[Gluster-Maintainers] Build failed in Jenkins: netbsd-periodic #492

jenkins at build.gluster.org jenkins at build.gluster.org
Sun Dec 17 10:12:17 UTC 2017


See <https://build.gluster.org/job/netbsd-periodic/492/display/redirect?page=changes>

Changes:

[Xavier Hernandez] glusterd: Fix buffer overflow in glusterd_get_volopt_content

------------------------------------------
[...truncated 235.97 KB...]
Byte-compiling python modules (optimized versions) ...
__init__.pygf_event.pyeventsapiconf.pyeventtypes.pyutils.pyhandlers.py
 <https://build.gluster.org/job/netbsd-periodic/ws/install-sh> -c -d '/build/install/libexec/glusterfs/events'
 /usr/bin/install -c <https://build.gluster.org/job/netbsd-periodic/ws/events/src/glustereventsd.py> '/build/install/libexec/glusterfs/events'
 <https://build.gluster.org/job/netbsd-periodic/ws/install-sh> -c -d '/build/install/etc/glusterfs'
 /usr/bin/install -c -m 644 <https://build.gluster.org/job/netbsd-periodic/ws/events/src/eventsconfig.json> '/build/install/etc/glusterfs'
 <https://build.gluster.org/job/netbsd-periodic/ws/install-sh> -c -d '/build/install/libexec/glusterfs'
 /usr/bin/install -c <https://build.gluster.org/job/netbsd-periodic/ws/events/src/peer_eventsapi.py> '/build/install/libexec/glusterfs'
Making install in tools
 <https://build.gluster.org/job/netbsd-periodic/ws/install-sh> -c -d '/build/install/share/glusterfs/scripts'
 /usr/bin/install -c <https://build.gluster.org/job/netbsd-periodic/ws/events/tools/eventsdash.py> '/build/install/share/glusterfs/scripts'
make  install-data-hook
/usr/bin/install -c -d -m 755 /build/install/var/db/glusterd/events
 <https://build.gluster.org/job/netbsd-periodic/ws/install-sh> -c -d '/build/install/lib/pkgconfig'
 /usr/bin/install -c -m 644 glusterfs-api.pc libgfchangelog.pc libgfdb.pc '/build/install/lib/pkgconfig'

Start time Sun Dec 17 10:06:52 UTC 2017
Run the regression test
***********************

tset: standard error: Inappropriate ioctl for device
chflags: /netbsd: No such file or directory
umount: /mnt/nfs/0: Invalid argument
umount: /mnt/nfs/1: Invalid argument
umount: /mnt/glusterfs/0: Invalid argument
umount: /mnt/glusterfs/1: Invalid argument
umount: /mnt/glusterfs/2: Invalid argument
umount: /build/install/var/run/gluster/patchy: No such file or directory
/dev/rxbd0e: 4096.0MB (8388608 sectors) block size 16384, fragment size 2048
	using 23 cylinder groups of 178.09MB, 11398 blks, 22528 inodes.
super-block backups (for fsck_ffs -b #) at:
32, 364768, 729504, 1094240, 1458976, 1823712, 2188448, 2553184, 2917920,
...............................................................................

... GlusterFS Test Framework ...


The following required tools are missing:

  * dbench

<https://build.gluster.org/job/netbsd-periodic/ws/> <https://build.gluster.org/job/netbsd-periodic/ws/>
<https://build.gluster.org/job/netbsd-periodic/ws/>

================================================================================
[10:06:53] Running tests in file ./tests/basic/0symbol-check.t
Skip Linux specific test
./tests/basic/0symbol-check.t .. 
1..2
ok 1, LINENUM:
ok 2, LINENUM:
ok
All tests successful.
Files=1, Tests=2,  0 wallclock secs ( 0.03 usr  0.01 sys +  0.06 cusr  0.09 csys =  0.19 CPU)
Result: PASS
End of test ./tests/basic/0symbol-check.t
================================================================================


================================================================================
[10:06:53] Running tests in file ./tests/basic/afr/add-brick-self-heal.t
./tests/basic/afr/add-brick-self-heal.t .. 
1..34
ok 1, LINENUM:6
ok 2, LINENUM:7
ok 3, LINENUM:8
ok 4, LINENUM:9
ok 5, LINENUM:10
ok 6, LINENUM:11
ok 7, LINENUM:12
ok 8, LINENUM:14
ok 9, LINENUM:15
ok 10, LINENUM:24
ok 11, LINENUM:27
ok 12, LINENUM:30
ok 13, LINENUM:31
ok 14, LINENUM:34
ok 15, LINENUM:35
ok 16, LINENUM:36
ok 17, LINENUM:38
ok 18, LINENUM:39
ok 19, LINENUM:40
ok 20, LINENUM:42
ok 21, LINENUM:43
ok 22, LINENUM:44
ok 23, LINENUM:45
ok 24, LINENUM:46
ok 25, LINENUM:47
ok 26, LINENUM:50
ok 27, LINENUM:53
ok 28, LINENUM:54
ok 29, LINENUM:57
ok 30, LINENUM:60
ok 31, LINENUM:61
ok 32, LINENUM:63
ok 33, LINENUM:64
ok 34, LINENUM:65
ok
All tests successful.
Files=1, Tests=34, 22 wallclock secs ( 0.03 usr  0.04 sys +  1.99 cusr  2.70 csys =  4.76 CPU)
Result: PASS
End of test ./tests/basic/afr/add-brick-self-heal.t
================================================================================


================================================================================
[10:07:15] Running tests in file ./tests/basic/afr/arbiter-add-brick.t
./tests/basic/afr/arbiter-add-brick.t .. 
1..40
ok 1, LINENUM:6
ok 2, LINENUM:7
ok 3, LINENUM:10
ok 4, LINENUM:11
ok 5, LINENUM:12
ok 6, LINENUM:13
ok 7, LINENUM:14
ok 8, LINENUM:15
ok 9, LINENUM:16
ok 10, LINENUM:19
ok 11, LINENUM:20
ok 12, LINENUM:21
ok 13, LINENUM:25
ok 14, LINENUM:26
ok 15, LINENUM:29
ok 16, LINENUM:30
ok 17, LINENUM:32
ok 18, LINENUM:33
ok 19, LINENUM:36
ok 20, LINENUM:37
ok 21, LINENUM:38
ok 22, LINENUM:39
ok 23, LINENUM:40
ok 24, LINENUM:41
not ok 25 Got "7" instead of "0", LINENUM:42
FAILED COMMAND: 0 get_pending_heal_count patchy
ok 26, LINENUM:45
ok 27, LINENUM:46
ok 28, LINENUM:47
ok 29, LINENUM:48
ok 30, LINENUM:49
ok 31, LINENUM:52
ok 32, LINENUM:53
ok 33, LINENUM:56
ok 34, LINENUM:57
ok 35, LINENUM:60
ok 36, LINENUM:61
ok 37, LINENUM:64
ok 38, LINENUM:65
ok 39, LINENUM:68
ok 40, LINENUM:69
Failed 1/40 subtests 

Test Summary Report
-------------------
./tests/basic/afr/arbiter-add-brick.t (Wstat: 0 Tests: 40 Failed: 1)
  Failed test:  25
Files=1, Tests=40, 133 wallclock secs ( 0.03 usr  0.02 sys + 1567550.38 cusr 1358549.22 csys = 2926099.65 CPU)
Result: FAIL
./tests/basic/afr/arbiter-add-brick.t: bad status 1

       *********************************
       *       REGRESSION FAILED       *
       * Retrying failed tests in case *
       * we got some spurious failures *
       *********************************

./tests/basic/afr/arbiter-add-brick.t .. 
1..40
ok 1, LINENUM:6
ok 2, LINENUM:7
ok 3, LINENUM:10
ok 4, LINENUM:11
ok 5, LINENUM:12
ok 6, LINENUM:13
ok 7, LINENUM:14
ok 8, LINENUM:15
ok 9, LINENUM:16
ok 10, LINENUM:19
ok 11, LINENUM:20
ok 12, LINENUM:21
ok 13, LINENUM:25
ok 14, LINENUM:26
ok 15, LINENUM:29
ok 16, LINENUM:30
ok 17, LINENUM:32
ok 18, LINENUM:33
ok 19, LINENUM:36
ok 20, LINENUM:37
ok 21, LINENUM:38
ok 22, LINENUM:39
ok 23, LINENUM:40
ok 24, LINENUM:41
not ok 25 Got "7" instead of "0", LINENUM:42
FAILED COMMAND: 0 get_pending_heal_count patchy
ok 26, LINENUM:45
ok 27, LINENUM:46
ok 28, LINENUM:47
ok 29, LINENUM:48
ok 30, LINENUM:49
ok 31, LINENUM:52
ok 32, LINENUM:53
ok 33, LINENUM:56
ok 34, LINENUM:57
ok 35, LINENUM:60
ok 36, LINENUM:61
ok 37, LINENUM:64
ok 38, LINENUM:65
ok 39, LINENUM:68
ok 40, LINENUM:69
Failed 1/40 subtests 

Test Summary Report
-------------------
./tests/basic/afr/arbiter-add-brick.t (Wstat: 0 Tests: 40 Failed: 1)
  Failed test:  25
Files=1, Tests=40, 132 wallclock secs ( 0.04 usr  0.01 sys + 731580.53 cusr 731582.33 csys = 1463162.91 CPU)
Result: FAIL
End of test ./tests/basic/afr/arbiter-add-brick.t
================================================================================


Run complete
================================================================================
Number of tests found:                             3
Number of tests selected for run based on pattern: 3
Number of tests skipped as they were marked bad:   0
Number of tests skipped because of known_issues:   0
Number of tests that were run:                     3

Tests ordered by time taken, slowest to fastest: 
================================================================================
./tests/basic/afr/arbiter-add-brick.t  -  133 second
./tests/basic/afr/add-brick-self-heal.t  -  22 second
./tests/basic/0symbol-check.t  -  0 second

1 test(s) failed 
./tests/basic/afr/arbiter-add-brick.t

0 test(s) generated core 


Result is 1

tar: Removing leading / from absolute path names in the archive
Logs archived in http://nbslave7c.cloud.gluster.org/archives/logs/glusterfs-logs-20171217100652.tgz
error: fatal: change is closed

fatal: one or more reviews failed; review output above
Build step 'Execute shell' marked build as failure


More information about the maintainers mailing list