[Gluster-Maintainers] Build failed in Jenkins: netbsd-periodic #527

jenkins at build.gluster.org jenkins at build.gluster.org
Sat Jan 20 14:22:50 UTC 2018


See <https://build.gluster.org/job/netbsd-periodic/527/display/redirect?page=changes>

Changes:

[Amar Tumballi] protocol: make on-wire-change of protocol using new XDR definition.

------------------------------------------
[...truncated 281.48 KB...]
./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t .. 
1..29
ok 1, LINENUM:11
ok 2, LINENUM:12
ok 3, LINENUM:14
ok 4, LINENUM:15
ok 5, LINENUM:16
ok 6, LINENUM:17
ok 7, LINENUM:18
ok 8, LINENUM:19
ok 9, LINENUM:20
ok 10, LINENUM:22
ok 11, LINENUM:25
ok 12, LINENUM:34
ok 13, LINENUM:39
ok 14, LINENUM:39
ok 15, LINENUM:43
ok 16, LINENUM:46
ok 17, LINENUM:47
ok 18, LINENUM:48
ok 19, LINENUM:51
ok 20, LINENUM:52
ok 21, LINENUM:53
ok 22, LINENUM:54
ok 23, LINENUM:60
ok 24, LINENUM:63
ok 25, LINENUM:68
ok 26, LINENUM:68
ok 27, LINENUM:72
ok 28, LINENUM:73
ok 29, LINENUM:74
ok
All tests successful.
Files=1, Tests=29, 24 wallclock secs ( 0.04 usr  0.00 sys +  2.43 cusr  3.03 csys =  5.50 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t
================================================================================


================================================================================
[14:20:02] Running tests in file ./tests/basic/afr/granular-esh/replace-brick.t
./tests/basic/afr/granular-esh/replace-brick.t .. 
1..34
ok 1, LINENUM:7
ok 2, LINENUM:8
ok 3, LINENUM:9
ok 4, LINENUM:10
ok 5, LINENUM:11
ok 6, LINENUM:12
ok 7, LINENUM:13
ok 8, LINENUM:14
ok 9, LINENUM:15
ok 10, LINENUM:17
ok 11, LINENUM:26
ok 12, LINENUM:29
ok 13, LINENUM:32
ok 14, LINENUM:35
ok 15, LINENUM:38
ok 16, LINENUM:41
ok 17, LINENUM:43
ok 18, LINENUM:44
ok 19, LINENUM:46
ok 20, LINENUM:47
ok 21, LINENUM:48
ok 22, LINENUM:49
ok 23, LINENUM:50
ok 24, LINENUM:53
ok 25, LINENUM:56
ok 26, LINENUM:59
ok 27, LINENUM:60
ok 28, LINENUM:63
ok 29, LINENUM:65
ok 30, LINENUM:68
ok 31, LINENUM:69
ok 32, LINENUM:71
ok 33, LINENUM:72
ok 34, LINENUM:73
ok
All tests successful.
Files=1, Tests=34, 24 wallclock secs ( 0.05 usr  0.02 sys +  1.93 cusr  2.96 csys =  4.96 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/replace-brick.t
================================================================================


================================================================================
[14:20:26] Running tests in file ./tests/basic/afr/heal-info.t
./tests/basic/afr/heal-info.t .. 
1..9
ok 1, LINENUM:21
ok 2, LINENUM:22
ok 3, LINENUM:23
ok 4, LINENUM:24
ok 5, LINENUM:25
ok 6, LINENUM:26
ok 7, LINENUM:27
ok 8, LINENUM:33
ok 9, LINENUM:34
ok
All tests successful.
Files=1, Tests=9, 29 wallclock secs ( 0.04 usr  0.01 sys +  4.12 cusr  5.73 csys =  9.90 CPU)
Result: PASS
End of test ./tests/basic/afr/heal-info.t
================================================================================


================================================================================
[14:20:55] Running tests in file ./tests/basic/afr/heal-quota.t
touch: /mnt/glusterfs/0/b: Socket is not connected
dd: block size `1M': illegal number
cat: /proc/9290/cmdline: No such file or directory
Usage: gf_attach uds_path volfile_path (to attach)
       gf_attach -d uds_path brick_path (to detach)
dd: block size `1M': illegal number
./tests/basic/afr/heal-quota.t .. 
1..19
ok 1, LINENUM:10
ok 2, LINENUM:11
ok 3, LINENUM:12
ok 4, LINENUM:13
ok 5, LINENUM:14
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:19
ok 10, LINENUM:20
not ok 11 , LINENUM:22
FAILED COMMAND: touch /mnt/glusterfs/0/a /mnt/glusterfs/0/b
ok 12, LINENUM:24
ok 13, LINENUM:26
ok 14, LINENUM:27
ok 15, LINENUM:28
ok 16, LINENUM:29
ok 17, LINENUM:30
ok 18, LINENUM:32
ok 19, LINENUM:33
Failed 1/19 subtests 

Test Summary Report
-------------------
./tests/basic/afr/heal-quota.t (Wstat: 0 Tests: 19 Failed: 1)
  Failed test:  11
Files=1, Tests=19, 25 wallclock secs ( 0.03 usr  0.01 sys +  1.67 cusr  2.67 csys =  4.38 CPU)
Result: FAIL
./tests/basic/afr/heal-quota.t: bad status 1

       *********************************
       *       REGRESSION FAILED       *
       * Retrying failed tests in case *
       * we got some spurious failures *
       *********************************

touch: /mnt/glusterfs/0/b: Socket is not connected
dd: block size `1M': illegal number
cat: /proc/26398/cmdline: No such file or directory
Usage: gf_attach uds_path volfile_path (to attach)
       gf_attach -d uds_path brick_path (to detach)
dd: block size `1M': illegal number
./tests/basic/afr/heal-quota.t .. 
1..19
ok 1, LINENUM:10
ok 2, LINENUM:11
ok 3, LINENUM:12
ok 4, LINENUM:13
ok 5, LINENUM:14
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:19
ok 10, LINENUM:20
not ok 11 , LINENUM:22
FAILED COMMAND: touch /mnt/glusterfs/0/a /mnt/glusterfs/0/b
ok 12, LINENUM:24
ok 13, LINENUM:26
ok 14, LINENUM:27
ok 15, LINENUM:28
ok 16, LINENUM:29
ok 17, LINENUM:30
ok 18, LINENUM:32
ok 19, LINENUM:33
Failed 1/19 subtests 

Test Summary Report
-------------------
./tests/basic/afr/heal-quota.t (Wstat: 0 Tests: 19 Failed: 1)
  Failed test:  11
Files=1, Tests=19, 26 wallclock secs ( 0.05 usr  0.01 sys +  1.69 cusr  2.36 csys =  4.11 CPU)
Result: FAIL
./tests/basic/afr/heal-quota.t: 4 new core files
End of test ./tests/basic/afr/heal-quota.t
================================================================================


Run complete
================================================================================
Number of tests found:                             29
Number of tests selected for run based on pattern: 29
Number of tests skipped as they were marked bad:   4
Number of tests skipped because of known_issues:   0
Number of tests that were run:                     25

Tests ordered by time taken, slowest to fastest: 
================================================================================
./tests/basic/afr/gfid-mismatch-resolution-with-fav-child-policy.t  -  160 second
./tests/basic/afr/gfid-mismatch-resolution-with-cli.t  -  112 second
./tests/basic/afr/entry-self-heal.t  -  101 second
./tests/basic/afr/arbiter-add-brick.t  -  61 second
./tests/basic/afr/arbiter.t  -  51 second
./tests/basic/afr/granular-esh/conservative-merge.t  -  40 second
./tests/basic/afr/gfid-self-heal.t  -  38 second
./tests/basic/afr/durability-off.t  -  37 second
./tests/basic/afr/arbiter-remove-brick.t  -  34 second
./tests/basic/afr/granular-esh/granular-esh.t  -  32 second
./tests/basic/afr/arbiter-mount.t  -  31 second
./tests/basic/afr/heal-info.t  -  29 second
./tests/basic/afr/client-side-heal.t  -  26 second
./tests/basic/afr/heal-quota.t  -  25 second
./tests/basic/afr/granular-esh/replace-brick.t  -  24 second
./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t  -  24 second
./tests/basic/afr/gfid-heal.t  -  22 second
./tests/basic/afr/add-brick-self-heal.t  -  22 second
./tests/basic/afr/granular-esh/add-brick.t  -  21 second
./tests/basic/afr/data-self-heal.t  -  20 second
./tests/basic/afr/arbiter-statfs.t  -  18 second
./tests/basic/afr/compounded-write-txns.t  -  11 second
./tests/basic/afr/gfid-mismatch.t  -  10 second
./tests/basic/afr/arbiter-cli.t  -  5 second
./tests/basic/0symbol-check.t  -  0 second

1 test(s) failed 
./tests/basic/afr/heal-quota.t

1 test(s) generated core 
./tests/basic/afr/heal-quota.t

Result is 1

tar: Removing leading / from absolute path names in the archive
Cores and build archived in http://nbslave7c.cloud.gluster.org/archives/archived_builds/build-install-20180120140336.tgz
Open core using the following command to get a proper stack...
Example: From root of extracted tarball
       gdb -ex 'set sysroot ./'   -ex 'core-file ./build/install/cores/xxx.core'   <target, say ./build/install/sbin/glusterd>
NB: this requires a gdb built with 'NetBSD ELF' osabi support,  which is available natively on a NetBSD-7.0/i386 system
tar: Removing leading / from absolute path names in the archive
Logs archived in http://nbslave7c.cloud.gluster.org/archives/logs/glusterfs-logs-20180120140336.tgz
error: fatal: change is closed

fatal: one or more reviews failed; review output above
Build step 'Execute shell' marked build as failure


More information about the maintainers mailing list