[Gluster-Maintainers] Build failed in Jenkins: netbsd-periodic #534

jenkins at build.gluster.org jenkins at build.gluster.org
Sat Jan 27 14:24:59 UTC 2018


See <https://build.gluster.org/job/netbsd-periodic/534/display/redirect?page=changes>

Changes:

[Kotresh H R] geo-rep: Detailed JSON output for config

------------------------------------------
[...truncated 351.64 KB...]
./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t .. 
1..29
ok 1, LINENUM:11
ok 2, LINENUM:12
ok 3, LINENUM:14
ok 4, LINENUM:15
ok 5, LINENUM:16
ok 6, LINENUM:17
ok 7, LINENUM:18
ok 8, LINENUM:19
ok 9, LINENUM:20
ok 10, LINENUM:22
ok 11, LINENUM:25
ok 12, LINENUM:34
ok 13, LINENUM:39
ok 14, LINENUM:39
ok 15, LINENUM:43
ok 16, LINENUM:46
ok 17, LINENUM:47
ok 18, LINENUM:48
ok 19, LINENUM:51
ok 20, LINENUM:52
ok 21, LINENUM:53
ok 22, LINENUM:54
ok 23, LINENUM:60
ok 24, LINENUM:63
ok 25, LINENUM:68
ok 26, LINENUM:68
ok 27, LINENUM:72
ok 28, LINENUM:73
ok 29, LINENUM:74
ok
All tests successful.
Files=1, Tests=29, 23 wallclock secs ( 0.02 usr  0.02 sys +  1.80 cusr  2.46 csys =  4.30 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t
================================================================================


================================================================================
[14:22:26] Running tests in file ./tests/basic/afr/granular-esh/replace-brick.t
./tests/basic/afr/granular-esh/replace-brick.t .. 
1..34
ok 1, LINENUM:7
ok 2, LINENUM:8
ok 3, LINENUM:9
ok 4, LINENUM:10
ok 5, LINENUM:11
ok 6, LINENUM:12
ok 7, LINENUM:13
ok 8, LINENUM:14
ok 9, LINENUM:15
ok 10, LINENUM:17
ok 11, LINENUM:26
ok 12, LINENUM:29
ok 13, LINENUM:32
ok 14, LINENUM:35
ok 15, LINENUM:38
ok 16, LINENUM:41
ok 17, LINENUM:43
ok 18, LINENUM:44
ok 19, LINENUM:46
ok 20, LINENUM:47
ok 21, LINENUM:48
ok 22, LINENUM:49
ok 23, LINENUM:50
ok 24, LINENUM:53
ok 25, LINENUM:56
ok 26, LINENUM:59
ok 27, LINENUM:60
ok 28, LINENUM:63
ok 29, LINENUM:65
ok 30, LINENUM:68
ok 31, LINENUM:69
ok 32, LINENUM:71
ok 33, LINENUM:72
ok 34, LINENUM:73
ok
All tests successful.
Files=1, Tests=34, 23 wallclock secs ( 0.06 usr  0.00 sys +  1.75 cusr  2.80 csys =  4.61 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/replace-brick.t
================================================================================


================================================================================
[14:22:49] Running tests in file ./tests/basic/afr/heal-info.t
./tests/basic/afr/heal-info.t .. 
1..9
ok 1, LINENUM:21
ok 2, LINENUM:22
ok 3, LINENUM:23
ok 4, LINENUM:24
ok 5, LINENUM:25
ok 6, LINENUM:26
ok 7, LINENUM:27
ok 8, LINENUM:33
ok 9, LINENUM:34
ok
All tests successful.
Files=1, Tests=9, 15 wallclock secs ( 0.03 usr  0.01 sys +  2.09 cusr  2.94 csys =  5.07 CPU)
Result: PASS
End of test ./tests/basic/afr/heal-info.t
================================================================================


================================================================================
[14:23:04] Running tests in file ./tests/basic/afr/heal-quota.t
touch: /mnt/glusterfs/0/b: Socket is not connected
dd: block size `1M': illegal number
cat: /proc/23902/cmdline: No such file or directory
Usage: gf_attach uds_path volfile_path (to attach)
       gf_attach -d uds_path brick_path (to detach)
dd: block size `1M': illegal number
./tests/basic/afr/heal-quota.t .. 
1..19
ok 1, LINENUM:10
ok 2, LINENUM:11
ok 3, LINENUM:12
ok 4, LINENUM:13
ok 5, LINENUM:14
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:19
ok 10, LINENUM:20
not ok 11 , LINENUM:22
FAILED COMMAND: touch /mnt/glusterfs/0/a /mnt/glusterfs/0/b
ok 12, LINENUM:24
ok 13, LINENUM:26
ok 14, LINENUM:27
ok 15, LINENUM:28
ok 16, LINENUM:29
ok 17, LINENUM:30
ok 18, LINENUM:32
ok 19, LINENUM:33
Failed 1/19 subtests 

Test Summary Report
-------------------
./tests/basic/afr/heal-quota.t (Wstat: 0 Tests: 19 Failed: 1)
  Failed test:  11
Files=1, Tests=19, 21 wallclock secs ( 0.04 usr  0.00 sys +  1.56 cusr  2.30 csys =  3.90 CPU)
Result: FAIL
./tests/basic/afr/heal-quota.t: bad status 1

       *********************************
       *       REGRESSION FAILED       *
       * Retrying failed tests in case *
       * we got some spurious failures *
       *********************************

touch: /mnt/glusterfs/0/b: Socket is not connected
dd: block size `1M': illegal number
cat: /proc/7901/cmdline: No such file or directory
Usage: gf_attach uds_path volfile_path (to attach)
       gf_attach -d uds_path brick_path (to detach)
dd: block size `1M': illegal number
./tests/basic/afr/heal-quota.t .. 
1..19
ok 1, LINENUM:10
ok 2, LINENUM:11
ok 3, LINENUM:12
ok 4, LINENUM:13
ok 5, LINENUM:14
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:19
ok 10, LINENUM:20
not ok 11 , LINENUM:22
FAILED COMMAND: touch /mnt/glusterfs/0/a /mnt/glusterfs/0/b
ok 12, LINENUM:24
ok 13, LINENUM:26
ok 14, LINENUM:27
ok 15, LINENUM:28
ok 16, LINENUM:29
ok 17, LINENUM:30
ok 18, LINENUM:32
ok 19, LINENUM:33
Failed 1/19 subtests 

Test Summary Report
-------------------
./tests/basic/afr/heal-quota.t (Wstat: 0 Tests: 19 Failed: 1)
  Failed test:  11
Files=1, Tests=19, 24 wallclock secs ( 0.05 usr  0.00 sys +  1.61 cusr  2.39 csys =  4.05 CPU)
Result: FAIL
./tests/basic/afr/heal-quota.t: 4 new core files
End of test ./tests/basic/afr/heal-quota.t
================================================================================


Run complete
================================================================================
Number of tests found:                             29
Number of tests selected for run based on pattern: 29
Number of tests skipped as they were marked bad:   4
Number of tests skipped because of known_issues:   0
Number of tests that were run:                     25

Tests ordered by time taken, slowest to fastest: 
================================================================================
./tests/basic/afr/gfid-mismatch-resolution-with-fav-child-policy.t  -  325 second
./tests/basic/afr/gfid-mismatch-resolution-with-cli.t  -  112 second
./tests/basic/afr/entry-self-heal.t  -  107 second
./tests/basic/afr/arbiter-add-brick.t  -  62 second
./tests/basic/afr/granular-esh/conservative-merge.t  -  46 second
./tests/basic/afr/arbiter.t  -  46 second
./tests/basic/afr/arbiter-remove-brick.t  -  37 second
./tests/basic/afr/gfid-self-heal.t  -  31 second
./tests/basic/afr/arbiter-mount.t  -  30 second
./tests/basic/afr/granular-esh/granular-esh.t  -  30 second
./tests/basic/afr/durability-off.t  -  29 second
./tests/basic/afr/granular-esh/replace-brick.t  -  23 second
./tests/basic/afr/granular-esh/granular-indices-but-non-granular-heal.t  -  23 second
./tests/basic/afr/gfid-heal.t  -  22 second
./tests/basic/afr/heal-quota.t  -  21 second
./tests/basic/afr/add-brick-self-heal.t  -  21 second
./tests/basic/afr/granular-esh/add-brick.t  -  20 second
./tests/basic/afr/client-side-heal.t  -  19 second
./tests/basic/afr/data-self-heal.t  -  18 second
./tests/basic/afr/arbiter-statfs.t  -  18 second
./tests/basic/afr/heal-info.t  -  15 second
./tests/basic/afr/compounded-write-txns.t  -  11 second
./tests/basic/afr/gfid-mismatch.t  -  10 second
./tests/basic/afr/arbiter-cli.t  -  5 second
./tests/basic/0symbol-check.t  -  0 second

1 test(s) failed 
./tests/basic/afr/heal-quota.t

1 test(s) generated core 
./tests/basic/afr/heal-quota.t

Result is 1

tar: Removing leading / from absolute path names in the archive
Cores and build archived in http://nbslave7c.cloud.gluster.org/archives/archived_builds/build-install-20180127140246.tgz
Open core using the following command to get a proper stack...
Example: From root of extracted tarball
       gdb -ex 'set sysroot ./'   -ex 'core-file ./build/install/cores/xxx.core'   <target, say ./build/install/sbin/glusterd>
NB: this requires a gdb built with 'NetBSD ELF' osabi support,  which is available natively on a NetBSD-7.0/i386 system
tar: Removing leading / from absolute path names in the archive
Logs archived in http://nbslave7c.cloud.gluster.org/archives/logs/glusterfs-logs-20180127140246.tgz
error: fatal: change is closed

fatal: one or more reviews failed; review output above
Build step 'Execute shell' marked build as failure


More information about the maintainers mailing list