[Bugs] [Bug 1214772] New: gluster xml empty output volume status detail
bugzilla at redhat.com
bugzilla at redhat.com
Thu Apr 23 13:46:18 UTC 2015
https://bugzilla.redhat.com/show_bug.cgi?id=1214772
Bug ID: 1214772
Summary: gluster xml empty output volume status detail
Product: GlusterFS
Version: 3.6.2
Component: glusterd
Assignee: bugs at gluster.org
Reporter: gluster at jahu.sk
CC: bugs at gluster.org, gluster-bugs at redhat.com
Description of problem:
- xml output of volume status detail command is empty.
- only xml status command output (without detail) is ok.
- status detail command output xml or noxml is ok.
- underlaying filesystem is btrfs. in other case with btrfs output of status
command is ok
- this command has been runned after geo-replication start. if output is empty.
geo-replication will not start.
Version-Release number of selected component (if applicable):
3.6.2, and 3.6.3beta2
i have tried upgrade to actual beta with bugfixes for 3.6
How reproducible:
gluster --xml volume status volname detail
gluster --xml --remote-host=host volume status volname detail
Actual results:
empty response
Expected results:
xml response with volume status detail information
Additional info:
# gluster --xml volume status volname
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<cliOutput>
<opRet>0</opRet>
<opErrno>0</opErrno>
<opErrstr/>
<volStatus>
<volumes>
<volume>
<volName>volname</volName>
<nodeCount>1</nodeCount>
<node>
<hostname>host</hostname>
<path>/mnt/...</path>
<peerid>....</peerid>
<status>1</status>
<port>49158</port>
<pid>14099</pid>
</node>
<tasks/>
</volume>
</volumes>
</volStatus>
</cliOutput>
# gluster --xml volume status volname detail
#
# gluster volume status volname detail
Status of volume: volname
------------------------------------------------------------------------------
Brick : Brick host:/mnt/...
Port : 49158
Online : Y
Pid : 14099
File System : N/A
Device : N/A
Mount Options : N/A
Inode Size : N/A
Disk Space Free : 2.0TB
Total Disk Space : 2.6TB
Inode Count : N/A
Free Inodes : N/A
# tail cli.log
[2015-04-23 13:39:05.928307] D [cli-xml-output.c:84:cli_begin_xml_output]
0-cli: Returning 0
[2015-04-23 13:39:05.928333] D [cli-xml-output.c:131:cli_xml_output_common]
0-cli: Returning 0
[2015-04-23 13:39:05.928346] D
[cli-xml-output.c:1375:cli_xml_output_vol_status_begin] 0-cli: Returning 0
[2015-04-23 13:39:05.928390] D
[cli-xml-output.c:322:cli_xml_output_vol_status_common] 0-cli: Returning 0
[2015-04-23 13:39:05.928412] D
[cli-xml-output.c:429:cli_xml_output_vol_status_detail] 0-cli: Returning -2
[2015-04-23 13:39:05.928422] D
[cli-xml-output.c:1756:cli_xml_output_vol_status] 0-cli: Returning -2
[2015-04-23 13:39:05.928433] E [cli-rpc-ops.c:6742:gf_cli_status_cbk] 0-cli:
Error outputting to xml
[2015-04-23 13:39:05.928471] D [cli-cmd.c:384:cli_cmd_submit] 0-cli: Returning
-2
[2015-04-23 13:39:05.928492] D [cli-rpc-ops.c:6912:gf_cli_status_volume] 0-cli:
Returning: -2
[2015-04-23 13:39:05.928508] D
[cli-cmd-volume.c:1930:cli_cmd_volume_status_cbk] 0-cli: frame->local is not
NULL (0x366700009c0)
[2015-04-23 13:39:05.928530] I [input.c:36:cli_batch] 0-: Exiting with: -2
- georeplication start error, running this command:
[2015-04-21 14:58:23.372023] I [monitor(monitor):141:set_state] Monitor: new
state: Initializing...
[2015-04-21 14:58:23.582990] E [resource(monitor):221:errlog] Popen: command
"gluster --xml --remote-host=host volume status volumename detail" returned
with 2
[2015-04-21 14:58:23.583717] I [syncdutils(monitor):214:finalize] <top>:
exiting.
--
You are receiving this mail because:
You are on the CC list for the bug.
You are the assignee for the bug.
More information about the Bugs
mailing list