<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Sep 5, 2018 at 9:21 AM, yu sun <span dir="ltr">&lt;<a href="mailto:sunyu1949@gmail.com" target="_blank">sunyu1949@gmail.com</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">thank you , I will investigate and have a try.</div></blockquote><div><br></div><div>Please get all the extended attributes of src and dst from all bricks. That should help us to (dis)prove whether its same bug.</div><div> <br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><br></div><div>BYW, I found another thing:</div><div>1. I found this only occur when I try to mv dir to a subdir.</div><div>2. I create a similar volume with default option and without quota enabled, i found its ok, mv seems no problem.</div><div><br></div><div>so I think it&#39;s possibly the volume option led to this problem, but I dont know which options and  how to resolve this, the volume have about 25T data.</div><div><br></div><div>Best Regards.</div></div><br><div class="gmail_quote"><div dir="ltr">Raghavendra Gowdappa &lt;<a href="mailto:rgowdapp@redhat.com" target="_blank">rgowdapp@redhat.com</a>&gt; 于2018年9月5日周三 上午10:50写道:<br></div><div><div class="h5"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Sep 4, 2018 at 5:28 PM, yu sun <span dir="ltr">&lt;<a href="mailto:sunyu1949@gmail.com" target="_blank">sunyu1949@gmail.com</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hi all:<div><br><div>I have a <span style="color:rgb(0,0,0);white-space:pre-wrap">replicated volume project2 with info:</span><div><font color="#000000"><span style="white-space:pre-wrap">Volume Name: project2
Type: Distributed-Replicate
Volume ID: 60175b8e-de0e-4409-81ae-<wbr>7bb5eb5cacbf
Status: Started
Snapshot Count: 0
Number of Bricks: 84 x 2 = 168
Transport-type: tcp
Bricks:
Brick1: node20:/data2/bricks/project2
Brick2: node21:/data2/bricks/project2
Brick3: node22:/data2/bricks/project2
Brick4: node23:/data2/bricks/project2
Brick5: node24:/data2/bricks/project2
Brick6: node25:/data2/bricks/project2
Brick7: node26:/data2/bricks/project2
Brick8: node27:/data2/bricks/project2
Brick9: node28:/data2/bricks/project2
Brick10: node29:/data2/bricks/project2
Brick11: node30:/data2/bricks/project2
Brick12: node31:/data2/bricks/project2
Brick13: node32:/data2/bricks/project2
Brick14: node33:/data2/bricks/project2
Brick15: node20:/data3/bricks/project2
Brick16: node21:/data3/bricks/project2
Brick17: node22:/data3/bricks/project2
Brick18: node23:/data3/bricks/project2
Brick19: node24:/data3/bricks/project2
Brick20: node25:/data3/bricks/project2
Brick21: node26:/data3/bricks/project2
Brick22: node27:/data3/bricks/project2
Brick23: node28:/data3/bricks/project2
Brick24: node29:/data3/bricks/project2
Brick25: node30:/data3/bricks/project2
Brick26: node31:/data3/bricks/project2
Brick27: node32:/data3/bricks/project2
Brick28: node33:/data3/bricks/project2
Brick29: node20:/data4/bricks/project2
Brick30: node21:/data4/bricks/project2
Brick31: node22:/data4/bricks/project2
Brick32: node23:/data4/bricks/project2
Brick33: node24:/data4/bricks/project2
Brick34: node25:/data4/bricks/project2
Brick35: node26:/data4/bricks/project2
Brick36: node27:/data4/bricks/project2
Brick37: node28:/data4/bricks/project2
Brick38: node29:/data4/bricks/project2
Brick39: node30:/data4/bricks/project2
Brick40: node31:/data4/bricks/project2
Brick41: node32:/data4/bricks/project2
Brick42: node33:/data4/bricks/project2
Brick43: node20:/data5/bricks/project2
Brick44: node21:/data5/bricks/project2
Brick45: node22:/data5/bricks/project2
Brick46: node23:/data5/bricks/project2
Brick47: node24:/data5/bricks/project2
Brick48: node25:/data5/bricks/project2
Brick49: node26:/data5/bricks/project2
Brick50: node27:/data5/bricks/project2
Brick51: node28:/data5/bricks/project2
Brick52: node29:/data5/bricks/project2
Brick53: node30:/data5/bricks/project2
Brick54: node31:/data5/bricks/project2
Brick55: node32:/data5/bricks/project2
Brick56: node33:/data5/bricks/project2
Brick57: node20:/data6/bricks/project2
Brick58: node21:/data6/bricks/project2
Brick59: node22:/data6/bricks/project2
Brick60: node23:/data6/bricks/project2
Brick61: node24:/data6/bricks/project2
Brick62: node25:/data6/bricks/project2
Brick63: node26:/data6/bricks/project2
Brick64: node27:/data6/bricks/project2
Brick65: node28:/data6/bricks/project2
Brick66: node29:/data6/bricks/project2
Brick67: node30:/data6/bricks/project2
Brick68: node31:/data6/bricks/project2
Brick69: node32:/data6/bricks/project2
Brick70: node33:/data6/bricks/project2
Brick71: node20:/data7/bricks/project2
Brick72: node21:/data7/bricks/project2
Brick73: node22:/data7/bricks/project2
Brick74: node23:/data7/bricks/project2
Brick75: node24:/data7/bricks/project2
Brick76: node25:/data7/bricks/project2
Brick77: node26:/data7/bricks/project2
Brick78: node27:/data7/bricks/project2
Brick79: node28:/data7/bricks/project2
Brick80: node29:/data7/bricks/project2
Brick81: node30:/data7/bricks/project2
Brick82: node31:/data7/bricks/project2
Brick83: node32:/data7/bricks/project2
Brick84: node33:/data7/bricks/project2
Brick85: node20:/data8/bricks/project2
Brick86: node21:/data8/bricks/project2
Brick87: node22:/data8/bricks/project2
Brick88: node23:/data8/bricks/project2
Brick89: node24:/data8/bricks/project2
Brick90: node25:/data8/bricks/project2
Brick91: node26:/data8/bricks/project2
Brick92: node27:/data8/bricks/project2
Brick93: node28:/data8/bricks/project2
Brick94: node29:/data8/bricks/project2
Brick95: node30:/data8/bricks/project2
Brick96: node31:/data8/bricks/project2
Brick97: node32:/data8/bricks/project2
Brick98: node33:/data8/bricks/project2
Brick99: node20:/data9/bricks/project2
Brick100: node21:/data9/bricks/project2
Brick101: node22:/data9/bricks/project2
Brick102: node23:/data9/bricks/project2
Brick103: node24:/data9/bricks/project2
Brick104: node25:/data9/bricks/project2
Brick105: node26:/data9/bricks/project2
Brick106: node27:/data9/bricks/project2
Brick107: node28:/data9/bricks/project2
Brick108: node29:/data9/bricks/project2
Brick109: node30:/data9/bricks/project2
Brick110: node31:/data9/bricks/project2
Brick111: node32:/data9/bricks/project2
Brick112: node33:/data9/bricks/project2
Brick113: node20:/data10/bricks/project2
Brick114: node21:/data10/bricks/project2
Brick115: node22:/data10/bricks/project2
Brick116: node23:/data10/bricks/project2
Brick117: node24:/data10/bricks/project2
Brick118: node25:/data10/bricks/project2
Brick119: node26:/data10/bricks/project2
Brick120: node27:/data10/bricks/project2
Brick121: node28:/data10/bricks/project2
Brick122: node29:/data10/bricks/project2
Brick123: node30:/data10/bricks/project2
Brick124: node31:/data10/bricks/project2
Brick125: node32:/data10/bricks/project2
Brick126: node33:/data10/bricks/project2
Brick127: node20:/data11/bricks/project2
Brick128: node21:/data11/bricks/project2
Brick129: node22:/data11/bricks/project2
Brick130: node23:/data11/bricks/project2
Brick131: node24:/data11/bricks/project2
Brick132: node25:/data11/bricks/project2
Brick133: node26:/data11/bricks/project2
Brick134: node27:/data11/bricks/project2
Brick135: node28:/data11/bricks/project2
Brick136: node29:/data11/bricks/project2
Brick137: node30:/data11/bricks/project2
Brick138: node31:/data11/bricks/project2
Brick139: node32:/data11/bricks/project2
Brick140: node33:/data11/bricks/project2
Brick141: node20:/data12/bricks/project2
Brick142: node21:/data12/bricks/project2
Brick143: node22:/data12/bricks/project2
Brick144: node23:/data12/bricks/project2
Brick145: node24:/data12/bricks/project2
Brick146: node25:/data12/bricks/project2
Brick147: node26:/data12/bricks/project2
Brick148: node27:/data12/bricks/project2
Brick149: node28:/data12/bricks/project2
Brick150: node29:/data12/bricks/project2
Brick151: node30:/data12/bricks/project2
Brick152: node31:/data12/bricks/project2
Brick153: node32:/data12/bricks/project2
Brick154: node33:/data12/bricks/project2
Brick155: node20:/data13/bricks/project2
Brick156: node21:/data13/bricks/project2
Brick157: node22:/data13/bricks/project2
Brick158: node23:/data13/bricks/project2
Brick159: node24:/data13/bricks/project2
Brick160: node25:/data13/bricks/project2
Brick161: node26:/data13/bricks/project2
Brick162: node27:/data13/bricks/project2
Brick163: node28:/data13/bricks/project2
Brick164: node29:/data13/bricks/project2
Brick165: node30:/data13/bricks/project2
Brick166: node31:/data13/bricks/project2
Brick167: node32:/data13/bricks/project2
Brick168: node33:/data13/bricks/project2
Options Reconfigured:
performance.force-readdirp: on
performance.write-behind: off
performance.stat-prefetch: on
performance.client-io-threads: on
nfs.disable: on
transport.address-family: inet
features.quota: on
features.inode-quota: on
features.quota-deem-statfs: on
cluster.readdir-optimize: on
cluster.lookup-optimize: on
dht.force-readdirp: off
client.event-threads: 10
server.event-threads: 10
performance.readdir-ahead: on
performance.io-cache: on
performance.flush-behind: on
performance.cache-size: 5GB
performance.cache-max-file-<wbr>size: 1MB
performance.write-behind-<wbr>window-size: 10MB
performance.read-ahead: off
network.remote-dio: enable
performance.strict-o-direct: disable
performance.io-thread-count: 25</span></font><br></div><div><font color="#000000"><span style="white-space:pre-wrap"><br></span></font></div><div><br></div><div>the volume looks ok, and I mount this volume on my client machine:</div><div>mount -t glusterfs -o oom-score-adj=-999 -o direct-io-mode=disable -o use-readdirp=no node20:/project2 /mnt/project2<br></div><div><br></div><div>I have a directory in /mnt/project2/, but when I mv the directory to other dirs, files in the dir lost while tree or ls the dir, some files missing, my operations is list as below:</div></div></div></div></blockquote><div><br></div><div>Looks very similar to:</div><div> <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1118762" target="_blank">https://bugzilla.redhat.com/<wbr>show_bug.cgi?id=1118762</a></div><div><a href="https://bugzilla.redhat.com/show_bug.cgi?id=1337394" target="_blank">https://bugzilla.redhat.com/<wbr>show_bug.cgi?id=1337394</a></div><div><br></div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div><div><br></div><div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ mkdir test-dir</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ tree</div><div>.</div><div>├── face_landmarks</div><div>│   └── alive</div><div>│       └── logs_20180823_28</div><div>│           ├── info_000000.out</div><div>│           ├── info_000001.out</div><div>│           ├── info_000002.out</div><div>│           ├── info_000003.out</div><div>│           ├── info_000004.out</div><div>│           ├── info_000005.out</div><div>│           ├── info_000006.out</div><div>│           ├── info_000007.out</div><div>│           ├── info_000008.out</div><div>│           ├── info_000009.out</div><div>│           ├── info_000010.out</div><div>│           ├── info_000011.out</div><div>│           ├── info_000012.out</div><div>│           ├── info_000013.out</div><div>│           ├── info_000014.out</div><div>│           ├── info_000015.out</div><div>│           ├── info_000016.out</div><div>│           ├── info_000017.out</div><div>│           ├── info_000018.out</div><div>│           └── info_000019.out</div><div>└── test-dir</div><div><br></div><div>4 directories, 20 files</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ mv face_landmarks/ test-dir/</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ tree</div><div>.</div><div>└── test-dir</div><div>    └── face_landmarks</div><div><br></div><div>2 directories, 0 files</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ cd test-dir/face_landmarks/</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829/test-dir/<wbr>face_landmarks$ ls</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829/test-dir/<wbr>face_landmarks$ cd ..</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829/test-dir$ mv face_landmarks/ ..</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829/test-dir$ cd ..</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$ tree</div><div>.</div><div>├── face_landmarks</div><div>│   └── alive</div><div>└── test-dir</div><div><br></div><div>3 directories, 0 files</div><div>root@ml-gpu-ser129.nmg01:/mnt/<wbr>project2/371_37829$</div></div><div><br></div><div>I think i make some mistakes with volume option, buti i am not sure, so how can i find the lost files?  the files seems still int the directory, because i cant remove the directory  and rm tell me &quot;Not empty directory&quot;</div></div></div></div></blockquote><div><br></div><div>Its likely that src and dst of mv having same gfid and that&#39;s causing the issues. Can you look into both src and dst paths on all bricks? Union of contents of both directories should give all the files in the src directory before mv. Once found you can,</div><div>* keep a backup of contents of src and dst on all bricks<br></div><div>* remove trusted.gfid xattr on src and dst from all bricks</div><div>* remove gfid handle (.glusterfs/&lt;first two characters of gfid&gt;/&lt;second set of two characters of gfid&gt;/&lt;gfid&gt; on each brick)</div><div>* disable readdirplus in entire stack (maybe you can use a tmp mount for this) [1]</div><div>* stat src and dst on a mount point with readdirplus disabled.</div><div>* Now you&#39;ll see two directories src and dst on mountpoint. You  can copy the contents of both into a new directory<br></div><div><br></div><div>[1] <a href="https://lists.gluster.org/pipermail/gluster-users/2017-March/030148.html" target="_blank">https://lists.gluster.org/<wbr>pipermail/gluster-users/2017-<wbr>March/030148.html</a></div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div><div><br></div><div><br></div><div>Any suggestions is appreciated. </div><div>Many Thanks</div></div></div><div><br></div><div>Best regards</div><span class="m_2277727051455094203m_-8641256548602523546gmail-m_-5285551782584983736HOEnZb"><font color="#888888"><div>Yu</div></font></span></div>
<br>______________________________<wbr>_________________<br>
Gluster-users mailing list<br>
<a href="mailto:Gluster-users@gluster.org" target="_blank">Gluster-users@gluster.org</a><br>
<a href="https://lists.gluster.org/mailman/listinfo/gluster-users" rel="noreferrer" target="_blank">https://lists.gluster.org/<wbr>mailman/listinfo/gluster-users</a><br></blockquote></div><br></div></div></div></div></div></div></div></div>
</blockquote></div></div></div>
</blockquote></div><br></div></div>