From container-build-reports at centos.org Wed Jul 3 20:50:36 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Wed, 03 Jul 2019 20:50:36 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d1d151c.O3GzvmQuD/1kcWYC%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 5c1d94a27199191c07d857bbd13d668e8e5210ec to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Wed Jul 3 20:55:16 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Wed, 03 Jul 2019 20:55:16 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d1d1634.HbY2XlH1QTfjKR3q%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 5c1d94a27199191c07d857bbd13d668e8e5210ec to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 7 06:50:46 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 07 Jul 2019 06:50:46 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:latest Message-ID: <5d219646.4y25v8RtcHRxatj1%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 7 06:51:58 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 07 Jul 2019 06:51:58 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:testing Message-ID: <5d21968e.7uEcDNxfaTIBnFwq%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 19:20:39 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 19:20:39 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d239787.V75OE/pn/fFphxaH%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 6e94431db9ecad18d5827060fe1c562d30e5d15c to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 19:25:20 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 19:25:20 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d2398a0.H5NcRu5E14591tFT%container-build-reports@centos.org> Build Status: Failure Cause of build: Update to build configurations of the container image -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 20:00:39 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 20:00:39 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d23a0e7.Bc9YcsNtcIqoN8P2%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 2a568195fcd564bcbeda1070f9423f303304fcd5 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 20:05:18 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 20:05:18 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d23a1fe.2yN8Qa87OiOMtLzo%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 2a568195fcd564bcbeda1070f9423f303304fcd5 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 20:43:36 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 20:43:36 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d23aaf8.PkMrqyZFia7avMDc%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit f9df38680d39d2898a0f84a22f7bdba5d5069537 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Mon Jul 8 20:50:35 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Mon, 08 Jul 2019 20:50:35 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d23ac9b.MjkXHSpW9sfRVrSw%container-build-reports@centos.org> Build Status: Failure Cause of build: Update to build configurations of the container image -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 14 06:50:37 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 14 Jul 2019 06:50:37 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:latest Message-ID: <5d2ad0bd.AWIGzUf1l+z99UOb%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 14 06:51:38 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 14 Jul 2019 06:51:38 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:testing Message-ID: <5d2ad0fa.vhZAbDL0FFz7bm2n%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Fri Jul 19 20:20:49 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Fri, 19 Jul 2019 20:20:49 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d322621.5Xk+IElVNTPv00Yl%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 72f417a06ebad0bfff97bec832995cac786492f7 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Fri Jul 19 20:25:16 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Fri, 19 Jul 2019 20:25:16 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d32272c.dNFrR9Tq2nL5TBR1%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 72f417a06ebad0bfff97bec832995cac786492f7 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 21 06:49:38 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 21 Jul 2019 06:49:38 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:latest Message-ID: <5d340b02.X+Sdbdf+qOcUo/jc%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 21 06:52:48 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 21 Jul 2019 06:52:48 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:testing Message-ID: <5d340bc0.hHoRHjgDOBFdlHls%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From rtalur at redhat.com Mon Jul 22 21:32:54 2019 From: rtalur at redhat.com (Raghavendra Talur) Date: Mon, 22 Jul 2019 17:32:54 -0400 Subject: [heketi-devel] Directory-based bricks In-Reply-To: References: Message-ID: On Thu, Jul 18, 2019 at 4:11 AM Dmitry Kireev wrote: > > Hello, > > Is there any way to use Heketi in the VPS model, when there is only 1 block storage (say, /dev/sda) and this is where root partition is located. > > Are there any recent changes that would allow me using directory-based bricks to create volumes dynamically? We don't have any provision to use directories as devices in heketi. I think it should be possible to use partitions though. > > Thank you. > > -- > Sincerely, > Dmitry Kireev > > _______________________________________________ > heketi-devel mailing list > heketi-devel at gluster.org > https://lists.gluster.org/mailman/listinfo/heketi-devel From container-build-reports at centos.org Sun Jul 28 06:48:33 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 28 Jul 2019 06:48:33 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:latest Message-ID: <5d3d4541.JpQ5D/A2e34LOQcC%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Sun Jul 28 06:52:08 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Sun, 28 Jul 2019 06:52:08 +0000 Subject: [heketi-devel] [registry.centos.org] SUCCESS: Weekly scan for gluster/storagesig-heketi:testing Message-ID: <5d3d4618.0rC5BBVX0ovxeaDf%container-build-reports@centos.org> Scan status: Success Repository: https://registry.centos.org/gluster/storagesig-heketi -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Wed Jul 31 13:20:50 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Wed, 31 Jul 2019 13:20:50 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:testing Message-ID: <5d4195b2.F2d+yPl3z2V44szx%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 752f7de0df3ef9b83d5ba17bf83947907cef0e38 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From container-build-reports at centos.org Wed Jul 31 13:23:36 2019 From: container-build-reports at centos.org (container-build-reports at centos.org) Date: Wed, 31 Jul 2019 13:23:36 +0000 Subject: [heketi-devel] [registry.centos.org] FAILED: Container build gluster/storagesig-heketi:latest Message-ID: <5d419658.vzP9Vt8NPRWVp/y3%container-build-reports@centos.org> Build Status: Failure Cause of build: Git commit 752f7de0df3ef9b83d5ba17bf83947907cef0e38 to branch origin/master of repo https://github.com/heketi/heketi. -- Do you have a query? Talk to CentOS Container Pipeline team on #centos-devel at freenode https://wiki.centos.org/ContainerPipeline From johannes.grumboeck at porscheinformatik.at Wed Jul 31 06:48:56 2019 From: johannes.grumboeck at porscheinformatik.at (Grumboeck Johannes (POI - AT/Salzburg)) Date: Wed, 31 Jul 2019 06:48:56 +0000 Subject: [heketi-devel] [EXT] volume provisioning fails with "No space" In-Reply-To: <6CC3B27B260F254B9F0D7B04E645A380011C446B@Cap1ca1vexc03.canada01.ifdsnet.int> References: <6CC3B27B260F254B9F0D7B04E645A380011C446B@Cap1ca1vexc03.canada01.ifdsnet.int> Message-ID: Hi Robert, We had a similar issue and it turned out that heketi stores and counts free/used space in its topology separately to underlying LVM. This means it can run out of sync. I found this article which is a nice walkthrough how to sync them again: https://www.ibm.com/support/knowledgecenter/en/SSBS6K_3.1.0/troubleshoot/heketi_disk.html Mit freundlichen Gr??en / Best regards i.A. Dipl.-Ing.(FH) Johannes Grumb?ck Infrastructure Architect, Computing & Platform Services Infrastructure & Common Platforms Telefon +43 (0)662 4670-6323 Telefax +43 (0)662 4670-16323 johannes.grumboeck at porscheinformatik.at Porsche Informatik Gesellschaft m.b.H. | A - 5020 Salzburg | Louise-Pi?ch-Stra?e 9 Sitz: Salzburg | FN 72830 d / Landesgericht Salzburg | DVR 88439 | UID ATU 36773309 http://www.porscheinformatik.at/ Internal Von: heketi-devel-bounces at gluster.org Im Auftrag von Polasek, Robert Gesendet: Donnerstag, 25. Oktober 2018 22:30 An: heketi-devel at gluster.org Betreff: [EXT] [heketi-devel] volume provisioning fails with "No space" Hi everybody, I am testing heketi autoprovisioning of gluster volumes and I am hitting a problem where heketi reports "No space" in the log, even there should be a space. Would somebody be able to point me what I need to look into or where my problem is? Here are the details of my problem: I am using a curl to simulate my failing request from kubernetes. As a side note, our kubernetes is old, 1.4 and therefore I cannot robert at workstation:~/src/ifds/helm/tapremium$ curl -v -X POST -H "Authorization: bearer ****" -H "Content-Type: application/json" POST http://localhost:8080/volumes -d@/tmp/volume.json ================================================================================================================= This is a content of /tmp/volume.json file { "size": 10, "name": "", "durability": { "type": "replicate", "replicate": { "replica": 3 }, "disperse": {} }, "gid": 2000, "snapshot": { "enable": false, "factor": 0 } } This is my topology ================================================================================================================= Cluster Id: 7396c8b03593f9bfa1288cdebd9412ac Volumes: Name: vol_4708719dddaacda301e3b71763cb73fa Size: 10 Id: 4708719dddaacda301e3b71763cb73fa Cluster Id: 7396c8b03593f9bfa1288cdebd9412ac Mount: 192.168.48.9:vol_4708719dddaacda301e3b71763cb73fa Mount Options: backup-volfile-servers= Durability Type: none Snapshot: Disabled Bricks: Id: 99b33f7f23347c1691e4a93c7220af04 Path: /var/lib/heketi/mounts/vg_05df77bacca68b941f8d3d20d64a1aa2/brick_99b33f7f23347c1691e4a93c7220af04/brick Size (GiB): 5 Node: 31e1ac3014029ff48398a989c10952b6 Device: 05df77bacca68b941f8d3d20d64a1aa2 Id: acebc8961ffb304fa2b97c201c7af8da Path: /var/lib/heketi/mounts/vg_05df77bacca68b941f8d3d20d64a1aa2/brick_acebc8961ffb304fa2b97c201c7af8da/brick Size (GiB): 5 Node: 31e1ac3014029ff48398a989c10952b6 Device: 05df77bacca68b941f8d3d20d64a1aa2 Nodes: Node Id: 31e1ac3014029ff48398a989c10952b6 State: online Cluster Id: 7396c8b03593f9bfa1288cdebd9412ac Zone: 1 Management Hostname: 192.168.48.9 Storage Hostname: 192.168.48.9 Devices: Id:05df77bacca68b941f8d3d20d64a1aa2 Name:/dev/gluster_vg/heketi_volumeState:online Size (GiB):99 Used (GiB):10 Free (GiB):89 Bricks: Id:99b33f7f23347c1691e4a93c7220af04 Size (GiB):5 Path: /var/lib/heketi/mounts/vg_05df77bacca68b941f8d3d20d64a1aa2/brick_99b33f7f23347c1691e4a93c7220af04/brick Id:acebc8961ffb304fa2b97c201c7af8da Size (GiB):5 Path: /var/lib/heketi/mounts/vg_05df77bacca68b941f8d3d20d64a1aa2/brick_acebc8961ffb304fa2b97c201c7af8da/brick Id:6c0325d31e1b84a4f3aa088ba352e2ec Name:/dev/gluster_vg/heketi_volume2State:online Size (GiB):24 Used (GiB):0 Free (GiB):24 Bricks: Id:bac78b5e4d558d387218d12fdfb7bfa0 Name:/dev/gluster_vg/heketi_volume1State:online Size (GiB):24 Used (GiB):0 Free (GiB):24 Bricks: ================================================================================================================= and this is the log output [negroni] Started POST /volumes [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry.go:101: [e8cb84a8b977a00d77e137a05a5b2530] Replica 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry.go:580: Using the following clusters: [7396c8b03593f9bfa1288cdebd9412ac] [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:43: brick_size = 10485760 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:44: sets = 1 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:45: num_bricks = 3 [heketi] INFO 2018/10/25 14:08:28 Allocating brick set #0 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 0 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:411: expected space needed for amount=10485760 snapFactor=1 : 10539008 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:368: device 6c0325d31e1b84a4f3aa088ba352e2ec[26079232] > required size [10539008] ? [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 1 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:455: Error detected. Cleaning up volume e8cb84a8b977a00d77e137a05a5b2530: Len(0) [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:56: No space, re-trying with smaller brick size [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:43: brick_size = 5242880 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:44: sets = 2 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:45: num_bricks = 6 [heketi] INFO 2018/10/25 14:08:28 Allocating brick set #0 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 0 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:411: expected space needed for amount=5242880 snapFactor=1 : 5271552 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:368: device bac78b5e4d558d387218d12fdfb7bfa0[26079232] > required size [5271552] ? [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 1 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:455: Error detected. Cleaning up volume e8cb84a8b977a00d77e137a05a5b2530: Len(0) [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:56: No space, re-trying with smaller brick size [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:43: brick_size = 2621440 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:44: sets = 4 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:45: num_bricks = 12 [heketi] INFO 2018/10/25 14:08:28 Allocating brick set #0 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 0 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:411: expected space needed for amount=2621440 snapFactor=1 : 2637824 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:368: device 05df77bacca68b941f8d3d20d64a1aa2[94179328] > required size [2637824] ? [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 1 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:455: Error detected. Cleaning up volume e8cb84a8b977a00d77e137a05a5b2530: Len(0) [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:56: No space, re-trying with smaller brick size [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:43: brick_size = 1310720 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:44: sets = 8 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:45: num_bricks = 24 [heketi] INFO 2018/10/25 14:08:28 Allocating brick set #0 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 0 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:411: expected space needed for amount=1310720 snapFactor=1 : 1318912 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/device_entry.go:368: device 05df77bacca68b941f8d3d20d64a1aa2[94179328] > required size [1318912] ? [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/brick_allocate.go:251: 1 / 3 [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:455: Error detected. Cleaning up volume e8cb84a8b977a00d77e137a05a5b2530: Len(0) [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:56: No space, re-trying with smaller brick size [heketi] ERROR 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry_allocate.go:37: Minimum brick size limit reached. Out of space. [heketi] DEBUG 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/volume_entry.go:445: Cluster 7396c8b03593f9bfa1288cdebd9412ac can not accommodate volume (Minimum brick size limit reached. Out of space.), trying next cluster [heketi] ERROR 2018/10/25 14:08:28 /src/github.com/heketi/heketi/apps/glusterfs/operations_manage.go:89: Create Volume Build Failed: No space [negroni] Completed 500 Internal Server Error in 3.397009ms This message is marked Public ________________________________ Please consider the environment before printing this email and any attachments. This e-mail and any attachments are intended only for the individual or company to which it is addressed and may contain information which is privileged, confidential and prohibited from disclosure or unauthorized use under applicable law. If you are not the intended recipient of this e-mail, you are hereby notified that any use, dissemination, or copying of this e-mail or the information contained in this e-mail is strictly prohibited by the sender. If you have received this transmission in error, please return the material received to the sender and delete all copies from your system. -------------- next part -------------- An HTML attachment was scrubbed... URL: