Displaying 6 results from an estimated 6 matches for "6c0969dd".
2017 Jul 07
2
I/O error for one folder within the mountpoint
...1d087e51-fb40-4606-8bb5-58936fb11a4c>
<gfid:bb0352b9-4a5e-4075-9179-05c3a5766cf4>
<gfid:40133fcf-a1fb-4d60-b169-e2355b66fb53>
<gfid:00f75963-1b4a-4d75-9558-36b7d85bd30b>
<gfid:2c0babdf-c828-475e-b2f5-0f44441fffdc>
<gfid:bbeff672-43ef-48c9-a3a2-96264aa46152>
<gfid:6c0969dd-bd30-4ba0-a7e5-ba4b3a972b9f>
<gfid:4c81ea14-56f4-4b30-8fff-c088fe4b3dff>
<gfid:1072cda3-53c9-4b95-992d-f102f6f87209>
<gfid:2e8f9f29-78f9-4402-bc0c-e63af8cf77d6>
<gfid:eeaa2765-44f4-4891-8502-5787b1310de2>
Status: Connected
Number of entries: 29
Brick ipvr9.xxx:/mnt/glust...
2017 Jul 07
0
I/O error for one folder within the mountpoint
...b11a4c>
> <gfid:bb0352b9-4a5e-4075-9179-05c3a5766cf4>
> <gfid:40133fcf-a1fb-4d60-b169-e2355b66fb53>
> <gfid:00f75963-1b4a-4d75-9558-36b7d85bd30b>
> <gfid:2c0babdf-c828-475e-b2f5-0f44441fffdc>
> <gfid:bbeff672-43ef-48c9-a3a2-96264aa46152>
> <gfid:6c0969dd-bd30-4ba0-a7e5-ba4b3a972b9f>
> <gfid:4c81ea14-56f4-4b30-8fff-c088fe4b3dff>
> <gfid:1072cda3-53c9-4b95-992d-f102f6f87209>
> <gfid:2e8f9f29-78f9-4402-bc0c-e63af8cf77d6>
> <gfid:eeaa2765-44f4-4891-8502-5787b1310de2>
> Status: Connected
> Number of entries:...
2017 Jul 07
2
I/O error for one folder within the mountpoint
...gfid:bb0352b9-4a5e-4075-9179-05c3a5766cf4>
>> <gfid:40133fcf-a1fb-4d60-b169-e2355b66fb53>
>> <gfid:00f75963-1b4a-4d75-9558-36b7d85bd30b>
>> <gfid:2c0babdf-c828-475e-b2f5-0f44441fffdc>
>> <gfid:bbeff672-43ef-48c9-a3a2-96264aa46152>
>> <gfid:6c0969dd-bd30-4ba0-a7e5-ba4b3a972b9f>
>> <gfid:4c81ea14-56f4-4b30-8fff-c088fe4b3dff>
>> <gfid:1072cda3-53c9-4b95-992d-f102f6f87209>
>> <gfid:2e8f9f29-78f9-4402-bc0c-e63af8cf77d6>
>> <gfid:eeaa2765-44f4-4891-8502-5787b1310de2>
>> Status: Connected
>...
2017 Jul 07
0
I/O error for one folder within the mountpoint
...075-9179-05c3a5766cf4>
>>> <gfid:40133fcf-a1fb-4d60-b169-e2355b66fb53>
>>> <gfid:00f75963-1b4a-4d75-9558-36b7d85bd30b>
>>> <gfid:2c0babdf-c828-475e-b2f5-0f44441fffdc>
>>> <gfid:bbeff672-43ef-48c9-a3a2-96264aa46152>
>>> <gfid:6c0969dd-bd30-4ba0-a7e5-ba4b3a972b9f>
>>> <gfid:4c81ea14-56f4-4b30-8fff-c088fe4b3dff>
>>> <gfid:1072cda3-53c9-4b95-992d-f102f6f87209>
>>> <gfid:2e8f9f29-78f9-4402-bc0c-e63af8cf77d6>
>>> <gfid:eeaa2765-44f4-4891-8502-5787b1310de2>
>>> St...
2017 Jul 07
0
I/O error for one folder within the mountpoint
On 07/07/2017 01:23 PM, Florian Leleu wrote:
>
> Hello everyone,
>
> first time on the ML so excuse me if I'm not following well the rules,
> I'll improve if I get comments.
>
> We got one volume "applicatif" on three nodes (2 and 1 arbiter), each
> following command was made on node ipvr8.xxx:
>
> # gluster volume info applicatif
>
> Volume
2017 Jul 07
2
I/O error for one folder within the mountpoint
Hello everyone,
first time on the ML so excuse me if I'm not following well the rules,
I'll improve if I get comments.
We got one volume "applicatif" on three nodes (2 and 1 arbiter), each
following command was made on node ipvr8.xxx:
# gluster volume info applicatif
Volume Name: applicatif
Type: Replicate
Volume ID: ac222863-9210-4354-9636-2c822b332504
Status: Started