search for: b278

Displaying 3 results from an estimated 3 matches for "b278".

Did you mean: 278
2017 Oct 26
0
not healing one file
Hey Richard, Could you share the following informations please? 1. gluster volume info <volname> 2. getfattr output of that file from all the bricks getfattr -d -e hex -m . <brickpath/filepath> 3. glustershd & glfsheal logs Regards, Karthik On Thu, Oct 26, 2017 at 10:21 AM, Amar Tumballi <atumball at redhat.com> wrote: > On a side note, try recently released health
2017 Oct 26
3
not healing one file
On a side note, try recently released health report tool, and see if it does diagnose any issues in setup. Currently you may have to run it in all the three machines. On 26-Oct-2017 6:50 AM, "Amar Tumballi" <atumball at redhat.com> wrote: > Thanks for this report. This week many of the developers are at Gluster > Summit in Prague, will be checking this and respond next
2017 Oct 26
2
not healing one file
...n.c:1327:afr_log_selfheal] 0-home-replicate-0: Completed metadata selfheal on e5238381-fe56-4168-8437-71e7ca024861. sources=0 [2] sinks=1 [2017-10-25 10:40:24.543242] I [MSGID: 108026] [afr-self-heal-common.c:1327:afr_log_selfheal] 0-home-replicate-0: Completed data selfheal on cd751a17-a8bd-4d31-b278-6ab29ab32ae1. sources=0 [2] sinks=1 [2017-10-25 10:40:24.546004] I [MSGID: 108026] [afr-self-heal-metadata.c:52:__afr_selfheal_metadata_do] 0-home-replicate-0: performing metadata selfheal on cd751a17-a8bd-4d31-b278-6ab29ab32ae1 [2017-10-25 10:40:24.557640] I [MSGID: 108026] [afr-self-heal-common...