search for: kiebzak

Displaying 5 results from an estimated 5 matches for "kiebzak".

Did you mean: jkiebzak
2017 Jun 29
2
Multi petabyte gluster
...for archival - near-cold storage. Anything, from your experience, to keep in mind while planning large installations? Sent from my Verizon, Samsung Galaxy smartphone -------- Original message --------From: Serkan ?oban <cobanserkan at gmail.com> Date: 6/29/17 4:39 AM (GMT-05:00) To: Jason Kiebzak <jkiebzak at gmail.com> Cc: Gluster Users <gluster-users at gluster.org> Subject: Re: [Gluster-users] Multi petabyte gluster I am currently using 10PB single volume without problems. 40PB is on the way. EC is working fine. You need to plan ahead with large installations like this. Do c...
2017 Jun 30
2
Multi petabyte gluster
...using 3.7.11 and only problem is slow rebuild time when a disk > fails. It takes 8 days to heal a 8TB disk.(This might be related with > my EC configuration 16+4) > 3.9+ versions has some improvements about this but I cannot test them > yet... > > On Thu, Jun 29, 2017 at 2:49 PM, jkiebzak <jkiebzak at gmail.com> wrote: > > Thanks for the reply. We will mainly use this for archival - near-cold > > storage. > > > > > > Anything, from your experience, to keep in mind while planning large > > installations? > > > > > > Sent fro...
2017 Jun 30
0
Multi petabyte gluster
...ng large installations? I am using 3.7.11 and only problem is slow rebuild time when a disk fails. It takes 8 days to heal a 8TB disk.(This might be related with my EC configuration 16+4) 3.9+ versions has some improvements about this but I cannot test them yet... On Thu, Jun 29, 2017 at 2:49 PM, jkiebzak <jkiebzak at gmail.com> wrote: > Thanks for the reply. We will mainly use this for archival - near-cold > storage. > > > Anything, from your experience, to keep in mind while planning large > installations? > > > Sent from my Verizon, Samsung Galaxy smartphone >...
2017 Jun 30
0
Multi petabyte gluster
...oblem is slow rebuild time when a disk >> fails. It takes 8 days to heal a 8TB disk.(This might be related with >> my EC configuration 16+4) >> 3.9+ versions has some improvements about this but I cannot test them >> yet... >> >> On Thu, Jun 29, 2017 at 2:49 PM, jkiebzak <jkiebzak at gmail.com> wrote: >> > Thanks for the reply. We will mainly use this for archival - near-cold >> > storage. >> > >> > >> > Anything, from your experience, to keep in mind while planning large >> > installations? >> >...
2017 Jun 28
1
Multi petabyte gluster
Has anyone scaled to a multi petabyte gluster setup? How well does erasure code do with such a large setup? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.gluster.org/pipermail/gluster-users/attachments/20170628/b030376d/attachment.html>