Since upgrading my server from CentOS 4.5 to 4.6 I've been getting the following error from amanda backups: mutilate /home lev 1 FAILED [compress got signal 11, /bin/tar got signal 13] I was away from the house for most of the end of December and had a couple of other issues that came up that could have been related but apparently weren't (why is it that several things all go wrong at once?). After getting these other issues resolved I was still getting the above error. I tried running the following command as root: /bin/tar -X /etc/amanda/exclude-list/exclude.txt -cvf - /home | gzip -v -c > /share/dave/Home.tar.gz Initially tar would die while attempting to back up one of IMAP folders that had quite a few fairly large e-mails (some pictures my brother had sent). I removed the larger e-mails and tar proceeded past the IMAP folder that had been the problem only to die later: ... /home/judy/Judy's Stuff/School/ /home/judy/Judy's Stuff/School/2007 Spring/ /home/judy/Judy's Stuff/School/2007 Spring/Mynametemplate.ppt /home/judy/Judy's Stuff/School/2007 Spring/MyNameSamples.doc /home/judy/Judy's Stuff/School/2007 Spring/Myname105.ppt /home/judy/Judy's Stuff/School/2007 Spring/TYP Types.doc /home/judy/Judy's Stuff/School/2007 Spring/Photoshop_CS2.exe Segmentation fault The copy of PhotoShop is the trial version that my wife had downloaded about a year ago for a class she was taking. This directory has been getting backed up at least every thirty days since then given my tape rotation. If I remove the PhotoShop_CS2.exe file, the backup completes normally. So, is this a tar bug (doesn't like big files now) or is there some other issue like available shared memory that's causing the problem? Cheers, Dave -- Politics, n. Strife of interests masquerading as a contest of principles. -- Ambrose Bierce
On Jan 8, 2008 7:00 PM, David G. Miller <dave at davenjudy.org> wrote:> Since upgrading my server from CentOS 4.5 to 4.6 I've been getting the > following error from amanda backups: > > mutilate /home lev 1 FAILED [compress got signal 11, /bin/tar got > signal 13] > > > I was away from the house for most of the end of December and had a > couple of other issues that came up that could have been related but > apparently weren't (why is it that several things all go wrong at > once?). After getting these other issues resolved I was still getting > the above error. I tried running the following command as root: > > /bin/tar -X /etc/amanda/exclude-list/exclude.txt -cvf - /home | gzip -v > -c > /share/dave/Home.tar.gz > > Initially tar would die while attempting to back up one of IMAP folders > that had quite a few fairly large e-mails (some pictures my brother had > sent). I removed the larger e-mails and tar proceeded past the IMAP > folder that had been the problem only to die later: > > ... > /home/judy/Judy's Stuff/School/ > /home/judy/Judy's Stuff/School/2007 Spring/ > /home/judy/Judy's Stuff/School/2007 Spring/Mynametemplate.ppt > /home/judy/Judy's Stuff/School/2007 Spring/MyNameSamples.doc > /home/judy/Judy's Stuff/School/2007 Spring/Myname105.ppt > /home/judy/Judy's Stuff/School/2007 Spring/TYP Types.doc > /home/judy/Judy's Stuff/School/2007 Spring/Photoshop_CS2.exe > Segmentation fault > > The copy of PhotoShop is the trial version that my wife had downloaded > about a year ago for a class she was taking. This directory has been > getting backed up at least every thirty days since then given my tape > rotation. If I remove the PhotoShop_CS2.exe file, the backup completes > normally. > > So, is this a tar bug (doesn't like big files now) or is there some > other issue like available shared memory that's causing the problem? > > Cheers, > Dave > > -- > Politics, n. Strife of interests masquerading as a contest of principles. > -- Ambrose Bierce > > > _______________________________________________ > CentOS mailing list > CentOS at centos.org > http://lists.centos.org/mailman/listinfo/centos >Dave, First you have to figure out if the problem occurs in tar or gzip, do you get the problem if you tar and then gzip or is it combined (make a none compressed tar archive), in case no is the pipe somehow the problem, same crash with tar -z option rather then piping to gzip? Finally you need to get a stack trace why you need to set the core size limit above the default 0 size with "ulimit -c unlimited" before running the command. You can now use gdb to make a stack trace "gdb <path to exec file> <path to core file>" and type where and type "where" in order to get a stack trace that you can publish in a relevant forum for further examination by developers. - Nicolas -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.centos.org/pipermail/centos/attachments/20080108/967e216b/attachment-0003.html>
David G. Miller wrote:> Since upgrading my server from CentOS 4.5 to 4.6 I've been getting the > following error from amanda backups: > > mutilate /home lev 1 FAILED [compress got signal 11, /bin/tar got > signal 13] > > > I was away from the house for most of the end of December and had a > couple of other issues that came up that could have been related but > apparently weren't (why is it that several things all go wrong at > once?). After getting these other issues resolved I was still getting > the above error. I tried running the following command as root: > > /bin/tar -X /etc/amanda/exclude-list/exclude.txt -cvf - /home | gzip -v > -c > /share/dave/Home.tar.gz > > Initially tar would die while attempting to back up one of IMAP folders > that had quite a few fairly large e-mails (some pictures my brother had > sent). I removed the larger e-mails and tar proceeded past the IMAP > folder that had been the problem only to die later: > > ... > /home/judy/Judy's Stuff/School/ > /home/judy/Judy's Stuff/School/2007 Spring/ > /home/judy/Judy's Stuff/School/2007 Spring/Mynametemplate.ppt > /home/judy/Judy's Stuff/School/2007 Spring/MyNameSamples.doc > /home/judy/Judy's Stuff/School/2007 Spring/Myname105.ppt > /home/judy/Judy's Stuff/School/2007 Spring/TYP Types.doc > /home/judy/Judy's Stuff/School/2007 Spring/Photoshop_CS2.exe > Segmentation fault > > The copy of PhotoShop is the trial version that my wife had downloaded > about a year ago for a class she was taking. This directory has been > getting backed up at least every thirty days since then given my tape > rotation. If I remove the PhotoShop_CS2.exe file, the backup completes > normally. > > So, is this a tar bug (doesn't like big files now) or is there some > other issue like available shared memory that's causing the problem? > > Cheers, > Dave >Sounds like your tmp directory isn't big enough to handle the creation of the tar file. I'm pretty sure it's stored there until it's created and zipped. Then once done is moved to the destination. If you do a df from the command line several times while running the script, you can see if an area is filling up before completion. John Hinton
"Nicolas Sahlqvist" <nicco77 at gmail.com> wrote:> First you have to figure out if the problem occurs in tar or gzip, do you > get the problem if you tar and then gzip or is it combined (make a none > compressed tar archive), in case no is the pipe somehow the problem, same > crash with tar -z option rather then piping to gzip? Finally you need to get > a stack trace why you need to set the core size limit above the default 0 > size with "ulimit -c unlimited" before running the command. You can now use > gdb to make a stack trace "gdb <path to exec file> <path to core file>" and > type where and type "where" in order to get a stack trace that you can > publish in a relevant forum for further examination by developers. > > > - Nicolasand John Hinton <webmaster at ew3d.com> wrote> Sounds like your tmp directory isn't big enough to handle the creation > of the tar file. I'm pretty sure it's stored there until it's created > and zipped. Then once done is moved to the destination. If you do a df > from the command line several times while running the script, you can > see if an area is filling up before completion. > > John HintonWhen I checked the server this morning it had a kernel panic (in sendbackup) . I got it cleaned up and back running and tried again to recreate the problem. This time my tar command worked just fine (Nicholas, I'm using tar piped to gzip to mimic what amanda does). I tried it again and got a seg fault from a file that had just been tarred and gzipped with no problem. I fired it off again and got the seg fault from a different file. At this point I'm thinking I have an intermittent hardware problem. It just happened to seg fault twice on the same file when I tried it yesterday. After several runs, there appear to be certain files that I preferentially get the seg fault on if I rerun the backup several times. Sometimes it works; sometimes it seg faults. When it seg faults it's frequently on certain specific files. Kind of weird. Hopefully, it's nothing worse than the CPU fan has ingested too much cat hair (http://davenjudy.org/interests/pets/img011.jpeg.medium.jpeg) so the CPU is running hot. Thanks for the help. Cheers, Dave -- Politics, n. Strife of interests masquerading as a contest of principles. -- Ambrose Bierce