Anas Alnaffar
2010-Jan-23 12:23 UTC
[CentOS] The directory that I am trying to clean up is huge
The directory that I am trying to clean up is huge . every time get this error msg -bash: /usr/bin/find: Argument list too long Please advise Anas -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.centos.org/pipermail/centos/attachments/20100123/f6534851/attachment-0001.html>
Marcelo M. Garcia
2010-Jan-23 12:34 UTC
[CentOS] The directory that I am trying to clean up is huge
Anas Alnaffar wrote:> The directory that I am trying to clean up is huge ? every time get this > error msg > > > > -bash: /usr/bin/find: Argument list too long > > > > > > Please advise > > > > *Anas *Hi Could you put the complete command? Please provide more details. Regards mg.
Robert Heller
2010-Jan-23 12:39 UTC
[CentOS] The directory that I am trying to clean up is huge
At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list <centos at centos.org> wrote:> > Content-Language: en-us > > > The directory that I am trying to clean up is huge . every time get this > error msg > > > > -bash: /usr/bin/find: Argument list too long'man xargs' find <mumble> -print | xargs rm> > > > > > Please advise > > > > Anas > > > MIME-Version: 1.0 > > _______________________________________________ > CentOS mailing list > CentOS at centos.org > http://lists.centos.org/mailman/listinfo/centos > >-- Robert Heller -- 978-544-6933 Deepwoods Software -- Download the Model Railroad System http://www.deepsoft.com/ -- Binaries for Linux and MS-Windows heller at deepsoft.com -- http://www.deepsoft.com/ModelRailroadSystem/
Kai Schaetzl
2010-Jan-23 12:49 UTC
[CentOS] The directory that I am trying to clean up is huge
http://www.google.com/search?as_epq=Argument+list+too+long Kai -- Get your web at Conactive Internet Services: http://www.conactive.com
Anas Alnaffar
2010-Jan-23 13:07 UTC
[CentOS] The directory that I am trying to clean up is huge
I tried to run this command find -name "*.access*" -mtime +2 -exec rm {} \; and I have same error message Anas -----Original Message----- From: centos-bounces at centos.org [mailto:centos-bounces at centos.org] On Behalf Of Marcelo M. Garcia Sent: Saturday, January 23, 2010 3:34 PM To: CentOS mailing list Subject: Re: [CentOS] The directory that I am trying to clean up is huge Anas Alnaffar wrote:> The directory that I am trying to clean up is huge . every time get this > error msg > > > > -bash: /usr/bin/find: Argument list too long > > > > > > Please advise > > > > *Anas *Hi Could you put the complete command? Please provide more details. Regards mg. _______________________________________________ CentOS mailing list CentOS at centos.org http://lists.centos.org/mailman/listinfo/centos
Kevin Krieser
2010-Jan-23 13:15 UTC
[CentOS] The directory that I am trying to clean up is huge
On Jan 23, 2010, at 7:07 AM, Anas Alnaffar wrote:> I tried to run this command > > find -name "*.access*" -mtime +2 -exec rm {} \; > > > and I have same error message > > > > Anas >There must have been more to it, since the command above is invalid. you need to specify where to start the find.
John Doe
2010-Jan-25 11:14 UTC
[CentOS] The directory that I am trying to clean up is huge
From: Anas Alnaffar <a.alnaffar at tijaritelecom.com>> I tried to run this command > find -name "*.access*" -mtime +2 -exec rm {} \; > and I have same error messageHow many "*.access*" are there...? JD
Chan Chung Hang Christopher
2010-Jan-25 14:49 UTC
[CentOS] The directory that I am trying to clean up is huge
Anas Alnaffar wrote:> I tried to run this command > > find -name "*.access*" -mtime +2 -exec rm {} \; >Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;
James B. Byrne
2010-Jan-25 16:05 UTC
[CentOS] The directory that I am trying to clean up is huge
On Mon, January 25, 2010 10:31, Robert Nichols wrote: \> > Now if the "{}" string appears more than once then the command line > contains that path more than once, but it is essentially impossible > to exceed the kernel's MAX_ARG_PAGES this way. > > The only issue with using "-exec command {} ;" for a huge number of > files is one of performance. If there are 100,000 matched files, > the command will be invoked 100,000 times. > > -- > Bob Nichols RNichols42 at comcast.net >Since the OP reported that the command he used: find -name "*.access*" -mtime +2 -exec rm {} \; in fact failed, one may infer that more than performance is at issue. The OP's problem lies not with the -exec construction but with the unstated, but nonetheless present, './' of his find invocation. Therefore he begins a recursive descent into that directory tree. Since the depth of that tree is not given us, nor its contents, we may only infer that there must be some number of files therein which are causing the MAXPAGES limit to be exceeded before the recursion returns. I deduce that he could provide the -prune option or the -maxdepth= 0 option to avoid this recursion instead. I have not tried either but I understand that one, or both, should work. -- *** E-Mail is NOT a SECURE channel *** James B. Byrne mailto:ByrneJB at Harte-Lyne.ca Harte & Lyne Limited http://www.harte-lyne.ca 9 Brockley Drive vox: +1 905 561 1241 Hamilton, Ontario fax: +1 905 561 0757 Canada L8E 3C3