Hi all, Sorry to post again for my 220 Millions Files in 4TeraBytes. I have make a first rsync with online applications. Now I want make a over rsync with offline applications before changing Disk. The problem is: I can't make a offline of more than 6Hours and my building file list take 36 Hours. I know the only 2 month old files can change between the 2 rsync. Have you a special argument for rsync to only build file list with file younger than 2 month??? For my part, I'm going to try with Bash command but I think I can have some hard limitation by MAX_ARG!!! Please help me again... I'm very sorry but my client have very special production. Thanks a lot for all you have do for me and your god job on rsync. Thanks in advance. Bye Bye -------------- next part -------------- HTML attachment scrubbed and removed
On Fri 08 Feb 2008, Sylvain Gargasson wrote:> > Sorry to post again for my 220 Millions Files in 4TeraBytes. > > I have make a first rsync with online applications. > > Now I want make a over rsync with offline applications before changing Disk. > > The problem is: I can't make a offline of more than 6Hours and my building file list take 36 Hours. > > I know the only 2 month old files can change between the 2 rsync. > > Have you a special argument for rsync to only build file list with file younger than 2 month???Use the --files-from=filelist.txt option? Paul Slootman
On Fri, 2008-02-08 at 12:41 +0100, Sylvain Gargasson wrote:> Sorry to post again for my 220 Millions Files in 4TeraBytes. > > I have make a first rsync with online applications. > > Now I want make a over rsync with offline applications before changing > Disk. > > The problem is: I can't make a offline of more than 6Hours and my > building file list take 36 Hours. > > I know the only 2 month old files can change between the 2 rsync. > > Have you a special argument for rsync to only build file list with > file younger than 2 month??? > > For my part, I'm going to try with Bash command but I think I can have > some hard limitation by MAX_ARG!!! > > Please help me again... I'm very sorry but my client have very special > production.Paul's suggestion to use --files-from is good. Use something like "find SRC -mtime -60" to make the file list. However you accomplish this copy, I suggest that you take this opportunity to move the files to an LVM volume. That way, in the future, you can take block-level snapshots and move the partition to another disk as necessary with no downtime. Matt
Maybe Matching Threads
- 4TB and "150 000 000" files out of memory error
- rsync more than 131072 files on linux
- Xapian 1.4.5 "Db block overwritten - are there multiple writers?" with Glass
- [PATCH V2] ocfs2: need rollback when journal_access failed in ocfs2_orphan_add()
- Using virtual folders with younger and index files