Hello, I have a problem with rsync and hard links : I have 1 folder : P, with 2 subfolders : S1 and S2 S2 contains a lot of hard links to file stored in folder S1. P : -S1 -S2 (S2 files hard links to S1 files) I would like to rsync the folder P to another computer, but each sub folder S1 (110 Go) and S2 (10 Go + hard link to 100 Go of S1) contains thousands of thousands of files, and when i try to rsync the folder P i have an out of memory error. The command used is : rsync --recursive --hard-links -e ssh --stats --delete --links --perms --times So i try to rsync the subfolder S1 and S2 in two rsync commands (with same argument as above), and then the hard links between S2 and S1 are not preserved. Is there a solution to keep the hard links between S2 and S1 when running two separated command ? Thank you, Limulezzz -------------- next part -------------- HTML attachment scrubbed and removed
At 09:50 28.09.2007 +0200, limule pika wrote:>Hello, > >I have a problem with rsync and hard links : > >I have 1 folder : P, with 2 subfolders : S1 and S2 > >S2 contains a lot of hard links to file stored in folder S1. > > P : -S1 > -S2 (S2 files hard links to S1 files) > >I would like to rsync the folder P to another computer, but each sub folder S1 (110 Go) and S2 (10 Go + hard link to 100 Go of S1) contains thousands of thousands of files, and when i try to rsync the folder P i have an out of memory error. > >The command used is : rsync --recursive --hard-links -e ssh --stats --delete --links --perms --times > >So i try to rsync the subfolder S1 and S2 in two rsync commands (with same argument as above), and then the hard links between S2 and S1 are not preserved. > >Is there a solution to keep the hard links between S2 and S1 when running two separated command ?I don't know an answer to this, but if possible you can use an rsync from cvs. The actual development version uses incremental file list (in remote mode both rsync binaries have to support this). This should save you from the memory problems and you can do it in one step. bye Fabi
Fabian's suggestion to use the CVS rsync with incremental recursion is good; that will be an improvement. However, rsync still has to remember all files in S1 that had multiple hard links in case they show up again in S2. If remembering the contents of even one of the directories makes rsync run out of memory, you'll have to do something different. On 9/28/07, limule pika <limulezzz@gmail.com> wrote:> Is there a solution to keep the hard links between S2 and S1 when running > two separated command ?Not in the general case, but if the hard links are between corresponding files (e.g., S1/path/to/X and S2/path/to/X; often the case in incremental backups), you can simply use --link-dest on the second run, like this: rsync <options> P/S1/ remote:P/S1/ rsync <options> --link-dest=../S1/ P/S2/ remote:P/S2/ (Note the ../S1/, because basis directory paths are interpreted relative to the destination directory.) If you do this and use the incremental recursion mode, rsync will remember only up to a few thousand files at a time and won't run out of memory. You can even do the copy in a single pass if you like: create a directory "P/basis" containing a symlink "S2" -> "../S1", and then run something like: rsync <options> --link-dest=basis/ P/ remote:P/ Matt
Maybe Matching Threads
- [Bug 8856] New: --hard-links does not handle hard-linked symlinks correctly on FreeBSD
- trouble with summary tables with several variables using aggregate function
- Solving a nonlinear System of equations
- Minimizing a Function with three Parameters
- [LLVMdev] set_intersect and Visual C compiler