Hi, I have been using rsync for many years and never had any kind of problem. Lately I am running out of RAM trying to do an incremental backup to a box that only has 2G of RAM. The entire directory structure I'm mirroring is about 200G of files. A minority of subdirectories have many files. Is there a way to do an incremental backup with --delete option that does not use as much memory? Is there a way to tell rsync to use a tempfile instead of RAM for keeping tracking of whatever it does? And would it be useful to add ignores for the subdirectories I know have many files and back them up separately? Is --delete safe to use in this case, as in does --delete with --ignore somedir/ not delete files in other target dirs that are not in the ignore path? Thanks, /jl -- ASCII ribbon campaign ( ) Powered by Lemote Fuloong against HTML e-mail X Loongson MIPS and OpenBSD and proprietary / \ http://www.mutt.org attachments / \ Code Blue or Go Home! Encrypted email preferred PGP Key 2048R/DA65BC04
On Fri 25 Mar 2016, John Long wrote:> > I have been using rsync for many years and never had any kind of problem. > Lately I am running out of RAM trying to do an incremental backup to a box > that only has 2G of RAM. The entire directory structure I'm mirroring is > about 200G of files. A minority of subdirectories have many files. > > Is there a way to do an incremental backup with --delete option that does > not use as much memory? Is there a way to tell rsync to use a tempfile > instead of RAM for keeping tracking of whatever it does?No to the last question; you could consider adding (more) swapspace to the system, which is effectively like using a tempfile.> And would it be useful to add ignores for the subdirectories I know have > many files and back them up separately? Is --delete safe to use in this > case, as in does --delete with --ignore somedir/ not delete files in other > target dirs that are not in the ignore path?There's no --ignore, you probably mean --exclude. I don't really understand what you're asking in your last question... Why should --exclude somedir/ affect what --delete does elsewhere? --delete will still delete stuff elsewhere if necessary. Also look at the description of --delete and --delete-excluded, if you have any questions about what's in the manpage then feel free to ask those here; but for now I get the impression you haven't spent much time reading the manpage. Paul
On Fri, Mar 25, 2016 at 09:54:14AM +0000, John Long wrote:> Hi, > > I have been using rsync for many years and never had any kind of problem. > Lately I am running out of RAM trying to do an incremental backup to a box > that only has 2G of RAM. The entire directory structure I'm mirroring is > about 200G of files. A minority of subdirectories have many files. > > Is there a way to do an incremental backup with --delete option that does > not use as much memory? Is there a way to tell rsync to use a tempfile > instead of RAM for keeping tracking of whatever it does? > > And would it be useful to add ignores for the subdirectories I know have > many files and back them up separately? Is --delete safe to use in this > case, as in does --delete with --ignore somedir/ not delete files in other > target dirs that are not in the ignore path?I didn't phrase this part very well. Is --delete safe to use with --ignore, meaning will rsync avoid deleting files in the ignore path on the target side? I think the answer is probably yes but since I'm crashing the target box with --delete I don't want to have to try this too many times. Really I'm looking for a workaround to the high memory consumption so I can sync up the file trees without exceeding the small RAM capacity of the target box. Any suggestions appreciated. Thanks, /jl
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 If you were using --link-dest to make multiple backups you wouldn't need --delete because the target is always a new empty directory (with - --link-dest pointing to the previous backup run). So, you get the benefit of having multiple backups to restore from and rsync doesn't have to --delete. When you run low on space you just rm - -rf some old backups (takes a while but doesn't need much RAM). On 03/25/2016 05:54 AM, John Long wrote:> Hi, > > I have been using rsync for many years and never had any kind of > problem. Lately I am running out of RAM trying to do an incremental > backup to a box that only has 2G of RAM. The entire directory > structure I'm mirroring is about 200G of files. A minority of > subdirectories have many files. > > Is there a way to do an incremental backup with --delete option > that does not use as much memory? Is there a way to tell rsync to > use a tempfile instead of RAM for keeping tracking of whatever it > does? > > And would it be useful to add ignores for the subdirectories I know > have many files and back them up separately? Is --delete safe to > use in this case, as in does --delete with --ignore somedir/ not > delete files in other target dirs that are not in the ignore path? > > Thanks, > > /jl >- -- ~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._., Kevin Korb Phone: (407) 252-6853 Systems Administrator Internet: FutureQuest, Inc. Kevin at FutureQuest.net (work) Orlando, Florida kmk at sanitarium.net (personal) Web page: http://www.sanitarium.net/ PGP public key available on web site. ~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._.,-*~'`^`'~*-,._., -----BEGIN PGP SIGNATURE----- Version: GnuPG v2 iEYEARECAAYFAlb1Vl8ACgkQVKC1jlbQAQccMQCfR+5LfpqH9to3D1QDBDScZOBX RIQAn1BB2vFbf7eDgyy7HSS2SITBTWCM =BYru -----END PGP SIGNATURE-----
Hi, On Fri, Mar 25, 2016 at 11:16:47AM -0400, Kevin Korb wrote:> If you were using --link-dest to make multiple backups you wouldn't > need --delete because the target is always a new empty directory (with > - --link-dest pointing to the previous backup run).The source is around 200G and the target box only has 500G total and some of it is used for other data. What I want to do is mirror the source on the target and be able to prune the files that get deleted from the source from the target also. I don't have enough space to back up the whole thing and that is very time consuming anyway over 100M/b link which is why I was using --delete. For a long time it was ok, but now I don't have enough RAM. There is one giant directory that is probably problematic because it has a huge number of files. I suspect this is the one that's causing me problems but it is relatively static. I suppose it could be backed up and cleaned up separately. Is there any way to reduce RAM consumption on the target box while still getting the benefit of the --delete function? I am thinking of trying to back up everything but the gigantic directory with a large number of files, and then backing up only that directory. Is this a reasonable strategy? I just couldn't understand if --delete with --exclude would delete files from the target outside the --exclude path. I guess the answer is no but it would be a very time consuming mistake. I'm trying to make sure before I try it. Thanks for your help and I'm sorry for my poorly worded post(s). /jl -- ASCII ribbon campaign ( ) Powered by Lemote Fuloong against HTML e-mail X Loongson MIPS and OpenBSD and proprietary / \ http://www.mutt.org attachments / \ Code Blue or Go Home! Encrypted email preferred PGP Key 2048R/DA65BC04