Matthias Schniedermeyer
2007-Jul-29 07:24 UTC
Can Rsync handle largs exclude lists without slowdown?
Hi Let's say i wanted to exclude 100.000 files by naming them one by one in a file to be used by --exclude-from. Can rsync cope with that without bigger problems? I'm currently thinking about how i could make backing up by computer more efficient and if i exclude every single file that i can reproduce an other way, the amount of files i need to back up would be reduced by a large amount. Or to be more precises, my distribution is Debian SID which uses packages in .deb-format. So if i keep all the .deb files and make a list of a files provided by a .deb-package i only need to backup the .deb-files instead of the uncompressed files and as i have several computers i can save even more because i only needed a single copy of the .deb files. So can i go forward with my idea or does rsync stand in my way to happyness. ;-) Bis denn -- Real Programmers consider "what you see is what you get" to be just as bad a concept in Text Editors as it is in women. No, the Real Programmer wants a "you asked for it, you got it" text editor -- complicated, cryptic, powerful, unforgiving, dangerous.
Matt McCutchen
2007-Jul-29 14:56 UTC
Can Rsync handle largs exclude lists without slowdown?
On 7/29/07, Matthias Schniedermeyer <ms@citd.de> wrote:> Let's say i wanted to exclude 100.000 files by naming them one by one in > a file to be used by --exclude-from. > > Can rsync cope with that without bigger problems?Rsync would work correctly, but file-list building would probably be slow because rsync would check each file against all 100,000 exclude patterns. Instead, you might consider (1) using per-directory filter files or (2) preparing a list of files you do want backed up and feeding it to rsync with --files-from. To prepare the list for #2, you could use "find" to list all the files on the system and "comm" to remove excluded files from that list. Matt