Gidday Jose...
With the size of data you are talking about I would suggest you use
'update' 'times' & 'delete'. Archive appears not to
update but overwrite
so you are transferring the entire data set over and over. I get the
impression that 'backup' will force a rewrite as well. You may also want
to use 'compress' if the LAN connection is slowish. Do some tests. Might
be very useful if you have large database (white space) files.
I also have a path too long problem. Since the same problem happens with
general Windoze users I tend to use the same sharenames as them, further
down the directory tree. This is also an administrative function to
ensure that users dont exceed the path length limit as it can cause all
types of lost data problems.
So my thoughts;
--recursive --update --delete --ignore-errors --compress --times --stats
--force --temp-dir=/cygdrive/c/temp (along with link, owner, device etc
as needed)
Of course this may not satisfy your data backup/integrity requirements.
One issue I have is I'd like to preverve the last weeks worth of deleted
data on my backup server "just in case" someone needs it.
Cheers Bob (Oz)
Jos? Luis Poza wrote:> Hi all,
>
> I have to make bakup of a file server based in Windows 2000 Server with
> the next considerations:
>
> - NTFS 5.0 File System
>
> -Long names dirs and files. (Actually I have this error "File name
> too long (91) ")
>
> - Approach 76 GB
>
> - Files in use by system and users
>
> - A mixed network based in Unix, Linux and NT OS and a Windows
> machine in the client side (launchs comand rsync) with cwrsync
>
>
> My actual params configuration.
>
> - "-av --stats --backup --force --ignore-errors
> --temp-dir=/cygdrive/c/temp --timeout=5400";
>
>
> My problems.
>
> - To many hours in each operation (more than 15 hours)
>
> - Problem with the size name of the files "File name too long
(91)"
>
> I would like obtain a more optimal configuration, any suggest??
>
> Thank you very much!!
>
> Jos? Luis Poza
>