Hi, I have setup a cron job to rsync files between two servers every hour. 30 * * * * rsync -avz --delete rsyncsrv1::ftp/ /export/ftp/ This works fine most of the time, but if I have a very large file that needs to be transfered ( many tens of MBs), I run into a problem. Since the link between my two servers is slow, it can take more than an hour to complete the rsync transfer. So at the end of the hour, when the next rsync job is started by cron, the big file transfer gets aborted and a new transfer is started. Due to this the big file transfer never gets completed. I have tried to increase the time between successive cron jobs, but that is only a temporary fix untill I run into an even bigger file which causes the new settings to fail. Is there a way I can control this behaviour, and avoid this looping ? Thanks & regards, -- Derric Lewis CAD/System Administrator Virtual IP Group, Inc. -------------- next part -------------- HTML attachment scrubbed and removed
How about scheduling a script with some logic built in it that firsts
checks if a previous Rsync still is running. If so, it backs off, if not it
starts the sync.
Rgds,
Bart Coninckx
Network Administrator
CNE, ASE
*************************************
Watco ICT Services
Lilsedijk 19
B-2340 Beerse
Belgium
e-mail: bart.coninckx@sita.be
Tel: + 32 (0) 14 60 99 42
Fax: + 32 (0) 14 62 41 47
*************************************
========================== Disclaimer =================================The
information in this email is confidential, and is intended solely for
the addressee(s). If you are not the intended recipient of this email
please let us know by reply and then delete it from your system;
you should not copy this message or disclose its contents to anyone,
not even by forwarding it.
Due to the integrity risk of sending emails over the Internet,
Sita ICT will accept no liability for any comments and/or attachments
contained within this email.
========================== Disclaimer =================================
CAD/SysAdmin
Manager To: rsync@lists.samba.org
<Cad_Manager@vipg cc:
.com> Subject: Howto to control
rsync through cron
Sent by:
rsync-admin@lists
.samba.org
10/24/2002 21:23
Please respond to
Cad_Manager
Hi,
I have setup a cron job to rsync files between two servers every hour.
30 * * * * rsync -avz --delete rsyncsrv1::ftp/ /export/ftp/
This works fine most of the time, but if I have a very large file that
needs to be transfered ( many tens of MBs), I run into a problem. Since the
link between my two servers is slow, it can take more than an hour to
complete the rsync transfer. So at the end of the hour, when the next rsync
job is started by cron, the big file transfer gets aborted and a new
transfer is started. Due to this the big file transfer never gets
completed.
I have tried to increase the time between successive cron jobs, but that
is only a temporary fix untill I run into an even bigger file which causes
the new settings to fail. Is there a way I can control this behaviour, and
avoid this looping ?
Thanks & regards,
--
Derric Lewis
CAD/System Administrator
Virtual IP Group, Inc.
Try wrapping your rsync around script thats has some sort of lock file mechanism [ -f /var/tmp/rsync_is_running ] && exit 0 [ -f /var/tmp/rsync_is_running ] || touch /var/tmp/rsync_is_running && rsync -avz --delete rsyncsrv1::ftp/ /export/ftp/ or something like that. CAD/SysAdmin Manager wrote:> > Hi, > > I have setup a cron job to rsync files between two servers every hour. > > 30 * * * * rsync -avz --delete rsyncsrv1::ftp/ /export/ftp/ > > This works fine most of the time, but if I have a very large file that > needs to be transfered ( many tens of MBs), I run into a problem. Since the > link between my two servers is slow, it can take more than an hour to > complete the rsync transfer. So at the end of the hour, when the next rsync > job is started by cron, the big file transfer gets aborted and a new > transfer is started. Due to this the big file transfer never gets > completed. > > I have tried to increase the time between successive cron jobs, but that > is only a temporary fix untill I run into an even bigger file which causes > the new settings to fail. Is there a way I can control this behaviour, and > avoid this looping ? > > Thanks & regards, > > -- > Derric Lewis > CAD/System Administrator > Virtual IP Group, Inc. > >
Hi Thomas, Bart, Thanks for the suggestion. I have written a little wrapper script that checks the flag ( /var/tmp/rsync_is_running ) before starting a new job, and it seems to be working just fine now. Thanks & best regards, -- Derric Lewis CAD/System Administrator Virtual IP Group, Inc. -------------- next part -------------- HTML attachment scrubbed and removed