Hi, I think this is related to Petter Osterlund's post... I have two CYGWIN machines: a Win98 (400 megHz) (with a cable modem) and a Win2K (1.4 megHz) (attached to a token ring network) that I am trying to use rsync with over the Internet... I got things to work for a small directory (150 files. 5 megs). I was using a -azvPu flag with --delete and --modify-window=1 and -e ssh The ssh server was on the Win2k machine. I then tried it with a large directory (300 megs > 1500 files). It took an hour, CPU was high on the Win98 (couldn't tell what it did to the Win2k), then it said some sort of bad IO (I think it was error 11) and died. I figured maybe the ssh was slowing it down, so I set up a rsync server (with user authorization and host identification (BTW: nice job with the tcp_wrapper like implementation)) on my local machine (the Win98), and used the :: stuff. Again I got it to work with a small directory, and then tried to do it with a larger directory... same results. I switched to only a -a flag (since I thought the -z option was eating CPU), but it crashed again on my last try before I gave up last night. Any suggestions on how to diagnose this, or what might be going on? One thought I had was to try the --bwlimit option. Has anyone else done this sort of thing (large'ish directories over the internet between 2 CYGWIN machines using rsync)? Regards, Arthur