similar to: rsync exclude files based on filesize

Displaying 20 results from an estimated 4000 matches similar to: "rsync exclude files based on filesize"

2010 Jul 30
2
rsync mirror solution: how to prevent accidental mirror deletion
I had a recent disaster scenario with rsync. I was wondering if there were any suggestions to guard against in the future: I used to maintain "mirror" backups of the /home dir on our production_server using rsync to a backup_server. The primary server had a rsyncd daemon running and the backup_server had this line in the crontab: 10 01 * * * rsync -av --delete root at
2009 Jun 10
2
rsync excluded file syntax errors
I cannot figure out where I am going wrong with my excluded files syntax! rsync backups up those folders I'm trying to exclude. rsync -av --exclude-from=/etc/rsync_excluded.conf --delete root@polaris::polhome /pol_home_bkup cat /etc/rsync_excluded.conf - /home/agokhale - /home/anand - /home/asalazar etc. These are all top level folders with the same names. On the rsync server
2009 Jun 20
2
which server to make client and which the server for rsync.
I am using rsync to keep a mirror of my 800GB /home (server1). The backup machine is a separate server (servr2). Currently I am running rsync daemon on server2 and invoking rsync daily via cron on server2. Are there design / performance considerations that influence which machine is made the server and which the client? Also, Is the rsync daemon the preferred way to o this backup? I can also
2010 Jul 27
3
Getting rsync to store timing information in its logs
Is there a way to know from the rsync logs how long it took to do a backup? The only timing info,. I see is this at the end: sent 3067328 bytes received 7853035429 bytes 1187888.83 bytes/sec total size is 1559866450336 speedup is 198.55 Can I use it to figure out how long the operation took? Does the above mean it took 2.5 secs of send time and 1.8 hours of recieve time so (roughly) the
2009 Jun 04
1
rsync --daemon. Can I open more than one instances?
Is there a way to speed up rsync by opening more than one daemon in parallel. I use rsync --daemon to start rsync. I was wondering if opening more than one instance is recommended or feasible? I know that for services like nfs for example I have had opening many instances improve performance. I have 4 cores available so if there are any other parallization modes I'd be glad to know since I
2017 Mar 03
2
How do you exclude a directory that is a symlink?
The directory I'm trying to copy from is: /home/blah/dir The symlink is /home/blah/dir/unwanted_symlinked_dir On Fri, Mar 3, 2017 at 8:10 AM, Paul Slootman <paul+rsync at wurtel.net> wrote: > On Fri 03 Mar 2017, Steve Dondley wrote: > > > I'm trying to rsync a directory from a server to my local machine that > has > > a symbolic link to a directory I don't
2004 Aug 04
2
refresh filesize of samba shares with win2k
hi, in this days i'm trying samba because i would like to substitute an NT file server. i found this problem: when with a client win2k i'm connected to samba server i can't see the latest situation about my shared directory on the samba server. example : if i create a new file named foo.txt , then i can see it with filesize = 0 byte then i edit foo.txt and add some char into the
2007 May 12
3
flac filesize limitation
hi is there a filesize limitation for flac files because of the encoder or decoder for some reason?
2005 Apr 12
7
Max filesize for rsync?
What the maximum filesize rsync can transfer? I'm trying to rsync one of my servers to another but the rsync is croaking on a file that's barely 1GB. Tips, hints, suggestions? rsync server is AIX 4.3.3 ML11 - rsync 2.6.3 rsync client is AIX 5.3 ML1 - rsync 2.6.4 Thanks -Jeff -- Jeff Schoby Unix/Network Admin City of Columbia, Missouri 573.874.6320
2011 Jun 06
3
rsync and many files
Hello together, I have a question about using rsync with many files. We are using rsync via rsnapshot, but this is not elementary. It is used to backup many (above 100 servers) and works very well. Now there is one server with many (several millions) files. The files are not very big, so the complete backup is about 500 GB. Now my problem is, that the backup needs about 14 hours - the most time
2007 May 13
2
flac filesize limitation
On May 13, 2007, at 05:45, Harry Sack wrote: > If I encode 192 kHz sound @ 24 bit for some days (WAV file) and I > encode it to FLAC, I think you can have a very big file and 1.5 TB > is reached very quickly. > And in the future audio will even get bigger, when used for HD-DVD > en Blu-ray media and 5.1 channels is considered the 'minimum' > setting for surround
2006 Jan 16
1
Max Filesize MySQL CentOS 3
Anybody offhand know the maximum filesize for a MySQL table? CentOS 3.6 Linux native ext3 filesystem MySQL version 3.23.58 TIA John Hinton
2003 Jun 11
1
rsync limit to filesize
Hi, I am trying to rsync about 460 GB of data from one server to another server using rsync 2.5.6. I started the transfer at about 11:45am, and it has been about 4 1/2 hrs since then and it is still building the file list. Here's the command I run: `rsync -avW --numeric-ids --delete --exclude-from=/tmp/EXCLUDE 192.168.0.75::vgroup00/* /vgroup00/`; I am trying to sync from server A
2013 Aug 21
1
FileSize changing in GlusterNodes
Hi, When I upload files into the gluster volume, it replicates all the files to both gluster nodes. But the file size slightly varies by (4-10KB), which changes the md5sum of the file. Command to check file size : du -k *. I'm using glusterFS 3.3.1 with Centos 6.4 This is creating inconsistency between the files on both the bricks. ? What is the reason for this changed file size and how can
2013 Mar 11
1
do not update dirtimes on --include='*/' --exclude='*'
Hello, I'm trying to do something that did not sound difficult, but no option I've tried seems to be working. I apologize in advance if I'm missing something obvious. I need to sync only directories from one tree to a similar, but older tree *without* updating the modtimes of directories that already exist in the destination. Or phrased the other way, I want to entirely skip
2003 Sep 14
1
How to calculate exact bitrate/filesize w/ Vorbis? Plz help
Hi, I'm quite familiar w/ mp3 cbr/abr/vbr encoding, as well as mpeg4 (cbr/vbr,etc). And I can always calc the bit rate for a given file size with: file size * 8000 / length in seconds = kbits/sec Works great w/ mpeg4 + mp3. BUT FOR THE LIFE OF ME: I cannot get oggenc (1.0x version) to give me the file size I want. I calc. it with the above formula, and nothing comes out right. Then I do
2012 Jul 31
1
[LLVMdev] ARM JIT support status?
Hi Rahul, I believe that ARM support is working in the MCJIT engine (as of llvm 3.1). If it wasn't working in the legacy JIT engine 10 months ago then it probably still isn't. -Andy -----Original Message----- From: llvmdev-bounces at cs.uiuc.edu [mailto:llvmdev-bounces at cs.uiuc.edu] On Behalf Of Rahul Garg Sent: Tuesday, July 31, 2012 1:13 PM To: llvmdev at cs.uiuc.edu Subject: Re:
2005 May 19
2
Bug#305932: rsync on a directory transfers the files of this directory
Hi, I got the following report from a Debian user, about --files-from transferring the contents of a dir (i.e. including the files in it) specified in the input, even thugh the files aren't listed in the input. This happens only when the dir name ends with a slash. I asked him to cook up a script to reproduce this (as it wasn't quite clear to me at first what happened exactly). Any
2012 Jul 26
2
Passing arguments to SQL Query in R
Hello all, I am a newbie at R, with some experience in PERL. I have a database table that contains the following data: Name | Score ======= | ===== Sachin T | 25 Sachin T | 53 Sachin T | 57 Sachin T | 34 Rahul D | 38 Rahul D | 31 Rahul D | 53 Ricky P | 7 Ricky P | 45 Ricky P | 27 Ricky P | 17 Ricky P | 86 Ricky P | 48 Jacques K | 23 Jacques K | 86 Jacques K | 32 I
2008 Oct 07
1
FW: Reading Data
Rahul Agarwal Analyst Equities Quantitative Research UBS_ISC, Hyderabad On Net: 19 533 6363 hi let me explain you the problem we have a database which is in this format Stocks 30-Jan-08 28-Feb-08 31-Mar-08 30-Apr-08 a 1.00 3.00 7.00 3.00 b 2.00 4.00 4.00 7.00 c 3.00 8.00 655.00 3.00 d 4.00 23.00 4.00 5.00 e 5.00 78.00 6.00 5.00 and we have a query