similar to: rsync incremental backup

Displaying 20 results from an estimated 4000 matches similar to: "rsync incremental backup"

2008 Apr 22
2
using rsync with scripts (cronjobs) and automated backups
Hello all, I am wondering if it would be possible to write a script or a cronjob in linux using Rsync to run an automated backup of a server, or serveral servers if possible. I am very new with writing scripts and such, so any help or suggestions with how to get started would be great!!! Thanks ahead of time for the help!!! ----- Computers are like air conditioners. They both dont work, if you
2008 May 18
1
Some files vanished...
I run an rsync to backup my mail server ever 4 hours. Sometimes, I get these sorts of warnings: Rsyncing... file has vanished: "/usr/local/virtual/*munged1*/new/ 1210876129.43402_0.mail.server.tld" file has vanished: "/usr/local/virtual/*munged2*/courierimapkeywords/. 4036254.1210876052.43312_0.mail.server.tld" file has vanished: "/usr/local/virtual/*munged3*/cur/
2010 Jul 16
4
--compare-dest weirdness
Hi All, I am writing a backup program for my computer. brief outline is as follows. Running ubuntu 10.04 2 main partitions, / and /home, both ext3. 1 external usb hdd, ext3, mounted to /backups/main. once every couple of days, rsync backs up, using following command, everything worth backing up in / and /home partitions to a folder /backups/main/Full. command: "rsync -vrhRupElog
2011 Jun 23
3
Using rsync as an incremental backup
I'm using rsync to do an incremental backup of my desktop here, to a remote server as follows: #/usr/bin/bash old=$(date -d 'now - 1 week' +%Y-%m-%d) new=$(date +%Y-%m-%d) rsync -avP --delete --link-dest=../$dir /home/bakers bakers at perturb.org:/home/bakers/backup/$new/ This is actually working GREAT! The only problem is that sometimes the cronjob won't complete (internet is
2009 Apr 09
3
Help creating incremental backups using --backup-dir.
Normally I would use the --link-dest option to do this but I can't since I'm rsyncing from a Mac to a Samba share on a Linux box and hard links don't work. What I want to do is create a 10 day rotating incremental backup. I used the first script example on the rsync examples page as a template. The only thing I changed was the destination to be a local directory and paths for
2011 Feb 07
1
Incremental backup with only delta into a separate file.
Hi All, I am presently doing a small POC with rsync for incremental backup and restore starategies. I have come up with certain question down the line, can anyone help me with the explanation. Used the config and ideas from: http://www.mikerubel.org/computers/rsync_snapshots/ The commands executed on two machines in sequence Machine 1: root at Andruil:~# vim testfile root at Andruil:~# ls
2009 Feb 11
2
--link-dest=server::location/folder
I'm wondering if I can use link-dest to compare rsync to a remote directory. rsync -aCHh --stats --link-dest=akane::backup/ranma.daily.1 / myserver::akane/ranma.daily.0 Something like that? Is it going to work like it does when doing a local rsync with a local link-dest directory? Is it going to chew massive amounts of bandwidth? Is there something else I should be doing
2005 Dec 12
2
date handling
Hi, Given a frame with calendar date's: "2005-07-01", "2005-07-02","2005-07-03","2005-07-04","2005-07-05",etc. I want to extract the following from these dates: week number month number year number Any ideas how to accomplish this? Many thanks. Regards, Richard
2011 Dec 13
1
Mac OS X : "get_xattr_names: llistxattr("some/path/here", 1024) failed" error
Hi, I'm trying to make a small script to get rid of Apple's TimeMachine. The aim is to backup the files of my company. I setup a MacMini with a lot of storage attached to it. The MacMini connects every once in a while to our data server (XServe) through SSH and pulls the files that need to be saved (I'm using the --link-dest option to limit the amount of files transferred, and to keep
2010 Apr 08
6
Mac OS X "rsync: unpack_smb_acl: sys_acl_get_info(): Unknown error: 0 (0)"
I am a developer on the LBackup project. An LBackup user recently posted a question to the mailing list asking about the following error. > "rsync: unpack_smb_acl: sys_acl_get_info(): Unknown error: 0 (0)" Link to thread : <http://www.mail-archive.com/lbackup-discussion at lists.connect.homeunix.com/msg00040.html> My understanding of this error is that when copying files via
2008 Mar 15
3
Incremental backups?
So I thought I'd get a head start for next week - I have a low-power Linux box that has a few samba shares mounted, and limited hard disk space. This box is connected to a tape library via SCSI card. I want to find the best way to create a full, then incremental backup of the samba mounts, directly to tape. Some of the samba mounts are appliances that cannot run any special
2010 May 14
2
command line to backup my documents to external drive
I'm really confused with all the examples out there and all different types of incremental backups. I tried several scripts but cannot reduce the size of my backup folders. What I want is to backup my documents to my external drive every month and save as much disk space as possible. Lets say I have 3 backup directories in the external drive, backup03.10, backup04.10 and backup05.10. I want
2010 Jun 22
2
few questions on rsync
Hi, I have few questions that could not find answers to in documentation. Different filesystems: Lets say I want to keep all extended attributes and everything else, so I use -A, -X, --perms etc., together with --fake-super. Now, lets say source FS supports some attributes not supported on target FS (for example, XFS extended attributes). Would this work as expected (ie. extended attributes are
2016 Jun 19
1
rsync script for snapshot backups
Am 19.06.2016 um 19:27 schrieb Simon Hobson: > Dennis Steinkamp <dennis at lightandshadow.tv> wrote: > >> i tried to create a simple rsync script that should create daily backups from a ZFS storage and put them into a timestamp folder. >> After creating the initial full backup, the following backups should only contain "new data" and the rest will be referenced
2015 Apr 17
1
Recycling directories and backup performance. Was: Re: rsync --link-dest won't link even if existing file is out of date (fwd)
How do you handle snapshotting? or do you leave that to the block/fs virtualization layer? /kc On Fri, Apr 17, 2015 at 01:35:27PM +1200, Henri Shustak said: >> Our backup procudures have provision for looking back at previous directories, but there is not much to be gained with recycled directories. Without recycling, and after a failure, the latest available backup may not have much
2006 Jan 22
23
calculate users age
i know it''s probably really simple, how do i work out someone''s age if i have their d.o.b. stored as a date in my db. cheers -- Posted via http://www.ruby-forum.com/.
2010 Oct 01
3
Converting a dataframe column from string to datetime
Hi, I have a dataframe column of the form v<-c("Fri Feb 05 20:00:01.43000 2010","Fri Feb 05 20:00:02.274000 2010","Fri Feb 05 20:00:02.274000 2010","Fri Feb 05 20:00:06.34000 2010") I need to convert this to datetime form. I did the following.. lapply(v,function(x){strptime(x, "%a %b %d %H:%M:%OS %Y")}) This gives me a list that looks like
2006 Jun 26
3
no true incrementals with rsync?
for example's sake: With traditional backup systems, you keep a base (full backup, let's say every 30 days), then build incrementals on top of that, eg. (what has changed since the base). So, to restore, you copy over your base, then copy each incremental over the base to rebuild up to the latest snapshot. (*copying new incrementals files over older base files*) With rsync, (using
2005 Jul 23
2
link_stat
Hi there, I set up my company's back up server using rsync. And I've got a strange problem. I searched in the archives of this list, but none of them seems not giving me an idea to solve the problem. If anyone can help, it would be grateful. I'm using cron by a user (non wheel/admin) to rsync everyday during the night. The cron is set in the server to transfer the backing-up
2007 Dec 31
1
Help with full and incremental dumps
I have an Overland Arcvault 12 library with a full LTO3 magazine of 400/800 GB tapes. It is connected directly to the fileserver via a SCSI card/cable. The two main directories I want to back up are /var/log, which is on one filesystem, and /home, which is on another. There are _currently_ no databases to worry about, but there may be active users logged in and active jobs running.