Displaying 20 results from an estimated 900 matches similar to: "Brandysnap -- a new rsync-based snapshot management script"
2011 Jun 09
0
rsync 3.0.7 hangs with unreadable hard-links files
Hello rsync list
I've stumbled across a fairly obscure situation in which rsync hangs --
it just waits until I press ctrl-C.
Here's a transcript. Note that directory foo contains two small files
that are hard-linked together, and are unreadable.
----------------------------------------------------------------------
$ uname -a
Linux ferox 2.6.35-28-generic-pae #50-Ubuntu SMP Fri Mar
2011 Jun 19
1
[Bug 8246] New: rsync hangs with --hard-links and unreadable files
https://bugzilla.samba.org/show_bug.cgi?id=8246
Summary: rsync hangs with --hard-links and unreadable files
Product: rsync
Version: 3.0.7
Platform: All
OS/Version: Linux
Status: NEW
Severity: minor
Priority: P5
Component: core
AssignedTo: wayned at samba.org
ReportedBy: cgdennis at
2012 Dec 17
1
--list-only ordering
Hello rsync people
I've noticed an apparent inconsistency in the ordering of output from
the --list-only option.
For example:
$ ls
d1 d2 d2-x d3 f1 f2 f2-x f3
$ rsync --list-only .
drwxr-xr-x 4096 2012/12/17 15:18:05 .
-rw-r--r-- 0 2012/12/17 15:17:52 f1
-rw-r--r-- 0 2012/12/17 15:17:52 f2
-rw-r--r-- 0 2012/12/17 15:17:52 f2-x
2013 Jun 18
1
rsync equivalent of 'cp -al' ?
Hello rsync people
I thought I knew how to use rsync, but I can't work out how to use it to
do the equivalent of
cp -al dir1 dir2
where dir1 and dir2 are both local and on the same disk.
In other words I want to make dir2 a copy of dir1, with every file
hard-linked to its counterpart in dir1.
Why not just use cp? Because I want to be able to do it as a user who
has sudo
2016 Jun 19
1
rsync script for snapshot backups
Am 19.06.2016 um 19:27 schrieb Simon Hobson:
> Dennis Steinkamp <dennis at lightandshadow.tv> wrote:
>
>> i tried to create a simple rsync script that should create daily backups from a ZFS storage and put them into a timestamp folder.
>> After creating the initial full backup, the following backups should only contain "new data" and the rest will be referenced
2013 Nov 25
2
rsync seems to overwhelm a failing hard disk
Hello rsync people
Today I was recovering data from a beginning-to-fail external USB hard disk.
I started with my usual 'rsync -av --ignore-errors <source> <dest>', and
that was fine until it got to the first I/O errors. It paused but
continued after the first couple of errors, but then the disk started
buzzing and rsync gave error messages for every file (I'm afraid
2010 Apr 08
6
Mac OS X "rsync: unpack_smb_acl: sys_acl_get_info(): Unknown error: 0 (0)"
I am a developer on the LBackup project.
An LBackup user recently posted a question to the mailing list asking about the following error.
> "rsync: unpack_smb_acl: sys_acl_get_info(): Unknown error: 0 (0)"
Link to thread : <http://www.mail-archive.com/lbackup-discussion at lists.connect.homeunix.com/msg00040.html>
My understanding of this error is that when copying files via
2010 May 14
2
command line to backup my documents to external drive
I'm really confused with all the examples out there and all different types
of incremental backups. I tried several scripts but cannot reduce the size
of my backup folders. What I want is to backup my documents to my external
drive every month and save as much disk space as possible.
Lets say I have 3 backup directories in the external drive, backup03.10,
backup04.10 and backup05.10. I want
2010 Jul 16
4
--compare-dest weirdness
Hi All,
I am writing a backup program for my computer. brief outline is as follows.
Running ubuntu 10.04
2 main partitions, / and /home, both ext3. 1 external usb hdd, ext3,
mounted to /backups/main.
once every couple of days, rsync backs up, using following command,
everything worth backing up in / and /home partitions to a folder
/backups/main/Full. command: "rsync -vrhRupElog
2010 Jun 22
2
few questions on rsync
Hi,
I have few questions that could not find answers to in documentation.
Different filesystems:
Lets say I want to keep all extended attributes and everything else,
so I use -A, -X, --perms etc., together with --fake-super.
Now, lets say source FS supports some attributes not supported on
target FS (for example, XFS extended attributes). Would this work as
expected (ie. extended attributes are
2015 Apr 17
1
Recycling directories and backup performance. Was: Re: rsync --link-dest won't link even if existing file is out of date (fwd)
How do you handle snapshotting? or do you leave that to the block/fs virtualization
layer?
/kc
On Fri, Apr 17, 2015 at 01:35:27PM +1200, Henri Shustak said:
>> Our backup procudures have provision for looking back at previous directories, but there is not much to be gained with recycled directories. Without recycling, and after a failure, the latest available backup may not have much
2015 Apr 16
2
Recycling directories and backup performance. Was: Re: rsync --link-dest won't link even if existing file is out of date (fwd)
rsync folks,
Henri Shustak <henri.shustak at gmail.com> wrote:
> LBackup always starts a new backup snapshot with an empty directory. I
> have been looking at extending --link-dest options to scan beyond just
> the previous successful backup to (failed backups / older backups).
> However, there are all kinds of edge cases which are worth considering
> with such a changes. At
2009 Sep 10
2
originate sync from the daemon server
is there anything special to do this from the daemon server. I've setup
the /etc/rsyncd.conf with some filesystems and I would rather originate
(control) my rsyncs from this server and not from the hosts that have
the data I want. ie. I want to pull not push.
for instance my rsyncd.conf
[www]
comment = www
path = /snaps/www
numeric ids = true
log file = /snaps/rsync/logs/www.log
pid file
2008 Nov 08
2
Differential backup
Hello,
I have got three folders:
- /home/backup/2008-10-20 - place for differential backup
- /mnt/for_backup - folder with files for backup
- /home/backup/2008-10-01 - place where the last full backup is
My question: is below command prepared correctly to make differential
backup?
rsync -avPbn --backup-dir=/home/backup/2008-10-20/ --exclude "System
Volume Information" --exclude
2015 Apr 06
6
rsync --link-dest won't link even if existing file is out of date
Feature request: allow --link-dest dir to be linked to even if file exists
in target.
This statement from the man page is adhered to too strongly IMHO:
"This option works best when copying into an empty destination hierarchy, as
rsync treats existing files as definitive (so it never looks in the link-dest
dirs when a destination file already exists)".
I was suprised by this behaviour
2014 Mar 19
1
Beating a dead horse
Sorry to do this .... AGAIN
Every year or two I get stuck on this same problem involving
excluding.
Seems I learn how its done then 2yrs later I've totally forgotten and
when I look up my notes ... this new need is just different enough
that they don't apply.
Here's the problem. (Simplified... and I've skipped some of the
repetitive output)
On remote
ls A/
a/ b/ c/ d e
on
2010 Jul 27
3
Getting rsync to store timing information in its logs
Is there a way to know from the rsync logs how long it took to do a backup?
The only timing info,. I see is this at the end:
sent 3067328 bytes received 7853035429 bytes 1187888.83 bytes/sec
total size is 1559866450336 speedup is 198.55
Can I use it to figure out how long the operation took?
Does the above mean it took 2.5 secs of send time and 1.8 hours of
recieve time so (roughly) the
2008 Nov 09
7
How to delete files older than X on backup during sync?
Hi list,
I have been reading MAN pages and listarchives, but not found the
answer to my question though I am sure it must be possible to acheive
my wish.
I wish to use rsync to create a backup BUT only keep the files for a
limited period of time, EG two weeks.
I have not yet been able to figure out how to do this inside rsync
(while the backup is being performed) and my understanding of the
2011 Jun 23
3
Using rsync as an incremental backup
I'm using rsync to do an incremental backup of my desktop here, to a
remote server as follows:
#/usr/bin/bash
old=$(date -d 'now - 1 week' +%Y-%m-%d)
new=$(date +%Y-%m-%d)
rsync -avP --delete --link-dest=../$dir /home/bakers
bakers at perturb.org:/home/bakers/backup/$new/
This is actually working GREAT! The only problem is that sometimes the
cronjob won't complete (internet is
2011 Feb 05
2
rsync not reporting diskfull error
I am involved with the development of lbackup. This message to the rsync mailing list is related to the following thread on the lbackup-disccussion mailing list : http://tinyurl.com/lbackup-discussion-diskfull
Essentially, I am curious to if any one using rsync 3.0.7 on Mac OS (10.6) Server has experienced an out of disk space error and not had a message similar to the following reported :
>