Displaying 20 results from an estimated 200 matches similar to: "Extending --log-format"
2002 Apr 03
3
metadata in dryrun mode
As I reported a while back rsync doesn't handle metadata (permissions and
ownership) in dryrun mode.
I offered to make a patch and that offer still stands. I didn't have the
time for it until now and want to pick it up again. I had some ugly hack
back then but I want to redo it in a clean way.
I would like some input on my thoughts.
IMHO, it would be ideally if the check for dry_run
1999 Nov 19
2
\\host\root
this is a slight variant of what's in the FAQ
(http://us1.samba.org/samba/docs/FAQ/#25) because i don't
understand why the answer provided there seems to miss the
entire point.
why does the "root" share exist at all?
it certainly isn't part of my smb.conf !
Am i missing the obvious?
what other hidden shares are there?
(didn't find anything in the doc about this
2001 Aug 06
1
merge rsync+ into rsync (was Re: rsync-2.4.7 NEWS file)
> Just curious: what about the rsync+ patch?
Thanks for the reminder.
I've just committed Jos's rsync+ patch onto the
"branch_mbp_rsyncplus_merge" branch. If it works OK and nobody
screams I will move it across onto the main tree tomorrow or
Wednesday.
I see the patch doesn't add documentation about the new options to the
man page, so we should fix that in the future.
2012 Feb 18
4
FADV_DONTNEED support
While going through an old todo list I found that these patches had fallen by
the way-side. About a year ago I initiated a discussion[1] with the Linux
kernel folks regarding the lack of any useable fadvise support on the kernel
side. As a result, I was observing extremely poor performance on my server
after backup as executable pages were being swapped out in favor of data
waiting to be flushed
2019 May 25
3
Error suggestions please
May 25 15:01:27 imap(xxxxx at italy1.com)<10256><LsrOdL2J3uisCqP7>: Error: Mailbox INBOX: UID=208: read(/home/vpopmail/domains/italy1.com/xxxxx/Maildir/cur/1154731221.9257.azz.italy1.com,S=26421:2,S) failed: Cached message size smaller than expected (26421 < 26511, box=INBOX, UID=208) (read reason=mail stream)
May 25 15:01:27 imap(xxxxxx at
2007 Jul 19
2
Subsetting dataframes
Dear all!
W2k, R 2.5.1
I am working with an ongoing malting barley variety evaluation within
Sweden. The structure is 25 cultivars tested each year at four sites, in
field trials with three replicates and 'lattice' structure (the replicates
are divided into five sub blocks in a structured way). As we are normally
keeping around 15 varieties from each year to the next, and take in 10 new
2010 Jun 25
1
Confused: Looping in dataframes
Hey,
I have a data frame x which consists of say 10 vectors. I essentially want
to find out the best fit exponential smoothing for each of the vectors.
The problem while I'm getting results when i say
> lapply(x,ets)
I am getting an error when I say
>> myprint
function(x)
{
for(i in 1:length(x))
{
ets(x[i],model="AZZ",opt.crit=c("amse"))
}
}
The error message is
2013 Jan 10
4
Fixing corrupt flac files
So, let's provide some information then :-)
----------------------------------------------------------------------------------------------
soa2ii at thor /mnt/files/music/Slime/Alle gegen Alle $ flac -aF 02\
St?rtebecker.flac
flac 1.2.1, Copyright (C) 2000,2001,2002,2003,2004,2005,2006,2007 Josh Coalson
flac comes with ABSOLUTELY NO WARRANTY. This is free software, and you are
welcome
2004 Sep 03
1
more filelist --stats
The attached diff causes rsync to show how much time it spends
on building and sending its filelist. I'd appreciate if you
could consider this change for inclusion in a future release.
-------------- next part --------------
diff -ru rsync-2.6.3pre1/flist.c rsync-2.6.3pre1+tykhe/flist.c
--- rsync-2.6.3pre1/flist.c 2004-08-12 14:20:07.000000000 -0400
+++ rsync-2.6.3pre1+tykhe/flist.c
2004 Sep 10
1
which files were newer and not transferred?
I almost always use "-u" with rsync so that I don't overwrite remote
files that have changed. The only way to get rsync to tell me which
remote files are "newer" is to use a double-v (-vv), which produces way
more output than I care to see. (In true Unix fashion, I don't care to
see what was done successfully; I only want to see what failed.)
I've always had to
2008 Jan 27
18
Reporting Analisysing program
Anybody knows some graphic reporting/analysing program for shorewall
4.0.7 or i have to do it by accounting?
--
Javier
MartÃnez
Technical Manager
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
2010 Jun 28
1
Exponential Smoothing: Forecast package
Hey,
I am using the ets() function in the forecast package to find out the best
fit parameters for my time-series. I have about 50 sets of time series data.
I'm currently using the function as follows:
ets(x,model="AZZ",opt.crit="mse")
As to my observation about 5-10 of them have been identified by ets to have
a trend and an alpha, beta values have been thrown up -
2001 Nov 13
2
direct write patch
I have attached a patch that supports a new "--direct-write" option.
The result of using this option is to write directly to the destination
files, instead of a temporary file first.
The reason this patch is needed is for rsyncing to a device where the
device is full or nearly full.
Say that I am writing to a device that has 1 Meg free, and a 2 meg file
on that device is out of date.
2003 Nov 17
0
[PATCH] --source-filter && --dest-filter for rsync 2.5.6
Hi,
I needed to filter content of files (encrypt), before they are sent over the network to backup server.
The easiest way to do this was modifying Kyle Jones's "--dest-filter" patch.
Somebody was asking there this feature in the past, so I'm sending this patch to list.
Implementation details:
-filtering disables rsync alogrithm
-source filter makes temporary files in /tmp
2004 Feb 09
1
[patch] Add `--link-by-hash' option.
This patch adds the --link-by-hash=DIR option, which hard links received
files in a link farm arranged by MD4 file hash. The result is that the system
will only store one copy of the unique contents of each file, regardless of
the file's name.
Anyone have an example of an MD4 collision so I can test that case? :)
Patch Summary:
-1 +1 Makefile.in
-0 +304 hashlink.c (new)
2013 Dec 16
2
Real hardware for opus
On Sun, Dec 15, 2013 at 10:33 PM, Adam Sampson <ats at offog.org> wrote:
> Carsten Mattner <carstenmattner at gmail.com> writes:
>
>> What are my best options for a portable player I can put opus on and
>> have 10 hours of opus playback?
>
> I use a SanDisk Clip+ running Rockbox; these are available for around
> ?25 refurbished. They get about 14h playing
2004 Feb 23
0
[patch] Add `--link-by-hash' option (rev 4).
This patch adds the --link-by-hash=DIR option, which hard links received
files in a link farm arranged by MD4 file hash. The result is that the system
will only store one copy of the unique contents of each file, regardless of
the file's name.
(rev 4)
* Updated for committed robust_rename() patch, other changes in CVS.
(rev 3)
* Don't link empty files.
* Roll over to new file when
2004 Feb 17
0
[patch] Add `--link-by-hash' option (rev 3).
This patch adds the --link-by-hash=DIR option, which hard links received
files in a link farm arranged by MD4 file hash. The result is that the system
will only store one copy of the unique contents of each file, regardless of
the file's name.
(rev 3)
* Don't link empty files.
* Roll over to new file when filesystem maximum link count is reached.
* If link fails for another reason, leave
2004 Feb 23
0
[patch] Add `--link-by-hash' option (rev 5).
This patch adds the --link-by-hash=DIR option, which hard links received
files in a link farm arranged by MD4 file hash. The result is that the system
will only store one copy of the unique contents of each file, regardless of
the file's name.
(rev 5)
* Fixed silly logic error.
(rev 4)
* Updated for committed robust_rename() patch, other changes in CVS.
(rev 3)
* Don't link empty
2004 Feb 16
1
[patch] Add `--link-by-hash' option (rev 2).
This patch adds the --link-by-hash=DIR option, which hard links received
files in a link farm arranged by MD4 file hash. The result is that the system
will only store one copy of the unique contents of each file, regardless of
the file's name.
(rev 2)
* This revision is actually against CVS HEAD (I didn't realize I was working
from a stale rsync'd CVS).
* Apply permissions after