similar to: RSync and large amounts of data

Displaying 20 results from an estimated 1000 matches similar to: "RSync and large amounts of data"

2001 Dec 19
0
[Bug 54] ssh does not respect /etc/host.conf
http://bugzilla.mindrot.org/show_bug.cgi?id=54 sfllaw at engmail.uwaterloo.ca changed: What |Removed |Added ---------------------------------------------------------------------------- Status|NEW |RESOLVED Resolution| |INVALID ------- Additional Comments From sfllaw at
2007 Aug 08
1
prediction using gam
I am fitting a two dimensional smoother in gam, say junk = gam(y~s(x1,x2)), to a response variable y that is always positive and pretty well behaved, both x1 and x2 are contained within [0,1]. I then create a new dataset for prediction with values of (x1,x2) within the range of the original data. predict(junk,newdata,type="response") My predicted values are a bit strange
2008 Feb 01
0
rsync Digest, Vol 62, Issue 1
Yep Zane -----Original Message----- From: rsync-bounces+zane_brady=trimble.com@lists.samba.org [mailto:rsync-bounces+zane_brady=trimble.com@lists.samba.org] On Behalf Of rsync-request@lists.samba.org Sent: Friday, February 01, 2008 7:01 AM To: rsync@lists.samba.org Subject: rsync Digest, Vol 62, Issue 1 Send rsync mailing list submissions to rsync@lists.samba.org To subscribe or unsubscribe
1999 Dec 08
3
permission problems
Samba Techs, ***Question*** Can you prevent file permission problems on Samba mounts? I have Samba 2.0.5a loaded on our SUN SPARCcenter 2000's and our HP-UX 9000's and I have had a file permission problem ever since 1.9.8 was installed way back when. Every once in a while we have a situation where someone copies a file from we'll say /info8/pub/word.document, to there personal
2004 Oct 25
1
unable to open connection
Hi , there: I used function source to download the package but found > source("http://www.bioconductor.org/getBioC.R") Error in file(file, "r") : unable to open connection In addition: Warning message: unable to resolve 'www.bioconductor.org'. Then I downloaded the packages from CRAN and found > local({a <- CRAN.packages() +
2017 Dec 01
0
time foo
On 12/01/2017 02:32 PM, hw wrote: > > Hm.? Foo is a program that imports data into a database from two CVS files, > using a connection for each file and forking to import both files at once. > > So this would mean that the database (running on a different server) takes > almost two times as much as foo --- which I would consider kinda > excruciatingly > long because
2006 Jan 05
3
Parsing key-value files
Hi, I am very very new to both Ruby and Ruby on Rails. I have been given a task to create a web application, which is able to edit individual entries in a key-value file. In other works, I would need to create an application which can parse a key-value file. Could someone guide me where I should start in a case like that? I was able to access and edit database table entries using Rails, but I
2016 Jan 23
0
LGPL relicense port of rsync
Hi, from my point of view: On Sat, 9 Jan 2016 14:48:09 +0100 Per Lundqvist <perlundq at gmail.com> wrote: > ... > > Getting the approval for a relicensing I think the contributions to > > rsync have to be analyzed in detail to approach a reasonable number of > > contributors. > > > > I experienced that finding a responsible person that is willing to >
2007 Nov 22
2
--delete not working - due to 200+G of files?
All of the rsync pages say that "rsync -a --delete src dest" will do a full mirror but it just isn't so for us. I've only found one Google item similar: http://www.linuxquestions.org/questions/linux-software-2/rsync-doesnt-seem-to-delete-599985/ We have around 9200 directories in 250G in /home on a Sun 4500 with Solaris 8 and rsync 2.6.5 - which we are trying to mirror on Sun
2006 Jun 26
3
no true incrementals with rsync?
for example's sake: With traditional backup systems, you keep a base (full backup, let's say every 30 days), then build incrementals on top of that, eg. (what has changed since the base). So, to restore, you copy over your base, then copy each incremental over the base to rebuild up to the latest snapshot. (*copying new incrementals files over older base files*) With rsync, (using
2020 Nov 15
5
(C8) root on mdraid
Hello everyone. I'm trying to install CentOS 8 with root and swap partitions on software raid. The plan is: - create md0 raid level 1 with 2 hard drives: /dev/sda and /dev/sdb, using Linux Rscue CD, - install CentOS 8 with Virtual Box on my laptop, - rsync CentOS 8 root partition on /dev/md0p1, - chroot in CentOS 8 root partition, - configure /etc/mdadm.conf, grub.cfg, initramfs, install
2007 Nov 16
1
continusync issue
I am experimenting with Matt McCutchen's excellent continusync script, and I'm having an issue. (My copy of continusync has been modified from the original http://mattmccutchen.net/utils/continusync by adding this @ line 227: <$fromInwt>; as suggested by Matt.) The problem is easily reproduced: # mkdir ~/foo # continusync ~/foo root@remotehost:~/foo & # vi
2006 Jul 03
0
Scaffolds auto generate tables fields.. and Database amounts
Small story , I am moving over databases from Filemaker. In the filemaker databases are some like near 100 fields for one table. Luckly they are not relational to anything just big fat one page docs of fields.. First I am wondering if this is smart from a database speed point. Or if they should be broken up into many different database tables.. Though they all need to be on screen for the
2007 Mar 27
1
Managing large amounts of zones
I'm wondering what others are using for an interface to managing very large amounts of zones and frequent updates? We aren't using DNSSEC at all and are considering changing the code to directly query a backend database and or having a backend database maintain the zone files for pre-compilation. Thanks Christopher
2004 Jun 16
2
Samba consuming massive amounts of memory (CONT)
Concerning Samba consuming memory, I forgot to add one thing, sorry. The process listing does not show Smbd consuming the memory. "free -m" just reports a drastic increase when I use Samba. Relevant portions of ps -fwaux: root 6412 0.0 0.0 10060 2548 ? S 08:01 0:00 /usr/sbin/smbd -D -s /etc/samba/smb.conf reisuser 6478 0.0 0.0 10420 3136 ? S 08:01
2003 Feb 12
2
Syncing large amounts of data
I need some suggestions. Here's my setup: 800GB of Data 14,000,000+ Files No changes just additions Files range in size from 30k - 190k The files are laid out in a tree fashion like: BASE \-Directory ( Numerical Directory name from 0 - 1023 ) \-Directory ( Numerical Directory name from 0 - 1023 ) \- Files ( Up to 1024 files each directory ) This allows for a
2001 Dec 10
1
Graphics with moderately large amounts of data
Hi, A major attraction to R and to S-plus are the graphics. (Up to now my experience is with STATA and SAS.) Most of the graphical examples that I have seen in the documentation are for relatively small size data sets. I am working with a moderately large data set -- the order of magnitude is 180,000 observations by 50 variables. There seem to be standard problems that I keep bumping into in
2013 Jun 27
0
[LLVMdev] Heads up, I've backed out significant amounts of the multiple address space conversion changes
On Thu, Jun 27, 2013 at 12:49 PM, Micah Villmow <micah.villmow at smachines.com > wrote: > That said, changes of this magnitude should be done in a branch instead of > mainline trunk. I strongly disagree. If you think this is the case, we should probably start a new thread (rather than ressurecting this one) with the context of what you want to do and why you think it should be on a
2011 Sep 09
2
How to translate the 2D-density matrix (the output of bkde2D function) into matrix of datapoints' amounts?
It is known that function bkde2D (package "KernSmooth") returns a matrix of density estimates over the mesh induced by x1 and x2. In Details it is written that "... heights of the kernel, scaled by the bandwidths, at each datapoint are summed. This sum, after a normalization, is the corresponding fhat value in the output". There are several questions: 1) How to calculate
2009 Nov 09
3
DO NOT REPLY [Bug 6881] New: --bwlimit option uses KiB/s, but is documented as (what amounts to) kB/s
https://bugzilla.samba.org/show_bug.cgi?id=6881 Summary: --bwlimit option uses KiB/s, but is documented as (what amounts to) kB/s Product: rsync Version: 3.1.0 Platform: All OS/Version: All Status: NEW Severity: trivial Priority: P3 Component: core AssignedTo: wayned at