similar to: Is the --sparse option suitable for .dbf files

Displaying 20 results from an estimated 1000 matches similar to: "Is the --sparse option suitable for .dbf files"

2012 Sep 06
1
Huge rsyncd.log file - what do I grep for to debug rscync failures
Hi, -rw-r--r-- 1 root other 5291457346 Sep 6 13:44 rsyncd.log what pattern should I grep for to send you guys more information on rsync failures (server side) Client side messages are: rsync of /oradb/d10 appeared to complete with NON-NOMINAL status (rc=12) at Thu Sep 6 07:33:58 PDT 2012 with the following files reported in output: receiving incremental file list
2012 Aug 29
1
Destination file is larger than source file
We are using the standard -av switch. And both filesystems are the same - UFS. /opt/rsync/bin/rsync -av -e "ssh -l root" --delete --exclude-from=/var/scripts/exclude --password-file=/var/scripts/transfer.passwd <username>@<source host>::<source dir>/ /<destination dir> Source system <source host>:<source dir># du -sh * 1K nohup.out 20G
2012 Sep 05
1
rsync in daemon mode
Hi, We use one server from which many other clients download files. This operates in daemon mode over ssh. Is it possible that there is a max number of connections the rsyncd can have on the server? Where is this value set? This is what I see on the client side: rsync: read error: Connection reset by peer (131) rsync error: error in rsync protocol data stream (code 12) at io.c(759)
2012 Aug 25
1
do the "::" mean that rsync is in server daemon mode
and is rsh the only supported mechanism in this mode? -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.samba.org/pipermail/rsync/attachments/20120825/986de384/attachment.html>
2012 Aug 30
1
we have about 60 directories that rsync fine
issue is with 3 directories. Is it possible that some files in those directories are "open", being written to etc that causes the issue? connection reset by peer is the message we get. This is rsync in daemon mode on TCP port 873 -------------- next part -------------- An HTML attachment was scrubbed... URL:
2012 Aug 21
1
weird rsync issue
rsync fails on some directories while on others it works without issue. Here are the inportant items: On theserver from which directories are being copied: more /etc/rsyncd.conf <snip> [abcd] path = /xyz/abcd comment = abcd uid = 0 gid = 3 read only = yes list = no auth users = test-abcd secrets file = /etc/rsyncd.passwd strict modes = true hosts allow =
2012 Sep 05
1
Is --sparse suitable for general purpose use?
Hi, I'm using rsync with --link-dest to do backups. I don't have any sparse files, but someday I might. Should I be using --sparse? I notice that -S is not implied by -a. This makes me suspicious that --sparse is not (yet?) suitable for general purpose use. There also seem to be outstanding bugs related to --sparse. Thanks. Karl <kop at meme.com> Free Software: "You
2012 Sep 14
1
rsync in daemon mode, no lock file generated
We are running rsync in daemon mode (::) (two colons), and in /etc.rsyncd.conf there is a lock file specified: log file = /var/adm/rsyncd.log pid file = /var/run/rsyncd.pid lock file = /var/run/rsync.lock But I do not see the lock file....... -------------- next part -------------- An HTML attachment was scrubbed... URL:
2010 Jan 13
1
column width in .dbf files using write.dbf ... to be continued
Dear UseRs, I did not have any answer to my previous message ("Is there a way to define "manually" columns width when using write.dbf function from the library foreign ?"), so I tried to modify write.dbf function to do what I want. Here is my modified version : write.dbfMODIF <- function (dataframe, file, factor2char = TRUE, max_nchar = 254, width = d) {
2011 May 16
1
reading multiple .dbf files
Hello.. I am currently working on running random forest to make predictions. For that I have a bunch of .dbf files from shapefiiles. Earlier I was running random forest on those dbf files individually but now I have >1,000 such files and procesisng one by one is not an option. I started by trying to read multiple dbf files as: >setwd (..) >a=list.files() > for (x in
2013 May 22
1
column width in .dbf files using write.dbf ... to be continued
Hello Arnaud, You posted this question a long long time ago, however I found your answer so I decided to post it anyway in case somebody else have the same problem as you and me. You were actually very close in finding your solution. The function DoWritedbf is an internal function from the foreign package. To access it outside of the package just do: foreign:::DoWritedbf so in your line:
2011 Aug 29
1
Problem exporting table with many columns to dbf
Hello, I'm newbie in R and I have a problem exporting a table with many columns to a dbf file. I found an error when I open the result DBF file on other software and also importing it on R again. Here a example snippet of the problem (on a GNU/Linux OS): http://pastebin.com/0SMJqqwb Is it a bug? Thank you, Nacho V [[alternative HTML version deleted]]
2009 Oct 08
2
Bringing dbf Data With SQL
I have a heavy DATA saved in dbf format. What I want is to bring that data to R with SQL statements. Like: I want columns 1, 4, 5 and only when column 4 > 30. Sorry asking it here instead of keep searching in manuals, but it seems that there are too many ways of doing it. So what's the appropriate package that I need to work it, considering also that im dealing with lots of gigas so the
2005 Aug 19
1
Summary: Unexpected result of read.dbf
Hi there, This is summary and patch for a bug in read.dbf, demonstrating in Message-Id: <20050818150446.697835cb.stanimura-ngs at umin.ac.jp>. After consulting Rjpwiki, a cyber-community of R user in Japan, the cause was found, and the patch of solution was proposed. Overflowing occurs when we use read.dbf for reading a dbf file having a field of longer signed integer. For example, $
2006 May 23
1
exporting long character vectors to dbf
Hi - I need to export data to openoffice base, where one of the elements is a long character vector (>255 characters.) write.dbf exports it as varchar, truncating the data. Any idea how to do this? thanks, -eduardo
2002 May 10
2
RODBC for importing dbf
Hi I know that it is very easy to import data from a dbf file to R, by saving the data as csv, for instance. However, I have several hundreds of files to do that. So, I thought of using RODBC to read the dbf files and save it as data.frame. However, I cannot even start (this is my first time using such package): > library(RODBC) > bdades <- odbcConnect("prova.DBF") Warning
2011 Jan 31
1
Files dbf and cdx - net linux
Hello. I am trying to acces to a software in a network, from a terminal. This terminal works on Windows, so I use Wine. I could execute the software, but some errors appear. It seams that it couldn't create in the right way the indexes in the data base (.dbf and cdx files). I think that the software was programed with Visual Fox Pro (but I am not sure) I will appreciate yours comments and
2011 Aug 23
1
Re: Files dbf and cdx - net linux
Hi, I have these BMIMS .dbf/.cdx/.fpt files. I have no longer BMIMS. I would like to transfers theses files to another visible format. I tried opening them with Monarch 4.0, but it does not give me a proper format. what do i do? thankx
2005 Oct 17
0
DBF database hangs up, when running from Samba 3.0.14 share
Hi to everyone, Since last week I am trying to fix following problem: I have a DOS application, operating on DBF databases. I've set up a Samba 3.0.14a-2 server running as a PDC on Fedora Core 4 (in fact, Samba server was a part of FC4 distribution). I've set up a share and then I've moved all data for this application on to that share. The problem is that the application hangs
2009 Apr 15
0
RE: Fox Pro DBF open problems Solved.....FOLLOW UP
Hi List, I wanted to provide the solution that proved to fix this problem. On this particular server I observed we had consumed all memory resources and were using a decent sized swap file. After installing more RAM the problem & symptoms have disappeared. FINALLY I have happy users. L. Kipp From: wikked1@hotmail.com To: samba@lists.samba.org Subject: Fox Pro DBF open problems Date: