similar to: Reading in 9.6GB .DAT File - OK with 64-bit R?

Displaying 20 results from an estimated 1000 matches similar to: "Reading in 9.6GB .DAT File - OK with 64-bit R?"

2012 Mar 10
1
Subsetting a data.frame -> Read in with FWF format from .DAT file
Hi there, I am having trouble subsetting a data frame by a conditional via one column (of many). I read the file into R through "read.fwf," where I specified column widths. Original data is .DAT. I then utilized "names" function to read in column headings. For one column, PRVDR_NUM, I wish to further amend the entire data set, but only have PRVDR_NUM == 050108. This is
2012 Mar 07
1
Convert Numeric (20090101) to Date
Hi there, Does it exist where R can convert a numeric date (20090101) to a "proper" date format? (Ideally dd-mm-yyyy) Original data (in this case) is in .DAT format. I read the multi-column data with the read.fwf function, where I specified the column width for the eight digit date (example above). After the .DAT data is read-in & formatted in R, it is to be exported to Excel.
2012 Apr 26
2
Merge function - Return NON matches
Hi there, I wish to merge a common variable between a list and a data.frame & return rows via the data.frame where there is NO match. Here are some details: The list, where the variable/col.name = CLAIM_NO CLAIM_NO 20 83 1440 4439 7002 ... > dim(hrc78_clm_no) [1] 6678 1 The data.frame, where there exists a variable with the same name, CLAIM_NO. > dim(bestPartAreadmin) [1] 13068
2003 Jul 17
1
2 GB Limit when writing to smbfs filesystems
I'm running RedHat 8.0 with samba-2.2.7-5.8.0 (installed from RedHat distribution) When I use cpio to write a backup (> 2GB) to a smbfs filesystem, I get the error: File size limit exceeded I get the same error when I linux copy (cp) a file (> 2GB) from a Linux ext3 filesystem to the smbfs filesystem. The smbfs filesystem is mounted from a Windows 2000 Professional workstation. After
2009 Jan 28
2
ZFS+NFS+refquota: full filesystems still return EDQUOT for unlink()
We have been using ZFS for user home directories for a good while now. When we discovered the problem with full filesystems not allowing deletes over NFS, we became very anxious to fix this; our users fill their quotas on a fairly regular basis, so it''s important that they have a simple recourse to fix this (e.g., rm). I played around with this on my OpenSolaris box at home, read around
2003 Dec 02
1
rdiff
Is there any chance for rdiff ? I need to frequently synchronize big text file (60MB+) undertaking small changes and I am interested in differences between the subsequent versions [DNS RBL data in dnsbl format, 1E6+ lines of text, new version every 20m, on average 50 new entries (lines) in every synchronization] I would like to get (small) diff file as result of rsync session and apply it to
2012 Oct 20
2
can't find the error in if function... maybe i'm blind?
Hi everybody, the following alway gives me the error "Fehler in if (File$X.Frame.Number[a] + 1 == File$X.Frame.Number[a + 1]) (File$FishNr[a] <- File$FishNr[a - : Fehlender Wert, wo TRUE/FALSE n?tig ist". Maybe its stupid, but i'm not getting why... Maybe someone can help me. Thanks a lot! for (i in unique(BigFile$TrackAll)) { File <-
2016 Oct 26
3
NFS help
On Tue, Oct 25, 2016 at 12:48 PM, Matt Garman <matthew.garman at gmail.com> wrote: > On Mon, Oct 24, 2016 at 6:09 PM, Larry Martell <larry.martell at gmail.com> wrote: >> The machines are on a local network. I access them with putty from a >> windows machine, but I have to be at the site to do that. > > So that means when you are offsite there is no way to access
2007 Nov 27
1
Syncing to multiple servers
Helle everyone, Let's say we have 3 servers, 2 of them have the latest (stable) version of rsyncd running (2.6.9) <Server1> ==> I N T E R N E T ==> <Server2 (rsyncd running)> ==> LAN ==> <Server3 (rsyncd running)> Suppose I want to send a big file (bigfile.big) from Server1 to both Server2 and Server3. It would be a good idea to send first from Server1
2005 Jul 06
2
OpenSSH and non-blocking mode
Dear OpenSSH developers, OpenSSH setting non-blocking mode on its standard files creates serious problems. Setting non-blocking mode violates many of the semantics of how files are supposed to behave and most programs (and most, if not all, stdio libraries) are not prepared to deal with it. That wouldn't be a problem except that non-blocking mode is not a property of the file descriptor but
2007 Mar 02
1
--delete --force Won't Remove Directories With Dotnames
--delete --force Won't Remove Directories With Dotnames rsync 2.6.9 Me, personally, I reckon this to be an irritant ... but perhaps (and having thought about this a bit I decided it's a good chance) this is an intentional and useful behaviour. But it's a nuisance if you call your --partial-dir .partial, as I happen to do, since now if you remove a directory which was aborted in
2015 Sep 11
2
Cannot open: No space left on device
On Fri, Sep 11, 2015 at 3:19 PM, Dario Lesca <d.lesca at solinos.it> wrote: > the result. # du -sc /* /.??* --exclude /proc|sort -n 0 /.autofsck 0 /.autorelabel 0 /misc 0 /net 0 /sys 4 /cgroup 4 /media 4 /mnt 4 /selinux 4 /srv 8 /opt 16 /home 16 /lost+found 16 /tmp 112 /root 188 /dev 7956 /bin
2009 Apr 22
2
purge-empty-dirs and max-file-size confusion
I want to use --min-size to copy just large files (and their necessary parent directories), but everything I've tried copies *all* the source directories, and creates them empty on the destination even if they don't have any big files in them. I only want the minimal directory hierarchies that contain the big files. This doesn't work: $ rm -rf /tmp/foo $ rsync -ai --min-size
2002 Mar 27
2
Linux 2.4.18 on RH 7.2 - odd failures
Hi there, I'm using RH7.2 (with the 2.4.9-30 kernal and it's required components) as a base for a server system running kernel 2.4.18. I've gone to this version to get around non-performing aic7xxx drivers in the stock 7.2 kernels, and updated gigabit ethernet drivers. I have a raid unit (Medea) attached to an Adaptec 3916, coming up as sdb. It has 2kb blocks, but the fault
2002 May 03
3
skipping columns with read.fwf?
I have a file in fwf. It is rather large, about 40,000 rows and 40 variables (columns). I only need about 10 variables form the data set for the analysis at hand. Unfortunately, these 10 variables are not contiguous in the file, for example, the first is position 1-8, the next position 25-27, then 40. Is there a way to read the selected varaibles that I need without reading in the entire data
2012 Sep 14
1
Any way to get read.table.ffdf() (in the ff package) to pass colClasses or comment.char parameters through to read.fwf() ?
Hi everyone, my apologies if I'm overlooking something obvious in the documentation. I'm relatively inexperienced with the (awesome) ff package. My goal is to use the read.table.ffdf() function to call the read.fwf() function and pass through the colClasses and comment.char arguments. The code below shows exactly what doesn't work for me. If the colClasses and comment.char
2006 Oct 30
4
read.fwf and header
Hi! I have data (also in attached file) in the following form: num1 num2 num3 int1 fac1 fac2 cha1 cha2 Date POSIXt 1 1 f q 1900-01-01 1900-01-01 01:01:01 2 1.0 1316666.5 2 a g r z 1900-01-01 01:01:01 3 1.5 1188830.5 3 b h s y 1900-01-01 1900-01-01 01:01:01 4 2.0 1271846.3 4 c i t x 1900-01-01 1900-01-01 01:01:01 5 2.5 829737.4 d j u w 1900-01-01 6 3.0
2006 Oct 30
4
read.fwf and header
Hi! I have data (also in attached file) in the following form: num1 num2 num3 int1 fac1 fac2 cha1 cha2 Date POSIXt 1 1 f q 1900-01-01 1900-01-01 01:01:01 2 1.0 1316666.5 2 a g r z 1900-01-01 01:01:01 3 1.5 1188830.5 3 b h s y 1900-01-01 1900-01-01 01:01:01 4 2.0 1271846.3 4 c i t x 1900-01-01 1900-01-01 01:01:01 5 2.5 829737.4 d j u w 1900-01-01 6 3.0
1999 Oct 10
1
Using metric scaling
I want to enter a symmetric matrix containing distances for use in the cmdscale() metric scaling function. The matrix currently sits on a file in lower triangular form looking like this: 1 AWANUI RIVER .000 2 BLENHEIM .510 .000 3 COLLINGWOOD .510 .109 .000 4 FOXTON .510 .141 .141 .000 5 GISBORNE .549 .549 .549
2005 Oct 20
4
read.fwf doesn't work with header = TRUE (PR#8226)
Full_Name: Emmanuel Paradis Version: 2.1.1 OS: Linux Submission from: (NULL) (193.49.41.105) read.fwf(..., header = TRUE) does not work properly since: 1/ the original header is printed on the console and not in FILE; 2/ the different 'parts' of the header should be separated with tabs to work with the call to read.table. Here is a suggested fix for src/library/utils/R/read.fwf.R: