similar to: Reading the data from specific columns

Displaying 20 results from an estimated 9000 matches similar to: "Reading the data from specific columns"

2008 May 01
2
Error while making R package
Hi All, I am trying to make R package using R 2.6.2 And I am getting following error. When I give R CMD check t1\ ---------- Making package t1 ------------ adding build stamp to DESCRIPTION making DLL ... making CGHseg_rewrite.d from CGHseg_rewrite.c making rowMedians.d from rowMedians.c making runavg.d from runavg.c gcc-sjlj -std=gnu99 -Ic:/R/R-2.6.2/include -O3 -Wall -c
2008 Apr 30
3
How to stop buffering of "cat"
Hi All, My R code takes very long time to finish the processing. I want to see at what stage the script is running. So I wrote some output messages using cat. But instead of displaying the cat messages at different stages they are buffered and displayed in the end when entire processing is done. Can you please suggest how to stop this buffering or some alternative way to display messages Thank
2008 Apr 04
1
Cannot run R from command prompt
Hello, I have installed R 2.6 on windows 2000 and now from the command prompt when I type R it show " 'R' is not recognized as an internal or external command, operable program or batch file." If I change the path to C:\R\R-2.6.2\bin and then type R it runs fine. I have changed my envionment variable Path to C:\R\R-2.6.2\bin; Still its not working. Can you please give some
2008 Apr 29
1
c code working in linux and hanging in windows
Hi All, I am calling some c code from R. It successfully makes the dll and .so files. When I run .so in Linux is works prefect but hangs in windows. Though the dll loads but it never returns back from the c function Can you please suggest the possible cause of this thank you vidhu [[alternative HTML version deleted]]
2008 Jul 16
1
RSQLite maximum table size
Hi All, I am trying to make write a table in RSQLite. And I get the error mentioned below mat<-as.data.frame(matrix(rnorm(n=244000000),nrow=244000,ncol=1000)) > dbWriteTable(con, "array", mat) [1] FALSE *Warning message: In value[[3]](cond) : RS-DBI driver: (error in statement: too many SQL variables)* Can someone please tell me what is maximum size of a table( max number of
2008 Apr 09
1
read table not reading lines containing single quotes
Hi, * I am using read.table command as follow kegg<-read.table("c:/IDs.tab",header =TRUE,quote= "'", sep="\t") * Fragment of file is as follow: ID Pathway 04916 Melanogenesis 04920 Adipocytokine signaling pathway 04930 Type II diabetes mellitus 04940 Type I diabetes mellitus 04950 Maturity onset diabetes of the young 05010
2008 Apr 25
1
making a dll for windows eviornment for R
Hi All, I have a .c file and I need to make a dll so that it could be called by R. I have successfully made .so from .c file and its working great. But i have no clue of making dll. I tried googling for help ...but not much of use Please suggest something regards Vidhu [[alternative HTML version deleted]]
2009 Sep 23
1
read.delim very slow in reading files with lots of columns
Hi, I am trying to read a tab-delimited file into R (Ver. 2.8). The machine I am using is 64bit Linux with 16 GB. The file is basically a matrix(~600x700000) and as large as 3GB. The read.delim() ran extremely slow (hours) even with a subset of the file (31 MB with 6x700000) I monitored the memory usage, and found it constantly only took less than 1% of 16GB memory. Does read.delim()
2012 Feb 08
2
Problems reading tab-delim files using read.table and read.delim
Hello, I used read.xlsx to read in Excel files but for large files it turned out to be not very efficient. For that reason I use a programme which writes each sheet in an Excel file into tab-delim txt files. After that I tried using read.table and read.delim to read in those txt files. Unfortunately, the results are not as expected. To show you what I mean I created a tiny Excel sheet with some
2008 May 30
1
Skipping columns to save memory
I have a very large tab delimited file (~ 1.97 GB) that I need to read in to R. The data contain 10 columns and there are millions of rows. I need all rows of the data, but I only need the first column in the data. I was looking at the ?read.delim and am trying to see if it is possible to tell this function only to read in the first column and skip the others. The help file says the number of
2012 Nov 17
1
Strange problem with reading a pipe delimited file
I am trying to read in a pipe delimited file that has rows with varying number of columns, here is my sample data: A|B|C|D A|B|C|D|E|F A|B|C|D|E A|B|C|D|E|F|G|H|I A|B|C|D A|B|C|D|E|F|G|H|I|J You can see line 6 has 10 columns. Yet, I can't explain why R does like so: > test <- read.delim("mypaths4.txt", sep="|", quote=NULL, header=F,
2012 Sep 26
3
Reading multiple files
Hi, I have 35 data files for reading. I would like get a program for performing reading of 35 files at once. All are of the type: Dados1.raw, Dados2.raw and so on. If the files have the same number of columns, I can read with the following commands: rm(list=ls()) filenames = list.files(path="~/Silvano/Arq", pattern="Dados+.*raw") names = substr(filenames, 1, 7) for(i in
2006 Sep 13
2
recursive methods for concatenating sets of files
Hello, I would like to read sets of files within a folder, perhaps using recursive methods. Right now, I rename the files before import. It would be even better to do this without renaming files, without providing explicit filenames, perhaps by importing files based on chronology, and translating each filename into a header? Please excuse my ignorance, and help cure my clunky programming
2014 Jul 01
1
combining data from multiple read.delim() invocations.
Is there a better way to do the following? I have data in a number of tab delimited files. I am using read.delim() to read them, in a loop. I am invoking my code on Linux Fedora 20, from the BASH command line, using Rscript. The code I'm using looks like: arguments <- commandArgs(trailingOnly=TRUE); # initialize the capped_data data.frame capped_data <- data.frame(lpar="NULL",
2010 Oct 06
3
Help troubleshooting silent failure reading huge file with read.delim
I am trying to read a tab-delimited 1.25 GB file of 4,115,119 records each with 52 fields. I am using R 2.11.0 on a 64-bit Windows 7 machine with 8 GB memory. I have tried the two following statements with the same results: d <- read.delim(filename, as.is=TRUE) d <- read.delim(filename, as.is=TRUE, nrows=4200000) I have tried starting R with this parameter but that changed
2009 Sep 18
1
Reading clipboard with read.delim("clipboard") crash (PR#13957)
Full_Name: Liam Gretton Version: 2.9.2 OS: openSUSE 11.1 (x86_64) Submission from: (NULL) (143.210.13.77) Reading a large number of rows of delimited data via the clipboard results in a segfault or double free error. I've tested copying from various applications, but gedit will do. This problem exists in the openSUSE-supplied 2.8.1, I've just built 2.9.2 to see if it's still there,
2004 Aug 09
5
How to import specific column(s) using "read.table"?
Dear R people, I have a very big tab-delim txt file with header and I only want to import several columns into R. I checked the options for "read.table" and only found "nrows" which lets you specify the maximum number of rows to read in. Although I can use some text editors (e.g., wordpad) to edit the txt file first before running R, I feel it?s not very convenient. The
2008 Jan 16
1
Error in plot.new() : figure margins too large
Hi I am getting error : Error in plot.new() : figure margins too large Can you please help -----code is ------------------------------------------------------------ temp <- seq(5500000/2, 22000000, 5500000/2) sV2 <- c(1, 1+temp[-length(temp)]) eV2 <- temp i=1 bitmap(paste("c:/vidhu/poster/poster_56half_2", LETTERS[i], ".png", sep = ""), "png256",
2024 Sep 07
1
Reading a txt file from internet
On 2024-09-07 4:52 p.m., Jeff Newmiller via R-help wrote: > When you specify LE in the encoding type, you are logically telling the decoder that you know the two-byte pairs are in little-endian order... which could override whatever the byte-order-mark was indicating. If the BOM indicated big-endian then the file decoding would break. If there is a BOM, don't override it unless you have to
2002 Apr 22
2
skipping specific rows in read.table
Hi, We are considering organizing some of our ascii files with multiple "column names" like so: a.long.but.complete.name a.different.complex.name short.name.1 short.name.2 1 7 2 8 3 9 [more data] The basic idea is that we want to keep, in one location, both a long descriptive name of each variable (in row 1) and a short convenient name (in row 2). I could imagine keeping other