Displaying 20 results from an estimated 6000 matches similar to: "how to read CSV file in R?"
2010 Jun 03
5
import text file into R
can anyone tell me how to import a text file in R? the text file I want to
import is a large file, about 800MB in size. Thanks in advance.
I tried using the following
data<-read.table("file",header=T,sep="\t")
--
View this message in context: http://r.789695.n4.nabble.com/import-text-file-into-R-tp2241525p2241525.html
Sent from the R help mailing list archive at
2009 Nov 10
3
Error: cannot allocate vector of size...
I'm trying to import a table into R the file is about 700MB. Here's my first
try:
> DD<-read.table("01uklicsam-20070301.dat",header=TRUE)
Error: cannot allocate vector of size 15.6 Mb
In addition: Warning messages:
1: In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, :
Reached total allocation of 1535Mb: see help(memory.size)
2: In scan(file, what,
2012 Feb 08
2
Problems reading tab-delim files using read.table and read.delim
Hello,
I used read.xlsx to read in Excel files but for large files it turned out to
be not very efficient.
For that reason I use a programme which writes each sheet in an Excel file
into tab-delim txt files.
After that I tried using read.table and read.delim to read in those txt
files. Unfortunately, the results
are not as expected. To show you what I mean I created a tiny Excel sheet
with some
2010 May 03
2
advice?
All-
Thank you in advance for any help you might be able to lend. Here is
my issue. I am trying to open a fairly large .dat file. The file
originally was downloaded as a GZ file but I unzipped it (with 7-zip) into
it's current 1.86 gig .dat format. I know that the data is "just a plain
ASCII file with 720 columns and 360 rows per time step (month). It should be
readable by
2013 Jul 19
4
Error read.csv
Estimados
Tengo un archivo CSV con 1.200.000 registros separados por ";" y cuando
quiero abrirlo me da el siguiente error:
form<-read.csv("Usr1.csv" , sep=';' , na.strings = "NA", header=T)
Mensajes de aviso perdidos
In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, :
entrada inválida encontrada en la conexión de entrada
2013 Jul 20
1
Error read.csv
Siguiendo la sugerencia de Velez Jorge use read.delim2, lee y crea el
objeto pero al intentar visualizarlo me da el error:
> form<-read.delim2("ASSEUsr1.csv", header=T, sep=";")
> form
Error: C stack usage is too close to the limit
Olvideo decir que corro R con Rstudio en Ubuntu 13.04
Gracias
El 19 de julio de 2013 15:36, Carlos Ortega
2010 Jun 18
5
extract date time from a text file
I a have a text file where every line is like that:
"2007-12-03 13:50:17 Juan Perez"
("yy-mm-dd hh:mm:ss First Name Second Name")
I would like to make a data frame with two column one for date and the
other one for name.
When I use read.delim it was transformed in a data frame with 4 colums.
Bye,
Sebasti?n.
2010 Jun 21
2
Return value associated with a factor
I am using the code below to extract census tract information.
save.tract$state, save.tract$county and save.tract$tract are returned as
factors. In the last three statements, I need to save the actual value of
the factor, but, instead, the code is yielding the position of the factor.
How do I instead return the value of the factor?
By way of example, for Lon=-82.49574 and Lat=29.71495, the code
2010 Jun 11
1
ff package when reading .csv files
Hi
My aim is to read a large .csv file into R. I ran the following code and am
using R version 10.1 on Windows.
>library(ff)
> read.csv.ffdf(x=NULL,"file.csv",fileEncoding="",nrows=-1,first.rows=NULL,next.rows=NULL,levels=NULL,appendLevels=TRUE,FUN="read.table",transFUN=NULL,asffdf_args=list(),BATCHBYTES=getOption("ffbatchbytes"),VERBOSE=FALSE)
2012 May 18
1
UTF-16 input and read.delim/scan
Hi all,
I am running 64-bit R 2.15.0 on windows 7. I am trying to use read.delim
to read from a file that has 2-byte unicode (CJK) characters.
Here is an example of the data (it is tab-delimited if that gets messed up):
HITId HITTypeId Title
2Q69Z6KW4ZMAGKKFRT6Q4ONO6MJF68 2LVJ1LY58B72OP36GNBHH16YF7RS7Z 看看句子,写写想法
请看以下的句子,再回答问
So read.delim (code below) doesn't read in correctly. It reads
2007 Nov 15
0
Error with read.delim & read.csv
Hi -
I'm reading in a tab delimited file that is causing issues with
read.delim. Specifically, for a specific set of lines, the last entry
of the line is misread and considered to be the first entry of a new row
(which is then padded with 'NA's' ). Specifically:
tmp <- read.delim( "trouble.txt", header=F )
produces a data.frame, tmp where if I call tmp[,1],
2010 Jul 08
2
random sample from arrays
Hello R users,
I'm trying to extract random samples from a big array I have.
I have a data frame of over 40k lines and would like to produce around 50
random sample of around 200 lines each from this array.
this is the matrix
ID xxx_1c xxx__2c xxx__3c xxx__4c xxx__5T xxx__6T xxx__7T xxx__8T
yyy_1c yyy_1c _2c
1 A_512 2.150295 2.681759 2.177138 2.142790 2.115344 2.013047
2010 May 24
2
import data from a csv file
Hi all,
I have some trouble reading data from a csv file.
I used command "read.delim("clipboard") to read in the data.
> aalpha.data <- read.delim("clipboard")
> class(aalpha.data)
[1] "data.frame"
> dim(aalpha.data)
[1] 8 25
> colnames(aalpha.data)
[1] "X" "V1" "V2" "V3" "V4"
2013 Nov 18
1
Reading in csv data with ff package
I've spent some time trying to wrap my head around reading in large csv
files with the ff-package. I think I know how to do it, but am bumping
into some problems. I've tried to recreate the issues as best as I can
with a smaller example and maybe someone can help explain the problems.
The following code just creates a csv file with an integer column,
character column and logical column.
2013 Oct 30
1
unique(1:3,nmax=1) freezes R
Dear all,
I was playing around with factor contrasts, and found the argument nmax on function factor. When using nmax=1, R froze completely, and I had to close it from task manager. After some debugging, I found that the problem is actually in unique-function, where the internal unique function is called:
.Internal(unique(x, incomparables, fromLast, nmax))
More generally, it looks like
2009 Jun 14
2
read.csv
If read.csv's colClasses= argument is NOT used then read.csv accepts
double quoted numerics:
1: > read.csv(stdin())
0: A,B
1: "1",1
2: "2",2
3:
A B
1 1 1
2 2 2
However, if colClasses is used then it seems that it does not:
> read.csv(stdin(), colClasses = "numeric")
0: A,B
1: "1",1
2: "2",2
3:
Error in scan(file, what, nmax, sep,
2009 Oct 02
4
Can't access http://localhost:3000
Hello,
Pure newbie question.
After installeing Ruby, Rail and al the gems I am following the
tutorial here
http://guides.rubyonrails.org/getting_started.html
I start a server with Mongrel and it seems to be working, but when I
try to acces http://localhost:3000 i get a firefox error:
Firefox can''t establish a connection to the server at localhost:3000
I work under Vista with Ruby the
2013 Nov 04
1
A warning message generated from 'read.csv'
Hi,
I'm using R version 3.0.2. While I executed the following command
filedata <- read.csv(file, header=TRUE, colClasses="character")
I got the warning message:
In scan(file, what, nmax, sep, dec, quote, skip, nlines, ... :
EOF within quoted string
I'd like to know what this means? And how shall I fix the problem?
Thank you for your help.
Best,
Chia-Chieh Lin
2008 Sep 04
1
read.table error
Dear all,
I have a tab-delimited text (.txt) file which I'm trying to read into R. This file is of column format - there are in fact 3 columns and 259201 rows (including the column headers). I've been using the following commands, but receive an error each time which prevents the data from being read in:
> Jan <- read.table("JanuaryAvBurntArea.txt", header=TRUE)
Error in
2013 Jul 19
0
Error read.csv
Hola,
Prueba con la carga de una parte del fichero, por ejemplo los primeros
100,000 registros para ver si se reproduce el problema.
Y si no ocurre, vete incrementando la cantidad de registros. No parece ser
el problema de un registro en concreto, en ese caso el mensaje de error es
más explicito indicando la línea en la que se produce.
Otra alternativa, si no tienes inconveniente, es adjuntar el