similar to: linux batch question

Displaying 20 results from an estimated 10000 matches similar to: "linux batch question"

2008 Oct 29
1
FW: Re: linux batch question
Hi Phil: That's EXACTLY what it is. Thanks so much. It's nice to know that the R Gods don't hate me. I hope it's okay that I'm going to cc r-help in case this thread comes up in the future and also so that other people who might want to help know that it's solved. Thanks again. On Wed, Oct 29, 2008 at 5:01 PM, Phil Spector wrote: > Mark: > >> delete
2006 Jun 04
2
difference in behavior between batch and source
Hi : I am using R 2.20 on windows XP and I have a REALLY long read.table statement because the col.names argument has 440 character strings in it. ( I use python to write R code ). When I run the read.table statement inside an R program ( the R program only consists of the read.table statement ) using the source command in an interactive R session, everything works fine. But, if I take the same
2008 Sep 11
1
plot of all.effects object
All, I'm trying to plot an all.effects() object, as shown in the help for all.effects and also Crawley's R book (p.178, 2007). The data has a repeated measures structure, but I'm using all.effects for the simple lm() fit here. Below is a reproducible example that yields the error message. fm.ex = lm(dv ~ time.num*drug*X, data = dat.new) fm.effects = all.effects(fm.ex, xlevels =
2007 Nov 01
2
unable to install package ff
Hi all, I've had one of my most miserable R weeks in memory. I'm trying to deal with huge datasets (>1GB each) but am running up against those pesky memory limits. The libraries filehash and g.data are not very suitable for what I need. I haven't gotten into the sql thing yet. Most recently I've been trying to install the new package ff (not yet on the CRAN repository). I
2010 Jan 21
0
filehash does not install on FreeBSD
Trying to install package 'filehash' I get the following error on FreeBSD 9.0-CURRENT (amd64) with R version 2.11.0 (2010-01-15 r50990): ----------------------------------- R CMD INSTALL filehash_2.0-1.tar.gz * installing to library '/usr/local/lib/R/library' * installing *source* package 'filehash' ... ** libs gcc -std=gnu99 -I/usr/local/lib/R/include
2015 Apr 14
1
httpuv not installing on fedora 19
No, that's not it. The error is that you don't have the g++ binary installed. Undo that change and yum install gcc-c++. On Apr 14, 2015 8:31 AM, Mark Leeds <markleeds2 at gmail.com> wrote: > > Hi: I'm on fedora 19 ( I know. I'm behind : ) and I'm trying to install the > httpuv library > which depends on Rcpp. When I try to install it with dependencies =
2018 Sep 22
2
installing tkrplot
Hi All: At the bottom of this email is my sessionInfo and below that there is a command that shows that tcltk is installed and working. My problem is that, when trying to install tkrplot, I get the following error: R CMD INSTALL -l . tkrplot_0.0-24.tar.gz * installing *source* package 'tkrplot' ... ** package 'tkrplot' successfully unpacked and MD5 sums checked configure:
2008 Aug 28
0
Can the file locking in filehash be reused? (Was: Re: [R] [R-pkgs] filehash 2.0)
Hi (Roger), I saw the announcement of filehash v2.0 and the sentence "This development has lead to better file locking for concurrent access and faster reading and writing of data in general" caught my attention. What kind of file locking do you refer to here? I am looking for a mechanism that can be used to lock files for reading and/or writing, and I'd love to have a cross
2008 Mar 08
1
Error message while trying to update packages: Error in gzfile(file, mode) : unable to open connection
Hello, I have just installed v 2.6.2 on a new computer running Windows XP and tried to perform 'update packages' via the menu option on the R console. Any advice on the following problem is much appreciated. Bob Below are the warning and error messages received. A search of the hard drive does not reveal any file including "RtmpgMMu03/libloc" . >
2009 Mar 15
1
What is the best package for large data cleaning (not statistical analysis)?
Dear R helpers: I am a newbie to R and have a question related to cleaning large data frames in R. So far, I have been using SAS for data cleaning because my data sets are relatively large (handling multiple files, each could be as large as 5-10 G). I am not a fan of SAS at all and am eager to move data cleaning tasks into R completely. Seems to me, there are 3 options. Using SQL, ff or
2011 Nov 07
1
close but no cigar
Hi Everyone: It turns out that there's still a small ( I hope ) problem. I'm close but that only counts in horse shoes and hand grenades. Here's my problem: When trying to load a package that I am writing, the load is looking for the packageDescription function in the utils package but not finding the utils package. I looked on cran and utils is not there which makes me think that it
2011 Jan 02
1
filehash for big data
Hi all, I am trying to use the filehash library to analyze a 5M by 20 matrix with both double and string data types. After consulting a few tutorials online, it seems as though one needs to first read the data into R; then create an R object; and then assign that object a location in my computer via filehash. It seems like the benefit of this is minimizing memory allocation when running
2008 Aug 28
0
filehash 2.0
I have just uploaded to CRAN version 2.0 of the 'filehash' package. This version contains a major rewriting of many of the internals (much rewritten in C) for the DB1 format, which is the default. This development has lead to better file locking for concurrent access and faster reading and writing of data in general. In addition to rewriting the internals, I have added two modules for a
2008 Aug 28
0
filehash 2.0
I have just uploaded to CRAN version 2.0 of the 'filehash' package. This version contains a major rewriting of many of the internals (much rewritten in C) for the DB1 format, which is the default. This development has lead to better file locking for concurrent access and faster reading and writing of data in general. In addition to rewriting the internals, I have added two modules for a
2012 Apr 27
1
TikzDevice
Dear R'ers, I have trouble installing tikzDevice in Ubuntu.  When I use install.packages("tikzDevice"), it gives error message: ERROR: dependency ‘filehash’ is not available for package ‘tikzDevice’ * removing ‘/usr/local/lib/R/site-library/tikzDevice’ Then I tried filehash installation, I get the message: "package ‘filehash’ is not available (for R version 2.13.1)"
2015 Apr 14
1
httpuv not installing on fedora 19
Hi: I'm on fedora 19 ( I know. I'm behind : ) and I'm trying to install the httpuv library which depends on Rcpp. When I try to install it with dependencies = TRUE, I get the following error. ( I'm only showing the end of the install messages. Things go okay for a good while ). INSTALLATION MESSAGES #================================================================ make[1]:
2011 Sep 23
2
tikzDevice install problem
Hi everybody! I'm trying to install the tikzDevice package, and I keep on getting the > ERROR: dependency ‘filehash’ is not available for package ‘tikzDevice’ I tried install.packages('filehash') and I get > package ‘filehash’ is not available Does anybody have the same problem or any hint? thank youhelena [[alternative HTML version deleted]]
2006 Jul 07
1
computational speed question
I have a 250 row by 20,000 column dataframe called temp and I do rowaverage<-function(x) rowmeans(temp[x],na.rm=TRUE ) averages<-tapply(seq(temp),names(temp),rowaverage) averages<-do.call('cbind',averages) , is it okay that it's been running for 4 hours or does this mean that something went wrong. I am on windows XP and i did ctrl alt delete and it seems like the process is
2011 Nov 06
2
still working on building R from source
Hi Everyone: Gavin's been so generous and patient that I figured I'd give him a break and send this snag to the list. When I do get this working, I will send an "How to build R from the tarball " instructions message to this list for posterity's sake ( and myself and anyone else who doesn't know how to build R from the tarball ). So, here's my latest snag: Gavin
2012 May 04
2
Can't import this 4GB DATASET
Dear Experienced R Practitioners, I have 4GB .txt data called "dataset.txt" and have attempted to use *ff, bigmemory, filehash and sqldf *packages to import it, but have had no success. The readLines output of this data is: readLines("dataset.txt",n=20) [1] " "