similar to: Error: cannot allocate vector of size X.0 Mb

Displaying 20 results from an estimated 600 matches similar to: "Error: cannot allocate vector of size X.0 Mb"

2010 Jul 19
3
"ACCTGMX" to "1223400" in R?
Hi, I am a newbie in R and was working on some DNA data represented as strings of A,C,T and G (also wild-character like M and X). I use the Bioconductor package in R. Currently I need to convert a string of the form "ACCTGMX" to "1223400" i.e. A is replaced by 1, C with 2, T with 3, G with 4 and any other character with a 0. I checked with 'replace' and also with a
2010 Nov 04
4
how to work with long vectors
HI, Dear R community, I have one data set like this, What I want to do is to calculate the cumulative coverage. The following codes works for small data set (#rows = 100), but when feed the whole data set, it still running after 24 hours. Can someone give some suggestions for long vector? id reads Contig79:1 4 Contig79:2 8 Contig79:3 13 Contig79:4 14 Contig79:5 17
2004 Feb 22
0
Network_access_denied and no group in domain
Was: RE: [Samba] samba 3.0 and freebsd 5.1 Hi Aaron, I deinstalled the 3.0.1 port and got the source tarball for 3.0.2a and installed from there. I also swapped out 3.0.1 for 3.0.2 on the domain controller when I discovered the second problem. I can now use smbclient to log into a file share on the member server, giving an " smb: \> " prompt but doing ls gives an error of:
2004 Feb 27
4
[OT] Fyodor terminates SCO nmap rights -- how about Samba?
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 As you all may know Fyodor of nmap fame has terminated SCO's rights to distribute namp with its products. See: http://www.smh.com.au/articles/2004/02/27/1077676955381.html I know this is off-topic, but I am interested in opinions on the subject of SCO using Samba in it's products while they declare the GPL is unconstitutional and invalid.
2004 Feb 19
3
wbinfo error
We have setup Samba3 and joined the server to the AD domain. problem is when I run wbinfo -u I get the error msg "Error looking up users". If I use the syntax "wbinfo -a domain+user%password -g, it rec msgs back that the passwd succeeded but still does not let me list groups.
2013 Jun 27
3
Read a text file into R with .Call()
Hi, I want to read a text file into R with .Call(). So I define some NEW_CHARACTER() to store the chracters read and use SET_STRING_ELT to fill the elements. e.g. PROTECT(qNames = NEW_CHARACTER(10000)); char *foo; // This foo holds the string I want. while(foo = readLine(FN)){ SET_STRING_ELT(qNames, i, mkChar(foo))); } In this way, I can get the desired character from qNames. The only problem
2003 Nov 20
1
samba 3.0.0 freebsd
Has anyone at all gotten the Samab 3.0 to integrate into the FreeBSD 5.1 Name Service switcher? Are there patches avaliable? Does anyone know where to get the FreeBSD nss api so I can try to fix the code my self? I keep getting the following errors in my logs NSSWITCH(nss_method_lookup): winbind, passwd, getpwnam_r, not found
2003 Nov 13
2
file permissions on home directories and admin user copying files to it
We want to copy files with the group in the admin list of the [homes] share. The problem is that the copied files then are owned by root. I know this is normal unix behavior. However we want the copied files to be owned by the user of the homeshare. I read the samba howto section "Users Cannot Write to a Public Share". Although I want to set the owner on the home shares and not on a
2008 Oct 01
1
Error: cannot allocate vector of size 117.3 Mb
Dear R users,   I am using RandomForest package. While using this package, i got "Error: cannot allocate vector of size 117.3 Mb" .........................message. I had this problem earlier too but could not manage. Is there any way to solve this problem or to increase vector size ? My data set is of 163 samples and 5546 variables. I tried through "? Memory" function to solve
2010 Jun 18
1
Error: cannot allocate vector of size 31.8 Mb
Hi, I am getting the following error while trying to run an R script: Error: cannot allocate vector of size 31.8 Mb I tried setting up memory.limit(), vsize, etc. but could not make it run. My computer has following configurations:- OS: Windows 7 Processor: Intel Core 2 Duo RAM: 4GB Thanks in advance, Harsh Yadav [[alternative HTML version deleted]]
2008 Sep 05
2
Error: can not allocate vector of 117.3 Mb
Hi R users,   I am doing multiscale bootstrapping for clustering through pvclust package. I have large data set (observations 182 and variables 5546). When i tried to make bootstrapping, then i got message as "Bootstrap (r = 0.5)............Error: cannot allocate vector of size 117.3 Mb". I am new R user and could not understand what is the problem and also don't know the easiet
2010 Aug 31
1
cannot allocate vector of size 381.5 Mb
Hi, I read some posts from the mailing list on the same problem, but it seems that i still cannot solve this problem. I only want to generate some simulated data. #Generate 2500 observations-it works without errors > coords<-as.matrix(expand.grid(seq(0,100,length.out=50), seq(0,100,length.out=50))) #SimData is a user-written function > SimBinData<-SimData(n=2500,coords=coords,
2008 Jul 20
2
Erro: cannot allocate vector of size 216.0 Mb
Please, I have a 2GB computer and a huge time-series to embedd, and i tried increasing memory.limit() and memory.size(max=TRUE), but nothing. Just before the command: > memory.size(max=TRUE) [1] 13.4375 > memory.limit() [1] 1535.875 > gc() used (Mb) gc trigger (Mb) max used (Mb) Ncells 209552 5.6 407500 10.9 350000 9.4 Vcells 125966 1.0 786432 6.0 496686 3.8
2010 Aug 05
3
Error: cannot allocate vector of size xxx Mb
I am dealing with very large data frames, artificially created with the following code, that are combined using rbind. a <- rnorm(5000000) b <- rnorm(5000000) c <- rnorm(5000000) d <- rnorm(5000000) first <- data.frame(one=a, two=b, three=c, four=d) second <- data.frame(one=d, two=c, three=b, four=a) rbind(first, second) which results in the following error for each of the
2010 Aug 31
2
Error: cannot allocate vector of size 198.4 Mb
Hi, All I have a problem of R memory space. I am getting "Error: cannot allocate vector of size 198.4 Mb" ------------------------------ I've tried with: > memory.limit(size=2047); [1] 2047 > memory.size(max=TRUE); [1] 12.75 > library('RODBC'); > Channel<-odbcConnectAccess('c:/test.MDB'); # inputdata:15 cols, 2000000
2009 Jul 01
3
"Error: cannot allocate vector of size 332.3 Mb"
Dear R-helpers, I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running Mac OS X version 10.5.6. With over 20Gb of RAM "free" (according to the Activity Monitor) the following happens. > x <- matrix(rep(0, 6600^2), ncol = 6600) # So far so good. But I need 3 matrices of this size. > y <- matrix(rep(0, 6600^2), ncol = 6600) R(3219) malloc: ***
2011 Jul 31
2
memory problem; Error: cannot allocate vector of size 915.5 Mb
Dear all, I am trying to make some matrix operations (whose size I think is smaller than what R allows) but the operations are not feasible when they run in one session but it is feasible if they run separately while each operation is totally independent of the other. I run the code in one session the error that appears is: Error: cannot allocate vector of size 915.5 Mb R(16467,0xa0421540)
2013 Mar 13
5
Copying a 900 mb file to Windows !!!
Hi, I am writing a puppet manifest to install a service pack on windows, what I observe is that in case the exe file is withing 50 - 70 Mb the transfer to windows happens without any issues. But the current service pack "windows6.1-KB976932-X64.exe" is around 900 Mb. My manifest is as follows file { ''c:/temp/windows6.1-KB976932-X64.exe'': ensure =>
2012 Jul 24
4
ERROR : cannot allocate vector of size (in MB & GB)
Hi, Here in R, I need to load a huge file(.csv) , its size is 200MB. [may come more than 1GB sometimes]. When i tried to load into a variable it taking too much of time and after that when i do cbind by groups, getting an error like this " Error: cannot allocate vector of size 82.4 Mb " My requirement is, spilt data from Huge-size-file(.csv) to no. of small csv files. Here i will give
2015 Aug 11
0
rsync stuck at +- 50 MB/s, cp and scp are +- 200 MB/s
Usually problem in encryption. try cipher arcfour or apply hpn patches to ssh. ( http://www.psc.edu/index.php/hpn-ssh) -- Eero 2015-08-11 12:37 GMT+03:00 G?tz Reinicke - IT Koordinator < goetz.reinicke at filmakademie.de>: > Hi, > > i have two servers, connected to to the lan by 10Gb with 10Gb and DAS > hardware raid. > > Each system con read and write locally or to the