Displaying 20 results from an estimated 3000 matches similar to: "Question about tables in bigtabulate"
2017 Jul 03
2
R memory limits on table(x, y) (and bigtabulate)
I have two character vectors x and y that have the following characteristics:
length(x) # same as
length(y) # 872099
length(unique(x)) # 47740
length(unique(y)) # 52478
I need to crosstabulate them, which would lead to a table with
47740*52478 # 2505299720
cells, which is more than
2^31 # 2147483648
cells, which seems to be R's limit because I am getting the error message
Error in
2017 Jul 03
0
R memory limits on table(x, y) (and bigtabulate)
Sorry, don't know enough to give you trustworthy answers, but I can
say that crashes due to (or linked to) packages should usually be
reported to the package maintainer, who can be found by the
?maintainer function. That person may not monitor this list.
Cheers,
Bert
Bert Gunter
"The trouble with having an open mind is that people keep coming along
and sticking things into it."
2019 Jul 19
1
difficulty with sanitizer using bigmemory
Dear all,
bigKRLS, which has been on CRAN for a couple of years, had to be pulled
recently due to what seems to be a sanitizer issue stemming from its use of
bigmemory. bigKRLS works fine (we?ve used it ourselves on many different
platforms and have had over 15,000 downloads without an end user reporting
difficulties because of this issue). Unfortunately, we have been unable to
reproduce the
2011 May 05
0
problem with cor() using bigmemory
Hello, I have a rather large set of data I need to analyze, currently I need to work with a 200000 by 200000 matrix, I'm using the package bigmemory but so far I can only allocate a 66000 by 66000 matrix, when I increase those values I get the following error:
> AdjMat <- big.matrix(nrow=68000,ncol=68000)
Cannot allocate memory
BigMatrix.cpp line 225
Error in big.matrix(nrow = 68000,
2011 Jun 24
1
Installation of bigmemory fails
Hello All,
I tried to intall the bigmemory package from a CRAN mirror site and
received the following output while installing. Any idea what's going
on and how to fix it? The system details are provided below.
--------------------- begin error messages -----------------------
* installing *source* package 'bigmemory' ...
checking for Sun Studio compiler...no
checking for
2010 May 10
0
bigmemory 4.2.3
The long-promised revision to bigmemory has arrived, with package
4.2.3 now on CRAN. The mutexes (locks) have been extracted and will
be available through package synchronicity (on R-Forge, soon to appear
on CRAN). Initial versions of packages biganalytics and bigtabulate
are on CRAN, and new versions which resolve the warnings and have
streamlined CRAN-friendly configurations will appear
2010 May 10
0
bigmemory 4.2.3
The long-promised revision to bigmemory has arrived, with package
4.2.3 now on CRAN. The mutexes (locks) have been extracted and will
be available through package synchronicity (on R-Forge, soon to appear
on CRAN). Initial versions of packages biganalytics and bigtabulate
are on CRAN, and new versions which resolve the warnings and have
streamlined CRAN-friendly configurations will appear
2012 Feb 21
1
tapply for enormous (>2^31 row) matrices
Hi all,
SETUP:
I have pairwise data on 22 chromosomes. Data matrix X for a given
chromosome looks like this:
1 13 58 1.12
6 142 56 1.11
18 307 64 3.13
22 320 58 0.72
Where column 1 is person ID 1, column 2 is person ID 2, column 3 can
be ignored, and column 4 is how much chromosomal sharing those two
individuals have in some small portion of the chromosome. There are
9000 individual people, and
2012 May 05
2
looking for adice on bigmemory framework with C++ and java interoperability
I work with problems that have rather large data requirements -- typically
a bunch of multigig arrays. Given how generous R is with using memory, the
only way for me to work with R has been to use bigmatrices from bigmemory
package. One thing that is missing a bit is interoperability of bigmatrices
with C++ and possibly java. What i mean by that is API that would allow
read and write filebacked
2013 Apr 29
2
bigmemory and R 3.0
Dear helpers,
Does anyone have information on the status of bigmemory and R3.0? Will it
just take time for the devs to re-code for the new environment? Or is there
an alternative for this new version?
Thanks
Ben Caldwell
[[alternative HTML version deleted]]
2001 Nov 05
1
Why doesn't outer work?
Hello
I'm a population ecologist and use R for all my stats and modelling.
Recently I have been using R to numerically solve integral projection
models. This involves constructing several large matrices. The current code
by Easterling (Size-specific sensitivity: Applying a new structured
population model. Ecology, 2000, 81, 694-708) uses nested loops to construct
the matrices. To speed up the
2013 Jan 14
1
ginv / LAPACK-SVD causes R to segfault on a large matrix.
Dear R-help list members,
I am hoping to get you help in reproducing a problem I am having That is
only reproducible on a large-memory machine. Whenever I run the following
lines, get a segfault listed below:
*** caught segfault ***
address 0x7f092cc46e40, cause 'invalid permissions'
Traceback:
1: La.svd(x, nu, nv)
2: svd(X)
3: ginv(bigmatrix)
Here is the code that I run:
2005 Mar 07
1
Faster way of binding multiple rows of data than rbind?
Hi all,
I have a vector that contains the row numbers of data taken from several
filtering operations performed on a large data frame (20,000rows x 500cols).
In order to output this subset of data, I've been looping through the vector
containing the row numbers (keepRows).
output <- data.frame(row.names = rownames(bigMatrix))
for(i in keepRows)
{
output <- rbind(output,
2011 Mar 02
2
clustering problem
Hi,
I have a gene expression experiment with 20 samples and 25000 genes each.
I'd like to perform clustering on these. It turned out to become much faster
when I transform the underlying matrix with t(matrix). Unfortunately then
I'm not anymore able to use cutree to access individual clusters. In general
I do something like this:
hc <- hclust(dist(USArrests), "ave")
2011 Nov 30
1
Error adding Bigmemory package
I am trying to add the bigmemory packages but I get the following error
message:
In file included from bigmemory.cpp:14:0:
../inst/include/bigmemory/isna.hpp: In function 'bool neginf(double)':
../inst/include/bigmemory/isna.hpp:22:57: error: 'isinf' was not declared in
this scope
make: *** [bigmemory.o] Error 1
ERROR: compilation failed for package 'bigmemory'
* removing
2011 Jan 16
1
Memory issues
Hi,
I have read several threads about memory issues in R and I can't seem to
find a solution to my problem.
I am running a sort of LASSO regression on several subsets of a big dataset.
For some subsets it works well, and for some bigger subsets it does not
work, with errors of type "cannot allocate vector of size 1.6Gb". The error
occurs at this line of the code:
example <-
2014 Jul 02
0
How do I call a C++ function (for k-means) within R?
I am trying to call a C++ k-means function within R and I am struggling. I
know that the below code is used to call a C++ function for gbm but how do I
do it for k-means?
gbm.obj <- .Call("gbm",
Y=as.double(y),
Offset=as.double(offset),
X=as.double(x),
X.order=as.integer(x.order),
2011 Sep 12
5
Hourly data with zoo
I have date data as a numeric and hourly data in 0 to 2300 hours in a dataframe.
d <- rep(20110101,24)
h <- seq(from = 0, to = 2300, by = 100)
df <- data.frame(LST_DATE = d, LST_TIME = h, data = rnorm(24, 0, 1))
S <- chron(dates. = as.character(df$LST_DATE), times. =
paste(as.character(df$LST_TIME/100), ":0:0", sep = ""),
format =
2012 Sep 13
0
bigmatrix and irlba
Hello, good mornig, i have one questions, anybody know how to calculate svd of a matrix of the library bigmatrix with library irlba? thanks
[[alternative HTML version deleted]]
2010 Mar 11
2
ANNOUNCE--Rdsm package, a threads-like environment for R
My long-promised Rdsm package is now on CRAN. Some of you may recall
that I made a prototype available on my own Web page last July. This is
the official version, much evolved since I released the prototype.
The CRAN description states:
Provides a threads-like programming environment for R, usable both on
a multicore machine and across a network of multiple machines. The
package