Displaying 14 results from an estimated 14 matches for "bigmatrix".
2012 Sep 13
0
bigmatrix and irlba
Hello, good mornig, i have one questions, anybody know how to calculate svd of a matrix of the library bigmatrix with library irlba? thanks
[[alternative HTML version deleted]]
2019 Jul 19
1
difficulty with sanitizer using bigmemory
...ng-UBSAN/bigKRLS>
gcc-UBSAN <https://www.stats.ox.ac.uk/pub/bdr/memtests/gcc-UBSAN/bigKRLS>
Clicking through, you find that
> test_check("bigKRLS")
gauss_kernel.cpp:38:40: runtime error: member call on address
0x6120001f2c40 which does not point to an object of type 'BigMatrix'
0x6120001f2c40: note: object is of type 'FileBackedBigMatrix'
[details omitted]
SUMMARY: UndefinedBehaviorSanitizer: undefined-behavior
gauss_kernel.cpp:38:40 in
/data/gannet/ripley/R/test-clang/bigmemory/include/bigmemory/BigMatrix.h:41:28:
runtime error: member access within addr...
2001 Nov 05
1
Why doesn't outer work?
...functions. One of these works just as expected and the other almost works!
Having spent 2 days trying to work out what's going wrong I have decided to
ask for help. The R-code is given below, the first part sets various
parameters, then there are several functions. The main functions are called
bigmatrix and new.bigmatrix which construct the matrices, these should give
the same answers, as all I've done is replace the nested loops with outer
functions. The Bmatrix calculation is correct but the Pmatrix calculation
gets some of the answers wrong. The final lines of code test the functions.
Any...
2005 Mar 07
1
Faster way of binding multiple rows of data than rbind?
...t contains the row numbers of data taken from several
filtering operations performed on a large data frame (20,000rows x 500cols).
In order to output this subset of data, I've been looping through the vector
containing the row numbers (keepRows).
output <- data.frame(row.names = rownames(bigMatrix))
for(i in keepRows)
{
output <- rbind(output, bigMatrix[i, ])
}
As you may guess, doing all of these rbinds takes a LOT of time, so I'm
wondering if there's a workaround where I can maybe use an intermediate
matrix-like object to store the loop output, and then coerce it back to a...
2011 Jan 16
1
Memory issues
...I am running a sort of LASSO regression on several subsets of a big dataset.
For some subsets it works well, and for some bigger subsets it does not
work, with errors of type "cannot allocate vector of size 1.6Gb". The error
occurs at this line of the code:
example <- cv.glmnet(x=bigmatrix, y=price, nfolds=3)
It also depends on the number of variables that were included in
"bigmatrix".
I tried on R and R64 for both Mac and R for PC but recently went onto a
faster virtual machine on Linux thinking I would avoid any memory issues. It
was better but still had some limits,...
2013 Jan 14
1
ginv / LAPACK-SVD causes R to segfault on a large matrix.
...help in reproducing a problem I am having That is
only reproducible on a large-memory machine. Whenever I run the following
lines, get a segfault listed below:
*** caught segfault ***
address 0x7f092cc46e40, cause 'invalid permissions'
Traceback:
1: La.svd(x, nu, nv)
2: svd(X)
3: ginv(bigmatrix)
Here is the code that I run:
require(MASS)
l=30000
w=30000
x=rpois(l*w,0.00126)
bigmatrix=matrix(x,nrow=l,ncol=w)
inverted=ginv(bigmatrix)
I have tried this both with OMP_NUM_THREADS=1 and greater than 1, and the
resulting is always a segfault. The max memory used in around 40G. I am
running t...
2007 Dec 08
2
NAMESPACE choices for exporting S4 methods
We are building a package, and want to create S4 methods for both head and
mean for our own BigMatrix class. Following the recommendation in "Writing
R Extensions" we use exportMethods instead of export in NAMESPACE (this is
described as being "clearer"). This works for head, but not for mean.
Obviously we importFrom(utils, head), but don't need to do this for mean,
which...
2005 Jun 14
5
load ing and saving R objects
Does anyone know a way to do the following:
Save a large number of R objects to a file (like load() does) but then
read back only a small named subset of them . As far as I can see,
load() reads back everything.
The context is:
I have an application which will generate a large number of large
matrices (approx 15000 matrices each of dimension 2000*30). I can
generate these matrices using an
2012 Feb 29
0
Question about tables in bigtabulate
...big.matrix(nrow = 100, ncol = 10)
test[,1:3]<- sample(150)
test[,4:6]<- sample(100)
test[,7:10]<- sample(100)
## so we have sample big memory matrix. Its not file backed but will do
for testing.
## the result we want is one that you would get if you could run table()
on the bigmatrix
## thats emulated in this example by coercing the bigmatrix to a matrix.
## in the real application that is not possible, because of RAM limits
P <- table(as.matrix(test))
## the package big tabulate has a version of table called bigtable.
## you can run table on an individual column...
2009 Mar 16
2
FW: Select a random subset of rows out of matrix
Dear all,
I have a large dataset (N=100,000 with 89 variables per subject). This dataset is stored in a 100.000 x 89 matrix where each row describes one individual and each column one variable.
What is the easiest way of selecting a subset of let's say 1.000 individuals out of that whole matrix?
Thanks,
Michael
Michael Haenlein
Associate Professor of Marketing
ESCP-EAP European School of
2006 May 11
4
data input strategy - lots of csv files
Good morning,
I have currently 63 .csv files most of which have lines which look like
01/06/05,23445
Though some files have two numbers beside each date. There are
missing values, and currently the longest file has 318 rows.
(merge() is losing the head and doing runaway memory allocation - but
thats another question - I'm still trying to pin that issue down and
make a small repeatable
2011 Jun 24
1
Installation of bigmemory fails
...----------------------
* installing *source* package 'bigmemory' ...
checking for Sun Studio compiler...no
checking for Darwin...yes
** libs
g++45 -I/usr/local/lib/R/include -I../inst/include -fpic -O2
-fno-strict-aliasing -pipe -Wl,-rpath=/usr/local/lib/gcc45 -c B\
igMatrix.cpp -o BigMatrix.o
g++45 -I/usr/local/lib/R/include -I../inst/include -fpic -O2
-fno-strict-aliasing -pipe -Wl,-rpath=/usr/local/lib/gcc45 -c S\
haredCounter.cpp -o SharedCounter.o
g++45 -I/usr/local/lib/R/include -I../inst/include -fpic -O2
-fno-strict-aliasing -pipe -Wl,-rpath=/usr/local/lib/gcc45 -c b\...
2011 May 05
0
problem with cor() using bigmemory
...eed to analyze, currently I need to work with a 200000 by 200000 matrix, I'm using the package bigmemory but so far I can only allocate a 66000 by 66000 matrix, when I increase those values I get the following error:
> AdjMat <- big.matrix(nrow=68000,ncol=68000)
Cannot allocate memory
BigMatrix.cpp line 225
Error in big.matrix(nrow = 68000, ncol = 68000) :
Error: memory could not be allocated for instance of type big.matrix
As a part of my analyisis I need to calculate de correlation coefficient, but when I try to do that in a "smaller" matrix I get this other error. ...
2012 May 05
2
looking for adice on bigmemory framework with C++ and java interoperability
I work with problems that have rather large data requirements -- typically
a bunch of multigig arrays. Given how generous R is with using memory, the
only way for me to work with R has been to use bigmatrices from bigmemory
package. One thing that is missing a bit is interoperability of bigmatrices
with C++ and possibly java. What i mean by that is API that would allow
read and write filebacked