Hi,
I am working on a computer 64-bit OS, with 7.8 GB usable memory (RAM). The
allocated quota by my administrator is 12 GB.
Now, I use R's 'spdep' package to run an hedonic pricing model,
using the function errorsarlm and the following data:
1) A spatial weights matrix, converted from a .gwt file to a listw (by
means of the nb2listw function; of 1.7 mb). It is in fact a k=4 nearest neighbor
matrix for 85684 regions (# of obervations):
Characteristics of weights list object:
Neighbour list object:
Number of regions: 85684
Number of nonzero links: 342736
Percentage nonzero weights: 0.004668316
Average number of links: 4
Non-symmetric neighbours list
Link number distribution:
4
85684
Weights style: W
Weights constants summary:
n nn S0 S1 S2
W 85684 7341747856 85684 34664.44 377277.6
2) A CSV data set of 246.3 mb, containing all my variables. Of the 177
variables in this data set, I use 80 variables in the errorsarlm model. Each
variable has 85684 observations.
When I run a simple linear regression (lm) based on the 80 variables, I have no
problems. But, when I run the errorsalm model I immediately get the following
message:
'Error in matrix(0, nrow = n, ncol = n) : too many elements specified'
What I don't know is whether the matrix is sparse (weights of 0.25 0.25 0.25
0.25 for 4 neighbors, and zeros for the remaining 85680 observations) or not.
If errorsarlm works with a sparse matrix, then I understand that I would need
much more memory. In that case, is there a way around it? A quick trial with the
packages 'ff' and 'biglm' don't resolve anything and the
'bigmemory' package is not available for my R version (the most recent
one).
Some direction would be highly appreciated.
Regards,
Diana
[[alternative HTML version deleted]]