You would need 2TB (2,000,000,000,000) to store a single copy of your
data. You probably need to rescale your problem. Even if you had the
memory, the computation would take a very long time.
On 4/18/07, Hong Su An <anhong at msu.edu> wrote:> Dear All:
>
> Pleas help me to increase the memory in R.
>
> I am trying to make euclidean distance matrix.
> The number of low in data is 500,000. Therefore, the dimension of euclidean
> distance matrix is 500,000*500,000.
>
> When I run the data in R. R could not make distance matrix because of
memory
> allocation problem.
>
> In order increase memory, I read the FAQ and follow the instruction as
> below:
>
> You may also set the amount of available memory manually. Close R, then
> right-click on your R program icon (the icon on your desktop or in your
> programs directory). Select ``Properties'', and then select the
``Shortcut''
> tab. Look for the ``Target'' field and after the closing quotes
around the
> location of the R executible, add
>
> --max-mem-size=500M
>
> It does not work.
> I have tried other computers in also does not work.
>
>
>
> When I add the --max-mem-size=3Gb in Target field. There is error like as
> below:
>
> "The name "C:\Documents and Settings\Hong Su An\My
> Documents\R\R-2.4.1\bin\Rgui.exe"--max-mem-size=1024M specified in the
> Target box is not valid"
>
> I use R2.4.1. in window xp with sp2 and 4Gb RAM.
>
> Have a nice day.
>
> Hong Su An.
> anhong at msu.eud
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Jim Holtman
Cincinnati, OH
+1 513 646 9390
What is the problem you are trying to solve?