Displaying 2 results from an estimated 2 matches for "x_as".
Did you mean:
_as
1998 Mar 09
2
R-beta: read.table and large datasets
I find that read.table cannot handle large datasets. Suppose data is a
40000 x 6 dataset
R -v 100
x_read.table("data") gives
Error: memory exhausted
but
x_as.data.frame(matrix(scan("data"),byrow=T,ncol=6))
works fine.
read.table is less typing ,I can include the variable names in the first
line and in Splus executes faster. Is there a fix for read.table on the
way?
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-....
2002 Aug 30
4
(PR#1964) The attached function working fine with R 1.3.0 but giving problem with R 1.5.1 (PR#1964)
...are?
And an example to reproduce them?
On Fri, 30 Aug 2002 stakb@nus.edu.sg wrote:
> Full_Name: Kaushik Bhattacharyya
> Version: 1.5.1
> OS: Solaris
> Submission from: (NULL) (137.132.3.10)
>
>
> Main R-function used:
>
> pp1
>
> function(X)
>
> {
>
> x_as.matrix(X)
>
> trial_function(a)
>
> {
>
> clusproj(x,a)
>
> }
>
> test.nlm_nlm(trial,rep(1,ncol(X)))
>
> theta_test.nlm$estimate
>
> theta_theta/sqrt(sum(theta^2))
>
> cluster.index_round(1/test.nlm$min,6)
>
> # print the results
>
> #####...