Displaying 20 results from an estimated 50000 matches similar to: "R-beta: memory"
1997 Dec 08
3
R-alpha: Bug in tapply in the Windows version of September
The function tapply is not working in the Windows version of R=20
(Version 0.50 Beta (Sept 29, 1997))
In
tapply <- function (x, INDEX, FUN=3DNULL, simplify=3DTRUE, ...)=20
...
The part:
if (simplify && all(unlist(lapply(ans, length)) =3D=3D 1)) {
ans <- unlist(ans, recursive =3D FALSE)
names(ans)<-namelist[[1]]
return(ans)
}
should be replaced by
if (simplify
2004 Jun 30
1
GLM problem
HI, I am a studient, so don't be surprise if my question seems so simple for you...
I have a dataframe with 6 qualitative variables divided in 33 modalities, 2 qualitatives variables and 78 lines. I use a glm to know wich variables have interactions... I would like to know if its normal that one (the first in alphabetical order) of the modalities of each qualitatives variable doesn't
1998 Mar 02
0
R-beta: "out of virtual memory"
Hi,
We just ran up against a "Memory exhausted" error trying to manipulate a
7Mb data file on R 0.61.1 FreeBSD-2.2.5, so I checked the manpage and
discovered the "-v" option. However, I can only run "R -v 11". Anything
higher gives the following:
> R -v 12
R : Copyright 1998, Robert Gentleman and Ross Ihaka
Version 0.61.1 Alpha (January 12, 1998)
[...startup
1997 Oct 17
1
R-beta: memory problem vith "dist" on W95
Using Rseptbeta for Windows 95 I encountered this problem:
> library(mva)
> data(quakes)
> dist(quakes)
Error: memory exhausted
I'm using a pentium 133 with 32 MB ram memory!
What I must to do?
Thanks and excuse me for my english!
Andrea Rossetti, rossetti at stat.unipg.it
_______________________________________________________
Statistica & Informatica per la Gestione delle
2009 Jul 10
1
generalized linear model (glm) and "stepAIC"
Hi,
I'm a very new user of R and I hope not to be too "basic" (I tried to
find the answer to my questions by other ways but I was not able to).
I have 12 response variables (species growth rates) and two
environmental factors that I want to test to find out a possible
relation.
The sample size is quite small: (7<n<12, depending on each species-case).
I performed a
1998 Jun 30
0
R-beta: stable distribution and stable glm package
I have just uploaded the package "stable-0.1.tgz" to the contrib section
of CRAN.
It enables to compute the density ('dstable'), the distribution
('pstable'), the quantile ('qstable') and the hazard ('hstable')
functions of a stable variate.
'stable.mode' computes the mode of a stable distribution.
The procedure 'stableglm' also enables
1997 Nov 27
2
R-beta: Memory Management in R-0.50-a4
Dear R users
we're having a problem reading a largish data file using
read.table(). The file consists of 175000 lines of 4
floating pt numbers. Here's what happens:
> dat_read.table('sst.dat')
Error: memory exhausted
(This is line 358 of src/main/memory.c).
Cutting down the file to around 15000 lines allows
read.table() to work OK.
I edited the memory limits in Platform.h
1998 Jun 03
1
R-beta: offset and glm again
I guess I understand it now (although it is surprising to me).
The following is a valid model formula
fred ~ wilma + offset(barney)
that sets the model offset to barney.
Given that this works, it would seem that one could remove the
offset =
argument from the glm call (and document the offset feature somewhere).
Too bad that one can't set weights the same way.
The anova bug when offsets
2008 Sep 03
1
many correlations
I have one hundred and six independent variable that I would like to
preform a correlation analysis on. Is there anyway to only get the
values that are abolute value 0.6 or greater.
thanks
--
Stephen Sefick
Research Scientist
Southeastern Natural Sciences Academy
Let's not spend our time and resources thinking about things that are
so little or so large that all they really do for us is
1998 Apr 04
2
R-beta: standard-errors-glm
I have a small problem. I am running glm() in R-0.61.0 on Redhat 4.2.
I want to get the standard errors from the output. If I do
out <- glm(....)
summary(out)
I get the coefficients printed as well as their correlation matrix. If I do
out$coefficients I get the coefficients
out$fitted gives me the fitted values
I can then assign the fitted values or the value of the estimated
1998 May 07
2
R-beta: 0.61.3: Problems on DEC Unix 4.0
Hi,
I had some trouble compiling the new R version on my Alpha:
make[2]: Entering directory `/usr2/local/R/src/graphics'
cc -ieee_with_inexact -O -Olimit 2000 -I/usr/local/include -I../include -c
gdevice.c
cc -ieee_with_inexact -O -Olimit 2000 -I/usr/local/include -I../include -c
graphics.c
cc: Error: graphics.c, line 808: An unexpected newline character is present in a
string literal.
1998 May 07
2
R-beta: 0.61.3: Problems on DEC Unix 4.0
Hi,
I had some trouble compiling the new R version on my Alpha:
make[2]: Entering directory `/usr2/local/R/src/graphics'
cc -ieee_with_inexact -O -Olimit 2000 -I/usr/local/include -I../include -c
gdevice.c
cc -ieee_with_inexact -O -Olimit 2000 -I/usr/local/include -I../include -c
graphics.c
cc: Error: graphics.c, line 808: An unexpected newline character is present in a
string literal.
2005 Feb 22
3
Reproducing SAS GLM in R
Hi,
I'm still trying to figure out that GLM procedure in SAS.
Let's start with the simple example:
PROC GLM;
MODEL col1 col3 col5 col7 col9 col11 col13 col15 col17 col19 col21 col23
=/nouni;
repeated roi 6, ord 2/nom mean;
TITLE 'ABDERUS lat ACC 300-500';
That's the same setup that I had in my last email. I have three factors:
facSubj,facCond and facRoi. I had this pretty
2001 Aug 28
1
Error: vector memory exhausted (limit reached?)
While runing R1.3.0 on a Win95 machine,
a process has produced the following error:
Error: vector memory exhausted (limit reached?)
In addition: Warning message:
Reached total allocation of 47Mb: see help(memory.size)
Lost warning messages
Now I cannot save the work (not too bad,
I had used save.image() just prior to launch
the process), but cannot quit the program:
> save.image()
Error:
2009 Jun 11
1
standard error beta glm
Dear All,
The std. error of the estimated coefficients
obtained by the summary.lm function can be calculated
as:
y=rnorm(20)
x=y+rnorm(20)
fit <- lm(y ~ x)
summary(fit)
sqrt( sum(fit$resid**2)/fit$df.resid * solve(t(model.matrix(fit))%*%model.matrix(fit)) )
Is posible calculate Std. Error for glm as lm, using
cov(hat beta) = phi * solve(t(X) %*% hat W %*% X)^-1
on R? Who is hat W and
1998 Mar 09
2
R-beta: read.table and large datasets
I find that read.table cannot handle large datasets. Suppose data is a
40000 x 6 dataset
R -v 100
x_read.table("data") gives
Error: memory exhausted
but
x_as.data.frame(matrix(scan("data"),byrow=T,ncol=6))
works fine.
read.table is less typing ,I can include the variable names in the first
line and in Splus executes faster. Is there a fix for read.table on the
way?
1998 Nov 26
1
heap memory exhausted
Hi
I always have following error message when I try to read
a big ascii-file:
> inzp<-read.data()
> Error: heap memory (1953 Kb) exhausted [needed 0 Kb more]
read.data() is a small function that reads the ascii-file.
When I cut the ascii-file to a small one I don't have this
problem.
Can I extend this 'heap memory' for reading big data files? How?
I'm working with R on a
2014 Jun 24
1
Dsync replication one-way
Hi all,
I know that I can do a backup using doveadm:
doveadm backup -u user ssh backup.server doveadm dsync-server -u user
But it's possible to use replicator service to do the same job? Or dsync
with replicator can only be used in a two-way environment?
Thanks,
Muriel
2020 Nov 02
2
Error: vector memory exhausted (limit reached?)
Hello
I have a question about the error: vector memory exhausted (limit reached?). I have the R-studio version 4.0.3 (2020-10-10).
I have a MacBook Air (13-inch, 2017). I am trying to open a dataset file ?data_dta? through the import dataset ?file from stata? button.
I already did this with other datasets, and this works fine. Now, I want to work with a bigger dataset of 4.08 GB. When I try to
1998 Aug 28
0
R-beta: R-0.62.3 is released
I have just put R-0.62.3.tgz and R-0.62.2-0.62.3.diff.gz into the FTP
area at Auckland. As usual, do not fetch it from there unless
absolutely urgent, because of the NZ Internet billing system. The
files should get mirrored to the main CRAN site in Vienna tonight and
the rest of CRAN within days.
[And, may I add, the NZ connection is slower than a sloth in a tarpit.
I had turnaround times of up