similar to: ancova help

Displaying 20 results from an estimated 100 matches similar to: "ancova help"

2006 Feb 19
2
possible rails -> postgresql bug
Hi I have a problem accessing an array field in a Postgresql database. Here is the table definition. View "neil.flashing_codes" Column | Type | Modifiers -------------+-----------------------+----------- code | character varying(10) | description | text | folds | integer[] | View definition: SELECT
2003 Jan 20
1
make check for R-1.6.2 on IBM AIX
Dear all, The 'make check' step fails for the pacakge mva on IBM AIX. The tail of the Rout log file looks like: > for(factors in 2:4) print(update(Harman23.FA, factors = factors)) Call: factanal(factors = factors, covmat = Harman23.cor) Uniquenesses: height arm.span forearm lower.leg weight 0.170 0.107 0.166
2004 Nov 09
3
no doubt a dumb question, but..
Yes, I am a newbie at R, but it is not the complex commands in R that have me baffled, but simple data commands. For example, why does something like: > plot(Girth ~ Height) *not* work after a command that allegedly loads the data: > data(trees) with the error message: Error in eval(expr, envir, enclos) : Object "Girth" not found but does work after the command: >
2012 Jul 12
1
Caret: Use timingSamps leads to error
I want to use the caret package and found out about the timingSamps obtion to obtain the time which is needed to predict results. But, as soon as I set a value for this option, the whole model generation fails. Check this example: ------------------------- library(caret) tc=trainControl(method='LGOCV', timingSamps=10) tcWithout=trainControl(method='LGOCV')
2007 Feb 12
0
predict on biglm class
Hi Everyone, I often use the 'safe prediction' feature available through glm(). Now, I'm at a situation where I must use biglm:::bigglm. ## begin example library(splines) library(biglm) ff <- log(Volume)~ns(log(Girth), df=5) fit.glm <- glm(ff, data=trees) fit.biglm <- bigglm(ff, data=trees) predict(fit.glm, newdata=data.frame(Girth=2:5)) ## -1.3161465 -0.2975659
2012 May 15
1
caret: Error when using rpart and CV != LOOCV
Hy, I got the following problem when trying to build a rpart model and using everything but LOOCV. Originally, I wanted to used k-fold partitioning, but every partitioning except LOOCV throws the following warning: ---- Warning message: In nominalTrainWorkflow(dat = trainData, info = trainInfo, method = method, : There were missing values in resampled performance measures. ----- Below are some
2012 Jun 09
1
caret: compare linear models of different degree
I want to use the caret package to train linear models. I want to compare these models when using different degrees (aka degrees of interaction). This is possible for the 'earth' method (using the '.degree' parameter) but I found no possibility of customizing the degree for the 'lm' method. This might be due to the fact that the basic 'lm' function does not support
2013 Nov 09
0
Standard errors in regression models with interactions terms
In a rather simple regression, I?d like to ask the question, for high trees, whether it makes a difference (for volume) whether a three is thick. If my interpretation is correct, for low trees, i.e. for which trees$isHigh == FALSE, the answer is yes. The problem is how to "merge" the standard errors. Code follows. data(trees) trees$isHigh <- trees$Height > 76 trees$isThick
2007 May 10
0
New package "earth"
The "earth" package is now available on CRAN. Earth builds models using Friedman's MARS. Earth's principal advantages over the existing function mda::mars are that it is much faster and provides plotting and printing methods. The general purpose model plotting function "plotmo" may also be useful to people who are not interested in earth itself. Example: > a
2007 May 10
0
New package "earth"
The "earth" package is now available on CRAN. Earth builds models using Friedman's MARS. Earth's principal advantages over the existing function mda::mars are that it is much faster and provides plotting and printing methods. The general purpose model plotting function "plotmo" may also be useful to people who are not interested in earth itself. Example: > a
2007 Oct 23
0
Residuals from biglm package
Hi all, first of all, I'm not an expert on R, I'm still learning, so sorry if this is a stupid question... I have a large dataset that is to big for my computer memory, and I found quite useful the package biglm. Now everything is working perfectly. But if I want the residuals, how I can do it? Let's say that we are running the example: > data(trees)>
2007 Aug 25
3
fill circles
Hi all, I'm an R newbie, I did this script to create a scatterplot using the "tree" matrix from "datasets" package: library('datasets') with(trees, { plot(Height, Volume, pch=3, xlab="Height", ylab="Volume") symbols(Height, Volume, circles=Girth/12, fg="grey", inches=FALSE, add=FALSE) } ) I'd like to use the column Named
2008 Oct 23
2
map points from scatterplot3d onto 2d fitted plane
Dear R helpers, I have a 3D scatter plot that I have generated from scatterplot3d (which looks great- thanks!) and I can see that the points in my graph fall in a plane. Following the example 5 from 3D scatter plot (below) I have fitted a regression plane. Now what I would like to do is a rotation so that my new co-ordinate system is about the fitted plane (by finding the normal to the plane
2008 Jul 24
1
Problem with scatterplot3d example
I tried to run the following example from section 4.1.4 of the "Scatterplot3d - an R package for Visualizing Multivariate Data" vignette and got an error on the part that plots the regression plane: > library(scatterplot3d) > data(trees) > s3d <- scatterplot3d(trees, type = "h", color = "blue", + angle = 55, scale.y = 0.7, pch = 16, main = "Adding
2006 Feb 27
1
4D stacked column chart, Excel -> R
Hi All. I'd like to programm a 4 dimensional chart in R. Acctually I wanted to solve that problem in Excel cause I had the data there. Here is a link of my actual problem description (there are some chart pictures as well).... http://www.mrexcel.com/board2/viewtopic.php?t=187336&highlight=stacked+column because I still couldn't solve that problem I came to R. The chart should be
2008 Jul 14
0
rgl.snapshot on linux produces colored lines only
I'm having a problem with rgl.snapshot. If I run the following code *once* and *once only*, all is well. data(trees) attach(trees) plot3d(Height, Girth, Volume, type='s') rgl.snapshot("/home/user/pic.png", fmt="png", top=TRUE ) And the pic.png looks right. But if I try to rerun rgl.snapshot anytime again in this session, the file pic.png still outputs but is
2008 Feb 20
0
igraph package, version 0.5
igraph is a package for graphs and networks. It has a C core and uses a simple and fast graph representation allowing millions of vertices and edges. NEW FEATURES: - We use the ARPACK library for graph related eigenvalue problems, like Page Rank calculation, Kleinberg's hub and authority scores, eigenvector centrality, etc. There is also a generic interface if someone wants to use
2008 Feb 20
0
igraph package, version 0.5
igraph is a package for graphs and networks. It has a C core and uses a simple and fast graph representation allowing millions of vertices and edges. NEW FEATURES: - We use the ARPACK library for graph related eigenvalue problems, like Page Rank calculation, Kleinberg's hub and authority scores, eigenvector centrality, etc. There is also a generic interface if someone wants to use
2002 Feb 28
1
get deviance from glm() for given parameter values
Dear all, I would like to get glm() return its results (at least the deviance) for some given parameter values (ie without actually fitting the model). I tried to set `maxit = 0' but this does not work, eg: > glm(y ~ x, start = c(1, 1), maxit = 0) Error in glm.control(...) : maximum number of iterations must be > 0 Any idea? Thanks in advance. Emmanuel Paradis
2006 May 17
1
Re : Large database help
Thanks for doing this Thomas, I have been thinking about what it would take to do this, but if it were left to me, it would have taken a lot longer. Back in the 80's there was a statistical package called RUMMAGE that did all computations based on sufficient statistics and did not keep the actual data in memory. Memory for computers became cheap before datasets turned huge so there