similar to: Mismatches in predict(newdata)

Displaying 20 results from an estimated 10000 matches similar to: "Mismatches in predict(newdata)"

2009 Oct 08
0
predict.lm() out-of-sample predictions - problem with data classes
Hello! I'm still working on my problem, which also occurs with the predict.lm() function. - Providing newdata, which is a data.frame with all variables being "numeric", as str() shows, R tells me the following: ar1.xpred.test.pred <- predict(ar1.xpred.fitted, regdata.test, se.fit = FALSE) Fehler: variable 'lag(ret1)' was fitted with type "numeric" but type
2006 May 30
0
(PR#8905) Recommended package nlme: bug in predict.lme when an independent variable is a polynomial
Many thanks for your very useful comments and suggestions. Renaud 2006/5/30, Prof Brian Ripley <ripley at stats.ox.ac.uk>: > On Tue, 30 May 2006, Prof Brian Ripley wrote: > > > This is not really a bug. See > > > > http://developer.r-project.org/model-fitting-functions.txt > > > > for how this is handled in other packages. All model-fitting in R used =
2010 Jan 18
2
Predict polynomial problem
I have a function that fits polynomial models for the orders in n: lmn <- function(d,n){ models=list() for(i in n){ models[[i]]=lm(y~poly(x,i),data=d) } return(models) } My data is: > d=data.frame(x=1:10,y=runif(10)) So first just do it for a cubic: > mmn = lmn(d,3) > predict(mmn[[3]]) 1 2 3 4 5 6 7 8
2018 Mar 31
1
Names of variables needed in newdata for predict.glm
all.vars works fine, EXCEPT, it give a bit too much. I only want the regression variables, but in the following example I also get "k" the variable holding the chosen knots. Any machinery to find only "real" regression variables? cheers, Bendix library( splines ) y <- rnorm(100) x <- rnorm(100) k <- -1:1 ml <- lm( y ~ bs(x,knots=k) ) mg <- glm( y ~
2018 Mar 08
0
Names of variables needed in newdata for predict.glm
Hi, Some try: > names(mi$xlevels) [1] "f" > all.vars(mi$formula) [1] "D" "x" "f" "Y" > names(mx$xlevels) [1] "f" > all.vars(mx$formula) [1] "D" "x" "f" When offset is indicated out of the formula, it does not work... Marc Le 07/03/2018 ? 06:20, Bendix Carstensen a ?crit?: > I would like
2018 Mar 07
3
Names of variables needed in newdata for predict.glm
I would like to extract the names, modes [numeric/factor] and levels of variables needed in a data frame supplied as newdata= argument to predict.glm() Here is a small example illustrating my troubles; what I want from (both of) the glm objects is the vector c("x","f","Y") and an indication that f is a factor: library( splines ) dd <- data.frame( D =
2007 May 31
1
predict.nls - gives error but only on some nls objects
Dear list, I have encountered a problem with predict.nls (Windows XP, R.2.5.0), but I am not sure if it is a bug... On the nls man page, an example is: DNase1 <- subset(DNase, Run == 1) fm2DNase1 <- nls(density ~ 1/(1 + exp((xmid - log(conc))/scal)), data = DNase1, start = list(xmid = 0, scal = 1)) alg = "plinear", trace =
2008 Feb 26
1
predict.rpart question
Dear All, I have a question regarding predict.rpart. I use rpart to build classification and regression trees and I deal with data with relatively large number of input variables (predictors). For example, I build an rpart model like this rpartModel <- rpart(Y ~ X, method="class", minsplit =1, minbucket=nMinBucket,cp=nCp); and get predictors used in building the model like
2002 Jan 12
2
Bug in predict(newdata=x) with poly() (PR#1258)
Bug in predict.lm & poly The predict function doesn't work when used with poly and newdata. For example, I'd expect the following code to work, and plot a fitted cubic to the nearly straight line: x <- 1:10 y <- x + rnorm(10)/100 plot(x,y) fit <- lm(y ~ poly(x,3)) newx <- seq(1,10,len=100) lines(newx,predict(fit,newdata=data.frame(x=newx))) However, the plotted
2006 Mar 09
0
variable '%s' was fitted with class... in predict.nls()
I've tried to predict the values from a new data.frame using the nls.predict function and keep getting the error message: Error in if (sum(wrong) == 1) stop(gettextf("variable '%s' was fitted with class \"%s\" but class \"%s\" was supplied", : missing value where TRUE/FALSE needed I first thought that it was becuase there may have been something
2011 Apr 19
1
How to Extract Information from SIMEX Output
Below is a SIMEX object that was generated with the "simex" function from the "simex" package applied to a logistic regression fit. From this mountain of information I would like to extract all of the values summarized in this line: .. ..$ variance.jackknife: num [1:5, 1:4] 1.684 1.144 0.85 0.624 0.519 ... Can someone suggest how to go about doing this? I can extract the
2006 May 24
1
(PR#8877) predict.lm does not have a weights argument for
I am more than 'a little disappointed' that you expect a detailed explanation of the problems with your 'bug' report, especially as you did not provide any explanation yourself as to your reasoning (nor did you provide any credentials nor references). Note that 1) Your report did not make clear that this was only relevant to prediction intervals, which are not commonly used.
2012 Nov 01
0
oblique.tree : the predict function asserts the dependent variable to be included in "newdata"
Dear R community, I have recently discovered the package oblique.tree and I must admit that it was a nice surprise for me, since I have actually made my own version of a kind of a classifier which uses the idea of oblique splits (splits by means of hyperplanes). So I am now interested in comparing these two classifiers. But what I do not seem to understand is why the function
2010 Dec 25
2
predict.lrm vs. predict.glm (with newdata)
Hi all I have run into a case where I don't understand why predict.lrm and predict.glm don't yield the same results. My data look like this: set.seed(1) library(Design); ilogit <- function(x) { 1/(1+exp(-x)) } ORDER <- factor(sample(c("mc-sc", "sc-mc"), 403, TRUE)) CONJ <- factor(sample(c("als", "bevor", "nachdem",
2007 Jan 22
0
[UNCLASSIFIED] predict.survreg() with frailty term and newdata
Dear All, I am attempting to make predictions based on a survreg() model with some censoring and a frailty term, as below: predict works fine on the original data, but not if I specify newdata. # a model with groups as fixed effect model1 <- survreg(Surv(y,cens)~ x1 + x2 + groups, dist = "gaussian") # and with groups as a random effect fr <- frailty(groups,
2009 Oct 07
0
error using predict() / "fRegression"-package
Hello! I'm puzzled by the following problem. It occurs while trying to predict responses in a test-dataset using a linear model fitted with regFit from the rMetrics "fRegression"-package. All goes well when I call "predict" using the training dataset. However, a call using the test-dataset retuns an error message - telling me that the latter dataset provides variables
2012 May 23
1
prcomp with previously scaled data: predict with 'newdata' wrong
Hello folks, it may be regarded as a user error to scale() your data prior to prcomp() instead of using its 'scale.' argument. However, it is a user thing that may happen and sounds a legitimate thing to do, but in that case predict() with 'newdata' can give wrong results: x <- scale(USArrests) sol <- prcomp(x) all.equal(predict(sol), predict(sol, newdata=x)) ## [1]
2012 Sep 04
1
predict rpart newdata - introduce only values variables used in the tree
Dear community, I've a tree which included at first 23 variables. Then I've pruned this tree, and there are only 8 variables involved. I'd like to predict and only introduce in newdata the values of these 8 variables involved. However, as the tree was built with the 23, it asked me for 15 values, even if it doesn't need them. Is there a way to introduce only this 8 values?
2008 Apr 04
2
predict.glm & newdata
Hi all - I'm stumped by the following mdl <- glm(resp ~ . , data = df, family=binomial, offset = ofst) WORKS yhat <- predict(mdl) WORKS yhat <- predict(mdl,newdata = df) FAILS Error in drop(X[, piv, drop = FALSE] %*% beta[piv]) : subscript out of bounds I've tried without offset, quoting binomial. The offset variable ofst IS in df. Previous postings indicate possible
2008 Aug 13
3
issue building dataframes with matrices.
Hello, Is this a bug or a feature? I am using R 2.7.1 on Apple OS X. > y <- matrix(1:3,nrow=3) # y is a single-column matrix > df <-data.frame(x=1:3,y=y) > sapply(df,data.class) x y "numeric" "numeric" > df$yy <- y > sapply(df,data.class) x y yy "numeric" "numeric" "matrix"