similar to: residual plots

Displaying 20 results from an estimated 30000 matches similar to: "residual plots"

2004 Nov 02
2
Problems with Durbin Watson and Partial Residual Plots
I am trying to evaluate a model by using the commands durbin.watson and cr.plot. However, I keep getting errors that I can't figure out. A description follows. Does anyone have a hint as to what may be wrong? 1)The Durbin Watson Test. In running the command I kept getting the message "residuals include missing values" when actually this was NOT the case. Example:
2014 Jun 20
2
[PATCH] stream_encoder : Improve selection of residual accumulator width
On Fri, Jun 20, 2014 at 01:21:03PM +0400, lvqcl wrote: > Miroslav Lichvar ?????: > > > +/* > > + * This is used to avoid overflow with unusual signals in 32-bit > > + * accumulator in the *precompute_partition_info_sums_* functions. > > + */ > > +#define FLAC__MAX_EXTRA_RESIDUAL_BPS 4 > > > + /* WATCHOUT: "+ bps +
2010 Jul 09
1
Appropriate tests for logistic regression with a continuous predictor variable and Bernoulli response variable
I have a data with binary response variable, repcnd (pregnant or not) and one predictor continuous variable, svl (body size) as shown below. I did Hosmer-Lemeshow test as a goodness of fit (as suggested by a kind “R-helper” previously). To test whether the predictor (svl, or body size) has significant effect on predicting whether or not a female snake is pregnant, I used the differences between
2014 Jun 19
7
[PATCH] stream_encoder : Improve selection of residual accumulator width
In the precompute_partition_info_sums_ function, instead of selecting 64-bit accumulator when the signal bps is larger than 16, revert to the original approach based on partition size, but make room for few extra bits to not overflow with unusual signals where the average residual magnitude may be larger than bps. It slightly improves the performance with standard encoding levels and 16-bit files
2006 Jul 26
2
residual df in lmer and simulation results
Hello. Douglas Bates has explained in a previous posting to R why he does not output residual degrees of freedom, F values and probabilities in the mixed model (lmer) function: because the usual degrees of freedom (obs - fixed df -1) are not exact and are really only upper bounds. I am interpreting what he said but I am not a professional statistician, so I might be getting this wrong... Does
2011 Nov 23
1
How to explain interaction variable in Linear regression?
Hello everyone, Recently, I faced a problem on explanatory of *Interaction variable* in Linear Regression, could anyone give me some help on how to explain that? the response variable Y is significantly correlated with *Interaction variable X* which is consisted of Continuous predictor A and Categorical predictor B. The Categorical predictor B has two factors B1 (value=1) and B2 (value=0). The
2011 Nov 15
2
Models with ordered and unordered factors
Hello; I am having a problems with the interpretation of models using ordered or unordered predictors. I am running models in lmer but I will try to give a simplified example data set using lm. Both in the example and in my real data set I use a predictor variable referring to 3 consecutive days of an experiment. It is a factor, and I thought it would be more correct to consider it ordered. Below
2011 Nov 20
3
logistic regression by glm
HI I use glm in R to do logistic regression. and treat both response and predictor as factor In my first try: ******************************************************************************* Call: glm(formula = as.factor(diagnostic) ~ as.factor(7161521) + as.factor(2281517), family = binomial()) Deviance Residuals: Min 1Q Median 3Q Max -1.5370 -1.0431 -0.9416 1.3065 1.4331 Coefficients:
2006 Jan 15
1
problems with glm
Dear R users, I am having some problems with glm. The first is an error message "subscript out of bounds". The second is the fact that reasonable starting values are not accepted by the function. To be more specific, here is an example: > success <- c(13,12,11,14,14,11,13,11,12) > failure <- c(0,0,0,0,0,0,0,2,2) > predictor <- c(0,80*5^(0:7)) >
2010 Sep 08
3
regression function for categorical predictor data
Hi, do you guys know what function in R handles the multiple regression on categorical predictor data. i.e, 'lm' is used to handle continuous predictor data. thanks, karena -- View this message in context: http://r.789695.n4.nabble.com/regression-function-for-categorical-predictor-data-tp2532045p2532045.html Sent from the R help mailing list archive at Nabble.com.
2011 Jul 29
4
finding a faster way to run lm on rows of predictor matrix
Hi, everyone. I need to run lm with the same response vector but with varying predictor vectors. (i.e. 1 response vector on each individual 6,000 predictor vectors) After looking through the R archive, I found roughly 3 methods that has been suggested. Unfortunately, I need to run this task multiple times(~ 5,000 times) and would like to find a faster way than the existing methods. All three
2012 Mar 03
1
interpreting the output of a glm with an ordered categorical predictor.
Greetings. I'm a Master's student working on an analysis of herbivore damage on plants. I have a tried running a glm with one categorical predictor (aphid abundance) and a binomial response (presence/absence of herbivore damage). My predictor has four categories: high, medium, low, and none. I used the "ordered" function to sort my categories for a glm. ah <-
2012 Jan 06
0
plots for residual analysis
Hello List: I'm writing in R some code to produce plots for residual analysis and diagnostics in linear regressions.  An example of the plots produced is given for downloading at http://dl.dropbox.com/u/25445316/res_plots.png .  Regarding the example plot, I'd like to point out that: 1) Tendency lines based on lowess estimations are drawn as green continuous curves, 2) "Extra"
2012 Dec 06
1
gamcheck doubts
Dear All, I am fitting scallop count data to negative binomial GAMs. I have two significant parameters that explain 43%of the deviance. The adjusted r square is 0.25. The gam.check function gives me the figure attached. In the graph of linear predictor vs. residuals there seems to be more negative residual values than positive. Is that telling me that the fit is underestimating the response? Can
2011 Sep 15
1
MCMCglmm heteroscedasticity dependent on predictor
Hi, I have a dataset where the residual variance decreases with on one of the predictors (population size). Currently, the full model looks like this: prior<-list(R=list(V=1e-16, nu=-2),G1=list(V=diag(2), nu=2)) m<-MCMCglmm(response~poly(population size,2)*poly(other predictor,2)+time, random=~us(1+time):population, data=data, prior=prior) Basically, it's a random regression with
2009 Sep 01
1
understanding the output from gls
I'd like to compare two models which were fitted using gls, however I'm having trouble interpreting the results of gls. If any of you could offer me some advice, I'd greatly appreciate it. Short explanation of models: These two models have the same fixed-effects structure (two independent, linear effects), and differ only in that the second model includes a corExp structure for
2012 Mar 16
1
multivariate regression and lm()
Hello, I would like to perform a multivariate regression analysis to model the relationship between m responses Y1, ... Ym and a single set of predictor variables X1, ..., Xr. Each response is assumed to follow its own regression model, and the error terms in each model can be correlated. Based on my readings of the R help archives and R documentation, the function lm() should be able to
2012 Sep 05
2
Improvement of Regression Model
Hello folks, I am on learning phase of R. I have developed Regression Model over six predictor variables. while development, i found my all data are not very linear. So, may because of this the prediction of my model is not exact. Here is the summary of model : Call: lm(formula = y ~ x_1 + x_2 + x_3 + x_4 + x_5 + x_6) Residuals: Min 1Q Median 3Q Max -125.302
2011 Feb 16
1
retrieving partial residuals of gam fit (mgcv)
Dear list, does anybody know whether there is a way to easily retrieve the so called "partial residuals" of a gam fit with package mgcv? The partial residuals are the residuals you would get if you would "leave out" a particular predictor and are the dots in the plots created by plot(gam.object,residuals=TRUE) residuals.gam() gives me whole model residuals and
2010 Dec 03
2
difference between linear model & scatterplot matrix
Dear R-users, I'm studing a DB, structured like this (just a little part of my dataset): _____________________________________________________________________________________________________________ Site Latitude Longitude Year Tot-Prod Total_Density dmp Dendoudi-1 15.441964 -13.540179 2005 3271.16 1007 16993.25 Dendoudi-2 15.397321 -13.611607