Displaying 20 results from an estimated 27 matches for "torvon".
2012 Nov 29
2
Confidence intervals for estimates of all independent variables in WLS regression
...rdized beta weights) of each predictor in a WLS regression:
m1 = lm(x~ x1+x2+x3, weights=W, data=D)
SPSS offers that output by default, and I am not able to find a way to do
this in R. I read through predict.lm, but I do not find a way to get the
CIs for multiple independent variables.
Thank you
Torvon
[[alternative HTML version deleted]]
2012 Oct 07
3
Robust regression for ordered data
...two
questions:
(1) Some sources say robust regression take care of both lack of normal
distribution and heteroscedasticity, while others say only of normal
distribution. What is true?
(2) Are there ways of using robust regressions with ordered data, or is
that only possible for metric DVs?
Thanks
Torvon
[[alternative HTML version deleted]]
2013 Apr 08
1
qgraph: correlation matrix variable names
...variables.
In a correlation matrix, the first row and the first column usually have
variable names. We've been unsuccessful so far to read such a file into
qgraph, and haven't found to manually assign names to variables in qgraph.
Would you know of a solution to this problem?
Thank you,
Torvon
[[alternative HTML version deleted]]
2013 Feb 12
1
Exact p-values in lm() - rounding problem
...I have the feeling that
p-values are rounded to the smallest value of "2e-16", because this p-value
is very common.
Is that true or just chance? If it is true, how do I obtain the "true"
unrounded p-values for these regressors?
m1 <- lm(y ~ x1+x2+x3+4+x5, data=D)
Thank you
Torvon
[[alternative HTML version deleted]]
2013 Jan 23
1
Regression with 3 measurement points
Dear R Mailinglist,
I want to understand how predictors are associated with a dependent
variable in a regression. I have 3 measurement points. I'm not interested
in understanding the associations of regressors and the predictor at each
measurement separately, instead I would like to use the whole sample in one
regression, "pooling" the measurement points.
I cannot simply throw them
2012 Nov 21
1
Regression: standardized coefficients & CI
I run 9 WLS regressions in R, with 7 predictors each.
What I want to do now is compare:
(1) The strength of predictors within each model (assuming all predictors
are significant). That is, I want to say whether x1 is stronger than x2,
and also say whether it is significantly stronger. I compare strength by
simply comparing standardized beta weights, correct? How do I compare if
one predictor is
2012 Dec 07
1
Polychor() - why does it take that long?
Hello.
Using the polychor function
> polychor(data[c(s1,s2)] )
for polychoric correlations of two ordinal variables in R takes a long time
for N=7000 (20 minutes+) and significantly slows down my computer.
Now, I have a pretty old computer, but it takes about 20 seconds for MPLUS
to print out the complete polychoric correlation matrix for all 16
variables, while I am running the R function
2012 Jul 05
4
Exclude missing values on only 1 variable
Hello,
I have many hundred variables in my longitudinal dataset and lots of
missings. In order to plot data I need to remove missings.
If I do
> data <- na.omit(data)
that will reduce my dataset to 2% of its original size ;)
So I only need to listwise delete missings on 3 variables (the ones I am
plotting).
data$variable1 <-na.omit(data$variable1)
does not work.
Thank you
2012 Oct 22
1
glm.nb - theta, dispersion, and errors
...e symptom causes this problem:
1: In sqrt(1/i) : NaNs produced
How would you recommend to deal with these problems? Are results reliable
although these errors occur?
Again, I have to use a model that fits all 9 models overall best - maybe I
should default to maximum likelihood Poisson?
Thank you
Torvon
[[alternative HTML version deleted]]
2012 Apr 24
1
Number of lines in analysis after removed missings
I have a dataset with plenty of variables and lots of missing data. As far
as I understand, R automatically removes subjects with missing values.
I'm trying to fit a mixed effects model, adding covariate by covariate. I
suspect that my sample gets smaller and smaller each time I add a
covariate, because more and more lines get deleted.
Is there a way of displaying how many subjects are
2012 Nov 09
1
Remove missings (quick question)
A colleague wrote the following syntax for me:
D = read.csv("x.csv")
## Convert -999 to NA
for (k in 1:dim(D)[2]) {
I = which(D[,k]==-999)
if (length(I) > 0) {
D[I,k] = NA
}
}
The dataset has many missing values. I am running several regressions on
this dataset, and want to ensure every regression has the same subjects.
Thus I want to drop subjects listwise for
2012 Nov 24
0
robustbase error message
...owing error:
> Error in qr.default(x * sqrt(ret$weights)) :
> NA/NaN/Inf in foreign function call (arg 1)
Google wasn't helpful, maybe you can help me out. If this problem is not
obvious to someone who uses R regularly, please let me know what further
information to provide.
Thank you!
Torvon
[[alternative HTML version deleted]]
2012 Dec 01
0
Relative strength of regression predictors (relaimpo vs. relimp)
...e right track
with the relimp and relaimpo packages?
(3) Is there a way to only use standardized betas? I would prefer that
because it would enable me to use standardized confidence intervals to
reason that x1 has a meaningfully larger influence on y than x2 (if the CIs
do not overlap).
Thank you!
Torvon
[[alternative HTML version deleted]]
2012 May 06
2
Interaction plot between 2 continuous variables
I have two very strong fixed effects in a LMM (both continuous variables).
model <- lmer( y ~ time + x1+x2 + (time|subject))
Once I fit an interaction of these variables, both main effects
disappear and I get a strong interaction effect.
model <- lmer( y ~ time + x1*x2 + (time|subject))
I would like to plot this effect now, but have not been able to do so,
reading through ggplot2 and
2012 Oct 14
2
Poisson Regression: questions about tests of assumptions
I would like to test in R what regression fits my data best. My dependent
variable is a count, and has a lot of zeros.
And I would need some help to determine what model and family to use
(poisson or quasipoisson, or zero-inflated poisson regression), and how to
test the assumptions.
1) Poisson Regression: as far as I understand, the strong assumption is
that dependent variable mean = variance.
2012 Apr 12
0
Multivariate multilevel mixed effects model: interaction
Hello.
I am running a multivariate multilevel mixed effects model, and am trying
to understand what the interaction term tells me.
A very simplified version of the model looks like this:
model <- lmer (phq ~ -1 + as.factor(index_phq) * Neuro + ( -1 +
as.factor(index_phq)|UserID), data=data)
The phq variable is a categorical depression score on 9 depression items
(classified by the variable
2012 Apr 23
1
save model summary
Hello,
I'm working with RStudio, which does not display enough lines in the
console that I can read the summary of my (due to the covariance-matrix
rather long) model. There are no ways around this, so I guess I need to
export the summary into a file in order to see it ...
I'm new to R, and "R save model summary" in google doesn't help, neither
does "help(save)"
2012 Apr 26
0
Correlated random effects: comparison unconditional vs. conditional GLMMs
In a GLMM, one compares the conditional model including covariates with the
unconditional model to see whether the conditional model fits the data
better.
(1) For my unconditional model, a different random effects term fits better
(independent random effects) than for my conditional model (correlated
random effects). Is this very uncommon, and how can this be explained? Can
I compare these models
2012 Oct 13
1
WLS regression weights
Hello.
I'm am trying to follow a recommendation to deal with a dependent variable
in a linear regression.
I read that, due to the positive trend in my dependent variable residual vs
mean function, I should
1) run a linear regression to estimate the standard deviations from this
trend, and
2) run a second linear regression and use 1 / variance as weight.
These might be terribly stupid
2012 Oct 21
0
R^2 in Poisson via pr2() function: skeptical about r^2 results
Hello.
I am running 9 poisson regressions with 5 predictors each, using glm with
family=gaussian.
Gaussian distribution fits better than linear regression on fit indices,
and also for theoretical reasons (e.g. the dependent variables are counts,
and the distribution is highly positively skewed).
I want to determine pseudo R^2 now. However, using the pR2() of the pscl
package offers drastically