similar to: how to ignore rows missing arguments of a function when creating a function?

Displaying 20 results from an estimated 4000 matches similar to: "how to ignore rows missing arguments of a function when creating a function?"

2013 Mar 30
1
vcovHC and arima() output
Dear all, how can I use vcovHC() to get robust/corrected standard errors from an arima() output? I ran an arima model with AR(1) and got the estimate, se, zvalue and p-value using coeftest(arima.output). However, I cannot use vcovHC(arima.output) to get corrected standard errors. It seems vcovHC works only with lm and plm objects? Is there another way I can get robust/corrected
2013 Apr 05
1
white heteroskedasticity standard errors NLS
Hello Is there any function to calculate White's standard errors in R in an NLS regression. The sandwich and car package do it but they need an lm object to calculate the error's. Does anyone have idea how to do it for an NLS object ? Regards The woods are lovely, dark and deep But I have promises to keep And miles before I go to sleep And miles before I go to sleep ----- [[alternative
2008 May 14
1
Negative Binomial Model
Hello, I am trying to run a negative binomial regression model in R and can't get the standard errors to match the output I get from the Stata nbreg command. I've tried a few different options but haven't had much luck. The closest I've found is: gamlss(formula, family = NBI, sigma.formula = ~ 1,data=dataframe) ...But this is still a little off most of the time and pretty far
2011 Jul 25
1
predict() and heteroskedasticity-robust standard errors
Hello there, I have a linear regression model for which I estimated heteroskedasticity-robust (Huber-White) standard errors using the coeftest function in the lmtest-package. Now I would like to inspect the predicted values of the dependent variable for particular groups and include a confidence interval for this prediction. My question: is it possible to estimate confidence intervals for the
2007 Nov 09
1
White's test again
Hi all, It seems that I can get White's (HC3) test using MASS. The syntax I used for the particular problem is anova(scireg3, white.adjust="hc3") where scireg3 is an object from the lm function. But, the anova summary table is all I get. I don't get the new estimates or standard errors correcting for heteroskedasticity. Is there a way to get that information? Thanks
2008 May 08
2
poisson regression with robust error variance ('eyestudy
Ted Harding said: > I can get the estimated RRs from > RRs <- exp(summary(GLM)$coef[,1]) > but do not see how to implement confidence intervals based > on "robust error variances" using the output in GLM. Thanks for the link to the data. Here's my best guess. If you use the following approach, with the HC0 type of robust standard errors in the
2017 Aug 03
0
Results of vcovCL (sandwich) and of cluster() in Stata
Hi, I'm trying to reproduce with R the results of this study: https://learn.gold.ac.uk/mod/resource/view.php?id=262406 More precisely I want to reproduce the results of the table 6 (pag.280), which can also be seen here: http://picpaste.de/pics/table-robin-llKCOeWV.1501745645.png Let's take the first column: we have a coeff. of 0.097 and a SE of 0.026, which represents clustered robust
2009 Mar 10
1
HAC corrected standard errors
Hi, I have a simple linear regression for which I want to obtain HAC corrected standard errors, since I have significant serial/auto correlation in my residuals, and also potential heteroskedasticity. Would anyone be able to direct me to the function that implements this in R? It's a basic question and I'm sure I'm missing something obvious here. I looked up this post:
2006 Jul 04
2
Robust standard errors in logistic regression
I am trying to get robust standard errors in a logistic regression. Is there any way to do it, either in car or in MASS? Thanks for the help, Celso [[alternative HTML version deleted]]
2009 Jun 26
1
Heteroskedasticity and Autocorrelation in SemiPar package
Hi all, Does anyone know how to report heteroskedasticity and autocorrelation-consistent standard errors when using the "spm" command in SemiPar package? Suppose the original command is sp1<-spm(y~x1+x2+f(x3), random=~1,group=id) Any suggestion would be greatly appreciated. Thanks, Susan [[alternative HTML version deleted]]
2011 Sep 19
1
"could not find function" after import
I am trying to build a package (GWASTools, submitted to Bioconductor) that uses the "sandwich" package. I have references to "sandwich" in DESCRIPTION: Imports: methods, DBI, RSQLite, sandwich, survival, DNAcopy and NAMESPACE: import(sandwich) In the code itself is a call to vcovHC: Vhat <- vcovHC(mod, type="HC0") I have sandwich version 2.2-7 installed.
2010 May 14
1
Creating an S3 method when the generic function is defined in another (imported) package
Hi, In one of my packages (maxLik), I would like to add an S3 method, where the generic function (estfun) is defined in another package (sandwich). Everything works fine if my package "Depends" on the other package and I import the generic function "estfun" from the "sandwich" package and define the new method in the NAMESPACE file. However, I prefer not to load the
2011 Jan 01
2
robust standard error of an estimator
Hi, I have ove the robust standard error of an estimator but I don't know how to do this. The code for my regression is the following: reg<-lm(fsn~lctot) But then what do I need to do? -- Charlène Lisa Cosandier [[alternative HTML version deleted]]
2010 May 10
2
Robust SE & Heteroskedasticity-consistent estimation
Hi, I'm using maxlik with functions specified (L, his gradient & hessian). Now I would like determine some robust standard errors of my estimators. So I 'm try to use vcovHC, or hccm or robcov for example but in use one of them with my result of maxlik, I've a the following error message : Erreur dans terms.default(object) : no terms component Is there some attributes
2005 Jan 17
2
Omitting constant in ols() from Design
Hi! I need to run ols regressions with Huber-White sandwich estimators and the correponding standard errors, without an intercept. What I'm trying to do is create an ols object and then use the robcov() function, on the order of: f <- ols(depvar ~ ind1 + ind2, x=TRUE) robcov(f) However, when I go f <- ols(depvar ~ ind1 + ind2 -1, x=TRUE) I get the following error: Error in
2005 Jun 02
1
glm with variance = mu+theta*mu^2?
How might you fit a generalized linear model (glm) with variance = mu+theta*mu^2 (where mu = mean of the exponential family random variable and theta is a parameter to be estimated)? This appears in Table 2.7 of Fahrmeir and Tutz (2001) Multivariate Statisticial Modeling Based on Generalized Linear Models, 2nd ed. (Springer, p. 60), where they compare "log-linear model fits to
2010 Sep 23
1
Newey West and Singular Matrix + library(sandwich)
thank you, achim. I will try chol2inv. sandwich is a very nice package, but let me make some short suggestions. I am not a good econometrician, so I do not know what prewhitening is, and the vignette did not explain it. "?coeftest" did not work after I loaded the library. automatic bandwidth selection can be a good thing, but is not always. as to my own little function, I like the
2008 Sep 04
2
Correct for heteroscedasticity using car package
Dear all, Sorry if this is too obvious. I am trying to fit my multiple regression model using lm() Before starting model simplification using step() I checked whether the model presented heteroscedasticity with ncv.test() from the CAR package. It presents it. I want to correct for it, I used hccm() from the CAR package as well and got the Heteroscedasticity-Corrected Covariance Matrix. I am not
2011 Feb 25
1
Question about foreach (with doSNOW), is that a bug?
Hi all, Within a foreach loop with doSNOW, we cant call functions which come from the non-default package. We need to load(require/library) the package once more within the foreach loop. Anyone knows why would happen like this? Is it caused by the snow package and something happened when "snow" parallelize the job? Other than load the package once more with in the foreach loop, is
2012 Feb 06
1
Simple lm/regression question
I am trying to use lm for a simple linear fit with weights. The results I get from IDL (which I am more familiar with) seem correct and intuitive, but the "lm" function in R gives outputs that seem strange to me. Unweighted case: > x<-1:4 > y<-(1:4)^2 > summary(lm(y~x)) Call: lm(formula = y ~ x) Residuals: 1 2 3 4 1 -1 -1 1 Coefficients: