similar to: Is there a R command for testing the difference of two liear regressions?

Displaying 20 results from an estimated 7000 matches similar to: "Is there a R command for testing the difference of two liear regressions?"

2012 Mar 14
2
How to test the statistical significance of the difference of two univariate Linear Regression betas?
How to test the statistical significance of the difference of two univariate Linear Regression betas? Hi all, There are two samples of data: D1 and D2. On data D1 we do a univariate Linear Regression and get the coefficient beta1. On data D2 we do a univariate Linear Regression and get the coefficient beta2. How do I test the statistical significance of (beta1-beta2)? Could you please
2009 Aug 19
1
ridge regression
Dear all, I considered an ordinary ridge regression problem. I followed three different ways: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-t(as.matrix(cbind(2,3,4,5))) n<-nrow(X) p<-ncol(X) #Without
2009 Aug 19
1
Ridge regression [Repost]
Dear all, For an ordinary ridge regression problem, I followed three different approaches: 1. estimate beta without any standardization 2. estimate standardized beta (standardizing X and y) and then again convert back 3. estimate beta using lm.ridge() function X<-matrix(c(1,2,9,3,2,4,7,2,3,5,9,1),4,3) y<-as.matrix(c(2,3,4,5)) n<-nrow(X) p<-ncol(X) #Without standardization
2008 Jul 07
5
question on lm or glm matrix of coeficients X test data terms
Hi, is there an easy way to get the calculated weights in a regression equation? for e.g. if my model has 2 variables 1 and 2 with coefficient .05 and .6 how can I get the computed values for a test dataset for each coefficient? data var1,var2 10,100 so I want to get .5, 60 back in a vector. This is a one row example but I would want to get a matrix of multiplied out coefficients
2004 Apr 21
2
Question on CAR appendix on NLS
The PDF file on the web, which is an appendix on nonlinear regression associated with the CAR book, is very nice. When I ran through the code presented there, I found something odd. The code does a certain model in 3 ways: Vanilla NLS (using numerical differentation), Analytical derivatives (where the user supplies the derivatives) and analytical derivatives (using automatic differentiation). The
2007 May 14
1
Hierarchical models in R
Is there a way to do hierarchical (bayesian) logistic regression in R, the way we do it in BUGS? For example in BUGS we can have this model: model {for(i in 1:N) { y[i] ~ dbin(p[i],n[i]) logit(p[i]) <- beta0+beta1*x1[i]+beta2*x2[i]+beta3*x3[i] } sd ~ dunif(0,10) tau <- pow(sd, -2) beta0 ~ dnorm(0,0.1) beta1 ~ dnorm(0,tau) beta2 ~ dnorm(0,tau) beta3 ~
2013 Apr 03
3
Generating a bivariate joint t distribution in R
Hi, I conduct a panel data estimation and obtain estimators for two of the coefficients beta1 and beta2. R tells me the mean and covariance of the distribution of (beta1, beta2). Now I would like to find the distribution of the quotient beta1/beta2, and one way to do it is to simulate via the joint distribution (beta1, beta2), where both beta1 and beta2 follow t distribution. How could we
2006 Oct 17
4
if statement error
Hi List, I was not able to make this work. I know it is a simple one, sorry to bother. Give me some hints pls. Thanks! Jen if(length(real.d)>=30 && length(real.b)>=30 && beta1*beta2*theta1*theta2>0 ) { r <- 1; corr <- 1; } real.d and real.b are two vectors, beta1,beta2,theta1,and theta2 are constants. The error occurred like this: Error in if
2013 Mar 11
2
vertical lines in R plot
Dear All, May I seek your suggestion on a simple issue. I want to draw vertical lines at some positions in the following R plot. To be more specific, I wish to draw vertical lines at d=c(5.0,5.5,6) and they should go till p=c(0.12,0.60,0.20) . I haven't found any way out, though made several attempts. Please run the following commands first if you are interested in!
2012 Oct 23
1
Minimizing Computational Time
Dear R-users, May I seek some suggestions from you. I have a long programme written in R with several 'for' loops inside. I just want to get them out by any elegant way (if there is!) to reduce the computational time of the main programme. For instance, is there any smart way for the following programme that will lessen time?
2004 Apr 16
5
Non-Linear Regression (Cobb-Douglas and C.E.S)
Dear all, For estimating Cobb-Douglad production Function [ Y = ALPHA * (L^(BETA1)) * (K^(BETA2)) ], i want to use nls function (without linearizing it). But how can i get initial values? ------------------------------------ > options(prompt=" R> " ) R> Y <- c(59.6, 63.9, 73.5, 75.6, 77.3, 82.8, 83.6, 84.9, 90.3, 80.5, 73.5, 60.3, 58.2, 64.4, 75.4, 85, 92.7, 85.4,
2008 Dec 03
1
hypergeometric
Hi, I hope somebody can help me on how to use the hypergeometric function. I did read through the R documentation on hypergeometric but not really sure what it means. I would like to evaluate the hypergeometric function as follows: F((2*alpha+1)/2, (2*alpha+2)/2 , alpha+1/2, betasq/etasq). I'm not sure which function should be used- either phyper or qhyper or dhyper Where
2023 Aug 20
1
Determining Starting Values for Model Parameters in Nonlinear Regression
The cautions people have given about starting values are worth heeding. That nlxb() does well in many cases is useful, but not foolproof. And John Fox has shown that the problem can be tackled very simply too. Best, JN On 2023-08-19 18:42, Paul Bernal wrote: > Thank you so much Dr. Nash, I truly appreciate your kind and valuable contribution. > > Cheers, > Paul > > El El
2000 Feb 14
2
Error in the inverse of a diagonal matrix?
I?m new to R so maybe this issue has been asked before and I still could not read the complete set of past messages sent to the list. I found a weird behabiour that I will explain with a simple example. Lets consider the following block of commands: > x <- diag(c(1,4,10)) > x [,1] [,2] [,3] [1,] 1 0 0 [2,] 0 4 0 [3,] 0 0 10 > invx <- x^-1 > invx
2012 Oct 17
1
Random Forest for multiple categorical variables
Dear all, I have the following data set. V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 alpha beta 1 11 1 11 1 11 1 11 1 11 alpha beta1 2 12 2 12 2 12 2 12 2 12 alpha beta1 3 13 3 13 3 13 3 13 3 13 alpha beta1 4 14 4 14 4 14 4 14 4 14 alpha beta1 5 15 5 15 5 15 5 15 5
2005 Nov 09
5
How to find statistics like that.
Hi there, Suppose mu is constant, and error is normally distributed with mean 0 and fixed variance s. I need to find a statistics that: Y_i = mu + beta1* I1_i beta2*I2_i + beta3*I1_i*I2_i + +error, where I_i is 1 Y_i is from group A, and 0 if Y_i is from group B. It is large when beta1=beta2=0 It is small when beta1 and/or beta2 is not equal to 0 How can I find it by R? Thank you very much
2009 Jul 12
2
Nonlinear Least Squares nls() programming help
Hi, I am trying to use the nls() function to closely approximate a vector of values, colC and I'm running into trouble. I am not sure how if I am asking the program to do what I think its doing, because the same minimization in Excel's Solver does not run into problems. If anyone can tell me what is going wrong, and why I'm getting a singular convergence(7) error, please tell me. I
2009 Jul 01
2
Difficulty in calculating MLE through NLM
Hi R-friends, Attached is the SAS XPORT file that I have imported into R using following code library(foreign) mydata<-read.xport("C:\\ctf.xpt") print(mydata) I am trying to maximize logL in order to find Maximum Likelihood Estimate (MLE) of 5 parameters (alpha1, beta1, alpha2, beta2, p) using NLM function in R as follows. # Defining Log likelihood - In the function it is noted as
2012 Dec 04
1
Winbugs from R
Hi, I am trying to covert a Winbugs code into R code. Here is the winbugs code model{# model’s likelihoodfor (i in 1:n){time[i] ~ dnorm( mu[i], tau ) # stochastic componenent# link and linear predictormu[i] <- beta0 + beta1 * cases[i] + beta2 * distance[i]}# prior distributionstau ~ dgamma( 0.01, 0.01 )beta0 ~ dnorm( 0.0, 1.0E-4)beta1 ~ dnorm( 0.0, 1.0E-4)beta2 ~ dnorm( 0.0, 1.0E-4)#
2005 Feb 27
2
Help with constrained optimization
Dear all, I need an advice in the following problem. I have to maximize two functions of the form f1(x)=f(y1,x,alpha1,beta1) and f2(x)=(y2,x,alpha2,beta2), the maximization is with respect to alpha1, alpha2, beta1, beta2. I can maximize each function separately using nlm. The problem is that I have to add the constraint of the form g(alpha1)=g(alpha2). The total number of parameters is