Displaying 20 results from an estimated 509 matches for "ridging".
Did you mean:
bridging
2009 Aug 01
2
Cox ridge regression
Hello,
I have questions regarding penalized Cox regression using survival
package (functions coxph() and ridge()). I am using R 2.8.0 on Ubuntu
Linux and survival package version 2.35-4.
Question 1. Consider the following example from help(ridge):
> fit1 <- coxph(Surv(futime, fustat) ~ rx + ridge(age, ecog.ps, theta=1), ovarian)
As I understand, this builds a model in which `rx' is
2017 May 04
4
lm() gives different results to lm.ridge() and SPSS
...ate between including or deleting ".ridge" in the function call, and watch the coefficient for the interaction change.)
What seems slightly strange to me here is that I assumed that lm.ridge() just piggybacks on lm() anyway, so in the specific case where lambda=0 and there is no "ridging" to do, I'd expect exactly the same results.
Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will not be easy to make, but I can share the data via Dropbox or something if that would help.
I appreciate that when there is strong collinearity then...
2017 May 04
2
lm() gives different results to lm.ridge() and SPSS
...t;.ridge" in the function call, and watch the coefficient for the interaction
> change.)
>
>
>
> What seems slightly strange to me here is that I assumed that lm.ridge() just
> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
> is no "ridging" to do, I'd expect exactly the same results.
>
>
> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
> not be easy to make, but I can share the data via Dropbox or something if that
> would help.
>
>
>
> I apprecia...
2005 Aug 24
1
lm.ridge
Hello, I have posted this mail a few days ago but I did it wrong, I hope
is right now:
I have the following doubts related with lm.ridge, from MASS package. To
show the problem using the Longley example, I have the following doubts:
First: I think coefficients from lm(Employed~.,data=longley) should be
equal coefficients from lm.ridge(Employed~.,data=longley, lambda=0) why
it does not happen?
2009 Mar 17
1
Likelihood of a ridge regression (lm.ridge)?
Dear all,
I want to get the likelihood (or AIC or BIC) of a ridge regression model
using lm.ridge from the MASS library. Yet, I can't really find it. As
lm.ridge does not return a standard fit object, it doesn't work with
functions like e.g. BIC (nlme package). Is there a way around it? I would
calculate it myself, but I'm not sure how to do that for a ridge regression.
Thank you in
2011 Nov 24
3
How to deal with package conflicts
In my genridge package, I define a function ridge() for ridge
regression, creating objects of class 'ridge'
that I intend to enhance.
In a documentation example, I want to use some functions from the car
package. However, that package
requires survival, which also includes a ridge() function, for coxph
models. So, once I require(car)
my ridge() function is masked, which means I have to
2011 Aug 06
0
ridge regression - covariance matrices of ridge coefficients
For an application of ridge regression, I need to get the covariance
matrices of the estimated regression
coefficients in addition to the coefficients for all values of the ridge
contstant, lambda.
I've studied the code in MASS:::lm.ridge, but don't see how to do this
because the code is vectorized using
one svd calculation. The relevant lines from lm.ridge, using X, Y are:
2013 Mar 31
1
Rock Ridge for core/fs/iso9660
Hi,
i have now a retriever of Rock Ridge names from ISO directory
records and their eventual Continuation Areas.
Further i have a detector for SUSP and Rock Ridge signatures.
Both have been tested in libisofs by comparing their results with
the Rock Ridge info as perceived by the library.
50 ISO images tested. Some bugs repaired. Now they are in sync.
(The macro case
2017 May 05
6
lm() gives different results to lm.ridge() and SPSS
...on call, and watch the coefficient for the interaction
>> change.)
>>
>>
>>
>> What seems slightly strange to me here is that I assumed that lm.ridge() just
>> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
>> is no "ridging" to do, I'd expect exactly the same results.
>>
>>
>> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
>> not be easy to make, but I can share the data via Dropbox or something if that
>> would help.
>>
>...
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
...on call, and watch the coefficient for the interaction
>>> change.)
>>>
>>> What seems slightly strange to me here is that I assumed that lm.ridge() just
>>> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
>>> is no "ridging" to do, I'd expect exactly the same results.
>>>
>>> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
>>> not be easy to make, but I can share the data via Dropbox or something if that
>>> would help.
>>...
2007 Apr 12
1
Question on ridge regression with R
Hi,
I am working on a project about hospital efficiency. Due to the high
multicolinearlity of the data, I want to fit the model using ridge
regression. However, I believe that the data from large hospital(indicated
by the number of patients they treat a year) is more accurate than from
small hosptials, and I want to put more weight on them. How do I do this
with lm.ridge?
I know I just need
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
...on call, and watch the coefficient for the interaction
>> change.)
>>
>>
>>
>> What seems slightly strange to me here is that I assumed that lm.ridge() just
>> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
>> is no "ridging" to do, I'd expect exactly the same results.
>>
>>
>> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
>> not be easy to make, but I can share the data via Dropbox or something if that
>> would help.
>>
>...
2010 Dec 09
1
survival: ridge log-likelihood workaround
Dear all,
I need to calculate likelihood ratio test for ridge regression. In February I have reported a bug where coxph returns unpenalized log-likelihood for final beta estimates for ridge coxph regression. In high-dimensional settings ridge regression models usually fail for lower values of lambda. As the result of it, in such settings the ridge regressions have higher values of lambda (e.g.
2009 Jun 04
0
help needed with ridge regression and choice of lambda with lm.ridge!!!
Hi,
I'm a beginner in the field, I have to perform the ridge regression with lm.ridge for many datasets, and I wanted to do it in an automatic way.
In which way I can automatically choose lambda ?
As said, right now I'm using lm.ridge MASS function, which I found quite simple and fast, and I've seen that among the returned values there are HKB estimate of the ridge constant and L-W
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
..." in the function call, and watch the coefficient for the interaction
>> change.)
>>
>> What seems slightly strange to me here is that I assumed that lm.ridge() just
>> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
>> is no "ridging" to do, I'd expect exactly the same results.
>>
>> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
>> not be easy to make, but I can share the data via Dropbox or something if that
>> would help.
>>
>> I ap...
2017 May 04
0
lm() gives different results to lm.ridge() and SPSS
...".ridge" in the function call, and watch the coefficient for the interaction
> change.)
>
>
>
> What seems slightly strange to me here is that I assumed that lm.ridge() just
> piggybacks on lm() anyway, so in the specific case where lambda=0 and there
> is no "ridging" to do, I'd expect exactly the same results.
>
>
> Unfortunately there are 34,000 cases in the dataset, so a "minimal" reprex will
> not be easy to make, but I can share the data via Dropbox or something if that
> would help.
>
>
>
> I appreciate t...
2010 Dec 02
0
survival - summary and score test for ridge coxph()
It seems to me that summary for ridge coxph() prints summary but returns NULL. It is not a big issue because one can calculate statistics directly from a coxph.object. However, for some reason the score test is not calculated for ridge coxph(), i.e score nor rscore components are not included in the coxph object when ridge is specified. Please find the code below. I use 2.9.2 R with 2.35-4 version
2017 May 05
1
lm() gives different results to lm.ridge() and SPSS
...>>>> change.)
>>>>
>>>> What seems slightly strange to me here is that I assumed that
>>>>lm.ridge() just
>>>> piggybacks on lm() anyway, so in the specific case where lambda=0 and
>>>>there
>>>> is no "ridging" to do, I'd expect exactly the same results.
>>>>
>>>> Unfortunately there are 34,000 cases in the dataset, so a "minimal"
>>>>reprex will
>>>> not be easy to make, but I can share the data via Dropbox or
>>>>someth...
2010 Feb 16
1
survival - ratio likelihood for ridge coxph()
It seems to me that R returns the unpenalized log-likelihood for the ratio likelihood test when ridge regression Cox proportional model is implemented. Is this as expected?
In the example below, if I am not mistaken, fit$loglik[2] is unpenalized log-likelihood for the final estimates of coefficients. I would expect to get the penalized log-likelihood. I would like to check if this is as expected.
2017 May 05
0
lm() gives different results to lm.ridge() and SPSS
...ion
>>>> change.)
>>>>
>>>> What seems slightly strange to me here is that I assumed that
>>>>lm.ridge() just
>>>> piggybacks on lm() anyway, so in the specific case where lambda=0 and
>>>>there
>>>> is no "ridging" to do, I'd expect exactly the same results.
>>>>
>>>> Unfortunately there are 34,000 cases in the dataset, so a "minimal"
>>>>reprex will
>>>> not be easy to make, but I can share the data via Dropbox or
>>>>something...