Displaying 20 results from an estimated 67 matches for "crossvalid".
Did you mean:
crossval
2023 Oct 22
1
running crossvalidation many times MSE for Lasso regression
Dear R-experts,
Here below my R code with an error message. Can somebody help me to fix this error??
Really appreciate your help.
Best,
############################################################
#?MSE CROSSVALIDATION Lasso regression?
library(glmnet)
?
x1=c(34,35,12,13,15,37,65,45,47,67,87,45,46,39,87,98,67,51,10,30,65,34,57,68,98,86,45,65,34,78,98,123,202,231,154,21,34,26,56,78,99,83,46,58,91)
x2=c(1,3,2,4,5,6,7,3,8,9,10,11,12,1,3,4,2,3,4,5,4,6,8,7,9,4,3,6,7,9,8,4,7,6,1,3,2,5,6,8,7,1,1,2,9)
y=c(2,6,5,4,...
2023 Oct 22
2
running crossvalidation many times MSE for Lasso regression
...a R-help
<r-help at r-project.org> wrote:
>
> Dear R-experts,
>
> Here below my R code with an error message. Can somebody help me to fix this error?
> Really appreciate your help.
>
> Best,
>
> ############################################################
> # MSE CROSSVALIDATION Lasso regression
>
> library(glmnet)
>
>
> x1=c(34,35,12,13,15,37,65,45,47,67,87,45,46,39,87,98,67,51,10,30,65,34,57,68,98,86,45,65,34,78,98,123,202,231,154,21,34,26,56,78,99,83,46,58,91)
> x2=c(1,3,2,4,5,6,7,3,8,9,10,11,12,1,3,4,2,3,4,5,4,6,8,7,9,4,3,6,7,9,8,4,7,6,1,3,2,5,6,...
2010 Jun 04
0
glmpath crossvalidation
Hi all,
I'm relatively new to using R, and have been trying to fit an L1
regularization path using coxpath from the glmpath library.
I'm interested in using a cross validation framework, where I crossvalidate
on a training set to select the lambda that achieves the lowest error, then
use that value of lambda on the entire training set, before applying to a
test set. This seems to entail somehow using cv.coxpath , inspecting the
cv.error attribute, then using the corresponding lambda in coxpath.
H...
2012 Mar 01
1
GLM with regularization
...new to R.
Does any of packages have something like glm+regularization? So far i
see probably something close to that as a ridge regression in MASS but
I think i need something like GLM, in particular binomial regularized
versions of polynomial regression.
Also I am not sure how some of the K-fold crossvalidation helpers out
there (cv.glm) could be used to adjust reg rate as there seems to be
no way to apply them over data not used for training (or i am not
seeing a solution here as training is completely separated from
crossvalidation error computation here) .
The example here in cv.glm doesn't...
2006 Feb 02
0
crossvalidation in svm regression in e1071 gives incorrect results (PR#8554)
Full_Name: Noel O'Boyle
Version: 2.1.0
OS: Debian GNU/Linux Sarge
Submission from: (NULL) (131.111.8.96)
(1) Description of error
The 10-fold CV option for the svm function in e1071 appears to give incorrect
results for the rmse.
The example code in (3) uses the example regression data in the svm
documentation. The rmse for internal prediction is 0.24. It is expected the
10-fold CV rmse
2006 Feb 02
0
crossvalidation in svm regression in e1071 gives incorre ct results (PR#8554)
1. This is _not_ a bug in R itself. Please don't use R's bug reporting
system for contributed packages.
2. This is _not_ a bug in svm() in `e1071'. I believe you forgot to take
sqrt.
3. You really should use the `tot.MSE' component rather than the mean of
the `MSE' component, but this is only a very small difference.
So, instead of spread[i] <- mean(mysvm$MSE), you
2010 Jan 01
1
Questions bout SVM
Hi everyone,
Can someone please help me in these questions?:
1)if I use crossvalidation with svm, do I have to use this equation to calculate RMSE?:
mymodel <- svm(myformula,data=mydata,cross=10)
sqrt(mean(mymodel$MSE))
But if I don’t use crossvalidation, I have to use the following to calculate RMSE:
mymodel <- svm(myformula,data=mydata)...
2013 Nov 06
3
Nonnormal Residuals and GAMs
Greetings, My question is more algorithmic than prectical. What I am
trying to determine is, are the GAM algorithms used in the mgcv package
affected by nonnormally-distributed residuals?
As I understand the theory of linear models the Gauss-Markov theorem
guarantees that least-squares regression is optimal over all unbiased
estimators iff the data meet the conditions linearity,
2009 Aug 21
1
LASSO: glmpath and cv.glmpath
...y
predicting Cancer or Noncancer. With a lasso model
fit.glm <- glmpath(x=as.matrix(X), y=target, family="binomial")
(target is 0, 1 <- Cancer non cancer, X the proteins, numerical in
expression), I get following path (PICTURE 1)
One of these models is the best, according to its crossvalidation
(PICTURE 2), the red line corresponds to the best crossvalidation. Its
produced by
cv <- cv.glmpath(x=as.matrix(X), y=unclass(T)-1, family="binomial", type
="response", plot.it=TRUE, se=TRUE)
abline(v= cv$fraction[max(which(cv$cv.error==min(cv$cv.error)))],
col=&quo...
2023 Oct 23
1
running crossvalidation many times MSE for Lasso regression
...gt;? ? ? >> this error?
>? ? ? >> >> Really appreciate your help.
>? ? ? >> >>
>? ? ? >> >> Best,
>? ? ? >> >>
>? ? ? >> >> ############################################################
>? ? ? >> >> # MSE CROSSVALIDATION Lasso regression
>? ? ? >> >>
>? ? ? >> >> library(glmnet)
>? ? ? >> >>
>? ? ? >> >>
>? ? ? >> >>
>? ? ? >> x1=c(34,35,12,13,15,37,65,45,47,67,87,45,46,39,87,98,67,51,10,30,65,34,57,68,98,86,45,65,34,78,98,123,...
2023 Oct 23
2
running crossvalidation many times MSE for Lasso regression
...gt; >> this error?
> >> >> Really appreciate your help.
> >> >>
> >> >> Best,
> >> >>
> >> >> ############################################################
> >> >> # MSE CROSSVALIDATION Lasso regression
> >> >>
> >> >> library(glmnet)
> >> >>
> >> >>
> >> >>
> >> x1=c(34,35,12,13,15,37,65,45,47,67,87,45,46,39,87,98,67,51,10,30,65,34,57,68,98,86,45,65,34,78,98,123,...
2023 Oct 24
1
running crossvalidation many times MSE for Lasso regression
...t; ? ? ? >> >> Really appreciate your help.
>> ? ? ? >> >>
>> ? ? ? >> >> Best,
>> ? ? ? >> >>
>> ? ? ? >> >> ############################################################
>> ? ? ? >> >> # MSE CROSSVALIDATION Lasso regression
>> ? ? ? >> >>
>> ? ? ? >> >> library(glmnet)
>> ? ? ? >> >>
>> ? ? ? >> >>
>> ? ? ? >> >>
>> ? ? ? >> x1=c(34,35,12,13,15,37,65,45,47,67,87,45,46,39,87,98,67,51,10,30,6...
2012 Aug 08
6
R versus SAS
I found this on CrossValidated:
"A medical statistician once told me, that they use SAS because if
they make mistakes due to software bugs and it comes to lawsuits, SAS
will recompensate them. R comes without warranty."
Kjetil
2013 May 12
1
Multinomial-Dirichlet using R
Hi:
I have asked this question on Cross-Validated. So it might be a cross
posting but havent received any responses to it.
I am trying to see which distribution will best fit the data I am working
on. The dataset is as following:
Site Nausea headache Abdominal Distension
1 17 5 10
2 12
2019 May 25
3
Increasing number of observations worsen the regression model
...squared:? 7.038e-05,??? Adjusted R-squared: 3.705e-05
F-statistic: 2.112 on 1 and 29998 DF,? p-value: 0.1462
```
The strange thing is that the code works perfectly for N=200 or N=2000.
It's only for larger N that this thing happen U(for example, N=20000). I
have tried to ask for example in CrossValidated
<https://stats.stackexchange.com/questions/410050/increasing-number-of-observations-worsen-the-regression-model>
but the code works for them. Any help?
I am runnign R 3.6.0 on Kubuntu 19.04
Best regards
Raffaele
[[alternative HTML version deleted]]
2006 Feb 16
1
reg cross validation in svm
Hi
My name is karthikeyan.
I am using svm in R for my data set.
my data set contain 60 finance ratio as variables and i want to classify
into group of good and bad.
I want to know how to do the crossvalidation for the svm .
first i am doing modelling and i am predict and i am calculating the
probabilities
how can i do the cross validation and how can i plot the svm for this
variables
waiting for your reply
thanking you
karthik
[[alternative HTML version deleted]]
2009 Jul 09
1
validate() in Design library
Hi, another question about validate() in Design library. The arugment "B" of this function is number of repetition for method="bootstrap", which is easy to understand; but for method="crossvalidation", B is the number of groups of omitted observations. This is confusing, I don't understand what it means. Let's say 5-fold cross validation, all samples are divided into 5 groups of equal number of samples, 4 groups will be used as training and the model developed there will be te...
2009 Oct 28
1
Data Partition Package
Hi, Users,
I am a new user. I am trying to partition data into training and test. Is
there any R package or function that can partition dataset? Also, is there
any package do crossvalidation? Any help will be appreciated.
Best,
Pat
[[alternative HTML version deleted]]
2008 Jun 25
1
LDA on pre-assigned training and testing data sets
Dear r-help
I am trying to run LDA on a training data set, and test it on another data set with the same variables. I found examples using crossvalidation, and using training and testing data sets set up with sample, but not when they are preassigned.
Here is what I tried
# FIRST SET UP A DATAFRAME WITH ALL THE DATA AND CREATE NEW VARIABLES
traintest1 <- arnaudnognod1[arnaudnognod1$DISC_USE1 == 1.01|arnaudnognod1$DISC_USE1 == 1.03|arnaudn...
2011 Jul 04
3
modification of cross-validations in rpart
Un texte encapsul? et encod? dans un jeu de caract?res inconnu a ?t? nettoy?...
Nom : non disponible
URL : <https://stat.ethz.ch/pipermail/r-help/attachments/20110704/68ecf4d2/attachment.pl>