Displaying 20 results from an estimated 7000 matches similar to: "User defined function and nonlinear least-squares fit"
2008 Jul 23
1
Questions on weighted least squares
Hi all,
I met with a problem about the weighted least square regression.
1. I simulated a Normal vector (sim1) with mean 425906 and standard deviation 40000.
2. I simulated a second Normal vector with conditional mean b1*sim1, where b1 is just a number I specified, and variance proportional to sim1. Precisely, the standard deviation is sqrt(sim1)*50.
3. Then I run a WLS regression without the
2011 Apr 02
3
Plotting MDS (multidimensional scaling)
Hi,
I just encountered what I thought was strange behavior in MDS. However, it
turned out that the mistake was mine. The lesson learned from my mistake is
that one should plot on a square pane when plotting results of an MDS. Not
doing so can be very misleading. Follow the example of an equilateral
triangle below to see what I mean. I hope this helps others to avoid this
kind of headache.
2017 Dec 20
1
Nonlinear regression
You also need to reply-all so the mailing list stays in the loop.
--
Sent from my phone. Please excuse my brevity.
On December 19, 2017 4:00:29 PM PST, Timothy Axberg <axbergtimothy at gmail.com> wrote:
>Sorry about that. Here is the code typed directly on the email.
>
>qe = (Qmax * Kl * ce) / (1 + Kl * ce)
>
>##The data
>ce <- c(15.17, 42.15, 69.12, 237.7, 419.77)
2017 Dec 20
0
Nonlinear regression
Should I repost the question with reply-all?
On Tue, Dec 19, 2017 at 6:13 PM, Jeff Newmiller <jdnewmil at dcn.davis.ca.us>
wrote:
> You also need to reply-all so the mailing list stays in the loop.
> --
> Sent from my phone. Please excuse my brevity.
>
> On December 19, 2017 4:00:29 PM PST, Timothy Axberg <
> axbergtimothy at gmail.com> wrote:
> >Sorry about
2012 Nov 08
2
Comparing nonlinear, non-nested models
Dear R users,
Could somebody please help me to find a way of comparing nonlinear, non-nested
models in R, where the number of parameters is not necessarily different? Here
is a sample (growth rates, y, as a function of internal substrate
concentration, x):
x <- c(0.52, 1.21, 1.45, 1.64, 1.89, 2.14, 2.47, 3.20, 4.47, 5.31, 6.48)
y <- c(0.00, 0.35, 0.41, 0.49, 0.58, 0.61, 0.71, 0.83, 0.98,
2005 Oct 26
1
help with a self-starting function in nonlinear least squares regression.
Hello. I am having a problem setting up a self-starting function for
use in nonlinear regression (and eventually in the mixed model version).
The function is a non-rectangular hyperbola - called "NRhyperbola" -
which is used for fitting leaf photosynthetic rate to light intensity.
It has one independent variable (Irr) and four parameters (theta, Am,
alpha and Rd). I have created this
2000 Feb 23
0
Lack of Fit test
> From: "Alan T. Arnholt" <arnholt at math.appstate.edu>
> To: Bill Venables <William.Venables at cmis.CSIRO.AU>
> Cc: r-help at stat.math.ethz.ch, arnholt at math.appstate.edu
> Subject: Re: [R] Lack of Fit test
> Date: Wed, 23 Feb 2000 09:40:21 -0500 (EST)
> X-Authentication: none
>
>
> I guess my question was not adequately stated when I sent
2012 May 28
0
GLMNET AUC vs. MSE
Hello -
I am using glmnet to generate a model for multiple cohorts i. For each i, I
run 5 separate models, each with a different x variable. I want to compare
the fit statistic for each i and x combination.
When I use auc, the output is in some cases is < .5 (.49). In addition, if
I compare mean MSE (with upper and lower bounds) ... there is no difference
across my various x variables, but
2009 Feb 12
0
Comparing slopes in two linear models
Hi everyone,
I have a data frame (d), wich has the results of mosquitoes trapping in
three different places.
I suspect that one of these places (Local=='Palm') is biased by low
numbers and will yield slower slopes in the variance-mean regression over
the areas. I wonder if these slopes are diferents.
I've looked trought the support list for methods for comparing slopes and
found the
2011 Mar 25
2
A question on glmnet analysis
Hi,
I am trying to do logistic regression for data of 104 patients, which
have one outcome (yes or no) and 15 variables (9 categorical factors
[yes or no] and 6 continuous variables). Number of yes outcome is 25.
Twenty-five events and 15 variables mean events per variable is much
less than 10. Therefore, I tried to analyze the data with penalized
regression method. I would like please some of the
2010 Jun 07
1
fit data with y = x^-1
Dear list,
I am getting weired with fitting data with a 1/x-polynomial. Suggest I have
the following data:
x <- c(1,2,3,4,5,6,7)
y <- c(100,20,4,2,1,.3,.1)
I may fit this with a linear model
fit1 = lm(y ~ I(x))
Getting plot out of this model I applied
library(polynom)
pol1 = polynomial(fit1$coefficients)
f1 = as.function(pol1)
plot(x,y)
lines(x, f1(x), col = 2)
Clearly, this model
2007 Nov 23
1
intercept in lars fit
I am trying to extract coefficients from lars fit and can't find how to get
intercept. E.g.
y = rnorm(10)
x = matrix(runif(50),nrow=10)
X = data.frame(y,x)
fit1 = lars(as.matrix(X[,2:6]),as.matrix(X[,1]))
fit2 = lm(y~.,data=X)
Then, if I do:
> predict(fit1,s=1,mode='fraction',type='coefficients')$coef
X1 X2 X3 X4 X5
0.3447570
2008 Jan 05
1
Likelihood ratio test for proportional odds logistic regression
Hi,
I want to do a global likelihood ratio test for the proportional odds
logistic regression model and am unsure how to go about it. I am using
the polr() function in library(MASS).
1. Is the p-value from the likelihood ratio test obtained by
anova(fit1,fit2), where fit1 is the polr model with only the intercept
and fit2 is the full polr model (refer to example below)? So in the
case of the
2020 Sep 30
0
2 KM curves on the same plot
Hi John,
Brilliant solution and the best sort - when you finally solve your
problem by yourself.
Jim
On Thu, Oct 1, 2020 at 2:52 AM array chip <arrayprofile at yahoo.com> wrote:
>
> Hi Jim,
>
> I found out why clip() does not work with lines(survfit.object)!
>
> If you look at code of function survival:::lines.survfit, in th middle of the code:
>
> do.clip <-
2004 Jun 11
1
comparing regression slopes
Dear List,
I used rlm to calculate two regression models for two data sets (rlm
due to two outlying values in one of the data sets). Now I want to
compare the two regression slopes. I came across some R-code of Spencer
Graves in reply to a similar problem:
http://www.mail-archive.com/r-help at stat.math.ethz.ch/msg06666.html
The code was:
> df1 <- data.frame(x=1:10, y=1:10+rnorm(10))
2011 Oct 06
1
anova.rq {quantreg) - Why do different level of nesting changes the P values?!
Hello dear R help members.
I am trying to understand the anova.rq, and I am finding something which I
can not explain (is it a bug?!):
The example is for when we have 3 nested models. I run the anova once on
the two models, and again on the three models. I expect that the p.value
for the comparison of model 1 and model 2 would remain the same, whether or
not I add a third model to be compared
2009 Jul 28
2
A hiccup when using anova on gam() fits.
I stumbled across a mild glitch when trying to compare the
result of gam() fitting with the result of lm() fitting.
The following code demonstrates the problem:
library(gam)
x <- rep(1:10,10)
set.seed(42)
y <- rnorm(100)
fit1 <- lm(y~x)
fit2 <- gam(y~lo(x))
fit3 <- lm(y~factor(x))
print(anova(fit1,fit2)) # No worries.
print(anova(fit1,fit3)) # Likewise.
print(anova(fit2,fit3)) #
2011 Jun 29
0
Problem: Update of glm-object cannot find where the data object is located
Hi everybody,
I want to ask your help to explain what is going on with my following
code:
> mydata <- data.frame(y=rbinom(100, 1, 0.5), x1=rnorm(100),
x2=rnorm(100))
> glm.fit.method <-
function(model,data,...){glm(formula=model,data=data,family="binomial",.
..)}
> fit1 <- glm(y ~ x1 + x2, data=mydata, family=binomial())
> update(fit1, .~1)
Call: glm(formula =
2003 Dec 30
1
odd results from polr vs wilcoxon test
Dear R helpers,
I would like to ask why polr occasionally generates results that look very
odd.
I have been trying to compare the power of proportional odds logistic
regression with
the Wilcoxon test. I generated random samples, applied both tests and
extracted and
compared the p-values, thus:-
library(MASS)
c1=rep(NA,100); c2=c1
for (run in 1:100)
{
dat=c(rbinom(20,12,0.65),rbinom(20,12,0.35))
2010 Sep 21
1
package gbm, predict.gbm with offset
Dear all,
the help file for predict.gbm states that "The predictions from gbm do not
include the offset term. The user may add the value of the offset to the
predicted value if desired." I am just not sure how exactly, especially for
a Poisson model, where I believe the offset is multiplicative ?
For example:
library(MASS)
fit1 <- glm(Claims ~ District + Group + Age +