Displaying 5 results from an estimated 5 matches for "y_c".
Did you mean:
_c
2005 Apr 01
1
optim problem, nls regression
...r[3] must be identical.
I have 3 set of data (x1,y1), (x2,y2), (x3,y3).
x_a<-c(0,0.5,1,1.5,2,3,4,6)
y_a<-c(5.4,5,4.84,4.3,4,2,1.56,1.3)
x_b<-c(0,1,2,3,4,5,6,7,8,9,10,11,12)
y_b<-c(5.34,4.98,4.66,4.06,3,3.4,2.7,2.9,2.6,2.6,1.9,1.3,1.4)
x_c<-c(0,3,6,8,10,12,14,16,18,20,24,26,28,30)
y_c<-c(5.5,5.1,4.3,4,3.7,3.2,3.04,2.778,2.56,2.25,1.78,1.44,1.32,1.2)
x<-c(x_a,x_b,x_c)
y<-c(y_a,y_b,y_c)
long<-c(0,8,21,35)
Hence, the sum of squares is:
Sce= sum( sum((y- number[4]-(x/number[1])^number[7])^2)+
sum((y- number[5]-(x/number[2])^number[7])^2)+
sum((y- number[6]-(...
2012 Jul 02
0
Fit circle with R
...ters
DET = xnew*xnew - xnew*Mz + Cov_xy;
Center = cbind(Mxz*(Myy-xnew)-Myz*Mxy , Myz*(Mxx-xnew)-Mxz*Mxy)/DET/2;
Par = cbind(Center+centroid , sqrt(Center[2]*Center[2]+Mz+2*xnew));
return(Par)
}
#EXAMPLE
library(plotrix)
# Create a Circle of radius=10, centre=5,5
R = 10; x_c = 5; y_c = 5;
thetas = seq(0,pi,(pi/64));
xs = x_c + R*cos(thetas)
ys = y_c + R*sin(thetas)
# Now add some random noise
mult = 0.5;
xs = xs+mult*rnorm(rnorm(xs));
ys = ys+mult*rnorm(rnorm(ys));
plot(xs,ys,pch=19,cex=0.5,col="red",xlim=c(-10,20),ylim=c(-10,20),asp=1)
# real circle
draw.circle(x_c,y...
2002 Nov 07
2
Qualitative factors
Hi,
I have some doubt about how qualitative factors are coded in R. For
instance, I consider a response y, a quantitative factor x and a qualitative
factor m at 3 levels, generated as follow :
y_c(6,4,2.3,5,3.5,4,1.,8.5,4.3,5.6,2.3,4.1,2.5,8.4,7.4)
x_c(3,1,3,1,2,1,4,5,1,3,4,2,5,4,3)
m_gl(3,5)
lm(y~x+m)
Coefficients:
(Intercept) x m2 m3
3.96364 0.09818 0.44145 0.62291
In literature, 2 usual implicit coding process are suggested : m1=0 o...
2005 Apr 08
0
TR: The results of your email commands
....60206,0.30103,0,0.4771213)
x_b<-c(0,1,2,3,4,5,6,7,8,9,10,11,12,13,14)
y_b<-c(5.3424227,4.9867717,4.6627578,4.0606978,3.07182,3.4771213,2.7993405,2
.9395193,2.69897,2.6127839,1.9777236,1.3222193,1.4149733,0.7781513,1.322219)
x_c<-c(0,3,6,10,12,14,16,18,20,24,26,28,30,34)
y_c<-c(5.5185139,5.1461280,4.3617278,3.771513,3.20412,3.0413927,2.7781513,2.
5682017,2.255272,1.7823917,1.447158,1.3222193,1.2787536,0.69897)
#number of kinetics
nb=3
#complete data set
x<-c(x_a,x_b,x_c)
y<-c(y_a,y_b,y_c)
#cumulative length
long&...
2001 Nov 16
2
pearson residuals in glm for binomial response (PR#1175)
...Thank you.
Professor John T. Kent tel (direct) (44) 113-233-5103
Department of Statistics tel(secretary) (44) 113-233-5101
University of Leeds fax (44) 113-233-5090
Leeds LS2 9JT, England e-mail: J.T.Kent@leeds.ac.uk
Input:
x_1:4 # regressor variable
y_c(2,6,7,8) # response binomial counts
n_rep(10,4) # number of binomial trials
ym_cbind(y,n-y) # response variable as a matrix
glm1_glm(ym~x,binomial) # fit a generalized linear model
f_fitted(glm1)
rp1_(y-n*f)/sqrt(n*f*(1-f)) # direct calculation of pearson residuals
rp2_residuals(glm1,type="pea...