Displaying 20 results from an estimated 20000 matches similar to: "determine length of bivariate polynomial"
2004 Dec 01
1
tuning SVM's
Hi
I am doing this sort of thing:
POLY:
> > obj = best.tune(svm, similarity ~., data = training, kernel =
"polynomial")
> summary(obj)
Call:
best.tune(svm, similarity ~ ., data = training, kernel = "polynomial")
Parameters:
SVM-Type: eps-regression
SVM-Kernel: polynomial
cost: 1
degree: 3
gamma: 0.04545455
coef.0: 0
2006 Mar 29
2
bivariate case in Local Polynomials regression
Hi:
I am using the package "KernSmooth" to do the local polynomial regression. However, it seems the function "locpoly" can only deal with univariate covaraite. I wonder is there any kernel smoothing package in R can deal with bivariate covariates? I also checked the package "lcofit" in which function "lcofit" can indeed deal with bivariate case. The
2008 Oct 10
1
Coefficients in a polynomial glm with family poisson/binomial
Dear R-users
When running a glm polynomial model with one explanatory variable (example Y~X+X^2), with a poisson or binomial error distribution, the predicted values obtained from using the predict() function and those obtained from using the coefficients from the summary table "as is" in an equation of the form Y=INTERCEPT+ XCoef x X + XCoef x X^2, differ considerably. The former are
2005 Jun 14
2
ordinary polynomial coefficients from orthogonal polynomials?
How can ordinary polynomial coefficients be calculated
from an orthogonal polynomial fit?
I'm trying to do something like find a,b,c,d from
lm(billions ~ a+b*decade+c*decade^2+d*decade^3)
but that gives: "Error in eval(expr, envir, enclos) :
Object "a" not found"
> decade <- c(1950, 1960, 1970, 1980, 1990)
> billions <- c(3.5, 5, 7.5, 13, 40)
> #
2004 May 06
5
Orthogonal Polynomial Regression Parameter Estimation
Dear all,
Can any one tell me how can i perform Orthogonal
Polynomial Regression parameter estimation in R?
--------------------------------------------
Here is an "Orthogonal Polynomial" Regression problem
collected from Draper, Smith(1981), page 269. Note
that only value of alpha0 (intercept term) and signs
of each estimate match with the result obtained from
coef(orth.fit). What
2003 Apr 29
1
polynomial fitting
I'm trying to find a way to fit a polynomial of degree n in x and y to
a set of x, y, and z data that I have and obtain the coefficients for
the terms of the fitted polynomial. However, when I try to use the
surf.ls function I'm getting odd results.
> x <- seq(0, 10, length=50)
> y <- x
> f <- function (x, y) {x^2 + y}
> library(spatial)
> test <-
2003 Jun 23
1
precision matrix for polynomial growth curves
What does the warning message
"1: Singular precision matrix in level -1, block 1" mean?
I get this warning 50+ times when I try to fit the following
model
lme( response ~ covariateA + poly(covariateB,3), ~poly(covariateB,3)|group )
It's not a small dataset - a set of up to 20 blood pressure
readings on just over 2000 people, and I don't get the error
message when I try to fit
2009 Sep 28
2
Polynomial Fitting
Hello All,
This might seem elementary to everyone, but please bear with me. I've
just spent some time fitting poly functions to time series data in R
using lm() and predict(). I want to analyze the functions once I've
fit them to the various data I'm studying. However, after pulling the
first function into Octave (just by plotting the polynomial function
using fplot() over
2008 Jan 07
3
Polynomial fitting
I wonder how one in R can fit a 3rd degree polynomial to some data?
Say the data is:
y <- c(15.51, 12.44, 31.5, 21.5, 17.89, 27.09, 15.02, 13.43, 18.18, 11.32)
x <- seq(3.75, 6, 0.25)
And resulting degrees of polynomial are:
5.8007 -91.6339 472.1726 -774.2584
THanks in advance!
--
Jonas Malmros
Stockholm University
Stockholm, Sweden
2001 Jul 09
1
polynomial regression and poly
When doing polynomial regression I believe it is a good idea to use the poly
function to generate orthogonal polynomials. When doing this in Splus there
is a handy function (transform.poly I think) to convert the coefficients
produced by regression with the poly function back to the original scale.
Has somebody written something similar for R ?
Robert
2003 Jan 16
2
polynomial contrasts in R
In S-Plus, I can obtain polynomial contrasts for an ordered factor with
contr.poly(). The function also exists in R, however is limited to factors
where the levels are equally spaced. In S-Plus, one can obtain the contrasts
for a set of numeric values representing unequally spaced ordered factors.
Has anyone implemented this in R? I see that the S-Plus function calls
another function (poly.raw())
2011 Feb 02
2
unequally spaced factor levels orthogonal polynomial contrasts coefficients trend analysis
Hello [R]-help
I am trying to find
> a package where you can do ANOVA based trend analysis on grouped data
> using orthogonal polynomial contrasts coefficients, for unequally
> spaced factor levels. The closest hit I've had is from this web site:
>(http://webcache.googleusercontent.com/search?q=cache:xN4K_KGuYGcJ:www.datavis.ca/sasmac/orpoly.html+Orthogonal+polynomial
>l
but I
2011 Jul 07
1
Polynomial fitting
Hello,
i'm fairly familiar with R and use it every now and then for math related
tasks.
I have a simple non polynomial function that i would like to approximate
with a polynomial. I already looked into poly, but was unable to understand
what to do with it. So my problem is this. I can generate virtually any
number of datapoints and would like to find the coeffs a1, a2, ... up to a
given
2004 Dec 03
3
Computing the minimal polynomial or, at least, its degree
Hi,
I would like to know whether there exist algorithms to compute the
coefficients or, at least, the degree of the minimal polynomial of a square
matrix A (over the field of complex numbers)? I don't know whether this
would require symbolic computation. If not, has any of the algorithms been
implemented in R?
Thanks very much,
Ravi.
P.S. Just for the sake of completeness, a
2013 Apr 27
2
Polynomial Regression and NA coefficients in R
Hey all,
I'm performing polynomial regression. I'm simulating x values using runif() and y values using a deterministic function of x and rnorm().
When I perform polynomial regression like this:
fit_poly <- lm(y ~ poly(x,11,raw = TRUE))
I get some NA coefficients. I think this is due to the high correlation between say x and x^2 if x is distributed uniformly on the unit interval
2009 Jan 11
4
How to get solution of following polynomial?
Hi, I want find all roots for the following polynomial :
a <- c(-0.07, 0.17); b <- c(1, -4); cc <- matrix(c(0.24, 0.00, -0.08,
-0.31), 2); d <- matrix(c(0, 0, -0.13, -0.37), 2); e <- matrix(c(0.2, 0,
-0.06, -0.34), 2)
A1 <- diag(2) + a %*% t(b) + cc; A2 <- -cc + d; A3 <- -d + e; A4 <- -e
fn <- function(z)
{
y <- diag(2) - A1*z - A2*z^2 - A3*z^3 - A4*z^4
2010 Jan 18
2
Predict polynomial problem
I have a function that fits polynomial models for the orders in n:
lmn <- function(d,n){
models=list()
for(i in n){
models[[i]]=lm(y~poly(x,i),data=d)
}
return(models)
}
My data is:
> d=data.frame(x=1:10,y=runif(10))
So first just do it for a cubic:
> mmn = lmn(d,3)
> predict(mmn[[3]])
1 2 3 4 5 6 7 8
2008 Oct 15
4
a really simple question on polynomial multiplication
Dear R people:
Is there a way to perform simple polynomial multiplication; that is,
something like
(x - 3) * (x + 3) = x^2 - 9, please?
I looked in poly and polyroot and expression. There used to be a
package that had this, maybe?
thanks,
Erin
--
Erin Hodgess
Associate Professor
Department of Computer and Mathematical Sciences
University of Houston - Downtown
mailto: erinm.hodgess at
2006 May 27
1
Recommended package nlme: bug in predict.lme when an independent variable is a polynomial (PR#8905)
Full_Name: Renaud Lancelot
Version: Version 2.3.0 (2006-04-24)
OS: MS Windows XP Pro SP2
Submission from: (NULL) (82.239.219.108)
I think there is a bug in predict.lme, when a polynomial generated by poly() is
used as an explanatory variable, and a new data.frame is used for predictions. I
guess this is related to * not * using, for predictions, the coefs used in
constructing the orthogonal
2008 Aug 19
1
Polynomial regression help
I have a simple X, Y data frame that I am trying to run regression analysis
on. The linear regression looks great, but when I use lm(formula = y ~
poly(x, degree = 5)) I get the same coeffecients. So for example if I use
degree =3 my formula would look like y = 4.2 x^3 + 3.2x^2 + 2.1x + 1.0 and
my degree 5 would look like y = 6.5x^5+ 5.4x^4 + 4.2 x^3 + 3.2x^2 + 2.1x +
1.0, which doesn't make