search for: quartic

Displaying 20 results from an estimated 24 matches for "quartic".

2011 May 06
2
Confidence intervals and polynomial fits
...tercept) 8.7216693 8.9844958 dispersal -37.5643922 -35.6652276 dispersal_sq 66.8121519 74.4534753 dispersal_cb -74.5995820 -62.8377766 dispersal_qt 22.6592724 28.6509494 These tables show the problem: the 95% confidence limits for the quartic term are every bit as wide as the limits for the other terms. Since the quartic term coefficient gets multiplied by the fourth power of x, this means that the width of the confidence band starts out nice and narrow (when x is close to zero, where the width of the confidence band is pretty much jus...
2010 Jan 08
0
solving cubic/quartic equations non-iteratively -- comparisons
Hi, I'm responding to a post about finding roots of a cubic or quartic equation non-iteratively. One obviously could create functions using the explicit algebraic solutions. One post on the subject noted that the square-roots in those solutions also require iteration, and one post claimed iterative solutions are more accurate than the explicit solutions. This post, h...
2010 Jan 05
4
solving cubic/quartic equations non-iteratively
To R-helpers, R offers the polyroot function for solving mentioned equations iteratively. However, Dr Math and Mathworld (and other places) show in detail how to solve mentioned equations non-iteratively. Do implementations for R that are non-iterative and that solve mentioned equations exists? Regards, Mads Jeppe
2007 Dec 26
1
Cubic splines in package "mgcv"
R-users E-mail: r-help@r-project.org My understanding is that package "mgcv" is based on "Generalized Additive Models: An Introduction with R (by Simon N. Wood)". On the page 126 of this book, eq(3.4) looks a quartic equation with respect to "x", not a cubic equation. I am wondering if all routines which uses cubic splines in mgcv are based on this quartic equation. In my humble opinion, the '^4' in the first term of the second line of this equation should be '^3'. K. Takezawa --...
2011 Aug 16
0
Cubic splines in package "mgcv"
re: Cubic splines in package "mgcv" I don't have access to Gu (2002) but clearly the function R(x,z) defined on p126 of Simon Wood's book is piecewise quartic, not piecewise cubic. Like Kunio Takezawa (below) I was puzzled by the word "cubic" on p126. As Simon Wood writes, this basis is not actually used by mgcv when specifying bs="cr". Maybe the point is that at the knot, this continuous function has continuous 1st and 2nd derivat...
2005 Sep 16
1
Question:manipulating spatial data using combination of Maptools and Splancs
Hi, I have a problem that concerns combination of the package Maptools and Splancs I have 2 shapefiles that i want to manipulate (one of type point and one polygon).I import them in R using Maptools but then i can't estimate a quartic Kernel using Splancs. The package doesn't recognize the shapes (invalid points and poly argument).I don't know if this is an easy task but i have read both packages's manual and i can't find a liable solution. Thank u for your time.
2008 Jan 05
2
Behavior of ordered factors in glm
...riables: $ issuecat : Factor w/ 5 levels "0 - 39","40 - 49",..: 1 1 1 1... snip I then defined issuecat as ordered: > xxx$issuecat<-as.ordered(xxx$issuecat) When I include issuecat in a glm model, the result makes me think I have asked R for a linear+quadratic+cubic+quartic polynomial fit. The results are not terribly surprising under that interpretation, but I was hoping for only a linear term (which I was taught to called a "test of trend"), at least as a starting point. > age.mdl<-glm(actual~issuecat,data=xxx,family="poisson") > sum...
2009 Mar 30
1
Warning messages in Splancs package :: no non-missing arguments to min; returning Inf
...0=4, nx=100, ny=100), col=terrain.colors(10)) pointmap(auto.spp, col="red", add=TRUE) I would need to analyze the relationship betweeb the two Shapefiles, but I am receiving the following warning message and a blank output Xrange is 1827.026 6796.202 Yrange is 1853.896 6832.343 Doing quartic kernel Warning messages: 1: In min(x) : no non-missing arguments to min; returning Inf 2: In max(x) : no non-missing arguments to max; returning -Inf Can someone help me with what am I doing wrong in the execution code? I am getting a blank graph. Best regards, D
1999 Dec 01
1
density(kernel = "cosine") .. the `wrong cosine' ..
...ot;) to use the cosine from the litterature. - provide the current "cosine" as kernel = "smoothcosine" {I'd like to keep the possibility of 1-initial-letter abbreviation} Enhancement (easy, I'll do that): - We further provide both Epanechnikov and "quartic" aka "biweight" additionally in any case. Martin Maechler <maechler@stat.math.ethz.ch> http://stat.ethz.ch/~maechler/ Seminar fuer Statistik, ETH-Zentrum LEO D10 Leonhardstr. 27 ETH (Federal Inst. Technology) 8092 Zurich SWITZERLAND phone: x-41-1-632-3408 fax: ...-1228...
2007 Mar 03
2
format of summary.lm for 2-way ANOVA
Hi, I am performing a two-way ANOVA (2 factors with 4 and 5 levels, respectively). If I'm interpreting the output of summary correctly, then the interaction between both factors is significant: ,---- | ## Two-way ANOVA with possible interaction: | > model1 <- aov(log(y) ~ xForce*xVel, data=mydataset) | | > summary(model1) | Df Sum Sq Mean Sq F value Pr(>F) |
2005 Dec 04
1
Understanding nonlinear optimization and Rosenbrock's banana valley function?
...c., I thought I'd ask you all for your thoughts. ROSENBROCK'S BANANA VALLEY FUNCTION? Beyond this, I wonder if someone help me understand the lessons one should take from Rosenbrock's banana valley function: banana <- function(x){ 100*(x[2]-x[1]^2)^2+(1-x[1])^2 } This a quartic x[1] and a parabola in x[2] with a unique minimum at x[2]=x[1]=1. Over the range (-1, 2)x(-1,1), it looks like a long, curved, deep, narrow banana-shaped valley. It is a known hard problem in nonlinear regression, but these difficulties don't affect "nlm" or "nlminb" u...
2005 Dec 15
0
bivariate kernel density estimates at point locations ( r ather than at grid locations)
...imation in R that returns the density estimates at the > observed point locations rather than at grid locations. I > have looked at > a number of different routines and they all seem to return > estimates at > grid locations. > > Any type of kernel is fine (i.e., Gaussian, Quartic, etc). > > Thank you for your help! > > Matt Strickland > U.S. Centers for Disease Control and Prevention > > ______________________________________________ > R-help at stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read...
2008 Oct 24
0
Weighted LSCV 2d kernel smoothing
...have been unable to find tools to conduct such analysis in R. My biggest problem is determining the LSCV value and finding a tool that does weighted smoothing, as there are obviously many different packages that will conduct 2d kernel smoothing (kde2d{MASS} - gaussian, (sp)kernel2d in {splaces} -- quartic) and even those that do weighted 1d (density{stats}). Crimestat will conduct the smoothing (weighted or no) if I can determine LSCV values, so getting past that step alone would be excellent. Hlscv{ks} would appear to calculate LSCV for non-weighted datasets once I can parse the example into workin...
2004 Feb 06
1
Savitzky-Golay smoothing -- an R implementation
...iltering method of Savitzky and Golay # See Numerical Recipes, 1992, Chapter 14.8, for details. # # T = vector of signals to be filtered # (the derivative is calculated for each ROW) # fl = filter length (for instance fl = 51..151) # forder = filter order (2 = quadratic filter, 4= quartic) # dorder = derivative order (0 = smoothing, 1 = first derivative, etc.) # sav.gol <- function(T, fl, forder=4, dorder=0) { m <- length(T) dorder <- dorder + 1 # -- calculate filter coefficients -- fc <- (fl-1)/2 # index: window left and right...
2009 Jul 03
2
Confidence Limits for a Cross-Product Ratio
Data from Fisher's paper: Confidence Limits for a Cross-Product Ratio. > y col1 col2 [1,] 10 3 [2,] 2 15 fisher.test(y) Fisher's Exact Test for Count Data data: y p-value = 0.0005367 alternative hypothesis: true odds ratio is not equal to 1 95 percent confidence interval: 2.753438 300.682787 sample estimates: odds ratio 21.30533 The crude odds
2011 Mar 10
1
Sample or Probability Weights in LM4, NLME (and PLM) package
...n-weighted random-effects regressions and I read about the varFunc objects but I really stuck here. I would like to ask you if you could help me briefly. How can use the weighting function in the lme function for my purpose? My variables are id for each individual, year, age (and age squared to age quartic) and (to begin with) constant weights for each individual. I would prefer to use yearly changing weights per individual to capture better attrition. Do my weights have to be constant in nlme for every individual (just as in xtreg in Stata)? My main variables: id year income age cross-sectional w...
2011 Dec 02
2
Unexplained behavior of level names when using ordered factors in lm?
Hello dear all, I am unable to understand why when I run the following three lines: set.seed(4254) > a <- data.frame(y = rnorm(40), x=ordered(sample(1:5, 40, T))) > summary(lm(y ~ x, a)) The output I get includes factor levels which are not relevant to what I am actually using: Call: > lm(formula = y ~ x, data = a) > Residuals: > Min 1Q Median 3Q Max >
2007 Feb 27
2
RDA and trend surface regression
Dear all, I'm performing RDA on plant presence/absence data, constrained by geographical locations. I'd like to constrain the RDA by the "extended matrix of geographical coordinates" -ie the matrix of geographical coordinates completed by adding all terms of a cubic trend surface regression- . This is the command I use (package vegan): >rda(Helling ~
2013 Jan 03
1
interpreting results of regression using ordinal predictors in R
Dear friends, Being very new to this, I was wondering if I could get some pointers and guidance to interpreting the results of performing a linear regression with ordinal predictors in R. Here is a simple, toy example: y <- c(-0.11, -0.49, -1.10, 0.08, 0.31, -1.21, -0.05, -0.40, -0.01, -0.12, 0.55, 1.34, 1.00, -0.31, -0.73, -1.68, 0.38, 1.22, -1.11, -0.20) x <-
2012 Oct 22
4
Help with applying a function to all possible 2x2 submatrices
Hi all, I'm working with a large data set (on the order of 300X300) and trying to apply a function which compares the elements of all possible 2x2 submatrices. There are rc(r-1)(c-1) such submatrices, so obviously the naive method of looping through the rows and columns is computationally unfeasible for my data set: for(i in 1:(nrow(data)-1)) { for(j in (i+1):nrow(data)) { for (m