Displaying 20 results from an estimated 40000 matches similar to: "2d loess question"
2011 Oct 07
1
loess question
Hi All,
I am trying to use loess to smooth a 2D image, and also obtain the
standard error for every pixel. I see that the standard error does not
make sense. For example, running the following:
library(stats)
x <- array(c(1:100), dim=c(100,100))
y <- t(x)
v <- exp(-((x-50)^2+(y-50)^2)/30^2)
s <- v*0.02
g_noise <- rnorm(10000, mean = 0, sd = s)
f <- v + g_noise
f.loess <-
2012 May 03
1
cannot calculate standard estimate with predict on loess
Hi,
For some reason I have been unable to use the predict function when I
desire the standard error to be calculated too. For example, when I try
the following:
l<- loess(d~x+y, span=span, se=TRUE)
p<- predict(l, se=TRUE)
I get the following error message:
Error in vector("double", length) : vector size cannot be NA
In addition: Warning message:
In N * M1 : NAs produced by
2011 Jun 16
0
Update: Is there an implementation of loess with more than 3 parametric predictors or a trick to a similar effect?
Dear R developers!
Considering I got no response or comments in the general r-help forum
so far, perhaps my question is actually better suited for this list? I
have added some more hopefully relevant technical details to my
original post (edited below).
Any comments gratefully received!
Best regards,
David Kreil.
----------
Dear R experts,
I have a problem that is a related to the question
2010 Nov 10
1
standardized/studentized residuals with loess
Hi all,
I'm trying to apply loess regression to my data and then use the fitted
model to get the *standardized/studentized residuals. I understood that for
linear regression (lm) there are functions to do that:*
*
*
fit1 = lm(y~x)
stdres.fit1 = rstandard(fit1)
studres.fit1 = rstudent(fit1)
I was wondering if there is an equally simple way to get
the standardized/studentized residuals for a
2011 Feb 07
1
tri-cube and gaussian weights in loess
>From what I understand, loess in R uses the standard tri-cube function.
SAS/INSIGHT offers loess with Gaussian weights. Is there a function in R
that does the same?
Also, can anyone offer any references comparing properties between tri-cube
and Gaussian weights in LOESS?
Thanks. - Andr?
--
View this message in context:
2011 Jun 17
0
the c implementation of loess
Hi All,
I am trying to trace the origin of the current loess implementation in
R. The reference mentions that Prof Ripley based it on the 1998 version
of dloess. When I look at dloess in http://www.netlib.org/a, the file
"changes" mentions dloess was made available in 1992 and that a memory
leak was plugged in 1996 with no mention of 1998. Is there another
version available?
2008 Jun 03
1
'asymmetric span' for 2D loess?
Hello,
I am interested in performing a 2D loess smooth on microarray data, i.e.
log2 ratios on a 2D grid, using different spans in the horizontal and
vertical directions (the immediate reason being that replicate spots are
laid out in the horizontal direction). Is it possible to do this in R?
Functions like loess(stats) seem to apply the same span for all
predictors, which carries over to
2005 Nov 17
3
loess: choose span to minimize AIC?
Is there an R implementation of a scheme for automatic smoothing
parameter selection with loess, e.g., by minimizing one of the AIC/GCV
statistics discussed by Hurvich, Simonoff & Tsai (1998)?
Below is a function that calculates the relevant values of AICC,
AICC1 and GCV--- I think, because I to guess from the names of the
components returned in a loess object.
I guess I could use
2012 Mar 24
0
Loess CI
I am trying to (semi) calculate the confidence intervals for a loess smoother
(function: loess()), but have been thus far unsuccessful.
The CI for the loess predicted values, yhat, are apparently
yhat +- t*s * sqrt(w^2), where s is the residual sum of squares and w is the
weight function
Correct me of I'm wrong, but R uses the tricubic function (1-abs(z)^3)^3,
where z = (x-xi)/h, where h
2006 Jul 07
0
User Error (was LOESS (PR#9064))
Please do as we ask (repeatedly) and study the help page before posting.
'family' is a separate argument, not part of loess.control, as the help
page correctly documents. If you use
cars.lo2 <- loess(dist ~ speed, cars, family = "symmetric",
control = loess.control(surface = "direct", iterations = 20))
cars.lo2$pars$iterations
it prints *20*, as it is
2006 Jul 07
1
LOESS (PR#9064)
Hello,
I found a little BUG in loess <stats>. It does not receive the iterations
parameter.
It can be debugged in the following way:
THIS IS AN EXCERPT FROM THE CODE:
....
fit <- simpleLoess(y, x, w, span, degree, parametric, drop.square,
normalize, control$statistics, control$surface, control$cell,
iterations, control$trace.hat)
Replace argument iterations with
2010 Oct 26
2
anomalies with the loess() function
Hello Masters,
I run the loess() function to obtain local weighted regressions, given
lowess() can't handle NAs, but I don't
improve significantly my situation......, actually loess() performance leave
me much puzzled....
I attach my easy experiment below
#------SCRIPT----------------------------------------------
#I explore the functionalities of lowess() & loess()
#because I have
2005 Aug 18
1
display of a loess fitted surface
Good morning,
I am Marta Colombo,student at Politecnico,Milan. I am studying local regression models and I am using loess function. My problem is that when I have a loess object I don't know how to display the fitted surface; in fact, while in S when you have a loess object you can see it writing plot(object), in R this dosen't work. Also I'd like to know if there is something like the
2001 Feb 21
1
Gradient field from loess
I have a two-dimensional loess fit, and need to calculate the
gradient field from it. Even after looking at loess.c and loess.f,
I don't understand the meaning of the returned polynomial coefficients.
Or is the brute force method of using a tangential approx
to the fitted values the way to go?
Dieter Menne
---------------------------------------
Dr. Dieter Menne
Biomed Software
72074
2004 Apr 09
1
loess' robustness weights in loess
hi!
i want to change the "robustness weights" used by loess. these
are described on page 316 of chambers and hastie's "statistical models in S"
book as
r_i = B(e_i,6m)
where B is tukey's biweight function, e_i are the residulas, and m is the
median average distance from 0 of the residuals. i want to
change 6m to, say, 3m.
is there a way to do this? i cant
2012 Mar 10
1
How to fit a line through the "Mountain crest", i.e., through the highest density of points - in a "loess-like" fashion.
Hi,
I'm trying to normalize data by fitting a line through the highest density
of points (in a 2D plot).
In other words, if you visualize the data as a density plot, the fit I'm
trying to achieve is the line that goes through the "crest" of the mountain.
This is similar yet different to what LOESS does. I've been using loess
before, but it does not exactly that as it takes
2012 Apr 30
1
Help with loess "Standard Error of the Residuals"
Dear All
I'm having trouble working out what exactly loess means by its "Standard Error of the Residuals" denoted s
and in particular when the weights argument is invoked.
For example, if the weights are weights are all =1, then s^2 is nearly sum sq res/ (n -1 - 'equiv num paras')
If the weights are all k then s is proportional to k
If the weights are unequal, I
2012 Feb 10
0
a) t-tests on loess splines; b) linear models, type II SS for unbalanced ANOVA
Dear all,
I have some questions regarding the validity an implementation of
statistical tests based on linear models and loess. I've searched the
R-help arhives and found several informative threads that related to my
questions, but there are still a few issues I'm not clear about. I'd be
grateful for guidance.
Background and data set:
I wish to compare the growth and metabolism
2009 Jul 01
2
getOptions("max.print") in R
I am typing the following on the command prompt:
>variab = read.csv(file.choose(), header=T)
>variab
It lists 900,000 ( this is the total number of observations in "variab" )
minus 797124 observations and prompts the following message
[ reached getOption("max.print") -- omitted 797124 entries ]]
Is there a way to see the entire set of data, ie all of 900,000 obs, and
2009 Jul 11
2
Heckman Selection Model/Inverse Mills Ratio
I have so far used the following command
glm(formula = s ~ age + gender + gemedu + gemhinc + es_gdppc +
imf_pop + estbbo_m, family = binomial(link = "probit"))
My question is
1. How do i discard the non significant selection variables (one out of the
seven variables above is non-significant) and calculate the Inverse Mills
Ratio of the significant variables
2. I need the inverse