similar to: asymptotic convergence of savitzky-golay?

Displaying 20 results from an estimated 2000 matches similar to: "asymptotic convergence of savitzky-golay?"

2004 Feb 06
1
Savitzky-Golay smoothing -- an R implementation
As the request for the Savitzky-Golay Algorithm in R has come up several times, I here include my implementation based on code written for Matlab. Savitzky-Golay uses the pseudo-inverse pinv() of a matrix. There is an 'generalized inverse' ginv() in the MASS package, but I use a simpler form because I didn't want to 'require' MASS any time I apply Savitzky-Golay.
2004 Feb 05
2
Savitzky-Golay smoothing for reflectance data
I got a question from a fellow PhD student that work with spectrum analysis in Excel and now he has lots of spectrums that needs to be smoothed, which would be nice to be able to do in batch. Is there an R package that can do: Savitzky-Golay smoothing for reflectance spectral data or a function that does something similar. _______________________________________ Henrik Andersson
2011 Feb 22
2
Regarding Savitzky-Golay Smoothing Filter
Hi When we use the sav_gol command in R , it shows an error which says: " error in as.matrix". We've downloaded the necessary packages. Kindly help us with this issue. If there is any other function to perform Savitzky-Golay smoothing in R, please let me know. With Regards Reynolds
2003 Mar 21
1
Savitzky-Golay Derivative and Smoothing
If I'm not mistaken, that's sort of local polynomial with even degree and fixed bandwidth (based on my own interpretation of description in Numerical Recipes). You can do that with functions in the KernSmooth package. HTH, Andy > -----Original Message----- > From: wolf at micro-biolytics.com [mailto:wolf at micro-biolytics.com] > Sent: Friday, March 21, 2003 1:43 PM > To:
2004 Oct 04
2
Weighted Savitzky-Golay?
Hi, Does anyone know how to use weights and generate error bounds for Savitzky-Golay? I have a (smallish) set of points y equally spaced each with a known error and would like to smooth them using S-G but so as to take into account the error already have and construct new error bounds around them that take into account the errors they had at the beginning and the erros they get as a result
2004 Jan 28
0
savitzky-golay derivatives?
Dear all, Sorry if this is slightly off the track as far as R is concerned, but I have been using the Savitzky-Golay filter to estimate some derivatives of interest. I am wondering however, if anyone has seen anything in the literature (or has any ideas) of how these estimates perform asymptotically. Does anyone know what the rate of convergence is for these as the sample size increases?
2009 Nov 18
0
Optimal parameters for Savitzky-Golay smoothing filter (loop)
Hi I am running a Savitzky-Golay smoothing filter (http://tolstoy.newcastle.edu.au/R/help/04/02/0385.html) for variables in my dataset, dim (272:90). I managed to run the code for individual variables in the dataset and then combine the results into a single dataset. My novice attempt at this task is shown below csg<-NULL for (i in 1:ncol(data.all)) {
2008 Jan 28
1
Integer vs numeric
Hi the list. I do not understand the philosophy behind numeric and integer. - 1 is numeric (which I find surprising) - 2 is numeric. - 1:2 is integer. Why is that ? Christophe
2008 Oct 08
1
Fw: MLE
I made one typo in my previous mail.   May I ask one statistics related question please? I have one query on MLE itself. It's property says that, for large sample size it is normally distributed [i.e. asymptotically normal]. On the other hand it is "Consistent" as well. My doubt is, how this two asymptotic properties exist simultaneously? If it is consistent then asymptotically it
2010 Nov 23
1
Factor analysis and cfa with asymptotically distributed data
I have friendship data which is strong skewed. So it doesn't make sense to use maximum likelihood methods for fa and cfa. But I couldn't find any function for asymptotically distributed data for doing a factor analysis. Only: apca() but there is no possibility to allow for factor correlations. The same problem is with sem() I couldn't get any solutions for my model because of the
2013 Jan 12
2
Getting the R squared value in asymptotic regression model
Please help getting the R squared value in asymptotic regression model I use the code below model1<-nls(GN1~SSasymp (nrate,a,b,c), data = data.1 ) and R produced the modell coefficients without the R squared value? -- Ahmed M. Attia Research Assistant Dept. Of Soil&Crop Sciences Texas A&M University ahmed <ahmedatia@zu.edu.eg>.attia@ag.tamu.edu Cell phone:
2004 Aug 04
0
Asymptotic Regression Model
Hi listers, I have some data (see attachment), and I fitted it to the Asymptotic Regression Model with NLSstAsymptotic(xy). But I want to know the significance of these fits. How can I accomplish this using R? Can anyone suggest some theoretical reading on this subject? Thanks, Miguel -- Miguel Figueiredo IT student / Marine Biologist "Tem calma irm?o que a morte n?o precisa do teu
2012 Oct 03
2
[LLVMdev] Does LLVM optimize recursive call?
> From: "Journeyer J. Joh" <oosaprogrammer at gmail.com> > I have a simple question about LLVM. > > I learned that we need to use iterations than recursions in C programming. > That is because recursion is expensive. It can easily consume out all > the stack given to a program. And the call/return consumes time much > more. > > But I've read a
2014 Nov 28
5
[LLVMdev] [RFC] Removing BBVectorize?
Hi Everyone, I propose that we remove BBVectorize from trunk. Here's why: - It never made it from "interesting experiment" to "production quality" (it is not part of any in-tree optimization pipeline). - We now have an SLP vectorizer that we do use in production, had have for some time. - BBVectorize otherwise needs refactoring, and the implementation has lots of
2023 Dec 19
1
Partial matching performance in data frame rownames using [
Hi Hilmar and Ivan, I have used your code examples to write a blog post about this topic, which has figures that show the asymptotic time complexity of the various approaches, https://tdhock.github.io/blog/2023/df-partial-match/ The asymptotic complexity of partial matching appears to be quadratic O(N^2) whereas the other approaches are asymptotically faster: linear O(N) or log-linear O(N log N).
2007 Jan 30
2
any implementations for adaptive modeling of time series?
Hallo, my noisy time series represent a fading signal comprising of long enough parts with a simple trend inside of each such a part. Transition from one part into another is always a non-smooth and very sharp/acute. In other words I have a piecewise polynomial noisy curve asymptotically converging to the biased constant, points between pieces are non-differentiable. I am looking for
2010 Jun 06
1
Robust Asymptotic Statistics (RobASt)
Hi all, Other than: http://www.stamats.de/F2006.r Are there other good simple examples out there of using the ROptRegTS package (part of the RobASt project)? I'm hoping to plug it in for multivariate regression. Or is this not a good idea? Just trying to find out how it compares to rlm, lts, glm, etc. Hopefully this makes sense, I'm new to the world of statistics and R. Thanks! St0x
2006 Mar 05
1
glm gives t test sometimes, z test others. Why?
I just ran example(glm) and happened to notice that models based on the Gamma distribution gives a t test, while the Poisson models give a z test. Why? Both are b/s.e., aren't they? I can't find documentation supporting the claim that the distribution is more like t in one case than another, except in the Gaussian case (where it really is t). Aren't all of the others approximations
2008 Oct 08
0
MLE
May I ask one statistics related question please? I have one query on MLE itself. It's property says that, for large sample size it is normally distributed [i.e. asymptotically normal]. On the other hand it is Efficient as well. My doubt is, how this two asymptotic properties exist simultaneously? If it is consistent then asymptotically it should collapse to "truth" i.e. for large
2012 Oct 03
3
[LLVMdev] Does LLVM optimize recursive call?
On Wed, Oct 3, 2012 at 10:15 AM, Matthieu Moy <Matthieu.Moy at grenoble-inp.fr> wrote: > Preston Briggs <preston.briggs at gmail.com> writes: >> Think about costs asymptotically; that's what matters. Calls and >> returns require constant time, just like addition and multiplication. > > Constant time, but not necessarily constant memory. > > Deep recursion