similar to: iterative regressions, adding a new line of data each, time

Displaying 20 results from an estimated 6000 matches similar to: "iterative regressions, adding a new line of data each, time"

2011 Dec 01
3
Assign name to object for each iteration in a loop.
Hi R-users, I'm trying to produce decompositions of a multiple time-series, grouped by a factor (called "area"). I'm modifying the code in the STLperArea function of package ndvits, as this function only plots produces stl plots, it does not return the underlying data. I want to extract the trend component of each decomposition ("x$time.series[,trend]), assign a name
2009 Apr 20
1
R-Squared with biglm?
I've been working with a rather large data set (~10M rows), and while biglm works beautifully for generating coefficients, it does not report an r-squared. It does report RSS. Any idea on how one could coax an R-squared out of biglm? Thanks in advance for any help with this! Bryan Lim Lecturer Department of Finance University of Melbourne [[alternative HTML version deleted]]
2002 Oct 31
1
Re: gregmisc version 0.7.3 now available
Dear Greg, Thanks for the new release. The decomposition of the SSQ is just what I need! Regards, Martin. Martin Hoyle, School of Life and Environmental Sciences, University of Nottingham, University Park, Nottingham, NG7 2RD, UK Webpage: http://myprofile.cos.com/martinhoyle >>> gregory_r_warnes at groton.pfizer.com 10/30/02 07:16PM >>> Version 0.7.3 of the gregmisc package
2010 Feb 11
1
Blinder-Oaxaca decompositions
I'm looking for a routine in R to do Blinder-Oaxaca (and related) decompositions. A very nice one has been written (by Jann) for Stata (and I'm evaluating whether I can switch over to R). I'm having a hard time finding any reference in R documentation to this pretty ubiquitous tool (in labour economics) for decomposing differences between two groups into differences in means and
2004 Nov 05
1
fast partial spectral decompositions.
hello, i want to compute the top k eigenvalues+eigenvectors of a (large) real symmetric matrix. since it doesn't look like any top-level R function does this, i'll call LAPACK from a C shlib and then use .Call. the only LAPACK function i see to do this in R_ext/Lapack.h is dsyevx. however, i know that in LAPACK dsyevr can also return a partial eigendecomposition. why is dsyevr not
2004 Nov 05
1
fast partial spectral decompositions.
hello, i want to compute the top k eigenvalues+eigenvectors of a (large) real symmetric matrix. since it doesn't look like any top-level R function does this, i'll call LAPACK from a C shlib and then use .Call. the only LAPACK function i see to do this in R_ext/Lapack.h is dsyevx. however, i know that in LAPACK dsyevr can also return a partial eigendecomposition. why is dsyevr not
2010 Jan 30
3
iterative regressions, adding a new line of data each time
Hi, I am pretty new to R. I'm trying run a regression repeatedly, adding a new data point each time, and then storing the predicted Y values. For example, let's say I have 500 data points and I run the regression. I would then like to store the Y value, run the regression again using 501 data points, store the new Y, run the regression with 502 data points, store the Y, and so on. What I
2009 Feb 19
1
Questions about biglm
Hello folks, I am very excited to have discovered R and have been exploring its capabilities. R's regression models are of great interest to me as my company is in the business of running thousands of linear regressions on large datasets. I am using biglm to run linear regressions on datasets that are as large as several GB's. I have been pleasantly surprised that biglm runs the
2008 Aug 17
1
package building problem on windows
Hi, I'm trying to compile the package biglm, but when I build it with R CMD build biglm, it failed : C:\LOCAL\c-dutang\code\R\biglm2>R CMD build biglm * checking for file 'biglm/DESCRIPTION' ... OK * preparing 'biglm': * checking DESCRIPTION meta-information ...C:/DOCUME~1/c-dutang/Local: Can't op n C:/DOCUME~1/c-dutang/Local: No such file or directory
2006 Aug 21
5
lean and mean lm/glm?
Hi All: I'm new to R and have a few questions about getting R to run efficiently with large datasets. I'm running R on Windows XP with 1Gb ram (so about 600mb-700mb after the usual windows overhead). I have a dataset that has 4 million observations and about 20 variables. I want to run probit regressions on this data, but can't do this with more than about 500,000 observations before
2006 Dec 13
2
caching frequently used values
Hi, I am trying to find an elegant way to compute and store some frequently used matrices "on demand". The Matrix package already uses something like this for storing decompositions, but I don't know how to do it. The actual context is the following: A list has information about a basis of a B-spline space (nodes, order) and gridpoints at which the basis functions would be
2010 Oct 31
1
biglm: how it handles large data set?
I am trying to figure out why 'biglm' can handle large data set... According to the R document - "biglm creates a linear model object that uses only p^2 memory for p variables. It can be updated with more data using update. This allows linear regression on data sets larger than memory." After reading the source code below? I still could not figure out how 'update'
2009 Mar 17
1
exporting s3 and s4 methods
If a package defined an S3 generic and an S4 generic for the same function (so as to add methods for S4 classes to the existing code), how do I set up the namespace to have them exported? With import(stats) exportMethods(bigglm) importClassesFrom(DBI) useDynLib(biglm) export(biglm) export(bigglm) in NAMESPACE, the S3 generic is not exported. > methods("bigglm") [1] bigglm.RODBC*
2011 Jul 25
1
biglm() and NeweyWest()
Dear all, I am working on a large dataset and need to use biglm() to perform OLS regressions. I have detected significant ARCH effects which I try to account for using the Newey-West correction. So far, I have worked with NeweyWest() in the sandwich package. NeweyWest() however seems to be unable to handle an object of class "biglm". Looking into the code, I figured out that
2010 Jun 15
1
help biglm.big.matrix; problem with weights
Hello colleagues, I have tried to use the package biglm. I want to specify a multivariate regression with a weight. I have imported a large dataset with the library(bigmemory). I load the library (biglm) and specified a regression with a weight. But I get everytime a error message like ?object not found? or ?`weights' must be a formula? or "error in eval(expr, envir, enclos)". I
2017 Jan 02
1
varimax implementation in stats package
Hello, recently I was looking at the implementation of the "varimax" rotation procedure from the "stats" package and to me it looks quite different from the algorithm originally suggested by Kaiser in 1958. The R procedure iteratively uses singular value decompositions of some matrices whereas Kaiser proposed to iteratively compute rotation matrices between all pairs of
2009 Jul 03
2
bigglm() results different from glm()
Hi Sir, Thanks for making package available to us. I am facing few problems if you can give some hints: Problem-1: The model summary and residual deviance matched (in the mail below) but I didn't understand why AIC is still different. > AIC(m1) [1] 532965 > AIC(m1big_longer) [1] 101442.9 Problem-2: chunksize argument is there in bigglm but not in biglm, consequently,
2009 Jun 17
3
Matrix inversion-different answers from LAPACK and LINPACK
Hello. I am trying to invert a matrix, and I am finding that I can get different answers depending on whether I set LAPACK true or false using "qr". I had understood that LAPACK is, in general more robust and faster than LINPACK, so I am confused as to why I am getting what seems to be invalid answers. The matrix is ostensibly the Hessian for a function I am optimizing. I want to get
2011 Nov 15
1
getting R2 (goodness of fit) result after using biglm()
Hello. I had been struggling with running linear regression using lm() primarily because my data has a few categorical variables with at least a thousand levels. I tried the biglm() function and it worked. My problem now is that i don't know how to get the R2 results. Could someone help? Thanks, sean
2007 Dec 05
2
converting factors to dummy variables
Hi all - I'm trying to find a way to create dummy variables from factors in a regression. I have been using biglm along the lines of ff <- log(Price) ~ factor(Colour):factor(Store) + factor(DummyVar):factor(Colour):factor(Store) lm1 <- biglm(ff, data=my.dataset) but because there are lots of colours (>100) and lots of stores (>250), I run it to memory problems. Now, not every