similar to: Aggregate daily data into weekly sums

Displaying 20 results from an estimated 1000 matches similar to: "Aggregate daily data into weekly sums"

2006 Nov 15
1
dynamic aggregation of many variables
Hi, i have many variables for in example 4weeks and want to do aggregations, like mean standard , deviation etc.. With mean it works but how i can calculate the standard deviation for the 4weeks and for every ID. many thanks & regards, christian week1 <- grep("(_PRO_001)",names(dmx3),perl=T) week1table <- subset(dmx3,select=c(ID,week1)) week2 <-
2000 Jan 04
0
Stepwise logistic discrimination - II
I apologise for writing again about the problem with using stepAIC + multinom, but I think the reason why I had it in the first place is perhaps there may be a bug in either stepAIC or multinom. Just to repeat the problem, I have 126 variables and 99 cases. I don't know if the large number of variables could be the problem. Of couse the reason for doing a stepwise method is to reduce this
2008 Nov 25
2
Heat Maps
Dear List, Does there exist a function that produces a heat map like this one (image 3 of 4): http://www.tdameritrade.com/tradingtools/options360.html?a=HDY&referrer=http%3A%2F%2Fquery.nytimes.com%2Fsearch%2Fsitesearch%3Fquery%3Dheatmaptype%3Dnyt In addition to colors, two other main features I am intersted in are: 1. Proportionality in the size of the grid. 2. Mose-over capability. I may
2009 May 09
2
Histogram frequencies with a normal pdf curve overlay
Dear List, When I plot a histogram with 'freq=FALSE' and overlay the histogram with a normal pdf curve, everything looks as expected, as follows: x <- rnorm(1000) hist(x, freq=FALSE) curve(dnorm(x), add=TRUE, col="blue") What do I need to do if I want to show the frequencies (freq=TRUE) with the same normal pdf overlay, so that the plot would still look the same? Regards,
2007 Sep 26
3
Scientific Notation
Dear List: Below is how I specify an axis: axis(2, at=c(0.00005, 0.0005)) R displays the numbers in scientific notation. What argument/parameter should I use to tell R to display the numbers as specified rather than in scientific notation? > version _ platform i386-pc-mingw32 arch i386 os mingw32 system i386, mingw32 status major
2008 Feb 02
2
Confidence Interval
I have a model as follows: x <- replicate(100, sum(rlnorm(rpois(1,5), 0,1))) y <- quantile(x, 0.99)) How would one go about estimating the boundaries of a 95% confidence interval for y? Any pointers would be greatly appreciated. > version _ platform i386-pc-mingw32 arch i386 os mingw32 system i386, mingw32 status major 2 minor 5.1 year 2007 month 06 day 27 svn rev 42083 language R
2007 May 31
1
R keeps crashing when executing 'rlogspline'
Dear List, I have a simple model as follows: x <- rnorm(500) library(logspline) fit <- logspline(x) n <- 1000000 y <- replicate(n, sum(rlogspline(rpois(1,10), fit))) # last line The problem I keep getting is R crashes when doing the last line. It seems to be fine if n is small, but not if n is 1000000. The message I keep getting is: "R for Windows GUI front-end has
2007 May 04
2
Alternatives to unlist()
Given the following, one of the things I am trying to see is what % of draws are below a certain number: lambda <- 3 rate <- 5 n <- 5 set.seed(123) v <- replicate(n, rexp(rpois(1,lambda), rate)) vv <- unlist(v) cat("% of draws below 0.1:", round(length(subset(vv, vv < 0.1))/length(vv)*100,0), "%\n") In actuality, my lambda, rate, and n are 26, 10, 1000000,
2005 Jun 16
1
identical results with PQL and Laplace options in lmer function (package lme4)
Dear R users I encounter a problem when i perform a generalized linear mixed model (binary data) with the lmer function (package lme4) with R 2.1.0 on windows XP and the latest version of package "lme4" (0.96-1) and "matrix" (0.96-2) both options "PQL" and "Laplace" for the method argument in lmer function gave me the same results (random and fixed effects
2007 Sep 07
1
How to obtain parameters of a mixture model of two lognormal distributions
Dear List, I have read that a lognormal mixture model having a pdf of the form f(x)=w1*f1(x)+(1-w1)*f2(x) fits most data sets quite well, where f1 and f2 are lognormal distributions. Any pointers on how to create a function that would produce the 5 parameters of f(x) would be greatly appreciated. > version _ platform i386-pc-mingw32 arch i386 os
2007 Sep 26
2
Password-protect script files
Dear List, Is there any way to password-protect script files (either within R or otherwise)? platform i386-pc-mingw32 arch i386 os mingw32 system i386, mingw32 status major 2 minor 5.1 year 2007 month 06 day 27 svn rev 42083 language R version.string R version 2.5.1 (2007-06-27)
2010 Apr 07
1
finding weekly average...
Hi All, I have a time series data with two continuous variables (say Var1 and Var2) for 4 years (***not continuous, do have some breaks because of missing data***). Something like this: Date Var1 Var2 12/01/2004 7 0 12/01/2004 0 0 12/01/2004 0 7 12/01/2004 7 0 12/01/2004 0 7 12/01/2004 0 7 12/02/2004 0 0 ... I need to find out weekly average of var1 and var2, so that I end up with data like:
2007 Jul 19
2
df manipulation
I have multicolumn data.frames with the first comumn giving ordinal observation index ( ex.: 1 4 7 9 11 13 etc). I would like to fill up the missing observations (i.e. 2 3 5 6 8 etc) with "NA"s. Thank you
2010 Feb 19
1
ggplot2 X axis levels
Hi all: I've done this before with factors but can't figure how to do it with a continuous variable. I am trying to reorder the sequence of my weeks along the X axis. I want to start with week 27 to 52 and then 1 to 26. I guess I could use levels along with seq() but doesn't seem to work for me. Thanks for your help winter <- structure(list(week = c(27L, 28L, 29L, 30L, 31L, 32L,
2010 Jun 18
0
pcse package - is it OK to use it when my regression is weighted by each subgroup's mean
Hello! Just would like to make sure I am not doing something wrong. I am running an OLS regression. I have several subgroups in the data set (locations) - and in each location I have weekly data for 2 years - on my DV and on all predictors. Looks like this: location week DV Predictor1 Predictor 2 location1 week1 xxx xxxxxxx xxxxxxxxx location1 week2 xxx xxxxxxx xxxxxxxxx . .
2007 Aug 20
1
Ask for functions to obtain partial R-square (squared partial correlation coefficients)
The partial R-square (or coefficient of partial determination, or squared partial correlation coefficients) measures the marginal contribution of one explanatory variable when all others are already included in multiple linear regression model. The following link has very clear explanations on partial and semi-partial correlation: http://www.psy.jhu.edu/~ashelton/courses/stats315/week2.pdf In
2013 Jan 04
2
Can you help me please
HI Fares, You could try this: dat1<- read.table(text=" date????? donation 3jan2003?? 20235 4jan2003?? 25655 5jan2003?? 225860 6jan2003?? 289658 7jan2003?? 243889 8jan2003?? 244338 9jan2003?? 243889 ",sep="",header=TRUE,stringsAsFactors=FALSE) The post is not very specific as to what you need.? I hope this works for you. library(xts)
2007 Jan 26
2
how to create daily / weekly ts object?
Dear All, Monthly and Quarterly ts obj. is easy to understand. But I couldn't find an example in R manual how to create daily or weekly ts object. Could you please shed some light on it? I really appreciate it.
2010 Aug 01
1
aggregating a daily zoo object to a weekly zoo object
Dear R People: I'm trying to convert a daily zoo object to a weekly zoo object: xdate <- seq(as.Date("2002-01-01"),as.Date("2010-07-10"),by="day") library(zoo) length(xdate) xt <- zoo(rnorm(3113),order=xdate) xdat2 <- seq(index(xt)[1],index(xt)[3113],by="week") xt.w <- aggregate(xt,by=xdat2,mean) Error: length(time(x)) ==
2010 Nov 26
0
Help on converting a daily zoo to a weekly zoo
Dear R-gurus: I want to convert a zoo object of daily stock prices to a weekly one, based on the following 4 steps. 1. No problem in creating weekly data using Wednesday Obs. wed.index &lt;- ( weekdays(index(daily)) == "Wed" ) week.dat &lt;- daily[wed.index] > week.dat A000010 A000020 A000030 A000040 A000050 A000060 2000-01-05 3925 NA 3950 6350 21717 836 2000-01-12