similar to: reshape with function(x,y)?

Displaying 20 results from an estimated 700 matches similar to: "reshape with function(x,y)?"

2006 Mar 16
4
problem for wtd.quantile()
Dear R-users, I don't know if there is a problem in wtd.quantile (from library "Hmisc"): -------------------------------- x <- c(1,2,3,4,5) w <- c(0.5,0.4,0.3,0.2,0.1) wtd.quantile(x,weights=w) ------------------------------- The output is: 0% 25% 50% 75% 100% 3.00 3.25 3.50 3.75 4.00 The version of R I am using is: 2.1.0 Best,Jing
2007 May 31
3
Problem with Weighted Variance in Hmisc
The function wtd.var(x,w) in Hmisc calculates the weighted variance of x where w are the weights. It appears to me that wtd.var(x,w) = var(x) if all of the weights are equal, but this does not appear to be the case. Can someone point out to me where I am going wrong here? Thanks. Tom La Bone [[alternative HTML version deleted]]
2010 Mar 20
2
different forms of nls recommendations
Hello, Using this data: http://n4.nabble.com/file/n1676330/US_Final_Values.txt US_Final_Values.txt and the following code i got the image at the end of this message: US.final.values<-read.table("c:/tmp/US_Final_Values.txt",header=T,sep=" ") US.nls.1<-nls(US.final.values$ECe~a*US.final.values$WTD^b+c,data=US.final.values,start=list(a=2.75,b=-0.95,c=0.731),trace=TRUE)
2009 Jun 23
3
subset POSIXct
Hi, I have a data frame with two columns: dt and tf. The dt column is datetime and the tf column is a temperature. dt tf 1 2009-06-20 00:53:00 73 2 2009-06-20 01:08:00 73 3 2009-06-20 01:44:00 72 4 2009-06-20 01:53:00 71 5 2009-06-20 02:07:00 72 ... I need a subset of the rows where the minutes are 53. The hour is immaterial. I can not find a wildcard
2011 Apr 20
2
survexp with weights
Hello, I probably have a syntax error in trying to generate an expected survival curve from a weighted cox model, but I can't see it. I used the help sample code to generate a weighted model, with the addition of a "weights=albumin" argument (I only chose albumin because it had no missing values, not because of any real relevance). Below are my code with the resulting error
2008 Jan 07
2
How should I improve the following R code?
I'm looking for a way to improve code that's proven to be inefficient. Suppose that a data source generates the following table every minute: Index Count ------------ 0 234 1 120 7 11 30 1 I save the tables in the following CSV format: time,index,count 0,0:1:7:30,234:120:11:1 1,0:2:3:19,199:110:87:9 That is, each line represents a table, and I
2012 Jul 24
1
Function for ddply
Hello, all. I'm new to R and just beginning to learn to write functions. I know I'm out of my depth posting here, and I'm sure my issue is mundane. But here goes. I'm analyzing the American National Election Study (nes), looking at mean values of a numeric dep_var (environ.therm) across values of a factor (partyid3). I use ddply from plyr and wtd.mean from Hmisc. The nes requires a
2012 Apr 20
1
pasting a formula string with double quotes in it
Hello everyone, I have tried several ways of doing this and searched the documentation and help lists and I have been unable to find an answer or even whether it is possible to do it. I am pasting together a formula and I need to insert double quotes around the strings. Here's an example: location <- c("AL", "AK", "MA", "PA") v=2 test <-
2007 Jul 19
0
[LLVMdev] memory hog llvm-ld
Hi Holger, > Note that I did specify "-g", but not any "-Ox" switches. That > made the size of all *.o files together being 143 MB. LLVM represents debug info as explicit calls to intrinsics. This approach has many advantages, but a possible disadvantage is that it can significantly increase the size of the bitcode. I don't know if that explains your observations.
2010 Dec 30
2
optim and singularity
Hello, I was unable to find clues to my problem in ?optim. Using the data and code below, I get an error ("system is exactly singular") when a particular line of code is left in, but have found that 'optim' works when I comment it out. The line of code in question is after the closeAllConnections() line of code and contains a call to "na.approx" from the zoo package.
2007 Jul 23
1
replacing double for loops with apply's
Hi, I am doing double for loops to calculate SDs with some weights and wondering if I can get rid of the outer for loop as well. I made a simple examples which is essentially what I am doing. Thanks for your help! -Young #------------------------------------------------------ # wtd.var is Hmisc package # you can replace the 3 lines inside for loop as # sdx[i,] =
2006 Jan 12
2
tapply and weighted means
I' m trying to compute weighted mean on different groups but it only returns NA. If I use the following data.frame truc: x y w 1 1 1 1 2 2 1 3 1 1 4 2 0 2 1 0 3 2 0 4 1 0 5 1 where x is a factor, and then use the command : tapply(truc$y,list(truc$x),wtd.mean, weights=truc$w) I just get NA. What's the problem ? What can I do ?
2007 Jul 31
1
A complicated 'aggregate'
Hi, I have a financial (zoo) time series with prices and volumes (although I can get the coredata as a matrix). Due to the data-source some indices have multiple observations. I want to aggregate these according to a weighted average. 11:00:01 34 1000 11:00:01 35 500 11:00:01 35 1000 11:00:02 34 500 11:00:02 35 500 should become 11:00:01 34.6 2500 11:00:02 34.5 1000 I currently do this
2009 Nov 14
4
Weighted descriptives by levels of another variables
I've noticed that R has a number of very useful functions for obtaining descriptive statistics on groups of variables, including summary {stats}, describe {Hmisc}, and describe {psych}, but none that I have found is able to provided weighted descriptives of subsets of a data set (ex. descriptives for both males and females for age, where accurate results require use of sampling
2009 Jan 19
1
conditional weighted quintiles
Dear All, I am economist and working on poverty / income inequality. I need descriptive statitics like the ratio of education expentitures between different income quintiles where each household has a different weight. After a bit of google search I found 'Hmisc' and 'quantreg' libraries for weighted quantiles. The problem is that these packages give me only weighted quintiles;
2007 Jul 18
5
[LLVMdev] memory hog llvm-ld
I want to share a little LLVM experiment. I tried LLVM on one of my bigger Qt 3.x based projects. I used llvm from SVN trunk (r39999) and SVN llvm-gcc-4.01 (r370) and did compile every file with /usr/src/llvm/dist/bin/g++ -c -pipe -g \ -Wall -Wextra -Wno-sign-compare \ ... lots of -Dxxxx ... \ --emit-llvm -I/usr/share/qt3/mkspecs/default \ -I. -I.. -I../../../include/qt3 -I.obj/ \
2017 Nov 24
2
number to volume weighted distribution
Hi Duncan I tried Ecdf and/or wtd.quantile from Hmisc and it is working (probably). Ecdf(x, q=.5) Ecdf(x, weights=xw,col=2, add=T, q=.5) wtd.quantile(x) 0% 25% 50% 75% 100% 10 10 10 100 300 wtd.quantile(x, weights=xw, type="i/n") 0% 25% 50% 75% 100% 10.0000 138.8667 192.5778 246.2889 300.0000 But could you please be more specific in this? >
2012 Mar 06
1
How to eliminate for next loops in this script
I needed to compute a complicated cross tabulation to show weighted means and standard deviations and the only method I could get that worked uses a series of nested for next loops. I know that there must be a better way to do so, but could use some assistance pointing the way. Here is my working, but inefficient script: library(Hmisc) rm(list=ls()) load('NHTS.Rdata') day.wt <-
2017 Nov 24
0
number to volume weighted distribution
Hi Petr, I think that Duncan suggests something like this: x<- c(rep(10,20), rep(300,5), rep(100, 10)) tx <- table(x) prop.x <- tx / sum(tx) vx <- as.integer(names(tx)) prop.wx <- tx * vx / sum(tx * vx) plot(ecdf(x)) plot(vx, cumsum(prop.x), ylim = 0:1) plot(vx, cumsum(prop.wx), ylim = 0:1) Best regards, Thierry ir. Thierry Onkelinx Statisticus / Statistician Vlaamse
2006 Dec 29
1
Failure loading library into second R 2.3.1 session on Windows XP
Hi. I am using R 2.3.1 on Windows XP. I had installed a library package into my first session and wanted the same package in my second session, so I went out to the CRAN mirror and tried to install the package, and got the following message: ********************************************************************* >utils:::menuInstallPkgs() trying URL