Displaying 3 results from an estimated 3 matches for "xnew1".
Did you mean:
new1
2010 May 16
2
Box-Cox Transformation: Drastic differences when varying added constants
...e original varible was -.91!!?
I guess that is something fundmental missing in my current thinking about
box-cox...
Best,
Holger
P.S. Here is what i did:
# Creating of a skewed variable X (mixture of two normals)
x1 = rnorm(120,0,.5)
x2 = rnorm(40,2.5,2)
X = c(x1,x2)
# Adding a small constant
Xnew1 = X +abs(min(X))+ .1
box.cox.powers(Xnew1)
Xtrans1 = Xnew1^.2682 #(the value of the lambda estimate)
# Adding a larger constant
Xnew2 = X +abs(min(X)) + 1
box.cox.powers(Xnew2)
Xtrans2 = Xnew2^-.2543 #(the value of the lambda estimate)
#Plotting it all
par(mfrow=c(3,2))
hist(X)
qqnorm(X)
qqline(X...
2003 Aug 29
3
Creating a new table from a set of constraints
Hi Everyone,
Here's a silly newbie question. How do I remove unwanted rows from an
R table? Say that I read my data as:
X <- read.table("mydata.txt")
and say that there are columns for age and gender. Call these X[5] and
X[10], respectively.
Here, X[5] is a column of positive integers and X[10] is binary valued
i.e., zero (for male) and one (for female)
Now, say that I
2012 Dec 25
5
aggregate / collapse big data frame efficiently
Hi,
I need to aggregate rows of a data.frame by computing the mean for rows with the same factor-level on one factor-variable;
here is the sample code:
x <- data.frame(rep(letters,2), rnorm(52), rnorm(52), rnorm(52))
aggregate(x, list(x[,1]), mean)
Now my problem is, that the actual data-set is much bigger (120 rows and approximately 100.000 columns) ? and it takes very very long