Hi, there: I am a newbie of R but I am very interested in applying R in my research. I got some help before from this mailist and I really appreciate it. Here is my another question. I think it might be a common problem for some R's packages. Recently I used rpart to do some research and my dataset has around 200 vars and over 50,000 observations. I loaded half of the data into R (I cannot load the whole dataset into it, even) and it is really slow. I think the implementation of rpart is kinda memory-resident. I am wondering if there is another implementation or some way to optimize it to solve the problem. Thanks in advance, Weiwei