Displaying 10 results from an estimated 10 matches for "prnn".
Did you mean:
prng
2005 May 04
1
Difference between "tree" and "rpart"
In the help for rpart it says, "This differs from the tree function
mainly in its handling of surrogate variables." And it says that an
rpart object is a superset of a tree object. Both cite Brieman et al.
1984. Both call external code which looks like martian poetry to me.
I've seen posts in the archives where BDR, and other knowledgeable
folks, have said that rpart() is to be
2005 Jul 11
1
Projection Pursuit
Hello,
Just a quick question about ppr in library modreg.
I have looked at Ripley and Venables 2002 and it says that projection
pursuit works "by projecting X in M carefully chosen directions"
I want to know how it choses the directions? I presume it moves around the
high-dimensional space of unit vectors finding ones that separate the
response variables, but how.
I looked at the
2005 Apr 11
1
multi-class modeling
Hi,
Just wonder if someone could comment on using linear
discriminant analysis (LDA) vs. multinomial logistic
regression in multi-class classification/prediction
(nomial dependent variable, not ordinal)? What kind of
difference in results can I expect from the 2 methods,
which is better or more appropriate, or under what
condiditon should I used one instead of the other? And
is there other
2006 Jan 18
1
Canonical Variance Analysis by any other name?
I've been asked about "Canonical Variance Analysis" (CVA). I don't
see any reference to it searching the R site. Does it go by other
names?
Genstat describes it thus:
Canonical variates analysis operates on a within-group sums of squares
and products matrix, calculated from a set of variates and factor that
specifies the grouping of units. It finds linear combinations of the
2000 Mar 20
1
CART and the `tree' contrib package
Dear R people,
I was recently reading the book `Classification and Regression Trees' by
Breiman. This book talks about the CART program. Both Splus and R have
implementations of this. However, the book talks about the possibility of
extending the existing `standard' set of questions (for continuous
variables, these are of the form X < c where X is the variable, c some
const) to
2005 Jul 01
1
p-values for classification
Dear All,
I'm classifying some data with various methods (binary classification). I'm interpreting the results via a confusion matrix from which I calculate the sensitifity and the fdr. The classifiers are trained on 575 data points and my test set has 50 data points.
I'd like to calculate p-values for obtaining <=fdr and >=sensitifity for each classifier. I was thinking about
2003 Aug 21
1
LDA in R: how to extract full equation, especially constant term
Hi,
Having dipped my toe into R a few times over the last year or two, in the
last few weeks I've been using it more and more; I'm now a thorough
convert. I've just joined the list, because although it's great, I do have
this problem...
I'm using linear discriminant analysis for binary classification, and am
happy with the classification performance using predict(). What
2000 May 01
1
GAMs under R?
At 06:09 AM 5/1/00 +0100, Prof Brian D Ripley wrote:
>On Sun, 30 Apr 2000, Stephen R. Laniel wrote:
>
>> I was just now surprised to note that functions to go generalized additive
>> models don't appear to exist under R 1.000. In particular, the gam() and
>> loess() functions aren't there. Are they hidden somewhere and I just
>> haven't noticed?
>
2004 Jan 27
8
distance between two matrices
Hi all,
Say I have a matrix A with dimension m x 2 and matrix B with
dimension n x 2. I would like to find the row in A that is closest to
the each row in B. Here's an example (using a loop):
set.seed(1)
A <- matrix(runif(12), 6, 2) # 6 x 2
B <- matrix(runif(6), 3, 2) # 3 x 2
m <- vector("numeric", nrow(B))
for(j in 1:nrow(B)) {
d <- (A[, 1] - B[j, 1])^2 + (A[,
2008 Jan 08
3
GAM, GLM, Logit, infinite or missing values in 'x'
Hi,
I'm running gam (mgcv version 1.3-29) and glm (logit) (stats R 2.61) on
the same models/data, and I got error messages for the gam() model and
warnings for the glm() model.
R-help suggested that the glm() warning messages are due to the model
perfectly predicting binary output. Perhaps the model overfits the data? I
inspected my data and it was not immediately obvious to me (though I