similar to: Error in integrate

Displaying 20 results from an estimated 7000 matches similar to: "Error in integrate"

2006 Jul 21
0
[Fwd: Re: Parameterization puzzle]
Bother! This cold has made me accident-prone. I meant to hit Reply-all. Clarification below. -------- Original Message -------- Subject: Re: [R] Parameterization puzzle Date: Fri, 21 Jul 2006 19:10:03 +1200 From: Murray Jorgensen <maj at waikato.ac.nz> To: Prof Brian Ripley <ripley at stats.ox.ac.uk> References: <44C063E5.3020703 at waikato.ac.nz>
2006 Jun 05
1
Extracting Variance components
I can ask my question using and example from Chapter 1 of Pinheiro & Bates. > # 1.4 An Analysis of Covariance Model > > OrthoFem <- Orthodont[ Orthodont$Sex == "Female", ] > fm1OrthF <- + lme( distance ~ age, data = OrthoFem, random = ~ 1 | Subject ) > summary( fm1OrthF ) Linear mixed-effects model fit by REML Data: OrthoFem AIC BIC
2008 Mar 02
2
Recommended Packages
Having just update to R 2.6.2 on my old Windows laptop I notice that the number of packages is growing exponentially and my usual approach of get-em-all may not be viable much longer. Has any thought been given to dividing "contributed" binaries into a recommended set, perhaps a couple of hundred, and the remained. That way one could install the recommended ones routinely and add in
2006 May 02
0
Pasting data into scan() - oops!
I forgot to mention that I am using Windows XP. -------- Original Message -------- Subject: Pasting data into scan() Date: Tue, 02 May 2006 11:55:03 +1200 From: Murray Jorgensen <maj at stats.waikato.ac.nz> To: r-help at stat.math.ethz.ch The file TENSILE.DAT from the Hand et al "Handbook of Small Data Sets" looks like this: [...] -- Dr Murray Jorgensen
2006 Nov 13
2
A printing "macro"
I am exploring the result of clustering a large multivariate data set into a number of groups, represented, say, by a factor G. I wrote a function to see how categorical variables vary between groups: > ddisp <- function(dvar) { + csqt <- chisq.test(G,dvar) + print(csqt$statistic) + print(csqt$observed) + print(round(csqt$expected)) + round(csqt$residuals) + } > > x
2008 Oct 11
1
step() and stepAIC()
The birth weight example from ?stepAIC in package MASS runs well as indeed it should. However when I change stepAIC() calls to step() calls I get warning messages that I don't understand, although the output is similar. Warning messages: 1: In model.response(m, "numeric") : using type="numeric" with a factor response will be ignored (and three more the same.) Checked
2005 Sep 08
1
Coarsening Factors
It is not uncommon to want to coarsen a factor by grouping levels together. I have found one way to do this in R: > sites [1] F A A D A A B F C F A D E E D C F A E D F C E D E F F D B C Levels: A B C D E F > regions <- list(I = c("A","B","C"), II = "D", III = c("E","F")) > library(Epi) > region <-
2007 Dec 05
2
Dimension of a vector
Consider the following: > A <- 1:10 > A [1] 1 2 3 4 5 6 7 8 9 10 > dim(A) NULL > dim(A) <- c(2,5) > A [,1] [,2] [,3] [,4] [,5] [1,] 1 3 5 7 9 [2,] 2 4 6 8 10 > dim(A) [1] 2 5 > dim(A) <- 10 > A [1] 1 2 3 4 5 6 7 8 9 10 > dim(A) [1] 10 Would it not make sense to have dim(A) = length(A) for all vectors?
2009 Nov 25
2
R or C++ on FreeNX servers
Hi all, I have just found out that the machine learning group in our Faculty has a lot of spare capacity on their FreeNX servers. I do not know a lot about these beasts but I understand that they are a free version of something produced by a firm called "NoMachine". They are designed for executing parallel algorithms and I thought that they might be of use in a project of mine
2010 Jun 14
2
Html help
I have just installed R 2.11.1 on my XP laptop. I like html help for browsing but text help for on-the-fly look-ups. I was a bit surprised when I was asked to choose between them during the installation. I chose text, thinking I could fix the html help later, which is what I am trying to do now. Now when I ask for html help my browser goes to 'http://-ip-number-/doc/html/index.html'
2005 Apr 05
1
nlme & SASmixed in 2.0.1
I assigned a class the first problem in Pinheiro & Bates, which uses the data set PBIB from the SASmixed package. I have recently downloaded 2.0.1 and its associated packages. On trying library(SASmixed) data(PBIB) library(nlme) plot(PBIB) I get a warning message Warning message: replacing previous import: coef in: namespaceImportFrom(self, asNamespace(ns)) after library(nlme) and a
2008 Aug 24
1
Extracting formula from an lm object
I want to extra the part of the formula not including the response variable from an lm object. For example if the lm object ABx.lm was created by the call ABx.lm <- lm( y ~ A + B + x, ...) Then ACx.lm is saved as part of a workspace. I wish to extract "~ A + B + x". Later in my code I will fit another linear model of the form z ~ A + B + x for some other response variable z. I
2006 Nov 13
1
stepAIC for overdispersed Poisson
I am wondering if stepAIC in the MASS library may be used for model selection in an overdispersed Poisson situation. What I thought of doing was to get an estimate of the overdispersion parameter phi from fitting a model with all or most of the available predictors (we have a large number of observations so this should not be problematical) and then use stepAIC with scale = phi. Should this
2005 Dec 22
1
Huber location estimate
We have a choice when calculating the Huber location estimate: > set.seed(221205) > y <- 7 + 3*rt(30,1) > library(MASS) > huber(y)$mu [1] 5.9117 > coefficients(rlm(y~1)) (Intercept) 5.9204 I was surprised to get two different results. The function huber() works directly with the definition whereas rlm() uses iteratively reweighted least squares. My surprise is
2006 Jul 16
1
princomp and eigen
Consider the following output [R2.2.0; Windows XP] > set.seed(160706) > X <- matrix(rnorm(40),nrow=10,ncol=4) > Xpc <- princomp(X,cor=FALSE) > summary(Xpc,loadings=TRUE, cutoff=0) Importance of components: Comp.1 Comp.2 Comp.3 Comp.4 Standard deviation 1.2268300 0.9690865 0.7918504 0.55295970 Proportion of Variance 0.4456907 0.2780929
2006 Sep 20
1
Pooled Covariance Matrix
I am in a discriminant analysis situation with a frame containing several variables and a grouping factor, if you like: set.seed(200906) exampledf <- as.data.frame(matrix(rnorm(50,5,2),nrow=10,ncol=5)) exampledf$Group <- factor(rep(c(1,2,3),c(3,3,4))) exampledf I'm sure there must be a simple way to get the within group pooled covariance matrix but I haven't found it yet. I
2006 Apr 14
5
vector-factor operation
I found myself wanting to average a vector [vec] within each level of a factor [Fac], returning a vector of the same length as vec. After a while I realised that lm1 <- lm(vec ~ Fac) fitted(lm1) did what I want. But there must be another way to do this, and it would be good to be able to apply other functions than mean() in this way. Cheers, Murray -- Dr Murray Jorgensen
2011 Jan 05
4
Converting Fortran or C++ etc to R
I'm going to try my hand at converting some Fortran programs to R. Does anyone know of any good articles giving hints at such tasks? I will post a selective summary of my gleanings. Cheers, Murray -- Dr Murray Jorgensen http://www.stats.waikato.ac.nz/Staff/maj.html Department of Statistics, University of Waikato, Hamilton, New Zealand Email: maj at waikato.ac.nz
2007 Nov 01
0
Reading R-help Digests with Mozilla Thunderbird
This is somewhat off-topic but I think that an answer may help other users of R-help. Thunderbird trys to help in the display of messages by "greying out" quoted text. However when reading r-help in digest form it gets thoroughly confused and usually ends up greying out the fresh text in a message. [I'm using Windows XP]. Does anyone know how to turn off this Thunderbird
2005 Oct 06
0
R for teaching multivariate statistics (Summary)
Greetings all I promised a summary of the responses that I got to my question: "Next year I will be teaching a third year course in applied statistics about 1/3 of which is multivariate statistics. I would be interested in hearing experiences from those who have taught multivariate statistics using R. Especially I am interested in the textbook that you used or recommended." There