Displaying 20 results from an estimated 3000 matches similar to: "Plotting slowly"
2000 Nov 01
0
Loop elimination question
How about:
> dim
[1] 3 1 4 1 5
> N <- length(dim)
> one <- rep(c(3,4),N)
> two <- c(rep(1,N),dim)
> three <- rep(1:N,rep(2,N))+N*rep(0:1,N)
> rep(one,two[three])
[1] 3 4 4 4 3 4 3 4 4 4 4 3 4 3 4 4 4 4 4
----------------------
Bendix Carstensen
Senior Statistician
Steno Diabetes Centre
Niels Steensens Vej 2
DK-2820 Gentofte
Denmark
tel: +45 44 43 87 38
mob: +45 28
2000 Aug 20
1
Best subsets regression
Hi all,
has anyone written a function to do "best subsets" regression similar to
the Minitab command?
Regards,
Murray Jorgensen
Murray Jorgensen, Department of Statistics, U of Waikato, Hamilton, NZ
-----[+64-7-838-4773]---------------------------[maj at waikato.ac.nz]-----
"Doubt everything or believe everything:these are two equally convenient
strategies. With either we
1999 Jul 16
0
Randomization tests
Please excuse this request from a beginner.
I'm teaching an introductory module on experimental design and I want to
illustrate the idea of tests based on the randomization distribution.
If anyone has written any R code to do a randomization test in a particular
example I'd be very grateful if you could show me your code.
Murray Jorgensen
Murray Jorgensen, Department of Statistics,
1999 Jun 28
0
Resources for teaching R
Greetings!
Soon I will begin teaching 2 courses two students who have never met R or
SPLUS before. I'm just wondering if list members are aware of any on-line
lecture notes about R or using R to teach statistics apart from the
Venables & Smith notes?
One drawback to R is its name, which is not very useful as a search string
in internet search engines!
Murray Jorgensen, Department of
2001 Mar 07
1
Windows: updating to R 1.2.2
Forgive me, but while I find the documentation for initially installing R
to be clear, I'm not so happy about updating.
I want to update from R 1.2.1 to R 1.2.2 (mainly so I can get the debugged
version of pictex() ).
Now I have downloaded the base zip files, say to Z:\Program Files and my
existing installation of R 1.2.1 is sitting in
Z:\Program Files\rw1021.
Running rwinst.exe I need to
2001 May 22
2
MASS data sets
I'm running R 1.2.2 under windows 98 on a Pentium 133 laptop.
I can't seem to retrieve the package MASS data sets:
> library(MASS)
> data(wtloss)
Warning message:
Data set `wtloss' not found in: data(wtloss)
> data(abbey)
Warning message:
Data set `abbey' not found in: data(abbey)
And yet all the .rda files for the MASS datasets are in
D:\Program
2006 Jul 21
0
[Fwd: Re: Parameterization puzzle]
Bother! This cold has made me accident-prone. I meant to hit Reply-all.
Clarification below.
-------- Original Message --------
Subject: Re: [R] Parameterization puzzle
Date: Fri, 21 Jul 2006 19:10:03 +1200
From: Murray Jorgensen <maj at waikato.ac.nz>
To: Prof Brian Ripley <ripley at stats.ox.ac.uk>
References: <44C063E5.3020703 at waikato.ac.nz>
2006 May 02
0
Pasting data into scan() - oops!
I forgot to mention that I am using Windows XP.
-------- Original Message --------
Subject: Pasting data into scan()
Date: Tue, 02 May 2006 11:55:03 +1200
From: Murray Jorgensen <maj at stats.waikato.ac.nz>
To: r-help at stat.math.ethz.ch
The file TENSILE.DAT from the Hand et al "Handbook of Small Data Sets"
looks like this:
[...]
--
Dr Murray Jorgensen
2003 Jan 12
1
likelihood and score interval estimates for glms
G'day list!
I'm thinking about programming likelihood and score intervals for
generalized linear models in R based on the paper "On the computation of
likelihood ratio and score test based confidence intervals in
generalized linear models" by Juha Alho (1992) (Statistics in Medicine,
11, 923-930).
Being lazy, I thought that I would ask if anyone else on the list has
2006 Jun 05
1
Extracting Variance components
I can ask my question using and example from Chapter 1 of Pinheiro & Bates.
> # 1.4 An Analysis of Covariance Model
>
> OrthoFem <- Orthodont[ Orthodont$Sex == "Female", ]
> fm1OrthF <-
+ lme( distance ~ age, data = OrthoFem, random = ~ 1 | Subject )
> summary( fm1OrthF )
Linear mixed-effects model fit by REML
Data: OrthoFem
AIC BIC
2000 Oct 11
0
Balanced incomplete block analysis
At 07:12 AM 10-10-00 +0100, Prof Brian D Ripley wrote:
>On Tue, 10 Oct 2000, Murray Jorgensen wrote:
>
>> Excuse me everyone, but I don't have to teach this very often!
>>
>> Has anyone got some R code for doing adjusted treatment means and the
>> recovery of inter-block information in the analysis of balanced incomplete
>> block designs?
>
>Do you
2005 Sep 08
1
Coarsening Factors
It is not uncommon to want to coarsen a factor by grouping levels
together. I have found one way to do this in R:
> sites
[1] F A A D A A B F C F A D E E D C F A E D F C E D E F F D B C
Levels: A B C D E F
> regions <- list(I = c("A","B","C"), II = "D", III = c("E","F"))
> library(Epi)
> region <-
2002 Oct 28
2
Combining simulation results
In one saved workspace I have the results of a simulation experiment stored
as an array "resarray".
> dim(resarray)
[1] 10 6 500 3
In another workspace I have a similar array from another run of the
simulation.
I want to combine the two arrays into a single array of dimensions
10, 6, 1000, 3
What's the best way to do this?
Murray Jorgensen
Dr Murray Jorgensen
2008 Mar 02
2
Recommended Packages
Having just update to R 2.6.2 on my old Windows laptop I notice that the
number of packages is growing exponentially and my usual approach of
get-em-all may not be viable much longer. Has any thought been given to
dividing "contributed" binaries into a recommended set, perhaps a couple
of hundred, and the remained. That way one could install the recommended
ones routinely and add in
2007 Nov 01
0
Reading R-help Digests with Mozilla Thunderbird
This is somewhat off-topic but I think that an answer may help other
users of R-help.
Thunderbird trys to help in the display of messages by "greying out"
quoted text. However when reading r-help in digest form it gets
thoroughly confused and usually ends up greying out the fresh text in a
message. [I'm using Windows XP].
Does anyone know how to turn off this Thunderbird
2006 Nov 13
1
stepAIC for overdispersed Poisson
I am wondering if stepAIC in the MASS library may be used for model
selection in an overdispersed Poisson situation. What I thought of doing
was to get an estimate of the overdispersion parameter phi from fitting
a model with all or most of the available predictors (we have a large
number of observations so this should not be problematical) and then use
stepAIC with scale = phi. Should this
2005 Oct 06
0
R for teaching multivariate statistics (Summary)
Greetings all
I promised a summary of the responses that I got to my question:
"Next year I will be teaching a third year course in applied statistics
about 1/3 of which is multivariate statistics. I would be interested in
hearing experiences from those who have taught multivariate statistics
using R. Especially I am interested in the textbook that you used or
recommended."
There
2004 Jul 12
1
Nested source()s
I had an error message while running a macro from Yudi Pawitan's web site:
> source("ex2-13.r")
Error in parse(file, n, text, prompt) : syntax error on line 2
Inspecting ex2-13.r I found that the error was generated by another
source() command.
Clearly R does not like nested source()s, which is fair enough when you
think about it. Still it's something that you might want
2005 Dec 22
1
Huber location estimate
We have a choice when calculating the Huber location estimate:
> set.seed(221205)
> y <- 7 + 3*rt(30,1)
> library(MASS)
> huber(y)$mu
[1] 5.9117
> coefficients(rlm(y~1))
(Intercept)
5.9204
I was surprised to get two different results. The function huber() works
directly with the definition whereas rlm() uses iteratively reweighted
least squares.
My surprise is
2006 Jul 16
1
princomp and eigen
Consider the following output [R2.2.0; Windows XP]
> set.seed(160706)
> X <- matrix(rnorm(40),nrow=10,ncol=4)
> Xpc <- princomp(X,cor=FALSE)
> summary(Xpc,loadings=TRUE, cutoff=0)
Importance of components:
Comp.1 Comp.2 Comp.3 Comp.4
Standard deviation 1.2268300 0.9690865 0.7918504 0.55295970
Proportion of Variance 0.4456907 0.2780929