Displaying 20 results from an estimated 400 matches similar to: "map data.frame() data after having linked them to a read.shape() object"
2011 Apr 06
2
glm predict on new data
I am aware this has been asked before but I could not find a resolution.
I am doing a logit
lg <- glm(y[1:200] ~ x[1:200,1],family=binomial)
Then I want to predict a new set
pred <- predict(lg,x[201:250,1],type="response")
But I get varying error messages or warnings about the different number of
rows. I have tried data/newdata and also to wrap in data.frame() but cannot
get
2011 Nov 02
1
nproc parameter in efpFunctional
Hello all,
could anyone explain the exact meaning of parameter nproc? Why different
values of nproc give so different critical values, i.e.
meanL2BB$computeCritval(0.05,nproc=3)
[1] 0.9984853
meanL2BB$computeCritval(0.05,nproc=1)
[1] 0.4594827
The strucchange-package description gives "integer specifying for which
number of processes Brownian motions should be simulated" - do I need
2003 May 22
7
extract half a matrix
Dear all,
I'm new to matrix operations in R. I couln't find a solution to the
following problem among earlier help mails or in An introd to R, I guess
because the question is really basic.
I want to extract all above the diagonal, i.e. from
1 2 3 4
1 0 26 49 49
2 26 0 44 40
3 49 44 0 21
4 49 40 21 0
I want
26
49
44
49
40
21
Thanks in advance!
Sincerely,
Tord
2007 Mar 19
5
order of values in vector
Dear all,
I would like to get the order of the values in a vector. I have tried
rank(), order() and searched the archive, though without success.
Here is an example of a try
x= c(20,30,50,40,60,10)
cbind(sort.list(x),x)
x
[1,] 6 20
[2,] 1 30
[3,] 2 50
[4,] 4 40
[5,] 3 60
[6,] 5 10
but I was hoping to get this:
x
[1,] 2 20
[2,] 3 30
[3,] 5 50
[4,] 4 40
[5,] 6 60
[6,] 1 10
I'm
2011 Nov 10
1
efpFunctional construction (strucchange package)
Hello,
to understand better how efpFunctional works, I'm trying to construct my own
functionals. But concerning already existing functionals I have some
questions. With maxBB it is clear:
functional = list(comp = function(x) max(abs(x)), time = max),
with rangeBB:
functional = list(time = function(x) max(x)-min(x), comp = max),
with meanL2BB, if I understood correctly:
functional =
2003 Dec 07
2
par(las = 1) not possible in polymap(), library(splancs)?
Dear all,
I want my PhD thesis which I hand in tomorrow to look even nicer:
Does polymap in the splancs library not allow horizontal plotting of
y-labels? I have tried
polymap(studyarea, xlab = "x (m)", ylab = "y (m)", las = 1)
but it doesn't change the labels?
Mayby some function in library(spatstat) support las?
Thanks!
Sincerely,
Tord
2003 Jan 22
3
Error when using polr() in MASS
Dear all,
I get an error message when I use polr() in MASS. These are my data:
skugg grupp frekv
4 1 gr3 0
5 2 gr3 3
6 3 gr3 6
10 1 gr5 1
11 2 gr5 12
12 3 gr5 1
>
> summary(polr(skugg ~ grupp, weights=frekv, data= skugg.cpy1.dat))
Error in optim(start, fmin, gmin, method = "BFGS", hessian = Hess, ...) :
2003 Feb 05
1
simplify a data frame
Dear all,
For the past three hours I have tried simplify a data frame. I would be
really happy if someone could help solving this, I'm sure simple, problem.
I want to "aggregate" the data frame:
ObjektID BalteNummer Baltessegment
S.13 S.13.1 S.13.1.2
S.13 S.13.1 S.13.1.3
S.13 S.13.2 S.13.2.1
S.13 S.13.2 S.13.2.2
S.13 S.13.2 S.13.2.3
S.13 S.13.3 S.13.3.6
S.13 S.13.3 S.13.3.7
2003 Nov 23
4
remove 0 rows from a data frame
Dear all,
As part of a larger function, I am randomly removing rows from a data
frame. The number of removed rows is determmined by a Poisson distribution
with a low mean. Sometimes, the random number is 0, and that's when the
problem starts:
My data frame:
> temp
occ x y dbh age
801 0 2977.196 3090.225 6 36.0
802 0 2951.892 3083.769 8 40.6
803 0 2919.111
2003 Jul 21
2
bold AND italic as font in text()
Dear all,
Is it possible to somshow plot text as italic AND bold. I tried font=c(2,3)
in text(), but it doesn't work. It seems like the latter value is used.
Thanks in advance!
Sincerely,
Tord
-----------------------------------------------------------------------
Tord Sn?ll
Avd. f v?xtekologi, Evolutionsbiologiskt centrum, Uppsala universitet
Dept. of Plant Ecology, Evolutionary Biology
2002 Jan 04
4
line up a matrix
Dear all,
I try to rearrange my ref. database (now in Excel!! :( ) for importing it
into a reference manager program (RIS format).
My file basically look like this [3,4]-matrix:
rbind(c("a", "b", "c", "d"), c("e", "f", "g", "h"), c("i", "j", "k", "l"))
[,1] [,2] [,3]
2006 Dec 27
3
counties in different colours using map()
Hi,
I would like to plot a map of US counties using different colors. map()
seems to be the function to use, e.g.
library(maps); map('usa'); map('county', 'colorado', add=T,fill = T,
col=c(1:5))
plots Colorado counties using colours 1 to 5.
However, I want each color to represent a certain value - a value to be
picked from a data frame.
This code should show a
2003 Oct 08
2
binomial glm warnings revisited
Dear all,
Last autumn there was some discussion on the list of the warning
Warning message:
fitted probabilities numerically 0 or 1 occurred in: (if
(is.empty.model(mt)) glm.fit.null else glm.fit)(x = X, y = Y,
when fitting binomial GLMs with many 0 and few 1.
Parts of replies:
"You should be able to tell which coefficients are infinite -- the
coefficients and their standard errors will
2003 Oct 20
4
warning from return() in 1.8 but not in 1.7.0 (PR#4687)
To whom it may concern,
I get the following message when I run my function:
Warning message:
multi-argument returns are deprecated in: return(call.fn, repl, time, from,
to, last.year, occup.m, ant.occ.m,
> version
platform i386-pc-mingw32
arch i386
os mingw32
system i386, mingw32
status
major 1
minor 8.0
2003 Jul 31
2
how as.numeric() !-> factor
Dear all,
I have divided two vectors:
Np.occup97.98<- as.data.frame(cbind(site = levels(sums$site),
Np.occup97.98 = sums$Ant.Nptrad97.98/Ant.trad$Ant.trad97.98))
> Np.occup97.98
site Np.occup97.98
1 erken97 0.342592592592593
2 erken98 0.333333333333333
3 rormyran 0.48471615720524
4 valkror 0.286026200873362
However, at a later stage of the analysis I want
>
2003 Jan 10
0
Thanks: Re: count levels per factor level
Dear Lockwood,
As you can see, I'm a beginner...
But thank you very much!
Sincerely,
Tord
Quoting "J.R. Lockwood" <lockwood at rand.org>:
> how about
>
> buskartant$buskartant <- sapply( group.list, function(x)
> sum(!is.na(unique(x))) )
>
> OR
>
> buskartant$buskartant <- sapply( group.list, function(x)
> length(unique(x[!is.na(x)])) )
2005 Sep 19
4
factor as seq() in for loop
Dear all,
I would like to use the values in vegaggr.BLMCMR02$colony
str(vegaggr.BLMCMR02)
`data.frame': 1678 obs. of 3 variables:
$ vegtype : Factor w/ 27 levels "2010","2020",..: 3 4 5 19 4 5 19 5
$ colony : Factor w/ 406 levels "0","1","10","100",..: 1 1 1 1 2 2 2
$ Totvegproparea: num 0.00055 0.03956 0.95705
2003 Jan 02
1
replace NA with factor class
Dear all,
I have a tree data matrix. For some trees I lack info about tree species,
but I want to set them to be spruce. For some reason the tree species names
on the remaining (non-NA) rows are changed into numbers (that I do not
recognise).
I guess that ifelse is not the correct function to use, but I have not
found any better one in my searches.
Thanks in advance!
Sincerely,
Tord
>
2003 Jan 02
1
aggregate: "sum" not meaningful for factors
Dear all,
I try to summarise my data per category using aggregate, but for some
reason I get the error message "sum" not meaningful for factors even though
my vector is numeric. The data set is shown below.
Could someone please give a hint.
Thanks in advance!
Sincerely,
Tord
> names(test)
[1] "ObjektID" "tallstubbyta"
> is.factor(test$ObjektID);
2001 May 27
2
library for mixed GLM?
Dear all,
I am taking a course in GLM given by a devoted SAS user. He has given us a
homework where a mixed GLM with a logistic link and binomially distributed
observations should be fitted.
I know of the library nlme for mixed effect models but as I understand it,
one cannot choose between different links and distributions in the
functions provided there.
I have so far managed very well with