Displaying 20 results from an estimated 7000 matches similar to: "graphing plots of plots"
2008 Feb 07
2
a kinder view of Type III SS
A young colleague (Matthew Keller) who is an ardent fan of R is teaching me
much about R and discussions surrounding its use. He recently showed me
some of the sometimes heated discussions about Type I and Type III errors
that have taken place over the years on this listserve. I'm presumptive
enough to believe I might add a little clarity. I write this from the
perspective of someone old
2008 Jun 05
1
nls() newbie convergence problem
I'm sure this must be a nls() newbie question, but I'm stumped.
I'm trying to do the example from Draper
and Yang (1997). They give this snippet of S-Plus code:
Specify the weight function:
weight < - function(y,x1,x2,b0,b1,b2)
{
pred <- b0+b1*x1 + b2*x2
parms <- abs(b1*b2)^(1/3)
(y-pred)/parms
}
Fit the model
gmfit < -nls(~weight(y,x1,x2,b0,b1,b2),
2013 Jul 10
3
PCA and gglot2
Hi,
I was trying as well as looking for an answer without success (a bit strange
since it should be an easy problem) and therefore I will appreciate you
help:
My simple script is:
# Loadings data of 5 columns and 100 rows of data
data1<-read.csv("C:/?/MyPCA.csv")
pairs(data1[,1:4])
pca1 <- princomp(data1[,1:4], score=TRUE, cor=TRUE)
biplot(pca1)
The biplot present the data
2010 Aug 03
2
subset based on column names and then subset based on the inverse (grep?, or...)
I would like to be able to grab x and y columns out of a dataframe and
then grab all of the columns that are not equal to x or y. I am sure
that I am missing something easy.
ftbr_UTM_downstream <- (structure(list(site =
c("Jennie_Creek_Main_Stem", "Wolf_Pit_Creek_Main_Stem",
"Little_Rockfish_Main_Stem_North", "Big_Muddy_Creek_Main_Stem",
2011 Mar 31
2
Linear Model with curve fitting parameter?
I have a model Q=K*A*(R^r)*(S^s)
A, R, and S are data I have and K is a curve fitting parameter. I
have linearized as
log(Q)=log(K)+log(A)+r*log(R)+s*log(S)
I have taken the log of the data that I have and this is the model
formula without the K part
lm(Q~offset(A)+R+S, data=x)
What is the formula that I should use?
Thanks for all of your help. I can provide a subset of data if necessary.
2013 May 01
3
Chron format question h:m not working
R 2.12.2 on Scientific Linux 6.4
#works
chron(times.="15:00:00", format=c(times="h:m:s"))
#doesn't work
chron(times.="15:00", format=c(times="h:m"))
From chron Manual:
The times format can be any permutation of "h", "m", and "s" separated
by any one non-special character. The default is "h:m:s".
what am I
2010 Jul 14
2
qplot in ggplot2 not working any longer - (what did I do?)
This is the first time that I have tried to update packages with a
tinkered around with .Rprofile. I start R with R --vanilla and it
does not load my .Rprofile, but when I issue the command
update.packages() R downloads the packages as expected, but then seems
to load .Rprofile before compiling the packages sources. What am I
doing wrong?
kindest regards,
Stephen Sefick
see- Session info
2011 Apr 16
2
(no subject)
I have just upgraded to R 2.13 and have library(ggplot2) in my
.Rprofile (among other things). when i start R I get an error
message. Has something in the start up scripts changed? Is there a
better way to specify the library calls in .Rprofile? Thanks for all
of the help in advance.
Error:
Loading required package: grid
Loading required package: proto
Error in rename(x, .base_to_ggplot) :
2010 Sep 08
1
problem with max in a function
s <- 1.00
max(s)
returns 1
is there anyway that I can get it to return 1.00. I am using the
results of this max statement in a grep statement and it returns the
wrong numbers, I will provide more information and code if it would
make more sense in context.
--
Stephen Sefick
____________________________________
| Auburn University? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? |
| Department of
2012 Jul 18
1
ggplot2 qplot pch not working anymore
Is there a way to use a continuous variable to pch in qplot? I believe
this worked in previous version. I need to specify certain values of a
shape for particular points so that multiple graphs all show the same
shapes for the same streams. I have gone to the original data and added
a pch column that I would like to use to specify the shapes to pch in
qplot. Any help would be greatly
2012 Nov 16
2
source file on startup question - why does an old version of a function show up? ggplot or R?
All,
1. I will try and make this clear and concise. Please let me know any
information that would be helpful in figuring out this problem (I don't
know the relevant information to post). I am on linux- see below for
session information.
2. Problem:
working directory: home
an old version of a function is sourced into the R session and doesn't work
working directory: Desktop
the
2010 Jul 15
1
loess line predicting number where the line crosses zero twice
These data represent stream channel cross-sectional surveys. I would
like to be able to find the measurement on the tape (measure) where
the Bank Full Depth (bkf_depths) is 0. This will happen twice because
the channel has two sides. I thought fitting a loess line to these
data and then predicting the measurment number would do it. I was
wrong. Below is my failed attempt. My naive thought is
2005 May 31
2
Problem going back to a viewport with gridBase
I am setting up base plots -- one in viewport A and and one in B. This part
works fine. But if I go back to A after having done B and add
horizontal lines it seems
to not use the correct coordinates. How do I tell it to resume using A's
coordinates? I am already using par(fig = gridFIG()) but it seems that that's
not enough to reestablish them. What happens is that when I go back to
2005 May 31
2
Problem going back to a viewport with gridBase
I am setting up base plots -- one in viewport A and and one in B. This part
works fine. But if I go back to A after having done B and add
horizontal lines it seems
to not use the correct coordinates. How do I tell it to resume using A's
coordinates? I am already using par(fig = gridFIG()) but it seems that that's
not enough to reestablish them. What happens is that when I go back to
2010 Jun 29
3
formating chron date times for printing
the date were created with chron with this argument
format=c(dates="Y/m/d", times="H:M:S"))
so I have the dates being displayed as
(10/06/22 12:00:00)
I would like to have them displayed as
"2010-06-22 12:00:00" or "%Y-%m-%d %H:%M:%S"
and then I can convert these for mergeing with another data frame
x <- (structure(c(14464, 14464.0104166667,
2010 Jun 29
1
How to allocate more memories to R?
When I use this I am getting following warning at the time of opening R from
that shortcut :
"-max-mem-size=2048MB:too large and taken as 2047M"
Why I am getting this? I have 3GB ram installed and using within Vista.
Thanks
--
View this message in context: http://r.789695.n4.nabble.com/How-to-allocate-more-memories-to-R-tp2271714p2272436.html
Sent from the R help mailing list archive
2010 Jul 23
1
MISSING VALUE IN R
I have a DF
ID VALUE
100 120
101 100
102 100
103
104
105
....
when i calculate the sum of the values, it returned NA. should I populate
the blank value as 0?
Thanks,
--
View this message in context: http://r.789695.n4.nabble.com/MISSING-VALUE-IN-R-tp2299586p2299586.html
Sent from the R help mailing list archive at Nabble.com.
2010 Jul 23
1
na.rm=TRUE
POS=sum(x[-1][x[-1]>0],na.rm=TRUE)
is this the correct syntax?
--
View this message in context: http://r.789695.n4.nabble.com/na-rm-TRUE-tp2299596p2299596.html
Sent from the R help mailing list archive at Nabble.com.
2010 Sep 03
2
Package wavelets
Hi user's
Does anybody work with wavelets on R?
Please I need some help.....!!!!
Atte
Marize Sim?es
--
View this message in context: http://r.789695.n4.nabble.com/Package-wavelets-tp2526023p2526023.html
Sent from the R help mailing list archive at Nabble.com.
2010 Dec 15
1
frestimate functionality
By any chance is there an R package that may contain functionality similar to
the frestimate in Matlab/Simulink.
Here is the URL for a description of the frestimate functionality:
http://www.mathworks.com/help/toolbox/slcontrol/ug/frestimate.html
Thank you again for any feedback.