Displaying 20 results from an estimated 10000 matches similar to: "subset() multiple arguments"
2008 Oct 10
2
ggplot adding points
I would like to do the following in ggplot:
what am I missing?
River.Mile <-c(202, 198, 190, 185, 179, 148, 119, 61)
TSS <- c(1:8)
DOC <- seq(2, by= 0.6, length.out=8)
z <- data.frame(River.Mile, TSS, DOC)
xyplot(TSS+DOC~River.Mile, data=z, auto.key=TRUE)
thanks
--
Stephen Sefick
Research Scientist
Southeastern Natural Sciences Academy
Let's not spend our time and resources
2008 Oct 13
1
ggplot faceting like lattice | variable
I would like to be able to do the xyplot in ggplot below. I read in
the archive that Hadley was working on this for the next release, and
I can not find the documentation (Aug. 23rd).
River.Mile <- c(215 ,202, 198, 190, 185, 179, 148, 119, 61)
Cu <- rnorm(9)
Fe <- rnorm(9)
Mg <- rnorm(9)
Ti <- rnorm(9)
Ir <- rnorm(9)
r <- data.frame(River.Mile, Cu, Fe, Mg, Ti, Ir)
z <-
2008 Jan 31
2
Box Plot With Groups being numbers
I would like to Summarize values that are repeated measures at a
certain river mile with box plot i.e.
The data matrix looks like this
123 124 125 #fiver mile
0.5 0.6 0.7
0.4 0.5 0.6
... ... ... #values
I would like to make a boxplot with the river mile naming the
different box plot. How do you suppress the X123?
Stephen
--
Let's not
2008 Feb 12
4
summary statistics
below is my data frame. I would like to compute summary statistics
for mgl for each river mile (mean, median, mode). My apologies in
advance- I would like to get something like the SAS print out of PROC
Univariate. I have performed an ANOVA and a tukey LSD and I would
just like the summary statistics.
thanks
stephen
RM mgl
1 215 0.9285714
2 215 0.7352941
3 215 1.6455696
4 215
2008 Oct 23
1
Reversing xlim qplot
I would like to be able to reverse the xlim on qplot
this is the code that I am using
qplot(a[,"River.Mile"], a[,26]
,ylab=colnames(a)[26], xlab="RiverMile", xlim=rev(c(60,
216)))+geom_smooth()+scale_x_continuous(breaks=c(215,202,198,190,185,179,148,119,61),
2008 Apr 29
1
merging multiple data frames with different numbers of rows
merge can only merge two objects at a time- I would like to merge more than
two objects at a time.
s.d <- structure(list(RiverMile = c(202L, 198L, 190L, 185L, 179L, 148L,
119L, 61L)), .Names = "RiverMile", row.names = c(NA, -8L), class =
"data.frame")
#s.d is all of the river miles that can occur in all of the data frames that
I want to put together
feb06 <-
2008 Apr 29
1
data management (subsetting and recombining)
This is an example of two months of data from a twenty four month data set
that I would like to apply this too. These data are subsets of the same
stations throught time, but differing ones were included on different
sampling dates. I would like to subset these data and then put them
together as a big matrix with the by column being RiverMile. What is the
easiest way to proceed as this is a
2009 Jan 07
1
Replace Function (How to replace numbers in a data frame with a specific number)
taxa <- (structure(list(Date = structure(c(4L, 4L, 4L, 4L, 4L, 4L, 4L,
4L, 4L, 4L, 4L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L), .Label = c("2006/04",
"2006/05", "2006/07", "2006/10", "2006/12", "2007/02", "2007/04",
"2007/06", "2007/08", "2007/10", "2007/12", "2008/01"), class =
2010 May 18
2
Function that is giving me a headache- any help appreciated (automatic read )
note: whole function is below- I am sure I am doing something silly.
when I use it like USGS(input="precipitation") it is choking on the
precip.1 <- subset(DF, precipitation!="NA")
b <- ddply(precip.1$precipitation, .(precip.1$gauge_name), cumsum)
DF.precip <- precip.1
DF.precip$precipitation <- b$.data
part, but runs fine outside of the function:
days=7
2008 Mar 11
1
stacked graphs
I would like to reproduce a figure found in HBN Hynes "The Ecology of
Running Waters" on page 79 reprint first edition copyright 2001. This
is a graph with graphs of insect abundance through time lined up in 3d
as you proceed down river.
any help is appreciated
Stephen
--
Let's not spend our time and resources thinking about things that are
so little or so large that all they
2009 Oct 06
1
ggplot2 applying a function based on facet
Look at the bottom of the message for my question
#here is a little function that I wrote
USGS <- function(input="discharge", days=7){
library(chron)
library(gsubfn)
#021973269 is the Waynesboro Gauge on the Savannah River Proper (SRS)
#02102908 is the Flat Creek Gauge (ftbrfcms)
#02133500 is the Drowning Creek (ftbrbmcm)
#02341800 is the Upatoi Creek Near Columbus (ftbn)
#02342500 is
2008 May 01
4
Making a map in R?
Does anyone know of a package to make a map from GIS data, and/or would it
be easier in one of the free GIS programs. I would like to make a map of
the savannah river area with our sampling locations.
thanks
stephen
--
Let's not spend our time and resources thinking about things that are so
little or so large that all they really do for us is puff us up and make us
feel like gods. We are
2010 Aug 03
2
subset based on column names and then subset based on the inverse (grep?, or...)
I would like to be able to grab x and y columns out of a dataframe and
then grab all of the columns that are not equal to x or y. I am sure
that I am missing something easy.
ftbr_UTM_downstream <- (structure(list(site =
c("Jennie_Creek_Main_Stem", "Wolf_Pit_Creek_Main_Stem",
"Little_Rockfish_Main_Stem_North", "Big_Muddy_Creek_Main_Stem",
2005 Mar 10
2
NoMethodError in Event_type#create
I am new Ruby on Rails, so excuse me if my question seems pretty obvious.
I am trying to validate uniqueness of a filed:
class EventType < ActiveRecord::Base
belongs_to :sport
validates_uniqueness_of :event_type
end
When I run it, I get this error message:
Showing /event_type/new.rhtml where line #27 raised undefined method
`each'' for nil:NilClass
<select
2012 Apr 05
1
"too large for hashing"
Hello,
I'm doing some analysis on a rather large data set. In this case,
some simple commands are failing. For example, this one:
> x$eventtype <- factor(x$eventtype)
Error in unique.default(x) : length 1093574297 is too large for hashing
...I think this is a bug, because "hashing" should not be required for the
"factor" function. Am I right? The whole column
2008 May 08
0
RSEIS could you help
I have dissolved oxygen traces that are continuous (fifteen minutes) for
two years (save for a couple of days, weeks, or minutes there depending on
the perogative of the river). These traces are spaced out by river mile. I
have figured out how to prepare data as to the sunspot example, but I can
not figure out how to get multiple traces into the prepSEIS function and
this is the warning that I
2009 Dec 18
1
linear contrasts for trends in an anova
Hi everybody,
I'm trying to construct contrasts for an ANOVA to determine if there is a significant trend in the means of my groups.
In the following example, based on the type of 2x3 ANOVA I'm trying to perform, does the linear polynomial contrast generated by contr.poly allow me to test for a linear trend across groups?
doi=data.frame(
Group=c(
rep(1, 5), rep(2, 5), rep(3, 5),
2008 Aug 26
1
processing subset lists and then plot(density())
d <- structure(list(Site = structure(c(8L, 12L, 7L, 6L, 11L, 5L, 10L,
4L, 3L, 2L, 1L, 9L, 8L, 12L, 7L, 6L, 11L, 5L, 10L, 4L, 3L, 2L,
1L, 9L, 8L, 12L, 7L, 6L, 11L, 5L, 10L, 4L, 3L, 2L, 1L, 9L, 8L,
12L, 7L, 6L, 11L, 5L, 10L, 4L, 3L, 2L, 1L, 9L, 8L, 12L, 7L, 6L,
11L, 5L, 10L, 4L, 3L, 2L, 1L, 9L, 8L, 12L, 7L, 6L, 11L, 5L, 10L,
4L, 3L, 2L, 1L, 9L, 8L, 12L, 7L, 6L, 11L, 5L, 10L, 4L, 3L, 2L,
1L, 9L,
2012 Jul 24
2
Convert Package Interest?
I am thinking about submitting a package to CRAN that contains some
units conversion functions that I use on a regular basis. Would this be
helpful to the community, or would it be better to keep this as a
personal package? I don't want to clutter CRAN.
many thanks,
--
Stephen Sefick
**************************************************
Auburn University
Biological Sciences
331 Funchess
2009 Aug 21
5
splitting a string up
x <- "1041281__2009_08_20_.lev"
I would like to split this string up and only extract the leading numbers.
1041281
to use as a label for a data column in a bigger for loop function to
read in data.
regards,
--
Stephen Sefick
Let's not spend our time and resources thinking about things that are
so little or so large that all they really do for us is puff us up and
make us