Displaying 20 results from an estimated 9000 matches similar to: "\link to another package"
2006 Feb 13
1
Turning control back over to the terminal
I'm invoking R from withing a shell script like this
R --no-save --no-restore --gui=none > `hostname` 2>&1 <<BYE
# various commands here
BYE
I would like to regain control from the invoking terminal at some point.
I tried source(stdin()) but got a syntax error, presumably stdin is the
little shell here snippet (the part between <<BYE
and BYE).
Is there some way to
2006 Nov 15
1
tail recursion in R
Apparently Scheme is clever and can turn certain apparently recursive
function calls into into non-recursive evaluations.
Does R do anything like that? I could find no reference to it in the
language manual.
What I'm wondering is whether there are desirable ways to express
recursion in R.
Thanks.
--
Ross Boylan wk: (415) 514-8146
185 Berry St #5700
2006 May 18
3
S4 classes and C
Is there any good source of information on how S4 classes (and methods)
work from C?
E.g., for reading
how to read a slot value
how to invoke a method
how to test if you have an s4 object
For writing, how to make a new instance of an S4 object.
I've found scattered hints in the archive, including a link to a talk on
this subject "I am using C code to create an S4 object based on
2007 Jun 05
1
ggplot aspect ratio
Is there a way to control the aspect ratio of plots using ggplot?
Specifically, I'm using the formula=a~b argument to produce a grid of
plots, but the overall width of the result seems to vary for reasons
that are obscure to me.
This affects not only the appearance of the plots but the amount of
space available for the title (which seems to be right justified
relative to the right edge of the
2006 Dec 07
2
making a grid of points
I'd like to evaluate a function at each point on a 2 or 3-D grid. Is
there some function that already does this, or generates the grid of
points?
My search has led me to the grid and lattice packages, and I found a
reference to the sp package (e.g., SpatialGrid) for this. There are
things in there that might be relevant, but at first blush many of them
are embedded in other concepts (grobs,
2007 Feb 16
1
pinning down symbol values (Scoping/Promises) question
I would like to define a function using symbols, but freeze the symbols
at their current values at the time of definition. Both symbols
referring to the global scope and symbols referring to arguments are at
issue. Consider this (R 2.4.0):
> k1 <- 5
> k
[1] 100
> a <- function(z) function() z+k
> a1 <- a(k1)
> k1 <- 2
> k <- 3
> a1()
[1] 5
> k <- 10
>
2007 Sep 12
1
"could not find function" in R CMD check
During R CMD check I get this:
** building package indices ...
Error in eval(expr, envir, enclos) : could not find function
"readingError"
Execution halted
ERROR: installing package indices failed
The check aborts there. readingError is a function I just added; for
reference
setClass("readingError", contains="matrix")
readingError <- function(...)
2007 Nov 15
1
Why is model.matrix creating 2 columns for boolean?
I have a data frame "reading" that includes a logical variable "OLT"
along with response variable "Reading" and predictor "True" (BOTH are
numeric variables; it's "True" as in the true value).
When I suppress the intercept, model.matrix gives me OLTTRUE and
OLTFALSE columns. Why? Can I do anything to prevent it?
> r <-
2006 May 18
1
Recommended style with calculator and persistent data
I have some calculations that require persistent state. For example,
they retain most of the data across calls with different parameters.
They retain parameters across calls with different subsets of the cases
(this is for distributed computation). They retain early analysis of
the problem to speed later computations.
I've created an S4 object, and the stylized code looks like this
calc
2006 Jan 02
2
checkpointing
I would like to checkpoint some of my calculations in R, specifically
those using optim. As far as I can tell, R doesn't have this facility,
and there seems to have been little discussion of it.
checkpointing is saving enough of the current state so that work can
resume where things were left off if, to take my own example, the system
crashes after 8 days of calculation.
My thought is that
2006 Feb 07
0
S4 documentation
1. promptClass generated a file that included
\section{Methods}{
No methods defined with class "mspathDistributedCalculator" in the
signature.
}
Yet there are such methods. Is this a not-working yet feature, or is
something funny going on (maybe I have definitions in the library and in
the global workspace...)?
2. Is the \code{\link{myS4class-class}} the proper way to
cross-reference a
2007 Mar 29
0
S4 generic surprise
I discovered the following behavior when source'ing the same file
repeatedly as I edited it. My generic stopped acting like a generic. I
can't tell from the docs what, if any, behavior is expected in this
case. R 2.4.0
> foo <- function(object) 3
> isGeneric("foo")
[1] FALSE
> setMethod("foo", "matrix", function(object) 4)
Creating a new
2006 Jan 31
2
an unpleasant interaction of environments and generic functions
I've run into an unpleasant oddity involving the interaction of
environments and generic functions. I want to check my diagnosis, and
see if there is a good way to avoid the problem.
Problem:
A library defines
"foo" <- function(object) 1
setMethod("foo", c("matrix"), function(object) 30)
After loading the library
foo(0) is 1
foo(matrix()) is 30
foo is a
2007 Jan 16
1
Problems with checking documentation vs data, and a proposal
I have a single data file inputs.RData that contains 3 objects. I
generated an Rd page for each object using prompt().
When I run R CMD check I get
* checking for code/documentation mismatches ... WARNING
Warning in utils::data(list = al, envir = data_env) :
data set 'gold' not found
(gold is one of the objects).
This appears to be coming from the codocData function defined in
2005 Dec 09
3
external pointers
I have some C data I want to pass back to R opaquely, and then back to
C. I understand external pointers are the way to do so.
I'm trying to find how they interact with garbage collection and object
lifetime, and what I need to do so that the memory lives until the
calling R process ends.
Could anyone give me some pointers? I haven't found much documentation.
An earlier message
2005 Nov 22
1
Customizing the package build process
I've made a package for which R CMD build isn't producing very
satisfactory results. I'll get to the details in a moment.
I wonder if it would make sense to have my own makefiles (which already
exist and are doing quite a lot) produce the .tar.gz file ordinarily
produced by R CMD build. As far as I can tell, R CMD build basically
tars up of the project directory after running some
2006 Oct 02
1
documenation duplication and proposed automatic tools
I've been looking at documenting S4 classes and methods, though I have a
feeling many of these issues apply to S3 as well.
My impression is that the documentation system requires or recommends
creating basically the same information in several places. I'd like to
explain that, see if I'm correct, and suggest that a more automated
framework might make life easier.
PROBLEM
Consider a
2006 Sep 26
3
S4 accessors
I have a small S4 class for which I've written a page grouping many of
the accessors and replacement functions together. I would be interested
in people comments on the approach I've taken.
The code has a couple of decisions for which I could imagine
alternatives. First, even simple get/set operations on class elements
are wrapped in functions. I suppose I could just use myinstance at
2005 Nov 23
2
Makefiles and other customization
Writing R Extensions mentions that a package developer can provide a
Makefile, but gives very little information about what should be in it.
It says there must be a clean target, and later on there's mention of
$(SHLIB): $(OBJECTS)
$(SHLIB_LINK) -o $@ $(OBJECTS) $(ALL_LIBS)
(in the F95 discussion).
What should a Makefile provide, and what can it assume? In other words,
2007 Jan 02
4
Am I missing something about debugging?
I would like to be able to trace execution into calls below the current
function, or to follow execution as calls return. This is roughly the
distinction between "step" and "next" in many debuggers.
I would also like to be able to switch to a location further up the call
stack than the location at which I enter the debugger, to see the
context of the current operations.
Are