I'm often wanting to develop functions whilst manipulating data. But I don't want to end up with a .RData full of functions and data. It might be that I have functions that are re-usable but not worth sticking in a package. So I've tried to come up with a paradigm for function development that more closely follows the way Matlab and Python do it (partly inspired by a confused Matlab convert over on R-help). My requirements were thus: * .R files as the master source for R functions * Don't see the functions in ls() * After editing R, make it easy to update the definitions visible to R (unlike rebuilding and reloading a package). So I wrote these two in a few mins: loadDir <- function(dir){ e = attach(NULL,name=dir) assign("__path__",dir,envir=e) reloadDir(e) e } reloadDir <- function(e){ path = get("__path__",e) files = list.files(path,".R$",full.names=TRUE,recursive=TRUE,ignore.case=TRUE) for(f in files){ sys.source(f,envir=e) } } Usage is something like: lib1 = loadDir("/foo/bar/baz/lib1/") - it creates a new environment on the search path and sources any .R it finds in there into that environment. If you edit anything in that directory, just do reloadDir(lib1) and the updated definitions are loaded. It's like python's "import foo" and "reload(foo)". Sourcing everything on any change seems a bit wasteful, but until R objects have timestamps I can't think of a better way. Hmm, maybe my environment could keep a __timestamp__ object... Okay, this is getting less simple now... So anyway, have I done anything wrong or stupid here, or is it a useful paradigm that seems so obvious someone else has probably done it (better)? Barry
That's nifty. Perhaps it could look into /foo/bar/baz/lib1/*/R in which case one could simply place source packages in /foo/bar/baz/lib1 In fact it would be nice if R had built into it some way of running code in source packages possibly with degraded functionality to ease development, i.e. if one added /foo/bar/baz/lib1 to .libPaths and if xx were a source package in /foo/bar/baz/lib1 then one could use library(xx) and use xx functions directly, possibly with degraded functionality, e.g. no help files. On Fri, Aug 21, 2009 at 8:03 AM, Barry Rowlingson<b.rowlingson at lancaster.ac.uk> wrote:> I'm often wanting to develop functions whilst manipulating data. But I > don't want to end up with a .RData full of functions and data. It > might be that I have functions that are re-usable but not worth > sticking in a package. > > ?So I've tried to come up with a paradigm for function development > that more closely follows the way Matlab and Python do it (partly > inspired by a confused Matlab convert over on R-help). > > ?My requirements were thus: > > ?* .R files as the master source for R functions > ?* Don't see the functions in ls() > ?* After editing R, make it easy to update the definitions visible to > R (unlike rebuilding and reloading a package). > > ?So I wrote these two in a few mins: > > loadDir <- function(dir){ > ?e = attach(NULL,name=dir) > ?assign("__path__",dir,envir=e) > ?reloadDir(e) > ?e > } > > reloadDir <- function(e){ > ?path = get("__path__",e) > ?files = list.files(path,".R$",full.names=TRUE,recursive=TRUE,ignore.case=TRUE) > ?for(f in files){ > ? ?sys.source(f,envir=e) > ?} > } > > ?Usage is something like: > > ?lib1 = loadDir("/foo/bar/baz/lib1/") > > ?- it creates a new environment on the search path and sources any .R > it finds in there into that environment. If you edit anything in that > directory, just do reloadDir(lib1) and the updated definitions are > loaded. It's like python's "import foo" and "reload(foo)". > > ?Sourcing everything on any change seems a bit wasteful, but until R > objects have timestamps I can't think of a better way. Hmm, maybe my > environment could keep a __timestamp__ object... Okay, this is getting > less simple now... > > ?So anyway, have I done anything wrong or stupid here, or is it a > useful paradigm that seems so obvious someone else has probably done > it (better)? > > Barry > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel >
On Fri, Aug 21, 2009 at 6:03 AM, Barry Rowlingson <b.rowlingson at lancaster.ac.uk> wrote:> I'm often wanting to develop functions whilst manipulating data. But I > don't want to end up with a .RData full of functions and data. It > might be that I have functions that are re-usable but not worth > sticking in a package. > > ?So I've tried to come up with a paradigm for function development > that more closely follows the way Matlab and Python do it (partly > inspired by a confused Matlab convert over on R-help). > > ?My requirements were thus: > > ?* .R files as the master source for R functions > ?* Don't see the functions in ls() > ?* After editing R, make it easy to update the definitions visible to > R (unlike rebuilding and reloading a package). > > ?So I wrote these two in a few mins: > > loadDir <- function(dir){ > ?e = attach(NULL,name=dir) > ?assign("__path__",dir,envir=e) > ?reloadDir(e) > ?e > } > > reloadDir <- function(e){ > ?path = get("__path__",e) > ?files = list.files(path,".R$",full.names=TRUE,recursive=TRUE,ignore.case=TRUE) > ?for(f in files){ > ? ?sys.source(f,envir=e) > ?} > }Rather than using __path__, why not just use chdir = TRUE in sys.source() and rely on the usual R working directory semantics?> ?Sourcing everything on any change seems a bit wasteful, but until R > objects have timestamps I can't think of a better way. Hmm, maybe my > environment could keep a __timestamp__ object... Okay, this is getting > less simple now...That's what I do for all of my packages during development. You really need a huge amount of code before it starts to take a noticeable amount of time. Hadley -- http://had.co.nz/