Displaying 20 results from an estimated 100 matches similar to: "Plans for tighter integration of reference classes in R."
2007 Oct 06
1
override variables
Hi,
I am trying to allow certain nodes to override variables in inherited
classes. Here''s an example:
node ''my.server.com'' inherits webserver {}
node ''other.server.com'' inherits webserver { $var = "newvalue" }
node webserver { $var = "value"
include snmp, dns, classx
}
In this example classx ends up using
2001 Oct 02
2
Avoiding deep copies
Is it correct that there is no way of avoiding deep copying
of data structures? Or asked from a different perspective,
is it true that there are no pointers? :)
(not that I am a fan of pointer, they just let me
decide when to do deep copy on my own when the
memory manager doesnt do it for me :) )
I was considering writing code in R that would need
the internal representation of complex graph
2015 Jun 23
3
Plans to improve reference classes?
Could of requests:
1) Is there any example or writeup on the difficulties of extending
reference classes across packages? Just so I can fully understand the
issues.
2) In what sorts of situations does the performance of reference
classes cause problems? Sure, it's an order of magnitude slower than
constructing a simple environment, but those timings are in
microseconds, so one would need a
2013 May 01
1
Size of a refClass instance
I'm using refClass for a complex multi-directional tree structure with possibly 100,000s of nodes. The refClass design is very impressive and I'd love to use it, but I've found that the size of refClass instances are very large and creation time is slow. For example, below is a RefClass and normal S4 class. The RefClass requires about 4KB per instance vs 500B for the S4 class --
2005 Feb 24
1
Do environments make copies?
I am using environments to avoid making copies (by keeping references).
But it seems like there is a hidden copy going on somewhere - for
example in the code fragment below, I am creating a reference to "y"
(of size 500MB) and storing the reference in object "data". But when I
save "data" and then restore it in another R session, gc() claims it is
using twice the
2005 Feb 24
1
Do environments make copies?
I am using environments to avoid making copies (by keeping references).
But it seems like there is a hidden copy going on somewhere - for
example in the code fragment below, I am creating a reference to "y"
(of size 500MB) and storing the reference in object "data". But when I
save "data" and then restore it in another R session, gc() claims it is
using twice the
2015 Jun 23
0
Plans to improve reference classes?
> 1) Is there any example or writeup on the difficulties of extending
> reference classes across packages? Just so I can fully understand the
> issues.
Here's a simple example:
library(scales)
library(methods)
MyRange <- setRefClass("MyRange", contains = "DiscreteRange")
a_range <- MyRange()
a_range$train(1:10)
# Error in a_range$train(1:10) : could not
2013 Oct 16
1
Internally accessing ref class methods with .self$x is different from .self[['x']]
When a reference class method is accessed with .self$x, it has
different behavior from .self[['x']]. The former copies the function
to the object's environment (with some attributes attached), and the
latter just return NULL (unless it has already been accessed once with
.self$x). Is this how it's supposed to work?
Here's an example that illustrates:
2018 Sep 28
0
About current Icecast development -- developer edition
Good afternoon,
earlier today I wrote a mail to the user mailing list about current
status of the current development. As pointed out by Mr. Rücker, it
seems to be good to also write to this list with a bit more technical
details.
Please note that this mail represents current master branch ("2.5.x").
We do recommend to use current stable branch ("2.4.x") for production.
2015 Sep 29
1
making object.size() more meaningful on environments?
Hi Gabe,
On 09/29/2015 02:51 PM, Gabriel Becker wrote:
> Herve,
>
> The problem then would be that for A a refClass whose fields take up N
> bytes (in the sense that you mean), if we do
>
> B <- A
>
> A and B would look like the BOTH take up N bytes, for a total of 2N,
> whereas AFAIK R would only be using ~ N + 2*56 bytes, right?
Yes, but that's still a *much*
2004 Sep 10
0
[Flac-users] settings for tighter compression than -8?
On Sun, Apr 06, 2003 at 04:40:42PM -0500, David W. Tamkin wrote:
> If processing time is not a big factor -- say, I could put up with four
> to six times the duration of compressing at -8 -- what command-line
> settings could one use to get even more compression?
>
> I have a case where the FLACs encoded at -8 are about 653.3 MB, but the
> set comes with artwork whose jpegs
2004 Sep 10
2
[Flac-users] Re: settings for tighter compression than -8?
Miroslav Lichvar wrote:
> Ok, you need 0.04% improvement, that should not be a problem.
Perhaps a little more than that, since the sizes I listed were after
stripping out the padding and all metadata blocks except SEEKTABLE and
STREAMINFO.
> Try flac --lax -e -p -l 32 -r 10 --no-padding
> and if it is not enough, increase -r up to 16.
Thank you. I'll do that. What, though,
2004 Sep 10
0
[Flac-users] Re: settings for tighter compression than -8?
On Sat, Apr 12, 2003 at 05:49:06PM -0500, David W. Tamkin wrote:
> Early this past week, Miroslav Lichvar suggested for me:
>
> >Ok, you need 0.04% improvement, that should not be a problem. Try
> >flac --lax -e -p -l 32 -r 10 --no-padding
>
> Thank you again, Miroslav. I tried that, and it took almost two full
> days (surprisingly, Windows ME stayed up that long
2012 Apr 18
0
Tighter Puppet Dashboard and MCollective integration
I was just thinking about the problem of host groups and such trying to set
up our puppet infrastructure properly and came to the realization that
using MCollective better in puppet dashboard would allow for more cloud
like scaling of infrastructure services.
Here is the concept:
Right now in puppet dashboard groups can have nodes, facts (parameters) and
Classes.
What I am suggesting would
2004 Sep 10
3
[Flac-users] settings for tighter compression than -8?
If processing time is not a big factor -- say, I could put up with four
to six times the duration of compressing at -8 -- what command-line
settings could one use to get even more compression?
I have a case where the FLACs encoded at -8 are about 653.3 MB, but the
set comes with artwork whose jpegs are 50.5 MB (I tried zipping the
jpegs, realizing it would do very little, and the zip file
2004 Sep 10
5
[Flac-users] Re: settings for tighter compression than -8?
Early this past week, Miroslav Lichvar suggested for me:
> Ok, you need 0.04% improvement, that should not be a problem. Try
> flac --lax -e -p -l 32 -r 10 --no-padding
Thank you again, Miroslav. I tried that, and it took almost two full
days (surprisingly, Windows ME stayed up that long without crashing) to
re-encode the entire set on my 266-MHz machine. After all, in the help
file
2019 Aug 27
3
[nbdkit PATCH 0/2] RFC: tighter filter versions
This is not intended for v1.14. In fact, we may decide that the
second patch is too gross, although the first one still seems like a
useful improvement in isolation.
I will also point out that all our filters are in-tree, and set the
user-controlled field .version to the current release string. We
could replace the second patch with a simpler one that just checks
._api_version as an int (as
2011 Mar 10
1
Testing for a reference class object
Hi all,
I've constructed the following function to test whether or not an
object was created from a reference class:
isRefClassObject <- function(x) isS4(x) &&
is.environment(attr(x,'.xData')) &&
exists('.refClassDef',attr(x,'.xData'))
but I'm unsure if it's a complete test or if there's a better way to
test. Regardless, It would be
2015 Jul 03
1
Are downstream dependencies rebuilt when a package is updated on CRAN?
I was wondering: are the downstream dependencies of a package rebuilt
when a package is updated on CRAN? (I'm referring to the binary
packages, of course.)
The reason I ask is because there are cases where this can cause
problems. Suppose that when pkgB is built, it calls
pkgA::makeClosure(), which returns a closure that refers to a function
in pkgA. Suppose this code is in pkgA 1.0:
2015 Sep 29
3
making object.size() more meaningful on environments?
Hi,
Currently object.size() is not very useful on environments as it always
returns 56 bytes, no matter how big the environment is:
env1 <- new.env()
object.size(env1) # 56 bytes
env2 <- new.env(hash=TRUE, size=75000000L)
object.size(env2) # 56 bytes
env3 <- list2env(list(a=runif(25000000), L=LETTERS))
object.size(env3) # 56 bytes
This makes it pretty useless on