similar to: Request: Increasing MAX_NUM_DLLS in Rdynload.c

Displaying 20 results from an estimated 1000 matches similar to: "Request: Increasing MAX_NUM_DLLS in Rdynload.c"

2016 Dec 20
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
Thanks Henrik this is very helpful! I will try this out on our tests and see if gcDLLs() has a positive effect. mlr currently has tests broken down by learner type such as classification, regression, forecasting, clustering, etc.. There are 83 classifiers alone so even when loading and unloading across learner types we can still hit the MAX_NUM_DLLS error, meaning we'll have to break them
2016 Dec 20
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
On 20 December 2016 at 17:40, Martin Maechler wrote: | >>>>> Steve Bronder <sbronder at stevebronder.com> | >>>>> on Tue, 20 Dec 2016 01:34:31 -0500 writes: | | > Thanks Henrik this is very helpful! I will try this out on our tests and | > see if gcDLLs() has a positive effect. | | > mlr currently has tests broken down by learner type
2016 Dec 20
0
Request: Increasing MAX_NUM_DLLS in Rdynload.c
>>>>> Steve Bronder <sbronder at stevebronder.com> >>>>> on Tue, 20 Dec 2016 01:34:31 -0500 writes: > Thanks Henrik this is very helpful! I will try this out on our tests and > see if gcDLLs() has a positive effect. > mlr currently has tests broken down by learner type such as classification, > regression, forecasting,
2016 Dec 20
0
Request: Increasing MAX_NUM_DLLS in Rdynload.c
Hi, Dirk: On 12/20/2016 10:56 AM, Dirk Eddelbuettel wrote: > On 20 December 2016 at 17:40, Martin Maechler wrote: > | >>>>> Steve Bronder <sbronder at stevebronder.com> > | >>>>> on Tue, 20 Dec 2016 01:34:31 -0500 writes: > | > | > Thanks Henrik this is very helpful! I will try this out on our tests and > | > see if gcDLLs()
2016 Dec 20
0
Request: Increasing MAX_NUM_DLLS in Rdynload.c
On reason for hitting the MAX_NUM_DLLS (= 100) limit is because some packages don't unload their DLLs when they being unloaded themselves. In other words, there may be left-over DLLs just sitting there doing nothing but occupying space. You can remove these, using: R.utils::gcDLLs() Maybe that will help you get through your tests (as long as you're unloading packages). gcDLLs() will
2016 Dec 20
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
On Tue, Dec 20, 2016 at 7:04 AM, Henrik Bengtsson <henrik.bengtsson at gmail.com> wrote: > On reason for hitting the MAX_NUM_DLLS (= 100) limit is because some > packages don't unload their DLLs when they being unloaded themselves. I am surprised by this. Why does R not do this automatically? What is the case for keeping the DLL loaded after the package has been unloaded? What
2016 Dec 21
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
On Tue, Dec 20, 2016 at 7:39 AM, Karl Millar <kmillar at google.com> wrote: > It's not always clear when it's safe to remove the DLL. > > The main problem that I'm aware of is that native objects with > finalizers might still exist (created by R_RegisterCFinalizer etc). > Even if there are no live references to such objects (which would be > hard to verify), it
2016 Feb 27
1
Unable to Install Packages from Binaries on Windows for R 3.2.3
Removing 'type=binary' worked for me. install.packages( 'httr', repos = "https://cran.rstudio.com/" ) But I get an error when I select binary as type --- install.packages( 'httr', type = 'binary', repos = "https://cran.rstudio.com/" ) Error in install.packages : type 'binary' is not supported on this platform.
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
I agree with Kasper, this is a 'big' issue. Does your method of taking only n PCs reduce the load on memory? The new addition to the summary looks like a good idea, but Proportion of Variance as you describe it may be confusing to new users. Am I correct in saying Proportion of variance describes the amount of variance with respect to the number of components the user chooses to show? So
2016 Mar 25
2
summary( prcomp(*, tol = .) ) -- and 'rank.'
> On 25 Mar 2016, at 10:41 am, peter dalgaard <pdalgd at gmail.com> wrote: > > As I see it, the display showing the first p << n PCs adding up to 100% of the variance is plainly wrong. > > I suspect it comes about via a mental short-circuit: If we try to control p using a tolerance, then that amounts to saying that the remaining PCs are effectively zero-variance, but
2015 May 05
3
Why is the diag function so slow (for extraction)?
Looks like the c(x)[...] bit used to be as.matrix(x)[...]. Not sure why the change was made many years ago, but this was before names were handled explicitly. It would definitely be better to not force the duplicate, at least in the case where we are sure c() and [ would not dispatch. Best, luke On Mon, 4 May 2015, peter dalgaard wrote: > >> On 04 May 2015, at 19:59 , franknarf
2016 Dec 21
0
Request: Increasing MAX_NUM_DLLS in Rdynload.c
It does, but you'd still be relying on the R code ensuring that all of these objects are dead prior to unloading the DLL, otherwise they'll survive the GC. Maybe if the package counted how many such objects exist, it could work out when it's safe to remove the DLL. I'm not sure that it can be done automatically. What could be done is to to keep the DLL loaded, but remove it from
2015 May 12
2
Why is the diag function so slow (for extraction)?
>>>>> Steve Bronder <sbronder at stevebronder.com> >>>>> on Thu, 7 May 2015 11:49:49 -0400 writes: > Is it possible to replace c() with .subset()? It would be possible, but I think "entirely" wrong. .subset() is documented to be an internal function not to be used "lightly" and more to the point it is documented to *NOT*
2016 Dec 20
0
Request: Increasing MAX_NUM_DLLS in Rdynload.c
It's not always clear when it's safe to remove the DLL. The main problem that I'm aware of is that native objects with finalizers might still exist (created by R_RegisterCFinalizer etc). Even if there are no live references to such objects (which would be hard to verify), it still wouldn't be safe to unload the DLL until a full garbage collection has been done. If the DLL is
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
Following from the R-help thread of March 22 on "Memory usage in prcomp", I've started looking into adding an optional 'rank.' argument to prcomp allowing to more efficiently get only a few PCs instead of the full p PCs, say when p = 1000 and you know you only want 5 PCs. (https://stat.ethz.ch/pipermail/r-help/2016-March/437228.html As it was mentioned, we already
2015 May 13
1
Why is the diag function so slow (for extraction)?
As kindly pointed out to me (oh my decaying gray matter), is.object() is better suited for this test; $ svn diff src/library/base/R/diag.R Index: src/library/base/R/diag.R =================================================================== --- src/library/base/R/diag.R (revision 68345) +++ src/library/base/R/diag.R (working copy) @@ -23,9 +23,11 @@ stop("'nrow' or
2016 Feb 27
3
Unable to Install Packages from Binaries on Windows for R 3.2.3
Installing packages from binaries on Windows seems broken, when using mirrors that are up to date with CRAN install.packages( 'httr', type = 'binary', repos = "https://cran.rstudio.com/" ) Changing repos to the Kansas CRAN mirror installs the package as expected, but that could be because the KS mirror has not yet synced. Someone pointed out that the PACKAGES.gz
2020 Apr 02
5
Rtools and R 4.0.0?
Hello, Has a decision been made yet as to whether R 4.0.0 on Windows is going to be built using the new gcc8 toolchain (described at https://cran.r-project.org/bin/windows/testing/rtools40.html)? >From the sidelines, I can see that the toolchain is being used to build and test packages on CRAN; if there are any remaining issues that I can help to try and run down (either in R or any CRAN
2016 Mar 25
0
summary( prcomp(*, tol = .) ) -- and 'rank.'
As I see it, the display showing the first p << n PCs adding up to 100% of the variance is plainly wrong. I suspect it comes about via a mental short-circuit: If we try to control p using a tolerance, then that amounts to saying that the remaining PCs are effectively zero-variance, but that is (usually) not the intention at all. The common case is that the remainder terms have a roughly
2016 Mar 25
0
summary( prcomp(*, tol = .) ) -- and 'rank.'
> On 25 Mar 2016, at 10:08 , Jari Oksanen <jari.oksanen at oulu.fi> wrote: > >> >> On 25 Mar 2016, at 10:41 am, peter dalgaard <pdalgd at gmail.com> wrote: >> >> As I see it, the display showing the first p << n PCs adding up to 100% of the variance is plainly wrong. >> >> I suspect it comes about via a mental short-circuit: If we