similar to: Unable to Install Packages from Binaries on Windows for R 3.2.3

Displaying 20 results from an estimated 5000 matches similar to: "Unable to Install Packages from Binaries on Windows for R 3.2.3"

2016 Feb 27
1
Unable to Install Packages from Binaries on Windows for R 3.2.3
Removing 'type=binary' worked for me. install.packages( 'httr', repos = "https://cran.rstudio.com/" ) But I get an error when I select binary as type --- install.packages( 'httr', type = 'binary', repos = "https://cran.rstudio.com/" ) Error in install.packages : type 'binary' is not supported on this platform.
2016 Feb 27
0
Unable to Install Packages from Binaries on Windows for R 3.2.3
> On 27 Feb 2016, at 05:22 , Ramnath Vaidyanathan <ramnath.vaidya at gmail.com> wrote: > > Installing packages from binaries on Windows seems broken, when using > mirrors that are up to date with CRAN > > install.packages( > 'httr', > type = 'binary', > repos = "https://cran.rstudio.com/" > ) > > Changing repos to the Kansas
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
I agree with Kasper, this is a 'big' issue. Does your method of taking only n PCs reduce the load on memory? The new addition to the summary looks like a good idea, but Proportion of Variance as you describe it may be confusing to new users. Am I correct in saying Proportion of variance describes the amount of variance with respect to the number of components the user chooses to show? So
2016 Mar 25
2
summary( prcomp(*, tol = .) ) -- and 'rank.'
> On 25 Mar 2016, at 10:41 am, peter dalgaard <pdalgd at gmail.com> wrote: > > As I see it, the display showing the first p << n PCs adding up to 100% of the variance is plainly wrong. > > I suspect it comes about via a mental short-circuit: If we try to control p using a tolerance, then that amounts to saying that the remaining PCs are effectively zero-variance, but
2016 Dec 20
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
Thanks Henrik this is very helpful! I will try this out on our tests and see if gcDLLs() has a positive effect. mlr currently has tests broken down by learner type such as classification, regression, forecasting, clustering, etc.. There are 83 classifiers alone so even when loading and unloading across learner types we can still hit the MAX_NUM_DLLS error, meaning we'll have to break them
2011 Apr 18
4
(no subject)
Hai From which CRAN mirror can get the package ‘LPP2005REC’ Ram [[alternative HTML version deleted]]
2016 Dec 20
3
Request: Increasing MAX_NUM_DLLS in Rdynload.c
This is a request to increase MAX_NUM_DLLS in Rdynload.c in from 100 to 500. On line 131 of Rdynload.c, changing #define MAX_NUM_DLLS 100 to #define MAX_NUM_DLLS 500 In development of the mlr package, there have been several episodes in the past where we have had to break up unit tests because of the "maximum number of DLLs reached" error. This error has been an inconvenience that
2016 Dec 20
2
Request: Increasing MAX_NUM_DLLS in Rdynload.c
On 20 December 2016 at 17:40, Martin Maechler wrote: | >>>>> Steve Bronder <sbronder at stevebronder.com> | >>>>> on Tue, 20 Dec 2016 01:34:31 -0500 writes: | | > Thanks Henrik this is very helpful! I will try this out on our tests and | > see if gcDLLs() has a positive effect. | | > mlr currently has tests broken down by learner type
2011 May 23
6
What are the common Standard Statistical methods used for the analysis of a dataset
Hi, Anybody know what are the common Standard statistical methods used for the analysis of a dataset,and anybody know which of these methods give similar results Ram [[alternative HTML version deleted]]
2018 Jan 02
4
httr::content without message
Hi All: I am using httr to download files form a service, in this case a .csv file. When I use httr::content on the result, I get a message. Since this will be in a package. I want to suppress the message, but haven't figured out how to do so. The following should reproduce the result: myURL <-
2011 Apr 18
3
(no subject)
Hai i just wanted to know how we can find the package of a dataset, eg: how can i find the package in which the dataset *iris* is present Ram [[alternative HTML version deleted]]
2015 May 05
3
Why is the diag function so slow (for extraction)?
Looks like the c(x)[...] bit used to be as.matrix(x)[...]. Not sure why the change was made many years ago, but this was before names were handled explicitly. It would definitely be better to not force the duplicate, at least in the case where we are sure c() and [ would not dispatch. Best, luke On Mon, 4 May 2015, peter dalgaard wrote: > >> On 04 May 2015, at 19:59 , franknarf
2018 Jan 02
1
httr::content without message
Thanks to all that replied. I had just looked through the httr code and sure enough for a .csv mime time it calls readr::read_csv(). The httr::content docs suggest not using automatic parsing in a package, rather to determine mime type and parse yourself and Ben's suggestion also works if I do: junk <- readr::read_csv(r1$content, col_types = cols()) Perfect. Using httr rather than
2018 Jan 02
0
httr::content without message
Ahoy! That's a message generated by the readr::read_table() function (or it's friends). You can suppress it a number of ways, but this should work as httr::content() will pass through arguments, like col_types = cols(), to the file reader. junk <- httr::content(r1, col_types = cols()) See more here... https://blog.rstudio.com/2016/08/05/readr-1-0-0/
2015 Jul 09
4
R CMD build failure
I have a local library 'dart' that imports "httr". It has routines that access central patient data such as birth date, so it is heavily used locally but of no interest to anyone else. The httr library (and 300 others) are in a shared directory, referenced by everyone in the biostatistics group via adding this location to the .libPaths in their default .Rprofile.
2018 Oct 23
2
Suggestion: Make CRAN source URLs immutable
Hello, I hope the is the right list to send this suggestion to. I was wondering if it might be possible to have CRAN store the most current version of a package's source tarball at a location that does not change. As an example, the source tarball for `httr` is stored at https://cran.r-project.org/src/contrib/httr_1.3.1.tar.gz. However, once the next version of `httr` is released, the URL for
2013 May 08
1
Dependencies of Imports not attached?
Encountered an error in scripting, which can be reproduced using Rscript as follows: $ Rscript -e "library(httr); handle('http://cran.r-project.org')" Error in getCurlHandle(cookiefile = cookie_path, .defaults = list()) : could not find function "getClass" Calls: handle -> getCurlHandle or by starting R without the methods package attached: $
2011 May 14
1
(no subject)
Can any one help me where i can find small standalone R programs that do things like: scatterplot, mean/median, kernel density estimation etc. Ram [[alternative HTML version deleted]]
2016 Feb 08
3
something wrong in package submission procedure/website
Yesterday I uploaded new rockchalk_1.8.97. Then I received email saying that I needed to confirm the submission. Here's the message. Dear Paul E. Johnson Someone has submitted the package rockchalk to CRAN. You are receiving this email to confirm the submission as the maintainer of this package. To confirm the submission to CRAN, follow or copy & paste the following link into your
2016 Mar 24
3
summary( prcomp(*, tol = .) ) -- and 'rank.'
Following from the R-help thread of March 22 on "Memory usage in prcomp", I've started looking into adding an optional 'rank.' argument to prcomp allowing to more efficiently get only a few PCs instead of the full p PCs, say when p = 1000 and you know you only want 5 PCs. (https://stat.ethz.ch/pipermail/r-help/2016-March/437228.html As it was mentioned, we already