similar to: R 1.6.0 benchmark with and without optimized ATLAS

Displaying 14 results from an estimated 14 matches similar to: "R 1.6.0 benchmark with and without optimized ATLAS"

2001 Feb 16
1
Sub_scribe and a question
Dear all, I am trying to get an estimate of the intercept for a linear model. In this case, I know the slope of the model, can anyone tell me how to constrain the formula in lm() so that it only estimates the intercept not the slope? Many thanks in advance, Sincerely, Liqing Zhang Dept. of Eco. Evol. Biol. Univ. of CA, Irvine email: lzhang at uci.edu >From VM Mon Apr 30 08:18:45 2001
2015 Nov 23
0
MKL Acceleration encouraging; need adjust package builds?
Hi Paul, We've been through this process ourselves for the Revolution R Open project. There are a number of pitfalls to avoid, but you can take a look at how we achieved it in the build scripts at: https://github.com/RevolutionAnalytics/RRO There are also some very useful notes in the R Installation guide: https://cran.r-project.org/doc/manuals/r-release/R-admin.html#BLAS Most packages do
2015 Nov 23
3
MKL Acceleration encouraging; need adjust package builds?
Dear R-devel: The Cluster administrators at KU got enthusiastic about testing R-3.2.2 with Intel MKL when I asked for some BLAS integration. Below I forward a performance report, which is encouraging, and thought you would like to know the numbers. Appears to my untrained eye there are some extraordinary speedups on Cholesky decomposition, determinants, and matrix inversion. They had
2001 Apr 27
2
Benchmarking R, why sort() is so slow?
Hello everybody, I am making a modified version of "Stephan Steinhaus' benchmark test for number crunching, v. 2, (see http://www.scinetificweb.com/ncrunch/ncrunch.pdf for the original version), comparing several functions of some math/stat software. R is not performing bad at all... except for the sorting of a 1,100,000 random vector (test #3) which is the worst of all (see cell F3 in
2001 Apr 27
2
Benchmarking R, why sort() is so slow?
Hello everybody, I am making a modified version of "Stephan Steinhaus' benchmark test for number crunching, v. 2, (see http://www.scinetificweb.com/ncrunch/ncrunch.pdf for the original version), comparing several functions of some math/stat software. R is not performing bad at all... except for the sorting of a 1,100,000 random vector (test #3) which is the worst of all (see cell F3 in
2003 Jul 26
0
R benchmark, moble Pentium III, 1.13 GHs
Hi Jason, I suppose you installed the Matrix library, and it is working on your computer? If yes, may be det.Matrix() was removed, or renamed in the Matrix library you have (I cannot check this for the latest version, because I am away of the office until August 1st), but I will do that next week. In the meantime, you can replace 'det.Matrix' by 'det.default', and it should run.
2011 Jul 21
2
revolution-mkl package not functioning correctly with R 2.13
It appears that the revoluton-mkl package, available via the multiverse Ubuntu repository with the purpose of adding multi-threaded numeric libraries to R, is not function correctly with R 2.13 (at least the version found on CRAN). Testing with "R-benchmark-25.R" (found at http://r.research.att.com/benchmarks/), I get the following error, after installing the revolution-mkl package
2007 Dec 12
3
combine variables to matrix
I just got stuck with a quite simple question. I've just read in an ASCII table from a plain text file with read.table(). It's a 1200x1200 table. R has assigned variables for each column: V1,V2,V3,V4,... For small data sets data <- read.table("data.txt"); data.matrix <- cbind(V1,V2,V3); works. But how could I put together 1200 columns? I've searched the R mailing
2010 Aug 08
2
paperclip save to disk and s3
I have a standard Paperclip setup that saves a file to my disk. In addition I would also like the file saved to my amazon s3 bucket. [code] after_save :copy_to_s3 def copy_to_s3 has_attached_file :photo, :storage => :s3, :s3_credentials => "#{RAILS_ROOT}/config/s3.yml", :styles => { :thumb => "100x100#", :small => "750x750>"
2004 Aug 06
1
Webcasting Rates
On Thu, 21 Feb 2002, Jack Moffitt wrote: > > Anyone (Jack?) see the story on /. yesterday about the webcasting rates? > > Won't that kill all Internet radio under a huge burden of cost? > > Are you saying it wasn't already dying? :) > > Those are set in stone, but similar rates and terms probably soon will > be. Remember, they are retroactive to 1998 or so.
2004 Aug 06
2
Webcasting Rates
Anyone (Jack?) see the story on /. yesterday about the webcasting rates? Won't that kill all Internet radio under a huge burden of cost? <p>--- >8 ---- List archives: http://www.xiph.org/archives/ icecast project homepage: http://www.icecast.org/ To unsubscribe from this list, send a message to 'icecast-request@xiph.org' containing only the word 'unsubscribe' in the
2004 Apr 22
1
New version of benchmark comparing R with other software
Hello, Thanks to Douglas Bates, there is now a new benchmark suite (version 2.3) which is compatible with R 1.9.0 and the recent Matrix library (0.8-1 or above). You find it at http://www.sciviews.org/other/benchmark.htm. It compares R 1.9.0 under Windows with: S-PLUS 6.5, Matlab 6.0, O-Matrix 5.6, Octave 2.1.42, Scilab 2.7 and Ox 3.30. In short, R in its version 1.9.0 and with the new Matrix
2007 Oct 27
1
[non-statistics question]methodological problem
Good afternoon! As mentioned in the subject, my question regards more the methodological part that accompanies survey design and the statistical part that is involved. So, I have the following data: a<-data.frame (id_hh=c(1:5), strata=c(1,1,2,2,1), Nhstrata=c(100,100,200,200,100), Nrmemb=c(2,4,2,5,4)) a$ocmemb1<-c("wk","jl","st","jl","st")
1999 Dec 29
1
Large data files
Dear R and S-Plus users: Currently I am using: at work: "S-Plus 2000 Pro" on a PC: Pentium II/350MHz, 256 MB RAM, running Win NT at home: "R" on my Mac PowerBook G3/292MHz, 128 MB RAM, running LinuxPPC Currently, at home I am trying to import a table(nrow=302500, ncol=6) which I have to do for each column extra because of memory problems. I have partially to use the columns,