Emmanuel Levy
2008-Nov-18 23:03 UTC
[R] Mathematica now working with Nvidia GPUs --> any plan for R?
Dear All, I just read an announcement saying that Mathematica is launching a version working with Nvidia GPUs. It is claimed that it'd make it ~10-100x faster! http://www.physorg.com/news146247669.html I was wondering if you are aware of any development going into this direction with R? Thanks for sharing your thoughts, Best wishes, Emmanuel
Prof Brian Ripley
2008-Nov-19 06:56 UTC
[R] Mathematica now working with Nvidia GPUs --> any plan for R?
On Tue, 18 Nov 2008, Emmanuel Levy wrote:> Dear All, > > I just read an announcement saying that Mathematica is launching a > version working with Nvidia GPUs. It is claimed that it'd make it > ~10-100x faster! > http://www.physorg.com/news146247669.htmlWell, lots of things are 'claimed' in marketing (and Wolfram is not shy to claim). I think that you need lots of GPUs, as well as the right problem.> I was wondering if you are aware of any development going into this > direction with R?It seems so, as users have asked about using CUDA in R packages. Parallelization is not at all easy, but there is work on making R better able to use multi-core CPUs, which are expected to become far more common that tens of GPUs.> Thanks for sharing your thoughts, > > Best wishes, > > EmmanuelPS: R-devel is the list on which to discuss the development of R. -- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595