Oliver LYTTELTON
2006-Jan-23 20:24 UTC
[Rd] Master's project to coerce linux nvidia drivers to run generalised linear models
Hi, I am working with a friend on a master's project. Our laboratory does a lot of statistical analysis using the R stats package and we also have a lot of under-utilised nvidia cards sitting in the back of our networked linux machines. Our idea is to coerce the linux nvidia driver to run some of our statistical analysis for us. Our first thought was to specifically code up a version of glm() to run on the nvidia cards... Thinking that this might be of use to the broader community we thought we might ask for feedback before starting? Any ideas... Thanks, Olly
Marc Schwartz (via MN)
2006-Jan-23 21:47 UTC
[Rd] Master's project to coerce linux nvidia drivers to run generalised linear models
On Mon, 2006-01-23 at 15:24 -0500, Oliver LYTTELTON wrote:> > Hi, > > I am working with a friend on a master's project. Our laboratory does a > lot of statistical analysis using the R stats package and we also have a > lot of under-utilised nvidia cards sitting in the back of our networked > linux machines. Our idea is to coerce the linux nvidia driver to run > some of our statistical analysis for us. Our first thought was to > specifically code up a version of glm() to run on the nvidia cards... > > Thinking that this might be of use to the broader community we thought > we might ask for feedback before starting? > > Any ideas... > > Thanks, > > OllyWell, I'll bite. My first reaction to this was, why? Then I did some Googling and found the following article: http://www.apcmag.com/apc/v3.nsf/0/5F125BA4653309A3CA25705A0005AD27 And also noted the GPU Gems 2 site here: http://developer.nvidia.com/object/gpu_gems_2_home.html So, my new found perspective is, why not? Best wishes for success, especially since I have a certain affinity for McGill... HTH, Marc Schwartz
Gabor Grothendieck
2006-Jan-23 22:09 UTC
[Rd] Master's project to coerce linux nvidia drivers to run generalised linear models
I wonder if it would make more sense to get a relatively low level package to run on it so that all packages that used that low level package would benefit. The Matrix package and the functions runmean and sum.exact in package caTools are some things that come to mind. Others may have other ideas along these lines. On 1/23/06, Oliver LYTTELTON <oliver at bic.mni.mcgill.ca> wrote:> > > Hi, > > I am working with a friend on a master's project. Our laboratory does a > lot of statistical analysis using the R stats package and we also have a > lot of under-utilised nvidia cards sitting in the back of our networked > linux machines. Our idea is to coerce the linux nvidia driver to run > some of our statistical analysis for us. Our first thought was to > specifically code up a version of glm() to run on the nvidia cards... > > Thinking that this might be of use to the broader community we thought > we might ask for feedback before starting? > > Any ideas... > > Thanks, > > Olly > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel >