Displaying 5 results from an estimated 5 matches for "embarrassingly_parallel".
2009 Nov 22
2
[LLVMdev] -O0 compile time speed (was: Go)
..., though inter-unit optimizations can take much longer, the
benefits are worthwhile. Multiple threads/processes with a message
passing interface in between them would be a start, but compiling a
unix kernel that way would be tricky memory-wise. ;)
cheers,
--renato
[1] http://en.wikipedia.org/wiki/Embarrassingly_parallel
2009 Nov 22
0
[LLVMdev] -O0 compile time speed (was: Go)
On Saturday 21 November 2009 14:27:15 Chris Lattner wrote:
> On Nov 19, 2009, at 1:04 PM, Bob Wilson wrote:
> >> I've tested it and LLVM is indeed 2x slower to compile, although it
> >> generates
> >> code that is 2x faster to run...
> >>
> >>> Compared to a compiler in the same category as PCC, whose pinnacle of
> >>> optimization
2008 Nov 24
3
increasing memory limit in Windows Server 2008 64-bit
Hello,
I'm working with a very large dataset in R on a computer running 64-bit
Windows Server 2008 Standard with 32GB of RAM. According to the R for
Windows FAQ, the maximum value allowed for max-mem-size is 4095MB. Is it
possible to run R with a higher memory limit on this system? I've tried
changing memory.limit() in the R console but it claims the system has a
4-GB address limit,
2009 Nov 21
2
[LLVMdev] -O0 compile time speed (was: Go)
On Nov 19, 2009, at 1:04 PM, Bob Wilson wrote:
>> I've tested it and LLVM is indeed 2x slower to compile, although it
>> generates
>> code that is 2x faster to run...
>>
>>> Compared to a compiler in the same category as PCC, whose pinnacle of
>>> optimization is doing register allocation? I'm not surprised at all.
>>
>> What else
2009 Mar 09
5
Help
...amazed how quickly you can get up and
> running.
>
> As suggested at the start of this email... "it depends"...
>
> Best Regards,
> Sean O'Riordain
> Dublin
>
> [1] http://cran.r-project.org/web/packages/biglm/index.html
> [2] http://en.wikipedia.org/wiki/Embarrassingly_parallel
>
>
> iwalters wrote:
> >
> > I'm currently working with very large datasets that consist out of
> > 1,000,000 + rows. Is it at all possible to use R for datasets this size
> > or should I rather consider C++/Java.
> >
> >
> >
>
> --
>...