search for: 126mb

Displaying 6 results from an estimated 6 matches for "126mb".

Did you mean: 120mb
2007 Dec 14
3
[LLVMdev] Adding ClamAV to the llvm testsuite (long)
...l disk I/O time out of the benchmark, like rerun 3 times automatically or something like that? 5. Library dependencies It needs zlib, all the rest is optional (bzip2, gmp, ....). I think I can reasonably assume zlib is available on all systems where the testsuite is run. 6. Sample output on using 126Mb of data as input: $ make TEST=nightly report .... Program | GCCAS Bytecode LLC compile LLC-BETA compile JIT codegen | GCC CBE LLC LLC-BETA JIT | GCC/CBE GCC/LLC GCC/LLC-BETA LLC/LLC-BETA clamscan | 7.0729 2074308 * * * | 17.48 17.55 18.81 *...
2009 Apr 24
2
"Old method" bootloader failing with large ramdisk
I''m trying to boot a PV guest using the "old method" of passing kernel= and ramdisk= and it appears to work fine with a "small" initrd but not with a "large" one. (Small is 4MB, large is 154MB.) I''m sure both of the initrd''s are properly gzip''ed etc. Unpacked, the large one approaches 400M. By doing some kernel startup debugging,
2007 Dec 17
0
[LLVMdev] Adding ClamAV to the llvm testsuite (long)
...ke rerun 3 times automatically or something like > that? > > 5. Library dependencies > It needs zlib, all the rest is optional (bzip2, gmp, ....). I think I > can reasonably assume zlib is available on all systems where the > testsuite is run. > > 6. Sample output on using 126Mb of data as input: > > $ make TEST=nightly report > .... > Program | GCCAS Bytecode LLC compile LLC-BETA compile JIT codegen | > GCC CBE LLC LLC-BETA JIT | GCC/CBE GCC/LLC GCC/LLC-BETA > LLC/LLC-BETA > clamscan | 7.0729 2074308 * * *...
2006 Apr 24
6
Handling large dataset & dataframe
Hi, I have a dataset consisting of 350,000 rows and 266 columns. Out of 266 columns 250 are dummy variable columns. I am trying to read this data set into R dataframe object but unable to do it due to memory size limitations (object size created is too large to handle in R). Is there a way to handle such a large dataset in R. My PC has 1GB of RAM, and 55 GB harddisk space running
2007 Jan 11
4
Help understanding some benchmark results
...and 245MB/sec read, while a ZFS raidz using the same disks returns about 120MB/sec write, but 420MB/sec read. * 16-disk RAID10 on Linux returns 165MB/sec and 440MB/sec write and read, while a ZFS pool with 8 mirrored disks returns 140MB/sec write and 410MB/sec read. * 16-disk RAID6 on Linux returns 126MB/sec write, 162MB/sec read, while a 16-disk raidz2 returns 80MB/sec write and 142MB/sec read. The biggest problem I am having understanding "why is it so", is because I was under the impression with ZFS''s CoW, etc, that writing (*especially* writes like this, to a raidz array) sh...
2007 Dec 18
3
[LLVMdev] Adding ClamAV to the llvm testsuite (long)
...or something like >> that? >> >> 5. Library dependencies >> It needs zlib, all the rest is optional (bzip2, gmp, ....). I think I >> can reasonably assume zlib is available on all systems where the >> testsuite is run. >> >> 6. Sample output on using 126Mb of data as input: >> >> $ make TEST=nightly report >> .... >> Program | GCCAS Bytecode LLC compile LLC-BETA compile JIT codegen | >> GCC CBE LLC LLC-BETA JIT | GCC/CBE GCC/LLC GCC/LLC-BETA >> LLC/LLC-BETA >> clamscan | 7.0729 2074308 *...