Displaying 20 results from an estimated 39 matches for "1.6gb".
Did you mean:
1.5gb
2018 May 13
0
A Fresh Start with LLVM
On 5/13/18, Bruce Hoult via llvm-dev <llvm-dev at lists.llvm.org> wrote:
> Yes, it's not bad. You can actually reduce the size of the .git
> directory to 597 MB by running "git repack -a -d -f --depth=250
> --window=250". This takes less than 5 minutes on a 16 core Xeon.
You can also svn checkout any GitHub branch if that's something that
you might need.
2018 May 14
2
A Fresh Start with LLVM
On Mon, May 14, 2018 at 3:02 AM, Carsten Mattner via llvm-dev <
llvm-dev at lists.llvm.org> wrote:
> On 5/13/18, Bruce Hoult via llvm-dev <llvm-dev at lists.llvm.org> wrote:
> > Yes, it's not bad. You can actually reduce the size of the .git
> > directory to 597 MB by running "git repack -a -d -f --depth=250
> > --window=250". This takes less than 5
2018 May 13
2
A Fresh Start with LLVM
Yes, it's not bad. You can actually reduce the size of the .git directory
to 597 MB by running "git repack -a -d -f --depth=250 --window=250". This
takes less than 5 minutes on a 16 core Xeon. Unfortunately I've never found
a way to get such a nicely packed repo into github such that it checks out
for others as nicely as it was when I uploaded it :-(
On Mon, May 14, 2018 at
2010 Feb 08
1
release memory
I am a new R user with the latest Ubuntu release.
My executions consume large amount of memory (up to 1.6 GB). When I try to
release the memory using "rm(list=ls())", R still occupies 1.6GB. (I also
tried to use rm on the specific arrays which I know to be large).
What could be the reason for that?
BTW I am using the randomForest package.
Thanks
--
View this message in context:
2011 Dec 14
0
Temporary use of disk space when deploying KVM with qcow2 ?
Hello,
I'm using libvirt to deploy a series of 7 KVM (in qcow2 format)
sequentially. The base image of the qcow2 is an ubuntu server of around
1.6GB.
The environment where i am doing this is a Live USB Ubuntu with a
persistence file (so that changes made remain).
So, the problem:
* If the persistence file (i.e. free disk space in the live ubuntu) is up
to around 1.6GB, the qemu process of
2006 Sep 09
2
Not enough memory to run installer
I'm trying to install a game using Wine (Company of
Heroes Multiplayer Beta), but when I try to execute
the installer, I get the error:
wine: could not load
L"Z:\\home\\bratch\\CoHBeta_1_19_0.exe": Not enough
memory
Somebody on #winehq suggested I move it to
~/.wine/drive_c/, but this gave the same error.
The installer is 1.6GB, and I have tried to run it
with varying amounts of
2018 May 14
0
A Fresh Start with LLVM
On 5/14/18, Bruce Hoult <bruce at hoult.org> wrote:
> On Mon, May 14, 2018 at 3:02 AM, Carsten Mattner via llvm-dev <
> llvm-dev at lists.llvm.org> wrote:
>
>> On 5/13/18, Bruce Hoult via llvm-dev <llvm-dev at lists.llvm.org> wrote:
>> > Yes, it's not bad. You can actually reduce the size of the .git
>> > directory to 597 MB by running
2011 Jan 16
1
Memory issues
Hi,
I have read several threads about memory issues in R and I can't seem to
find a solution to my problem.
I am running a sort of LASSO regression on several subsets of a big dataset.
For some subsets it works well, and for some bigger subsets it does not
work, with errors of type "cannot allocate vector of size 1.6Gb". The error
occurs at this line of the code:
example <-
2010 May 16
1
syslinux can't read the configuration file on USB
Hi,
I have the following weird problem with syslinux, and I was hoping
that some1 on this list might be able to help:
I am using syslinux 3.86 on CentOS-5.4 64bit.
I'm creating a custom installation image for my distribution
(customized centos). I first create a file with the image, and later
on I write it to a USB disk. This procedure have been working quite
good for a very long time.
2007 Sep 26
1
modifying large R objects in place
I have a C function, which performs a transformation
of a large integer matrix. The matrix may be of size 1.6GB,
so I can have only one copy in RAM and have to modify it
in place. This is possible using .Call and works fine. For
debugging, I need two copies of a smaller matrix and modify only
one of them. This may also be done, for example, by
A <- some integer matrix
B <- A +
2018 Mar 14
2
truncation/rounding bug with write.csv
To my surprise, I can confirm on Windows 10 using R 3.4.3 . As tail is not
recognized by Windows cmd, I replaced with:
system('powershell -nologo "& "Get-Content -Path temp.csv -Tail 1')
The last line shows only 7 digits after the decimal, whereas the first have
15 digits after the decimal. I agree with Dirk though, 1.6Gb csv files are
not the best way to work with
2006 Apr 01
3
CentOS 4.3 occasionally locking up accessing IDE drive
For those who haven't seen my several previous postings about problems
with this (now not quite so) new PC, I have an ASUS P5N32-SLI Deluxe
motherboard. The boot drive and primary filesystems are on an SATA
disk and I'm having no problem with that. However, I recently plugged
in a couple of IDE drives salvaged from my old PCs and I'm running
into trouble with one of those.
The drive
2009 Nov 20
1
R 2.10 'memory leak'? on OS X
Dear R users,
I am running R 2.10.0 on OS X 10.5.8.
I had been running 2.10 successfully for about a week (and have used
previous R versions for 2+ years on the same computer) until 2 days ago it
failed to start up for me. Now when I try to start R, the application tries
to initiate for several minutes then crashes. Looking at the activity
monitor, my memory usage goes from having about 1.6Gb
2008 Feb 13
3
isolinux not booting - old 486 with SCSI CD writer
I am trying to install Debian Linux on an an old Intel Classic R+ computer that
uses an internal Yamaha SCSI CD writer model CRW8424S connected to an Adaptec
ISA SCSI card (I think its a 1542CP). The CD writer is is the only device
connected to the SCSI card. The computer has one hard drive connected to the
on-board IDE interface, a 1.44MB 3 1/2 inch floppy and a 1.2MB 5 1/4 inch floppy.
The hard
2009 Nov 20
2
R 2.10 memory leak on OS X
Dear R users,
I am running R 2.10.0 on OS X 10.5.8.
I had been running 2.10 successfully for about a week (and have used
previous R versions for 2+ years on the same computer) until 2 days ago it
failed to start up for me. Now when I try to start R, the application tries
to initiate for several minutes then crashes. Looking at the activity
monitor, my memory usage goes from having about 1.6Gb
2018 Mar 14
2
truncation/rounding bug with write.csv
I don't see the issue here. It would be helpful if people would report
their sessionInfo() when reporting whether or not they see this issue.
Mine is
> sessionInfo()
R version 3.4.3 (2017-11-30)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Arch Linux
Matrix products: default
BLAS/LAPACK: /usr/lib/libopenblas_haswellp-r0.2.20.so
locale:
[1] LC_CTYPE=en_US.UTF-8
2006 Apr 20
0
Happy story
2003 Feb 20
1
Fast Cygwin binaries ?
I have tried using the Cygwin rsync binaries, but found them so slow as to
be unusable. After 1-1/2 hours, it was still 100% CPU trying to sync two
1.6GB files. (It finally finished moving one, but was still thinking about
the second.) Using scp, it takes less than an hour to move the two files.
These binaries have big sections that never change, so I would expect better
performance.
I read
2013 May 12
0
Glusterfs with Infiniband tips
Hello guys,
I was wondering if someone could share their glusterfs volume and system settings if you are running glusterfs with infiniband networking. In particular I am interested in using the glusterfs + infiniband + kvm for virtualisation. However, any other implementation would also be useful for me.
I've tried various versions of glusterfs (versions 3.2, 3.3 and 3.4beta) over the past
2008 Mar 21
1
Memory Problem
Dear all,
I am having a memory problem when analyzing a rather large data set with
nested factors in R.
The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent
variables some of which are nested.
The problem occurs when using aov but also when using glm or lme.
In particular I get the following response,
Error: cannot allocate vector of size 1.6 Gb
R(311,0xa000d000) malloc: ***