similar to: Out of Memory problem with an old app

Displaying 20 results from an estimated 100000 matches similar to: "Out of Memory problem with an old app"

2008 Mar 03
0
reducing RODBC odbcQuery memory use?
1. Can I avoid having RODBC use so much memory (35 times the data size or more) making a data.frame & then .rda file via. sqlQuery/save? 2. If not, is there some more appropriate way from w/in R to pull large data sets (2-5GB) into .rda files from sql? [R] reducing RODBC odbcQuery memory use? From: WILLIE, JILL <JILWIL_at_SAFECO.com> Date: Thu 25 Jan 2007 - 22:27:02 GMT
2007 Jan 26
0
FW: reducing RODBC odbcQuery memory use?
New to R, sorry if one or either of these is an inappropriate list for a question like this below; please let me know if this is a general help question. Jill Willie Open Seas Safeco Insurance jilwil at safeco.com -----Original Message----- From: WILLIE, JILL Sent: Thursday, January 25, 2007 2:27 PM To: r-help at stat.math.ethz.ch Subject: reducing RODBC odbcQuery memory use? Basic
2011 Apr 23
4
A question about memory ballooning
Hi all, How can I manage automatically memory ballooning under a kvm host (C5.6 and future C6)?? For example if I define a kvm guest to boot up with 512MB of RAM and I have configured 1GB as a maximum memory for this guest, how can I allocate this memory when guest will need it?? And the opposite question, can memory balloon be deallocated?? And is it possible to do this automatically or
2003 Sep 13
4
Large memory issues on 4-STABLE
Hey All I have a dual Xeon box used by our students for data crunching. It has 4GB of RAM. After initial installation everything went well until someone found they couldn;t allocate more than 512MB of RAM per process. After some poking around I found some things things to adjustin the kernel conf file: options MAXDSIZ="(2000*1024*1024)" options
2008 Mar 11
8
Can I define how much free memory is reported to an app?
Hi Sorry if this is a simple question, I haven't been able to find this out by myself! I'm trying to run an old application that I wrote under Visual BASIC 4.0 many years ago, however I am encountering the same problem under Wine that I find under modern XP: while trying to open the database I receive a 2004: Out of Memory error. It appears that this problem is actually due to an
2009 Feb 26
1
Controlling Amount of Memory Available
Hi, I've got a program that I'd like to run under WINE. I've used it under Windows for years but, after adding extra RAM to my PC, it insists there now isn't enough memory for it to run. Is there a way of specifying how much memory is available (e.g. 512MB, rather than the 2GB the PC has) in the hope that it will run correctly again? Many thanks, Graham
2008 Feb 14
1
Analysis with spatstat and Kcross() requires to much memory
Hi I am running analysis with Kcross from the package spatstat and I am getting a message that R can not allocate enough memory for a vector of 900MB. R seems to be running towards the 2GB limit per process. The dataset is not to big (ca 3000 points) but the mask for the points is extremely irregular (a buffer around roads which have been sampled) and I can do trhe analysis if I use a
2006 Jun 27
1
Memory available to 32-bit R app on 64-bit machine
I want to get a 64-bit machine/OS system so I can put 16Gb of RAM in it. As first I assumed that I would have to use the 64-bit version of R to make use of the 16Gb of RAM, which would mean that I would use the Linux version of R. But I have heard many posters say they run the 32-bit version of R on a 64-bit machine/OS. So my questions, in Windows 64-bit, how much memory would be available to
2014 Jun 02
0
Re: [long] major problems on fs; e2fsck running out of memory
Unfortunately, there has been a huge number of bug fixes for ext4's online resize since 2.6.32 and 1.42.11. It's quite possible that you hit one of them. > The 51.8% seems very suspicious to me. A few weeks ago, I did an online > resize2fs, and the original filesystem was about 52% the size of the new > one (from 2.7TB to 5.3TB). The resize2fs didn't report any errors, and
2020 Jun 05
0
[PATCH RFC v4 00/13] virtio-mem: paravirtualized memory
On 05.06.20 10:55, Alex Shi wrote: > > > ? 2020/1/9 ??9:48, David Hildenbrand ??: >> Ping, >> >> I'd love to get some feedback on >> >> a) The remaining MM bits from MM folks (especially, patch #6 and #8). >> b) The general virtio infrastructure (esp. uapi in patch #2) from virtio >> folks. >> >> I'm planning to send a proper
2005 Feb 23
0
Memory error in Mac OS X Aqua GUI v1.01 with cluster pack age functions
It's trying to allocate about 850 MB. And that's just the "object that broke the camel's back". You probably really are out of memory. You could increase swap space and cross your fingers, but probably daisy creates the 10481 x 10481 distance matrix, which would be about 800 MB since each entry is 8 bytes. It may even create multiple copies. You might try increasing RAM to 4
2009 Feb 18
2
Running out of memory when importing SPSS files
Hello R-help, I am trying to import a large dataset from SPSS into R. The SPSS file is in .SAV format and is about 1GB in size. I use read.spss to import the file and get an error saying that I have run out of memory. I am on a MAC OS X 10.5 system with 4GB of RAM. Monitoring the R process tells me that R runs out of memory when reaching about 3GB of RAM so I suppose the remaining 1GB is used up
2006 Aug 23
0
Some Problems with vt on new intel S5000PSL Board (chipset i5000P)
Hi, I have the job to set up two (identical) servers with xen, so that on each of these servers runs one or even more hvm-enabled windows guests (2003 Server). The first thing I noticed was that the complete systeme freezes when the hypervisor starts dom0, if I use the xen packages, that are officially available for debian unstable. These packages are based on xen 3.0.2 (testing changeset:
2005 Sep 09
2
A question on R memory management in .Fortran() calls under Windows
Dear R community, I have a question on how R manages memory allocation in .Fortran() calls under Windows. In brief, apparently, it is not possible to allocate large matrices inside a Fortran subroutine unless you pass them as arguments. If you do not act in this way RGUI crashes with a stack overflow error and acting on memory through vsize nsize ppsize and memory.limit does not help at all.
2010 May 05
0
[LLVMdev] Another bad binutils?
Hi Mike-M, Thanks for the help. It seems I'll have to just download the precompiled binaries since I only have 1 Gig in the entire system I'm using. --Sam ----- Original Message ---- > From: mike-m <mikem.llvm at gmail.com> > To: Samuel Crow <samuraileumas at yahoo.com> > Sent: Wed, May 5, 2010 3:36:34 PM > Subject: Re: [LLVMdev] Another bad binutils? >
2011 Nov 22
1
Recovering data from old corrupted file system
I have a corrupted multi-device file system that got corrupted ages ago (as I recall, one of the drives stopped responding, causing btrfs to panic). I am hoping to recover some of the data. For what it''s worth, here is the dmesg output from trying to mount the file system on a 3.0 kernel: device label Media devid 6 transid 816153 /dev/sdq device label Media devid 7 transid 816153
2005 Jan 24
3
MYSQL bechmark results on XEN and NFS
I am trying to setup mysql on XEN. I need to set it up on NFS so I can migrate the domain when needed. I am getting unexpected results and wanted to get some feedback. Here is my setup: NFS server Dual 2.0 Ghz AMD Opteron 2GB of RAM Broadcom GB NIC Standard Fedora Core 3 (FC3) Mysql w/tpc-c medium databases on the NFS share XenU filesystem on the NFS share Xen machine
2005 Jan 31
1
[LLVMdev] Question about Global Variable
Hi, Sorry for bothering you guys again. I got problem when I am trying to recover the Global Variable Initial value. What I did is like the following ConstantArray *Cstr = dyn_cast<ConstantArray>(gI->getInitializer()); // the above instruction enable me to get the content of initial string of global variable, like char a[10] ="test global"; And then I make some change for
2013 Jul 14
1
unmapped memory core dump with pure R program?
dear R developers---I am running a pure R program on the stock binary debian (ubuntu) 64-bit linux distribution, 3.0.1. for identification, 20abb3a1d917bce52a10dd40cb47b82b /usr/lib/R/bin/exec/R 58ebc91f143f752610c8dbe22182f3f3 /usr/lib/libR.so my R program loads 5 big matrices (about 1GB each)and rbind's them, all on a 16GB machine. alas, in one particular run, I am getting a
2014 Jun 02
5
Re: [long] major problems on fs; e2fsck running out of memory
Hi Bodo and Ted, Thank you both for your responses; they confirm what I thought might be the case. Knowing that I can try to proceed with your suggestions. I do have some followup questions for you: On Sun, Jun 01, 2014 at 09:05:09PM -0400, Theodore Ts'o wrote: > Unfortunately, there has been a huge number of bug fixes for ext4's > online resize since 2.6.32 and 1.42.11.