similar to: Memory issues in R

Displaying 20 results from an estimated 60 matches similar to: "Memory issues in R"

2009 Apr 25
5
Out of memory issue
Hi all, I am trying to run some plots on data, but when loading he CSV data file R is stopping and I am getting an out of memory error. Anyway to tweak this somehow to get it to run? Using WinXP with 4 GB RAM Tnx Bruce
2002 Dec 18
3
summary stats including NA's into new dataframe
List, I am trying to extract summary statistics from a data frame with several variables (and NAs) into a dataframe with the columns: Variablename (ie the colnames of original data), mean, stdev, max, min, Valid N, Missing Values. Extracting the statistics is straightforward using stack and aggregate. However, I haven''t succeeded in obtaining the number of Missing Values. I can extract
2010 Aug 19
0
2d kriging with anisotropy on an irregular network (RandomFields Package)
Dear List I am using the RandomFields package, and I have a problem when 2d-kriging, with an anisotropy, some values from an irregular network. It works well when : - 2d-kriging, without any anisotropy, some data from an irregular network - 2d-kriging, with and without anisotropy, some data from a regular network - 3d-kriging, with and without anisotropy, some data from a regular network Here is
2010 Nov 15
3
how normal is this temporary power loss for a single UPS?
Hi Occasionally I get emails from my NUT system saying that one of my UPSes lost power for about a minute. Losing power: Date: 15. nov 2010 17.40.25 CET regaining power: Date: 15. nov 2010 17.41.34 CET 69 seconds later. But why? None of the 3 other UPSes reports any power loss. What is the problem? So I made some scripts that logs the input, output and frequency at the time of power loss,
2003 Sep 11
2
(structured) programming style
I find that because R functions are call by value, and because there are no pointer or reference types (a la C++), I am making fairly heavy use of lexical scoping to modify variables. E.g. outer <- function() { m <- matrix(0, 2, 2) inner <- function() { m[2,2] <<- 3 ... } } I am not too pleased with this, as it violates basic rules of structured programming, namely
2003 Jul 01
1
Warning message in scatter.smooth (modreg)
Dear list, In using the scatter.smooth() function (modreg) on a small data set (100 obs) the following error was produced: > scatter.smooth(Na, S) Warning message: k-d tree limited by memory. ncmax= 200 I haven't used scatter.smooth much but when I have, I haven't seen this message before. gc() returns > gc() used (Mb) gc trigger (Mb) Ncells 417693 11.2 667722
2015 Feb 23
2
Call for testing: OpenSSH 6.8
On Mon, Feb 23, 2015 at 5:27 PM, Kevin Brott <kevin.brott at gmail.com> wrote: > Just as an FYI - the whole sys/queue.h issue is impacting HP-UX 11.23 and > 11.31 as well - so we'll see how the latest fixes flush out. > > And, not to play the fool overmuch - but is there a quick howto on how > you're expecting we get the git clone pulls into a buildable state? When I
2003 Jun 02
1
'methods' and environments.
Hi, I have quite some trouble with the package methods. "Environments" in R are a convenient way to emulate pointers (and avoid copies of large objects, or of large collections of objects). So far, so good, but the package methods is becoming more (and more) problematic to work with. Up to version R-1.7.0, slots that were environments were still references to an environment, but I
2002 Apr 29
1
Garbage collection: RW1041
Have searched through the archives but have been unable to find any related issues - hopefully I'm not bringing up an old topic. Am using RW1041 on a Windows NT on a machine with 1Gb of memory. Have a function doit() that reads in a chunk of data using readBin, performs a regression, saves out coeffs and then returns. When using Rgui with the default memory limit of 256Mb I'm able to
2003 Dec 06
7
Windows Memory Issues
Hi all, I am currently building an application based on R 1.7.1 (+ compiled C/C++ code + MySql + VB). I am building this application to work on 2 different platforms (Windows XP Professional (500mb memory) and Windows NT 4.0 with service pack 6 (1gb memory)). This is a very memory intensive application performing sophisticated operations on "large" matrices (typically 5000x1500
2006 Oct 16
1
Problems with newhidups and Smart-UPS 750
Hello, I encountered a problem with my APC Smart-UPS 750, connected via USB, using newhidups. Without any reason, nut/upsmon shut down my PC: >>> Oct 13 12:57:08 echo kernel: usb 2-9.1.1: usbfs: USBDEVFS_CONTROL failed cmd newhidups rqt 161 rq 1 len 8 ret -71 Oct 13 12:57:12 echo upsmon[5060]: UPS smart750@localhost on battery Oct 13 12:57:12 echo upsmon[5060]: UPS smart750@localhost
2007 May 18
1
AIX testers needed
Per the request to test the latest tarball referenced below, I have built R on AIX 5.3. There is a memory issue, please see 3) below. 1) Build with --enable-BLAS-shlib option. Builds and passes "make check". 2) GNU libiconv was installed; R configured *without* the --without- iconv option. Builds and passes "make check." 3) Memory issue: a)
2004 Aug 18
1
Memory Problems in R
Hello everyone - I have a couple of questions about memory management of large objects. Thanks in advance for your response. I'm running R version 1.9.1 on solaris 8, compiled as a 32 bit app. My system has 12.0 GB of memory, with usually ~ 11GB free. I checked system limits using ulimit, and there is nothing set that would limit the maximum amount of memory for a process (with the
2012 May 02
1
DKIM Pass - Fail
Hi Dear Community Friends, it is few days now, i am trying to figure out why DKIM is working / not working. Any assistance would be very much appreciable. Server IP is not blacklisted ever, MX, PTR SPF, DKIM records are available in DNS. why it is working at Gmail, why failing at Yahoo? Gmail dkim=pass header.i=@digital-infotech.net Yahoo: domainkeys=neutral (no sig);
2003 Oct 28
1
Loading a "sub-package"
Hi Folks, The inspiration for this query is described below, but it prompts a general question: If one wants to use only one or a few functions from a library, is there a way to load only these, without loading the library, short of going into the package source and extracting what is needed (including of course any auxiliary functions and compiled code they may depend on)? What prompted this
2004 Jan 14
2
R internal data types
I am trying to figure out R data types and/or storage mode. For example: > #From a clean workspace > gc() used (Mb) gc trigger (Mb) Ncells 415227 11.1 597831 16 Vcells 103533 0.8 786432 6 > x <- seq(0,100000,1) > is.integer(x) [1] FALSE > is.double(x) [1] TRUE > object.size(x) [1] 800036 > gc() used (Mb) gc trigger (Mb) Ncells 415247
2008 Feb 04
2
nut with Belkin "Active Battery Backup" UPS (BU30 series)
Hi, I bought a Belkin Active Battery Backup UPS (http://www.belkin.com/uk/activebattery/) yesterday with a USB port and hoped it would work with nut. The device will show up in /proc/bus/usb/devices as a Cypress Semiconductor USB to Serial. It creates /dev/hiddev0 (and this disappears when the UPS is unplugged). I've tried using belkin, belkinunv, usbhid-ups and megatec_usb. I've tried
2011 Sep 14
2
External pointers and an apparent memory leak
I'm using external pointers and seemingly leaking memory. My determination of a memory leak is that the R process continually creeps up in memory as seen by top while the usage as reported by gc() stays flat. I have isolated the C code: void h5R_allocate_finalizer(SEXP eptr) { Rprintf("Calling the finalizer\n"); void* vector = R_ExternalPtrAddr(eptr); free(vector);
2001 Jul 24
3
Memory/data -last time I promise
I've seen several posts over the past 2-3 weeks about memory issues. I've tried to carefully follow the suggestions, but remain baffled as to why I can't load data into R. I hope that in revisiting this issue that I don't exasperate the list. The setting: 1 gig RAM , Linux machine 10 Stata files of approximately 14megs each File contents appear at the end of this boorishly long
2001 Jul 24
3
Memory/data -last time I promise
I've seen several posts over the past 2-3 weeks about memory issues. I've tried to carefully follow the suggestions, but remain baffled as to why I can't load data into R. I hope that in revisiting this issue that I don't exasperate the list. The setting: 1 gig RAM , Linux machine 10 Stata files of approximately 14megs each File contents appear at the end of this boorishly long