similar to: [Fwd: Re: Memory leak in R v1.5.1?] - resolved

Displaying 20 results from an estimated 1000 matches similar to: "[Fwd: Re: Memory leak in R v1.5.1?] - resolved"

2002 Aug 06
2
Memory leak in R v1.5.1?
Hi, I am trying to minimize a rather complex function of 5 parameters with gafit and nlm. Besides some problems with both optimization algorithms (with respect to consistantly generating similar results), I tried to run this optimization about a hundred times for yet two other parameters. Unfortunately, as the log below shows, during that batch process R starts to eat up all my RAM,
2003 May 06
2
capi + bri ?
Hello, I have som problems with my BRI/capi setup. I manage to call in to the system (some rows below). ---------------- -- Executing Dial("CAPI[contr1/16453]", "SIP/BYEXTENSION@janm|10") in new stack -- Called s@janm -- SIP/janm-63f5 is ringing -- SIP/janm-63f5 is ringing -- SIP/janm-63f5 is ringing ---------------- But I can't make outgoing calls from
2003 Oct 14
1
Outgoing CallerID
Hello, Does anyone know how to set the outgoing CallerID properly when using Snom200/SIP/CAPI/BRI? Following doesn?t work: exten => _0.,1,SetCallerID,526910 exten => _0.,2,Dial,CAPI/526980:${EXTEN:1} Asterisk writes: *CLI> -- Executing SetCallerID("SIP/226-ada0", "526910") in new stack -- Executing Dial("SIP/226-ada0",
2003 Jun 26
2
No busy detection
I have some problems with busy detection and SIP. When I'm making a phonecall (out or internal) and someone else is calling me, the phone (Snom200) is ringing and leaving the first caller (no difference if I call someone or if someone calling me) in the background waiting. It doesn?t hang up the first call but the second one is overriding the first. Is there anyone that has experienced the
2001 Feb 28
2
(off topic) Re: Notepad
At 21:57 28/02/01 +0100, Peter Dalgaard BSA wrote: >Jim Lemon <bitwrit at ozemail.com.au> writes: > >> 3) The usual number of responses spent a lot of time dissing NotePad and >> advertising their favorite editor. As various contributors noted, >> NotePad actually does most of the things that some people said it >> doesn't. Positive advice (like the fact
2007 Oct 28
1
tree problem
I am trying to use tree to partition a data set. The data set has 3924 observations. Partitioning seems to work for small subsets of the data, but when I use the entire data set, no partitioning occurs. The variables are: RESP respondent to a survey (0 = not a respondent, 1 = respondent) AGE_P Age (continuous) ORIGIN_I Hispanic Ethnicity (1 = Hispanic, 2 = non-Hispanic) RACRECI2 Race
2013 Aug 21
2
[PATCH 1/3] Rationalise whitespace to 4 space indentation with no trailing spaces
RHSrvAny.c was using a mixture of 4 space indentation, and tabs with a width of 4. This commit rationalises the whitespace to use only 4 space indentation, and removes trailing whitespace. --- RHSrvAny/RHSrvAny.c | 537 ++++++++++++++++++++++++++-------------------------- RHSrvAny/RHSrvAny.h | 1 - RHSrvAny/resource.h | 2 +- 3 files changed, 269 insertions(+), 271 deletions(-) diff --git
2005 Jul 07
2
r: LOOPING
hi all i know that one should try and limit the amount of looping in R programs. i have supplied some code below. i am interested in seeing how the code cold be rewritten if we dont use the loops. a brief overview of what is done in the code. ============================================== ============================================== ============================================== 1. the input
1999 Apr 27
2
Memory management
Dear all, I don't get it: First of all, the help doesn't say what are the memory limits of R. Say, what's the max heap size for instance ???? Secondly, I invoke R with the following commands each time: rgui --vsize 30M --nsize 1000K rgui --vsize 30M --nsize 2000K rgui --vsize 30M --nsize 3000K rgui --vsize 30M --nsize 4000K I try to open a matrix 8000x8000 by issuing
2013 Aug 29
5
[PATCH 1/6] Rationalise whitespace to 4 space indentation with no trailing spaces
RHSrvAny.c was using a mixture of 4 space indentation, and tabs with a width of 4. This commit rationalises the whitespace to use only 4 space indentation, and removes trailing whitespace. --- RHSrvAny/RHSrvAny.c | 537 ++++++++++++++++++++++++++-------------------------- RHSrvAny/RHSrvAny.h | 1 - RHSrvAny/resource.h | 2 +- 3 files changed, 269 insertions(+), 271 deletions(-) diff --git
2010 Jan 19
2
Server hanging despite efforts to correct memory limits
My group is working with datasets between 100 Mb and 1 GB in size, using multiple log ins. From the documentation, it appears that vsize is limited to 2^30-1, which tends to prove too restrictive for our use. When we drop that restriction (set vsize = NA) we end up hanging the server, which requires a restart. Is there any way to increase the memory limits on R while keeping our jobs from
2001 Aug 22
1
Huge workspace cannot be opened
Hi everyone, I have a problem that some people may have already encountered but i did not find the solution yet. As I use R to simulate several arrays of data, my workspace is now 35Mb big and I cannot launch R with it. An "xdr real data read error occured" and R tells me to delete .RData or increase memory. I WON'T delete this file and changing the max-nsize to 40600k did not
2013 Aug 07
2
Override master service settigs with spaces
I'd like to override one setting for a master service in conf.d/10-master.conf. Unfortunately, said setting contains spaces, and I do not know how to escape them properly. Here's what I've tried so far. (Note: This is just the easiest/silliest test case I could come up with; not the actual setting or service I want to overwrite.) conf.d/10-master.conf: service quota-status {
2017 Nov 22
2
function pointers?
We have a project that calls for the creation of a list of many distribution objects. Distributions can be of various types, with various parameters, but we ran into some problems. I started testing on a simple list of rnorm-based objects. I was a little surprised at the RAM storage requirements, here's an example: N <- 10000 closureList <- vector("list", N) nsize = sample(x
2005 Dec 20
1
Problems in batch mode
Dear R-users, I am trying to run some simulations in batch mode. In an older version of the program, I used rterm --vsize=100M --nsize=5000K --restore --save <input file> output file, however, in the new version R 2.2.0 , the parameters vsize and nsize are ignored. I can use the command memory.limit to increase memory, but I am not sure if this corresponds to vsize and nsize.
2006 Mar 03
0
Memory problem
Hi list, I am analysing a large dataset using random coefficient (using nlme) and fixed effects (using lm function) models. I have problem with my R version 2. 2. 1 due to memory allocation difficulties. When I try to expand the memory I get the following error message. > R --min-vsize=10000000 --max-vsize=1000000000 --min-nsize=500 --max-nsize=10000000 Error: target of assignment expands
2010 Jan 18
0
R jobs keep hanging linux server despite mem.limits modifcations
My group is working with datasets between 100 Mb and 1 GB in size, using multiple log ins. From the documentation, it appears that vsize is limited to 2^30-1, which tends to prove too restrictive for our use. When we drop that restriction (set vsize = NA) we end up hanging the server, which requires a restart. Is there any way to increase the memory limits on R while keeping our jobs from
2002 Oct 28
1
Nonlinear time series
Dear R People: Is there code for nonlinear time series available, please? I'm looking for something that could also provide a model for forecasts. This is for R V1.5.1 on a PC. Thank you very much in advance! sincerely Erin Hodgess mailto: hodgess at uhddx01.dt.uh.edu -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.- r-help mailing list -- Read
2009 Nov 30
1
allocating vector memory > 1 GByte on Windows XP / Vista / 7
Let me begin stating that I read all help files and faq's on the subject matter (there aren't more than about a dozen) but either did not find solutions or found them not to work. Here is the issue. I am trying to run a spatial regression on a medium-sized dataset. Part of the functions in the spdep package I use require me to allocate a vector of 1.1 Gb (mine is not a spatial SIG
2000 Nov 09
3
maximum of nsize=20000k ??
Dear R-ers, somehow it is not possible to increase nsize to more than 20000k. When I specify e.g. > R --vsize=10M --nsize=21000K the result is: free total (Mb) Ncells 99658 350000 6.7 Vcells 1219173 1310720 10.0 Maybe I have overlooked s.th.... Marcus -- +------------------------------------------------------- | Marcus Eger | E-Mail: eger.m at gmx.de (NEW) |