Displaying 5 results from an estimated 5 matches for "vheap_free".
2011 Oct 05
1
Moderating consequences of garbage collection when in C
...rc/main/memory.c
===================================================================
--- src/main/memory.c (revision 57169)
+++ src/main/memory.c (working copy)
@@ -2503,6 +2503,17 @@
R_gc_internal(0);
}
+void R_gc_needed(R_size_t size_needed)
+{
+ if (FORCE_GC || NO_FREE_NODES() || VHEAP_FREE() < size_needed) {
+ R_gc_internal(size_needed);
+ if (NO_FREE_NODES())
+ mem_err_cons();
+ if (VHEAP_FREE() < size_needed)
+ mem_err_heap(0);
+ }
+}
+
static void R_gc_full(R_size_t size_needed)
{
num_old_gens_to_collect = NUM_OLD_GENERATION...
2010 Jan 07
1
Segfault in GetNewPage, memory.c.
...db)
> 1928 switch (type) {
> (gdb)
> 1978 if (length <= 0)
> (gdb)
> 1984 size = PTR2VEC(length);
> (gdb)
> 2000 if (size <= NodeClassSize[1]) {
> (gdb)
> 2017 old_R_VSize = R_VSize;
> (gdb)
> 2020 if (FORCE_GC || NO_FREE_NODES() || VHEAP_FREE() < alloc_size) {
> (gdb)
> 2017 old_R_VSize = R_VSize;
> (gdb)
> 2020 if (FORCE_GC || NO_FREE_NODES() || VHEAP_FREE() < alloc_size) {
> (gdb)
> 2028 if (size > 0) {
> (gdb)
> 2029 if (node_class < NUM_SMALL_NODE_CLASSES) {
> (gdb)
> 2030...
2006 Nov 06
2
gc()$Vcells < 0 (PR#9345)
Full_Name: Don Maszle
Version: 2.3.0
OS: x86_64-unknown-linux-gnu
Submission from: (NULL) (206.86.87.3)
# On our new 32 GB x86_64 machine
R : Copyright 2006, The R Foundation for Statistical Computing
Version 2.3.0 (2006-04-24)
ISBN 3-900051-07-0
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to
2G ( I'm using a 4G linux box). After the work is done I clean up (rm())
and the state is returned to 700MB. Yet I find I cannot run the same
routine again as it claims to not be able to allocate memory even though
gcinfo() claims there is 1.1G left.
At the start of the second time
===============================
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to
2G ( I'm using a 4G linux box). After the work is done I clean up (rm())
and the state is returned to 700MB. Yet I find I cannot run the same
routine again as it claims to not be able to allocate memory even though
gcinfo() claims there is 1.1G left.
At the start of the second time
===============================