search for: r_genheap

Displaying 5 results from an estimated 5 matches for "r_genheap".

2010 Jan 07
1
Segfault in GetNewPage, memory.c.
...(node_class, s); > (gdb) > > Program received signal SIGSEGV, Segmentation fault. > GetNewPage (node_class=1) at memory.c:657 > 657 SNAP_NODE(s, base); > (gdb) So CLASS_GET_FREE_NODE is #defined in memory.c as: > #define CLASS_GET_FREE_NODE(c,s) do { \ > SEXP __n__ = R_GenHeap[c].Free; \ > if (__n__ == R_GenHeap[c].New) { \ > GetNewPage(c); \ > __n__ = R_GenHeap[c].Free; \ > } \ > R_GenHeap[c].Free = NEXT_NODE(__n__); \ > R_NodesInUse++; \ > (s) = __n__; \ > } while (0) and we here have a call to GetNewPage. > yziquel at sel...
2001 Feb 20
2
segfault
...(s); in the following section from src/main/memory.c #ifndef EXPEL_OLD_TO_NEW /* scan nodes in uncollected old generations with old-to-new pointers */ for (gen = num_old_gens_to_collect; gen < NUM_OLD_GENERATIONS; gen++) for (i = 0; i < NUM_NODE_CLASSES; i++) for (s = NEXT_NODE(R_GenHeap[i].OldToNew[gen]); s != R_GenHeap[i].OldToNew[gen]; s = NEXT_NODE(s)) FORWARD_CHILDREN(s); #endif I'm using > version _ platform sparc-sun-solaris2.6 arch sparc os solaris2.6 system sparc, solaris2.6 status Patched major 1 minor 2.1 year 2001 month...
2011 Aug 14
0
Improved version of Rprofmem
..., R_len_t); static void R_ReportNewPage(); -#endif extern SEXP framenames; @@ -790,9 +797,7 @@ if (page == NULL) mem_err_malloc((R_size_t) R_PAGE_SIZE); } -#ifdef R_MEMORY_PROFILING - R_ReportNewPage(); -#endif + if (R_IsMemReporting) R_ReportNewPage(); page->next = R_GenHeap[node_class].pages; R_GenHeap[node_class].pages = page; R_GenHeap[node_class].PageCount++; @@ -2312,6 +2317,13 @@ } } + if (R_IsMemReporting) { + if (!R_MemPagesReporting + || size > 0 && node_class >= NUM_SMALL_NODE_CLASSES) + R_Rep...
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to 2G ( I'm using a 4G linux box). After the work is done I clean up (rm()) and the state is returned to 700MB. Yet I find I cannot run the same routine again as it claims to not be able to allocate memory even though gcinfo() claims there is 1.1G left. At the start of the second time ===============================
2005 Feb 19
2
Memory Fragmentation in R
I have a data set of roughly 700MB which during processing grows up to 2G ( I'm using a 4G linux box). After the work is done I clean up (rm()) and the state is returned to 700MB. Yet I find I cannot run the same routine again as it claims to not be able to allocate memory even though gcinfo() claims there is 1.1G left. At the start of the second time ===============================