I have an R function that uses a shared library written in C. The memory allocation is done using calloc and free (I tried using R_alloc, but although it would compile, R would crash... is there a FAQ with more info on R_alloc? I couldn't find an answer in the documentation...). There are no memory leaks in my C code (i have triple checked). The code also uses the coxph function from survival5. The function gets called repeatedly as part of a simulation, and I noticed that on i386 linux, the memory aggregates, and thus I can only run 75 iterations before I start to run out of memory. When I quit out of R, the memory gets freed. However, when I run the same code recompiled on a sparc64, the memory aggregation is not nearly as bad (if any) and I can run the simulation as many iterations as I need without this memory problem. My questions: 1. Is this due to a difference in the different ports of linux (or a 64 vs 32 bit issue)? 2. Is it due to R? 3. Would using R_alloc fix this situation? Info: R version 1.0.0 (IIRC) on both machines RedHat Linux 6.1 on i386 RedHat Linux 6.2 on sparc64 Thanks for your help, Chris -------------- next part -------------- A non-text attachment was scrubbed... Name: quale.vcf Type: text/x-vcard Size: 1641 bytes Desc: Card for Christopher Quale Url : https://stat.ethz.ch/pipermail/r-help/attachments/20000707/99243ef1/quale.vcf