Dear R listers, I have developed a C function to be executed from R through the ".C" interface. After doing dyn.load, the function executes properly and I get the results. However, after executing my function, R seems to get unstable and crashes (giving a segmentation fault and exiting) whenever I try to do ANYTHING with a relatively large object (creating a new one or even just writing the name of an existing one). I use R 2.4.0 under a Linux machine with 1 GB RAM. Below there is an example of execution, so you can get an idea of what is happening: -------------------- dyn.load("my_C_module.so"); res <- .C("my_C_function",.....); #The function executes fine and res is ok dyn.unload("my_C_module.so") #I know this isn't strictly necessary #Here R is still running, but when I execute: m <- matrix(0,1000,100); #I try to create a new object and R crashes *** caught segfault *** address 0x10, cause 'memory not mapped' Traceback: 1: matrix(0, 1000, 100) Possible actions: 1: abort (with core dump) 2: normal R exit 3: exit R without saving workspace 4: exit R saving workspace -------------------- Although I tell R to abort and give me the core dump, it doesn't succeed in doing so. I would be grateful if anyone could tell me what could be the problem with my C function that makes R behave this way? Thank you very much in advance, and apologies for this long email. Xavier Solé. [[alternative HTML version deleted]]
There are many packages around that use .C and .Call interfaces with really huge objects and still are pretty stable. The error is most likely in your own C code and it most likely to be connected to memory allocation/deallocation or array indexing, but without having the code here no one will be able to help further. Oleg Sole Acha, Xavi wrote:> Dear R listers, > > > > I have developed a C function to be executed from R through the ".C" interface. After doing dyn.load, the function executes properly and I get the results. However, after executing my function, R seems to get unstable and crashes (giving a segmentation fault and exiting) whenever I try to do ANYTHING with a relatively large object (creating a new one or even just writing the name of an existing one). > > > > I use R 2.4.0 under a Linux machine with 1 GB RAM. Below there is an example of execution, so you can get an idea of what is happening: > > > > -------------------- > > dyn.load("my_C_module.so"); > > res <- .C("my_C_function",.....); #The function executes fine and res is ok > > dyn.unload("my_C_module.so") #I know this isn't strictly necessary > > > > #Here R is still running, but when I execute: > > > > m <- matrix(0,1000,100); #I try to create a new object and R crashes > > > > *** caught segfault *** > > address 0x10, cause 'memory not mapped' > > > > Traceback: > > 1: matrix(0, 1000, 100) > > > > Possible actions: > > 1: abort (with core dump) > > 2: normal R exit > > 3: exit R without saving workspace > > 4: exit R saving workspace > > -------------------- > > > > Although I tell R to abort and give me the core dump, it doesn't succeed in doing so. > > > > I would be grateful if anyone could tell me what could be the problem with my C function that makes R behave this way? > > > > Thank you very much in advance, and apologies for this long email. > > > > Xavier Sol?. > > > > > > > [[alternative HTML version deleted]] > > > > ------------------------------------------------------------------------ > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel-- Dr Oleg Sklyar * EBI/EMBL, Cambridge CB10 1SD, England * +44-1223-494466
Please do as I suggested in my reply to your message to R-help! And do read the posting guide and not send HMTL mail. On Wed, 21 Feb 2007, Sole Acha, Xavi wrote:> Dear R listers, > > > > I have developed a C function to be executed from R through the ".C" interface. After doing dyn.load, the function executes properly and I get the results. However, after executing my function, R seems to get unstable and crashes (giving a segmentation fault and exiting) whenever I try to do ANYTHING with a relatively large object (creating a new one or even just writing the name of an existing one). > > > > I use R 2.4.0 under a Linux machine with 1 GB RAM. Below there is an example of execution, so you can get an idea of what is happening: > > > > -------------------- > > dyn.load("my_C_module.so"); > > res <- .C("my_C_function",.....); #The function executes fine and res is ok > > dyn.unload("my_C_module.so") #I know this isn't strictly necessary > > > > #Here R is still running, but when I execute: > > > > m <- matrix(0,1000,100); #I try to create a new object and R crashes > > > > *** caught segfault *** > > address 0x10, cause 'memory not mapped' > > > > Traceback: > > 1: matrix(0, 1000, 100) > > > > Possible actions: > > 1: abort (with core dump) > > 2: normal R exit > > 3: exit R without saving workspace > > 4: exit R saving workspace > > -------------------- > > > > Although I tell R to abort and give me the core dump, it doesn't succeed in doing so. > > > > I would be grateful if anyone could tell me what could be the problem with my C function that makes R behave this way? > > > > Thank you very much in advance, and apologies for this long email. > > > > Xavier Sol?. > > > > > > > [[alternative HTML version deleted]] > >-- Brian D. Ripley, ripley at stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595