Corak, Robert (US - Newton)
2016-Aug-05 12:28 UTC
[R] Segmentation Fault on Unix box with nloptr, works on Windows
I have an R script that is giving me a Segmentation Fault depending on the size
of the dataset. It is only happening on our Unix installation of R Server. I
am able to run it against a Windows server with the exact same data and script
successfully.
The Segmentation Fault occurs when I call nloptr. The data I am passing in only
has about 1350 records. I have a print level of 2 set for the nloptr call. I
have been trying different record counts and was able to get to a point where I
would either:
1) 1395 Recs: Fault immediately after calling the nloptr function
2) 1394 Recs: Start the optimization iteration inside nloptr for 1
iteration then fault
3) 1393 Recs: Get the optimization to iterate around 130 times before
faulting
4) 1392 Recs: Getting the optimization to succeed
I am running the script from the command line using "Rscript
myscript.R"
I have a tryCatch around the call but it just crashes with no additional info
and is never caught.
I would assume that this is memory related but it looks like there is plenty of
memory resources available (at least within the JVM). I also tried to call:
options(java.parameters = "-Xmx8192m")
but it didn't seem to help.
These are the libraries I am installing:
* library(rJava)
* library(RJDBC)
* library(RCurl)
* library(stringr)
* library(nloptr)
* library(gsubfn)
Below is the setup snippet for the nloptr call:
# 4) Pick Algorithm to be used
local_opts <- list (
"algorithm" =
"NLOPT_LD_MMA",
"xtol_rel" = 1.0e-7
)
opts <- list (
"algorithm" =
"NLOPT_LD_AUGLAG",
"xtol_rel" = 1.0e-7,
"maxeval" = 1000,
"print_level" = 2,
"local_opts" = local_opts
)
# 5) Do optimization
optRes <- nloptr (
x0 = x0,
eval_f = eval_f,
lb = lb,
ub = ub,
eval_g_ineq = eval_g_ineq,
opts = opts
)
Does anyone have any thoughts or ideas on what might be happening here?
Thanks
Rob
This message (including any attachments) contains confid...{{dropped:18}}