Harold, Obviously the bottleneck is your objective function fn(). I have speeded up your function by a factor of about 2.4 by using `outer' instead of sapply. I think it can be speeded much more. I couldn't figure it out without spending a lot of time. I am sure someone on this list-serv can come up with a cleverer way to program the objective function. pl3 <- function(dat, Q, startVal = NULL, ...){ if(!is.null(startVal) && length(startVal) != ncol(dat) ){ stop("Length of argument startVal not equal to the number of parameters estimated") } if(!is.null(startVal)){ startVal <- startVal } else { p <- colMeans(dat) startValA <- rep(1, ncol(dat)) startValB <- as.vector(log((1 - p)/p)) startVal <- c(startValA,startValB) } rr1 <- rr2 <- numeric(nrow(dat)) qq <- gauss.quad.prob(Q, dist = 'normal', mu = 0, sigma=1) nds <- qq$nodes wts <- qq$weights Q <- length(qq$nodes) dat <- as.matrix(dat) fn <- function(params){ a <- params[1:20] b <- params[21:40] for(j in 1:length(rr1)){ rr1[j] <- sum(wts*exp(colSums(outer(dat[j,], nds, function(x,y) dbinom(x, 1, 1/ (1 + exp(- 1.7 * a * (y - b))), log = TRUE))))) } -sum(log(rr1)) } #opt <- optim(startVal, fn, method = "BFGS", hessian = TRUE) opt <- nlminb(startVal, fn, control=list(trace=1, rel.tol=1.e-06, iter.max=50)) # opt <- spg(startVal, fn) opt #list("coefficients" = opt$par, "LogLik" = -opt$value, "Std.Error" = sqrt(diag(solve(opt$hessian)))) } Hope this is helpful, Ravi [[alternative HTML version deleted]]