Dear Experts, this is more a general stat question, I tried to ask in other places but had no luck with answers (expect one that suggested numerical instead of analytical optimization): The likelihood below is a mixture of two negative binomial distributions: P*f(n;x1,x2,E) + (1-P)*f(n;x3,x4,E) N and E are vectors of same length. I would like to find the fist derivatives with respect to x1,x2,x3,x4 and P (so that I can set them to zero and calculate a bayesian MLE by iterations according to David Draper's "Bayesian Modeling, Inference and Prediction"). Does anyone have a "quick and dirty" way? To avoid drawning in rather complicated math... Here's the transcription of the likelihood in S language: P*exp((lgamma(x1+N)))-(((lgamma(x1)+lfactorial(N))+log((1+(E/ x2))^x1)+log((1+(x2/E))^N)))+ (1-P)*exp((lgamma(x3+N)))-(((lgamma(x3)+lfactorial(N))+log((1+(E/ x4))^x3)+log((1+(x4/E))^N))) -- View this message in context: http://www.nabble.com/Analytical-Optimization-%28Stat-question%29-tp14878234p14878234.html Sent from the R help mailing list archive at Nabble.com.