Hello all, This is the error message that I get.> hyp.res <- nls(log(y)~log(pdf.hyperb(theta,X)), data=dataModel,+ start=list(theta=thetaE0), + trace=TRUE) 45.54325 : 0.1000000 1.3862944 -4.5577142 0.0005503 3.728302 : 0.0583857346 0.4757772859 -4.9156128701 0.0005563154 1.584317 : 0.0194149477 0.3444648833 -4.9365149150 0.0004105426 1.569333 : 0.0139310639 0.3824648048 -4.9024001228 0.0004089738 1.569311 : 0.0137155342 0.3888648619 -4.8979817546 0.0004137501 1.569311 : 0.0136895846 0.3893564152 -4.8976182201 0.0004141057 1.569311 : 0.0136876315 0.3894059947 -4.8975821760 0.0004141343> hyp.res.S <- summary(hyp.res) > hyp.res.SFormula: log(y) ~ log(pdf.hyperb(theta, X)) Parameters: Estimate Std. Error t value Pr(>|t|) theta1 0.0136876 0.0359964 0.380 0.705 theta2 0.3894060 0.3079860 1.264 0.211 theta3 -4.8975822 0.2219928 -22.062 <2e-16 *** theta4 0.0004141 0.0005457 0.759 0.451 --- Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 Residual standard error: 0.1542 on 66 degrees of freedom Correlation of Parameter Estimates: theta1 theta2 theta3 theta2 -0.02168 theta3 -0.02029 0.997736 theta4 -0.97182 -0.008054 -0.008952> pr1 <- profile(hyp.res)1.825584 : 0.3894059947 -4.8975821760 0.0004141343 1.58426 : 0.373691474 -4.909091289 0.000824045 1.583673 : 0.4176596873 -4.8774106487 0.0008176545 1.583670 : 0.4196944963 -4.8760375504 0.0008187918 1.583670 : 0.4199010211 -4.8758854269 0.0008188162 1.624899 : 0.449756713 -4.854643555 0.001215014 1.624743 : 0.46804752 -4.84185838 0.00122343 1.624741 : 0.470384534 -4.840195293 0.001224199 1.624741 : 0.470638282 -4.840013199 0.001224298 1.624741 : 0.470670500 -4.839990112 0.001224309 1.692158 : 0.522188258 -4.803565745 0.001635778 1.691853 : 0.540794581 -4.791027785 0.001650730 1.691847 : 0.544973564 -4.788090229 0.001652321 1.691847 : 0.545500818 -4.787718964 0.001652616 1.691847 : 0.545592388 -4.787654441 0.001652658 1.784749 : 0.622277872 -4.734086833 0.002091090 1.784039 : 0.642139831 -4.721442413 0.002115555 1.784022 : 0.649929188 -4.716068239 0.002119126 1.784021 : 0.651094995 -4.715267692 0.002119963 1.784021 : 0.65136956 -4.71507850 0.00212012 1.784021 : 0.651420684 -4.715043256 0.002120153 1.901667 : 0.760981513 -4.639871097 0.002604136 1.899765 : 0.782870773 -4.627113544 0.002644289 1.899703 : 0.798044378 -4.616913842 0.002653139 1.899699 : 0.800930030 -4.614996517 0.002655646 1.899699 : 0.801815115 -4.614404602 0.002656286 1.899699 : 0.802033012 -4.614258879 0.002656455 1.899699 : 0.802090888 -4.614220164 0.002656499 2.042311 : 0.960722487 -4.508069592 0.003221186 2.036028 : 0.985442669 -4.495703574 0.003291724 2.035767 : 1.017849320 -4.474692317 0.003317203 2.035736 : 1.026482531 -4.469203277 0.003326355 2.035733 : 1.029937702 -4.466987738 0.003329627 2.035732 : 1.031114541 -4.466233213 0.003330776 2.035732 : 1.03153247 -4.46596514 0.00333118 2.035732 : 1.031679019 -4.465871141 0.003331322 2.035732 : 1.031730150 -4.465838341 0.003331371 1.583425 : 3.595503e-01 -4.918824e+00 1.793668e-05 1.583270 : 3.783622e-01 -4.905413e+00 2.175254e-05 1.58327 : 3.793739e-01 -4.904683e+00 2.134353e-05 1.58327 : 3.794562e-01 -4.904622e+00 2.133204e-05 1.624847 : 0.3695765499 -4.9116128267 -0.0003687005 1.624748 : 0.3877816645 -4.8985581022 -0.0003699797 1.624747 : 0.3890277914 -4.8976645481 -0.0003706884 1.624747 : 0.3891837724 -4.8975508910 -0.0003707385 1.693266 : 0.3989123147 -4.8904787597 -0.0007628478 1.693170 : 0.4172349434 -4.8774484377 -0.0007692982 1.693168 : 0.4193992605 -4.8759045536 -0.0007705839 1.693168 : 0.4197576966 -4.8756463144 -0.0007707548 1.693168 : 0.4198094076 -4.8756090629 -0.0007707816 1.788041 : 0.450653198 -4.853510937 -0.001173674 1.787884 : 0.469850125 -4.840178495 -0.001186483 1.787878 : 0.473941725 -4.837285088 -0.001188994 1.787878 : 0.474783800 -4.836687703 -0.001189518 1.787878 : 0.474960093 -4.836562526 -0.001189627 1.787878 : 0.474996400 -4.836536741 -0.001189650 1.908362 : 0.531011661 -4.796878005 -0.001614805 1.907987 : 0.552282889 -4.782714884 -0.001637055 1.907965 : 0.560264832 -4.777154923 -0.001642437 1.907964 : 0.562346209 -4.775709348 -0.001644026 1.907963 : 0.56295255 -4.77528733 -0.00164447 1.907963 : 0.563123513 -4.775168321 -0.001644596 1.907963 : 0.563172014 -4.775134556 -0.001644632 2.053048 : 0.653571857 -4.712183501 -0.002111091 2.051874 : 0.678913471 -4.696423681 -0.002150202 2.051785 : 0.69544095 -4.68518302 -0.00216328 2.051772 : 0.701166962 -4.681323343 -0.002168564 2.051770 : 0.703471045 -4.679765756 -0.002170603 2.051769 : 0.704360670 -4.679164327 -0.002171397 2.051769 : 0.704707691 -4.678929683 -0.002171706 2.051769 : 0.704842651 -4.678838422 -0.002171826 2.051769 : 0.704895042 -4.678802995 -0.002171872 3.874239 : 0.0136876315 -4.8975821760 0.0004141343 1.683633 : 0.0140300885 -5.1010837614 0.0004228894 1.582383 : 0.0159635264 -5.0698805881 0.0003921865 1.582380 : 0.0156612906 -5.0699927701 0.0003961255 1.582380 : 0.0156714529 -5.0699887428 0.0003959888 1.62839 : 0.0177072933 -5.2469160871 0.0003773674 1.62302 : 0.0173703508 -5.2549785000 0.0003824169 1.623010 : 0.017454006 -5.254634273 0.000381311 1.623010 : 0.017450834 -5.254646856 0.000381349 1.693919 : 0.0192287068 -5.4391483575 0.0003667216 1.688721 : 0.0188383089 -5.4474923226 0.0003715871 1.688708 : 0.0189024936 -5.4470793675 0.0003708466 1.688708 : 0.0189011857 -5.4470970845 0.0003708537 1.781934 : 0.0203962009 -5.6454738631 0.0003600353 1.776866 : 0.0199673656 -5.6541185083 0.0003646648 1.776849 : 0.0200118977 -5.6536319416 0.0003642738 1.776849 : 0.0200100558 -5.6536559449 0.0003642825 1.889638 : 0.0211923804 -5.8738978163 0.0003572760 1.884404 : 0.0207378220 -5.8830916118 0.0003616144 1.884382 : 0.020765806 -5.882510312 0.000361524 1.884382 : 0.0207622023 -5.8825427275 0.0003615496 1.884382 : 0.0207620572 -5.8825409040 0.0003615529 2.013048 : 0.0215963002 -6.1364575433 0.0003585248 2.007339 : 0.0211300497 -6.1464738264 0.0003624619 2.00731 : 0.0211456623 -6.1457695801 0.0003626177 2.00731 : 0.0211391005 -6.1458133458 0.0003626743 2.00731 : 0.0211385176 -6.1458106028 0.0003626839 1.588746 : 0.0116517911 -4.7206548317 0.0004327556 1.58227 : 0.0113562764 -4.7282291715 0.0004385397 1.582266 : 0.0114927727 -4.7280477868 0.0004363126 1.582266 : 0.0114778523 -4.7280512493 0.0004365554 1.626601 : 0.0092002102 -4.5533139799 0.0004596651 1.619645 : 0.0089799845 -4.5607704466 0.0004657534 1.619642 : 0.0091267199 -4.5606310820 0.0004631665 1.619642 : 0.0091061083 -4.5606330852 0.0004635248 1.686974 : 0.0065887359 -4.3829352072 0.0004921501 1.679178 : 0.006431904 -4.390438771 0.000499185 1.679176 : 0.0065909364 -4.3903319603 0.0004961289 1.679176 : 0.0065644067 -4.3903330820 0.0004966265 1.679176 : 0.0065683565 -4.3903330201 0.0004965523 1.767647 : 0.003788939 -4.203815551 0.000532725 1.758289 : 0.0036705739 -4.2116327081 0.0005421454 1.758287 : 0.0038516622 -4.2115484360 0.0005382965 1.758287 : 0.0038192070 -4.2115490856 0.0005389643 1.865816 : 0.0006957633 -4.0084238869 0.0005871507 1.853701 : 0.0005961484 -4.0168687626 0.0006016352 1.853699 : 0.0008362140 -4.0167968055 0.0005958852 1.853699 : 0.0007620456 -4.0167975560 0.0005975896 1.853699 : 0.0007701809 -4.0167974548 0.0005974007 1.978174 : -0.0028587603 -3.7850046811 0.0006669515 1.960889 : -0.0030688672 -3.7945046007 0.0006955498 1.960887 : -0.0027067062 -3.7944311799 0.0006854164 1.960887 : -0.0027920434 -3.7944320124 0.0006877001 5.64713 : 0.0136876315 0.3894059947 0.0004141343 1.629748 : 0.0155012039 0.1892390960 0.0003968719 1.581812 : 0.0156784380 0.1604443768 0.0003956682 1.581806 : 0.0155981707 0.1607541098 0.0003966080 1.581806 : 0.0156011476 0.1607440033 0.0003965726 1.625034 : 0.0176093652 -0.0792346636 0.0003781418 1.619729 : 0.0172570301 -0.0690409623 0.0003829002 1.619717 : 0.0172925225 -0.0695061842 0.0003825291 1.619717 : 0.0172912517 -0.0694884984 0.0003825385 1.685930 : 0.0190500552 -0.3090794855 0.0003679339 1.680988 : 0.0186534576 -0.2991696578 0.0003725206 1.680974 : 0.0186835862 -0.2996838489 0.0003722919 1.680974 : 0.0186818890 -0.2996610614 0.0003723024 1.768212 : 0.0201697584 -0.5459270795 0.0003613506 1.763496 : 0.0197433946 -0.5361742949 0.0003657251 1.763479 : 0.0197679071 -0.5367429015 0.0003656357 1.763479 : 0.0197650941 -0.5367140691 0.0003656558 1.869533 : 0.0209584525 -0.7978734727 0.0003583334 1.864765 : 0.0205126475 -0.7879935819 0.0003624617 1.864745 : 0.0205327327 -0.7886349546 0.0003624954 1.864745 : 0.0205286611 -0.7885983504 0.0003625264 1.864745 : 0.0205285176 -0.7886004594 0.0003625299 1.986651 : 0.0213989344 -1.0757885915 0.0003589658 1.981542 : 0.0209450354 -1.0654801197 0.0003627591 1.981515 : 0.0209633543 -1.0662185762 0.0003628857 1.981515 : 0.020956299 -1.066171806 0.000362947 1.981515 : 0.0209561876 -1.0661748041 0.0003629514 1.589875 : 0.0116794138 0.6293846615 0.0004325651 1.583420 : 0.0113555311 0.6404016532 0.0004381207 1.583415 : 0.0114133891 0.6400920083 0.0004372227 1.583415 : 0.0114079388 0.6400983595 0.0004373103 1.631556 : 0.0091559922 0.8877395583 0.0004602042 1.624037 : 0.0088957207 0.8995559273 0.0004663601 1.624033 : 0.0089630852 0.8992803329 0.0004651855 1.624033 : 0.008952827 0.899284613 0.000465364 1.697006 : 0.0064519740 1.1632996957 0.0004939404 1.688459 : 0.0062555272 1.1758143022 0.0005012813 1.688456 : 0.0063357705 1.1755760587 0.0004997134 1.688456 : 0.0063225169 1.1755787128 0.0004999653 1.783962 : 0.0035487681 1.4669400022 0.0005364535 1.773585 : 0.0033867882 1.4806292714 0.0005467553 1.773583 : 0.0034981174 1.4804153720 0.0005443142 1.773583 : 0.0034734893 1.4804170613 0.0005448297 1.889003 : 0.0003437136 1.8152947117 0.0005941152 1.875415 : 0.0002107001 1.8307953466 0.0006103544 1.875413 : 0.0002868541 1.8305923056 0.0006083076 1.875413 : 0.0003182105 1.8305924268 0.0006075744 1.875413 : 0.000300086 1.830592703 0.000608001 Error in prof$getProfile() : step factor 0.000488281 reduced below `minFactor' of 0.000976563>Why is there an error on profile, which should only evaluate the objective function for different parameter values? Thanks a lot. Adrian
Adrian Dragulescu asks:> Hello all, > > This is the error message that I get. > > hyp.res <- nls(log(y)~log(pdf.hyperb(theta,X)), data=dataModel, > + start=list(theta=thetaE0), > + trace=TRUE) > 45.54325 : 0.1000000 1.3862944 -4.5577142 0.0005503 > 3.728302 : 0.0583857346 0.4757772859 -4.9156128701 0.0005563154 > 1.584317 : 0.0194149477 0.3444648833 -4.9365149150 0.0004105426 > 1.569333 : 0.0139310639 0.3824648048 -4.9024001228 0.0004089738 > 1.569311 : 0.0137155342 0.3888648619 -4.8979817546 0.0004137501 > 1.569311 : 0.0136895846 0.3893564152 -4.8976182201 0.0004141057 > 1.569311 : 0.0136876315 0.3894059947 -4.8975821760 0.0004141343 > > hyp.res.S <- summary(hyp.res) > > hyp.res.S > > Formula: log(y) ~ log(pdf.hyperb(theta, X)) > > Parameters: > Estimate Std. Error t value Pr(>|t|) > theta1 0.0136876 0.0359964 0.380 0.705 > theta2 0.3894060 0.3079860 1.264 0.211 > theta3 -4.8975822 0.2219928 -22.062 <2e-16 *** > theta4 0.0004141 0.0005457 0.759 0.451 > --- > Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 > > Residual standard error: 0.1542 on 66 degrees of freedom > > Correlation of Parameter Estimates: > theta1 theta2 theta3 > theta2 -0.02168 > theta3 -0.02029 0.997736 > theta4 -0.97182 -0.008054 -0.008952 > > > pr1 <- profile(hyp.res) > 1.825584 : 0.3894059947 -4.8975821760 0.0004141343 > ... > 1.875413 : 0.000300086 1.830592703 0.000608001 > Error in prof$getProfile() : step factor 0.000488281 reduced below`minFactor' of 0.000976563> > > > Why is there an error on profile, which should only evaluate the > objective function for different parameter values?profile() doesn't merely evaluate the objective function at different parameter values. It holds one of the parameters constant and optimises the objective function with respect to all the others. This is repeated for a sequence of values passing through the mle, and the same for each parameter. If you fix a parameter at a very unrealistic value (as can happen) it is not surprising that the optimisation with respect to all the others runs into trouble. This is a computationally difficult area and I am more surprised by the cases that work without a hitch than by the ones that don't. You can expect to have to tweak things a bit in many cases. Bill Venables.> > Thanks a lot. > > Adrian > > ______________________________________________