Park, Kyong H Mr. RDECOM
2005-Jun-10 11:20 UTC
[R] Replies of the question about robustness of segmented regression
I appreciate to Roger Koenker, Achim Zeileis and Vito Muggeo for their
informative answers. Listed below is unedited replies I got followed by the
question I posted.
Kyong
1. Roger Koenker:
You might try rqss() in the quantreg package. It gives piecewise
linear fits
for a nonparametric form of median regression using total variation
of the
derivative of the fitted function as a penalty term. A tuning parameter
(lambda) controls the number of distinct segments. More details are
available in the vignette for the quantreg package.
2. Achim Zeileis:
hen you keep the number of break points fixed, then there is a unique
solution to the problem of fitting a segmented regression: the solution
which maximizes the likelihood (or for linear models equivalently
minimizes the RSS). Vito's segmented package gives an iterative method
which can be shown to converge to this unique solution. If empirically
you find different solutions with different starting values, you can
always compare them using the RSS or log-likelihood and choose the one
which fits better (because the other one can't be the optimal solution).
The function breakpoints() in package strucchange computes (as
opposed to approximates) the unique solution for a fully segmented model
instead of a broken line trend.
Another nonparametric solution using quantreg was already pointed out by
Roger.
3. Vito Muggeo:
In addition to valuable Achim's comments.
As Achim said, you can try different starting values to assess how the
final solution depends on them. Then select one having the best logLik
(or the minimum RSS).
Everybody dealing with nonlinear models knows that the logLik may be not
concave. This is particulary true for broken-line model, so different
starting values (psi0) sometimes can lead to different solutions
(segmented performs "just" an iterative estimating algorithm..). This
sensitivity depends on your data: the more clear-cut the relationship,
the stabler the algorithm, i.e. more indipendent of psi0 the estimates
are. Of course here, a grid-search is able to fix the problem.
Furthermore comparing the results with a visual inspection of a
(possibly smoothed) scatterplot can lead to different location of psi.
Usually visual location of changepoints through plots is based on local
fitting, while segmented, in its standard usage, performs global fitting
(on each side of the range of the explanatory variable Z). Influential
points on extreme limits of the Z-range may influence the slopes and
then the breakpoint location.
the question:
Hello, R users,
I applied segmented regression method contributed by Muggeo and got
different slope estimates depending on the initial break points. The results
are listed below and I'd like to know what is a reasonable approach handling
this kinds of problem. I think applying various initial break points is
certainly not a efficient approach. Is there any other methods to deal with
segmented regression? From a graph, v shapes are more clear at 1.2 and 1.5
break points than 1.5 and 1.7. Appreciate your help.
Result1:
Initial break points are 1.2 and 1.5. The estimated break points and slopes:
Estimated Break-Point(s):
Est. St.Err
Mean.Vel 1.285 0.05258
1.652 0.01247
Est. St.Err. t value CI(95%).l
CI(95%).u
slope1 0.4248705 0.3027957 1.403159 -0.1685982 1.018339
slope2 2.3281445 0.3079903 7.559149 1.7244946 2.931794
slope3 9.5425516 0.7554035 12.632390 8.0619879 11.023115
Adjusted R-squared: 0.9924.
Result2:
Initial break points are 1.5 and 1.7. The estimated break points and slopes:
Estimated Break-Point(s):
Est. St.Err
Mean.Vel 1.412 0.02195
1.699 0.01001
Est. St.Err. t value CI(95%).l
CI(95%).u
slope1 0.7300483 0.1381587 5.284129 0.4592623 1.000834
slope2 3.4479466 0.2442530 14.116289 2.9692194 3.926674
slope3 12.5000000 1.7783840 7.028853 9.0144314 15.985569
Adjusted R-squared: 0.995.
[[alternative HTML version deleted]]
