You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran into a problem and had to make a source modification for the convergence criteria. The currently-implemented convergence criterion operates in absolute terms of the squared residuals. Real data sets with noise may have a minimal curve with an arbitrarily-large error, and convergence in those cases will not be recognized.
If the curve can be fit exactly, I've gotten the inverse problem where convergence isn't recognized because the proportion of the change in error to the error itself has a lower bound. A combination of the relative and absolute approaches seems to work well in practice. Not an exact science in any case.
I ran into a problem and had to make a source modification for the convergence criteria. The currently-implemented convergence criterion operates in absolute terms of the squared residuals. Real data sets with noise may have a minimal curve with an arbitrarily-large error, and convergence in those cases will not be recognized.
https://github.com/mljs/levenberg-marquardt/blob/master/src/index.js#L106
I modified the convergence criteria to be the relative change based on the previous error, as described here: https://en.wikipedia.org/wiki/Non-linear_least_squares#Convergence_criteria.
The text was updated successfully, but these errors were encountered: