geex
provides an extensible API for estimating parameters and their covariance from a set of estimating functions (M-estimation). M-estimation theory has a long history (see the M-estimation bibliography). For an excellent introduction, see the primer by L.A. Stefanski and D.D. Boos, "The Calculus of M-estimation" (The American Statistician (2002), 56(1), 29-38); also available here).
M-estimation encompasses a broad swath of statistical estimators and ideas including:
- the empirical "sandwich" variance estimator
- generalized estimating equations (GEE)
- many maximum likelihood estimators
- robust regression
- and many more
geex
can implement all of these using a user-defined estimating function.
If you can specify a set of unbiased estimating equations,
geex
does the rest.
The goals of geex
are simply:
- To minimize the translational distance between a set of estimating functions and R code;
- To return numerically accurate point and covariance estimates from a set of unbiased estimating functions.
geex
does not necessarily aim to be fast nor precise. Such goals are left to the user to implement or confirm.
To install the current version:
devtools::install_github("bsaul/geex")
Start with the examples in the package introduction (also accessible in R by vignette('00_geex_intro')
).
Please review the contributing guidelines. If you have bug reports, feature requests, or other ideas for geex
, please file an issue or contact @bsaul.
If you use geex
in a project,
please cite the
Journal of Statistical Software paper.
BibTex entry:
@Article{,
title = {The Calculus of M-Estimation in {R} with {geex}},
author = {Bradley C. Saul and Michael G. Hudgens},
journal = {Journal of Statistical Software},
year = {2020},
volume = {92},
number = {2},
pages = {1--15},
doi = {10.18637/jss.v092.i02},
}
Need help using geex
or writing your estimating function?
Feel free to contact @bsaul.
You can find examples of help in the geex-help
repository.