Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Find better Poisson approximation in ML backends #267

Closed
matthewfeickert opened this issue Sep 17, 2018 · 1 comment · Fixed by #268
Closed

Find better Poisson approximation in ML backends #267

matthewfeickert opened this issue Sep 17, 2018 · 1 comment · Fixed by #268
Assignees
Labels
feat/enhancement New feature or request research experimental stuff

Comments

@matthewfeickert
Copy link
Member

Description

As I forgot in Issue #266, the different backends have different approximations of the the Poisson distribution. In the NumPy backend a Gamma function based approximation is used. However, in the TensorFlow, PyTorch, and MXNet backends a Normal function approximation is used.

It would be good if possible to try and have a better approximation than the Normal approximation that is more consistent across backends. I forgot this, and so it won't be obvious to new users.

@matthewfeickert matthewfeickert added feat/enhancement New feature or request research experimental stuff labels Sep 17, 2018
@lukasheinrich
Copy link
Contributor

many backends support pdf distributions. The only possile issue is that poisson is nominally a discrete pdf. but iirc, most backends should already use a continuous approximation of it anyways.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/enhancement New feature or request research experimental stuff
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants