Skip to content

leandrolcampos/pyspecials

Repository files navigation

Welcome to PySpecials

PySpecials is a Python library that provides well-established numerical method implementations for selected special functions.

Its primary purpose is to benchmark new numerical method implementations for these mathematical functions, using time-tested algorithms as references to assess their accuracy and execution time.

PySpecials uses the NumPy C-API to seamlessly bridge the gap between Python and numerical method implementations written in Fortran, C or C++. In particular, the NumPy universal functions (UFuncs) allow PySpecials to perform element-wise operations on NumPy arrays in a highly optimized and efficient manner, supporting vectorization and broadcasting.

Table of Contents

What are special functions?

Special functions are particular mathematical functions that play a fundamental role in various scientific and industrial disciplines, about which many useful properties are known. They find extensive applications in physics, engineering, chemistry, computer science, and statistics, being prized for their ability to provide closed-form solutions to complex problems in these fields.

We can give some examples of special function applications in AI:

  • The Gaussian Error Linear Unit (GELU) [2], a high-performing neural network activation function, is defined based on the Gauss error function.

  • Using numerical methods for Bessel, incomplete beta, and incomplete gamma functions, we can implicitly differentiate [1] cumulative distribution functions that are expressed in terms of these special functions, and then train probabilistic models with, for instance, von Mises, gamma, and beta latent variables.

Why the focus on special functions?

Beyond the practical importance of special functions in scientific and industrial applications, finding accurate and efficient ways to work with them can be an enjoyable brain-teaser for those who love math and computer science.

Installation

Currently, the only way to install PySpecials is building it from source with Meson. This requires Python 3.11 or newer.

We strongly recommend using a virtual environment to isolate PySpecials' dependencies from your system's environment.

First, clone the repository and install the building tools:

git clone https://github.com/leandrolcampos/pyspecials.git
cd pyspecials
python -m pip install -r build_requirements.txt

Then install PySpecials:

python -m pip install .

If you want to test PySpecials, you also need to install the R language in your system. PySpecials is tested with R version 4.3.1. After cloning the repository and installing the building tools and the R language, execute the following commands to install PySpecials and test it:

python -m pip install ".[test]"
pytest

Example Usage

The following code snippet shows how to compute the regularized incomplete beta function for given broadcastable NumPy arrays:

>>> import numpy as np
>>> import pyspecials as ps
>>> a = np.array([0.10, 1.00, 10.0])
>>> b = np.array([[0.30], [3.00], [30.0]])
>>> x = np.array([0.01, 0.50, 0.99])
>>> ps.ibeta(a, b, x)
array([[0.49207421, 0.1877476 , 0.45864671],
       [0.72743419, 0.875     , 0.99979438],
       [0.90766467, 1.        , 1.        ]])

Some Implementations Available

Beta Functions

Function Description Reference
ibeta(a, b, x[, out]) Regularized incomplete beta function ACM TOMS 708
ibetac(a, b, x[, out]) Complement of the regularized incomplete beta function ACM TOMS 708
lbeta(a, b[, out]) Natural logarithm of absolute value of beta function ACM TOMS 708

Contributing

We are not accepting pull requests at this time. However, you can contribute by reporting issues or suggesting features through the creation of a GitHub issue here.

References

[1] Figurnov, Mikhail, Shakir Mohamed, and Andriy Mnih. "Implicit reparameterization gradients." Advances in neural information processing systems 31 (2018). [Link]

[2] Hendrycks, Dan, and Kevin Gimpel. "Gaussian error linear units (gelus)." arXiv preprint arXiv:1606.08415 (2016). [Link]