From b53a5788668129ccfaeb3953c533314cbe055834 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 16:48:56 -0500 Subject: [PATCH 01/12] Draft intro page --- docs/source/intro.rst | 122 ++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 122 insertions(+) create mode 100644 docs/source/intro.rst diff --git a/docs/source/intro.rst b/docs/source/intro.rst new file mode 100644 index 0000000000..4ec3e353f2 --- /dev/null +++ b/docs/source/intro.rst @@ -0,0 +1,122 @@ +************ +Introduction +************ + +:Date: 2017-07-06 +:Authors: Chris Fonnesbeck +:Contact: fonnesbeck@gmail.com +:Web site: http://github.com/pymc-devs/pymc3 + + +Purpose +======= + +PyMC3 is a probabilistic programming module for Python that allows users to fit Bayesian models using a variety of numerical methods, most notably, Markov chain Monte Carlo (MCMC) and variational inference (VI). Its flexibility and extensibility make it applicable to a large suite of problems. Along with core model specification and fitting functionality, PyMC3 includes methods for summarizing output, plotting, goodness-of-fit and model diagnostics. + + + +Features +======== + +PyMC3 strives to make Bayesian modeling as simple and painless as possible, thereby allowing users to focus on their scientific problem, rather than on the methods used to solve it. Here is a partial list of its features: + +* Fits Bayesian statistical models using modern techniques, including MCMC and VI. + +* Includes a large suite of well-documented statistical distributions. + +* Uses Theano as the computational backend, allowing for fast expression evaluation, automatic gradient calculation, and GPU computing. + +* Includes a module for Gaussian process modeling. + +* Creates summaries including tables and plots. + +* Several convergence diagnostics are available. + +* Extensible: easily incorporates custom step methods and unusual probability + distributions. + +* Bayesian models can be embedded in larger programs, and results can be analyzed + with the full power of Python. + + +What's new in version 3 +======================= + +The third major version of PyMC has benefitted from being re-written from scratch. Substantial improvements in the user interface and performance have resulted from this. While PyMC2 relied on Fortran extensions (via f2py) for most of the computational heavy-lifting, PyMC3 leverages Theano, a library from the Montréal Institute for Learning Algorithms (MILA), for array-based expression evaluation, to perform its computation. What this provides, above all else, is fast automatic differentiation, which is at the heart of the gradient-based sampling and optimization methods currently providing inference for probabilistic programming. + +Most notably, the PyMC3 provides: + +* New flexible object model and syntax (not backward-compatible with PyMC2). + +* Gradient-based MCMC methods, including Hamiltonian Monte Carlo (HMC), the No U-turn Sampler (NUTS), and Stein Variational Gradient Descent. + +* Variational inference methods, including automatic differentiation variational inference (ADVI) and operator variational inference (OPVI). + +* An interface for easy formula-based specification of generalized linear models (GLM). + +* New elliptical slice sampler method. + +* Specialized distributions for representing time series. + +* A library of Jupyter notebooks that provide case studies and fully developed usage examples. + +* Much more! + +While the addition of Theano adds a level of complexity to the development of PyMC, fundamentally altering how the underlying computation is performed, we have worked hard to maintain the elegant simplicity of the original PyMC model specification syntax. + +Usage Overview +============== + +First, import the PyMC3 functions and classes you will need for building your model. You can import the entire module via `import pymc3 as pm`, or just bring in what you need:: + + from pymc3 import Model, Normal, invlogit, Binomial, sample, traceplot + import numpy as np + +Models are defined using a context manager (`with` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: + + with Model() as bioassay_model: + + # Prior distributions for latent variables + alpha = Normal('alpha', 0, sd=100) + beta = Normal('beta', 0, sd=100) + + # Linear combinations of parameters + theta = invlogit(alpha + beta*dose) + + # Model likelihood + deaths = Binomial('deaths', n=n, p=theta, observed=np.array([0, 1, 3, 5])) + +Save this file, then from a python shell (or another file in the same directory), call:: + + with bioassay_model: + # Draw wamples + trace = sample(1000, njobs=2) + # Plot two parameters + traceplot(trace, varnames=['alpha', 'beta']) + +This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). The sample is returned as arrays inside of a `MultiTrace` object, which is then passed to a plotting function. + + +History +======= + +PyMC began development in 2003, as an effort to generalize the process of +building Metropolis-Hastings samplers, with an aim to making Markov chain Monte +Carlo (MCMC) more accessible to applied scientists. +The choice to develop PyMC as a python module, rather than a standalone +application, allowed the use MCMC methods in a larger modeling framework. By +2005, PyMC was reliable enough for version 1.0 to be released to the public. A +small group of regular users, most associated with the University of Georgia, +provided much of the feedback necessary for the refinement of PyMC to a usable +state. + +In 2006, David Huard and Anand Patil joined Chris Fonnesbeck on the development +team for PyMC 2.0. This iteration of the software strives for more flexibility, +better performance and a better end-user experience than any previous version +of PyMC. PyMC 2.2 was released in April 2012. It contained numerous bugfixes and +optimizations, as well as a few new features, including improved output +plotting, csv table output, improved imputation syntax, and posterior +predictive check plots. PyMC 2.3 was released on October 31, 2013. It included +Python 3 compatibility, improved summary plots, and some important bug fixes. + +In 2011, John Salvatier began thinking about implementing gradient-based MCMC samplers, and developed the `mcex` package to experiment with his ideas. The following year, John was invited by the team to re-engineer PyMC to accomodate Hamiltonian Monte Carlo sampling. This led to the adoption of Theano as the computational back end, and marked the beginning of PyMC3's development. The first alpha version of PyMC3 was released in June 2015. Over the following 2 years, the core development team grew to 12 members, and the first release, PyMC3 3.0, was launched in January 2017. From 30ee412a3b81f6631a4821239df4d27931705740 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 17:25:09 -0500 Subject: [PATCH 02/12] Fixed header in intro --- docs/source/intro.rst | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 4ec3e353f2..a86550c3bf 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -1,12 +1,9 @@ +.. _intro: + ************ Introduction ************ -:Date: 2017-07-06 -:Authors: Chris Fonnesbeck -:Contact: fonnesbeck@gmail.com -:Web site: http://github.com/pymc-devs/pymc3 - Purpose ======= From 02244bc27b8bae1c8b661bc6b84d9948be958d3a Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:22:25 -0500 Subject: [PATCH 03/12] Updated text with suggested edits --- docs/source/intro.rst | 21 +++++++++++++-------- 1 file changed, 13 insertions(+), 8 deletions(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index a86550c3bf..508a5c09ec 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -8,26 +8,26 @@ Introduction Purpose ======= -PyMC3 is a probabilistic programming module for Python that allows users to fit Bayesian models using a variety of numerical methods, most notably, Markov chain Monte Carlo (MCMC) and variational inference (VI). Its flexibility and extensibility make it applicable to a large suite of problems. Along with core model specification and fitting functionality, PyMC3 includes methods for summarizing output, plotting, goodness-of-fit and model diagnostics. +PyMC3 is a probabilistic programming module for Python that allows users to fit Bayesian models using a variety of numerical methods, most notably Markov chain Monte Carlo (MCMC) and variational inference (VI). Its flexibility and extensibility make it applicable to a large suite of problems. Along with core model specification and fitting functionality, PyMC3 includes functionality for summarizing output and for model diagnostics. Features ======== -PyMC3 strives to make Bayesian modeling as simple and painless as possible, thereby allowing users to focus on their scientific problem, rather than on the methods used to solve it. Here is a partial list of its features: +PyMC3 strives to make Bayesian modeling as simple and painless as possible, allowing users to focus on their scientific problem, rather than on the methods used to solve it. Here is a partial list of its features: -* Fits Bayesian statistical models using modern techniques, including MCMC and VI. +* Modern methods for fitting Bayesian models, including MCMC and VI. * Includes a large suite of well-documented statistical distributions. * Uses Theano as the computational backend, allowing for fast expression evaluation, automatic gradient calculation, and GPU computing. -* Includes a module for Gaussian process modeling. +* Built-in support for Gaussian process modeling. -* Creates summaries including tables and plots. +* Model summarization and plotting. -* Several convergence diagnostics are available. +* Model checking and convergence detection. * Extensible: easily incorporates custom step methods and unusual probability distributions. @@ -51,7 +51,7 @@ Most notably, the PyMC3 provides: * An interface for easy formula-based specification of generalized linear models (GLM). -* New elliptical slice sampler method. +* Elliptical slice sampling. * Specialized distributions for representing time series. @@ -71,6 +71,10 @@ First, import the PyMC3 functions and classes you will need for building your mo Models are defined using a context manager (`with` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: + # Data + n = 5 + y = np.array([0, 1, 3, 5]) + with Model() as bioassay_model: # Prior distributions for latent variables @@ -81,11 +85,12 @@ Models are defined using a context manager (`with` statement). The model is spec theta = invlogit(alpha + beta*dose) # Model likelihood - deaths = Binomial('deaths', n=n, p=theta, observed=np.array([0, 1, 3, 5])) + deaths = Binomial('deaths', n=n, p=theta, observed=y) Save this file, then from a python shell (or another file in the same directory), call:: with bioassay_model: + # Draw wamples trace = sample(1000, njobs=2) # Plot two parameters From 89993d6da8fc6778548994a9ec07169e430435ec Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:23:06 -0500 Subject: [PATCH 04/12] Converted tabs to spaces throughout --- docs/source/intro.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 508a5c09ec..8100826efd 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -89,7 +89,7 @@ Models are defined using a context manager (`with` statement). The model is spec Save this file, then from a python shell (or another file in the same directory), call:: - with bioassay_model: + with bioassay_model: # Draw wamples trace = sample(1000, njobs=2) From 1723b42e6dabdddc74d7d66e8a4fb0cda5d94be6 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:24:57 -0500 Subject: [PATCH 05/12] Reorganized sections in intro --- docs/source/intro.rst | 51 ++++++++++++++++++++++--------------------- 1 file changed, 26 insertions(+), 25 deletions(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 8100826efd..7e22e46834 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -61,6 +61,32 @@ Most notably, the PyMC3 provides: While the addition of Theano adds a level of complexity to the development of PyMC, fundamentally altering how the underlying computation is performed, we have worked hard to maintain the elegant simplicity of the original PyMC model specification syntax. + +History +======= + +PyMC began development in 2003, as an effort to generalize the process of +building Metropolis-Hastings samplers, with an aim to making Markov chain Monte +Carlo (MCMC) more accessible to applied scientists. +The choice to develop PyMC as a python module, rather than a standalone +application, allowed the use MCMC methods in a larger modeling framework. By +2005, PyMC was reliable enough for version 1.0 to be released to the public. A +small group of regular users, most associated with the University of Georgia, +provided much of the feedback necessary for the refinement of PyMC to a usable +state. + +In 2006, David Huard and Anand Patil joined Chris Fonnesbeck on the development +team for PyMC 2.0. This iteration of the software strives for more flexibility, +better performance and a better end-user experience than any previous version +of PyMC. PyMC 2.2 was released in April 2012. It contained numerous bugfixes and +optimizations, as well as a few new features, including improved output +plotting, csv table output, improved imputation syntax, and posterior +predictive check plots. PyMC 2.3 was released on October 31, 2013. It included +Python 3 compatibility, improved summary plots, and some important bug fixes. + +In 2011, John Salvatier began thinking about implementing gradient-based MCMC samplers, and developed the `mcex` package to experiment with his ideas. The following year, John was invited by the team to re-engineer PyMC to accomodate Hamiltonian Monte Carlo sampling. This led to the adoption of Theano as the computational back end, and marked the beginning of PyMC3's development. The first alpha version of PyMC3 was released in June 2015. Over the following 2 years, the core development team grew to 12 members, and the first release, PyMC3 3.0, was launched in January 2017. + + Usage Overview ============== @@ -97,28 +123,3 @@ Save this file, then from a python shell (or another file in the same directory) traceplot(trace, varnames=['alpha', 'beta']) This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). The sample is returned as arrays inside of a `MultiTrace` object, which is then passed to a plotting function. - - -History -======= - -PyMC began development in 2003, as an effort to generalize the process of -building Metropolis-Hastings samplers, with an aim to making Markov chain Monte -Carlo (MCMC) more accessible to applied scientists. -The choice to develop PyMC as a python module, rather than a standalone -application, allowed the use MCMC methods in a larger modeling framework. By -2005, PyMC was reliable enough for version 1.0 to be released to the public. A -small group of regular users, most associated with the University of Georgia, -provided much of the feedback necessary for the refinement of PyMC to a usable -state. - -In 2006, David Huard and Anand Patil joined Chris Fonnesbeck on the development -team for PyMC 2.0. This iteration of the software strives for more flexibility, -better performance and a better end-user experience than any previous version -of PyMC. PyMC 2.2 was released in April 2012. It contained numerous bugfixes and -optimizations, as well as a few new features, including improved output -plotting, csv table output, improved imputation syntax, and posterior -predictive check plots. PyMC 2.3 was released on October 31, 2013. It included -Python 3 compatibility, improved summary plots, and some important bug fixes. - -In 2011, John Salvatier began thinking about implementing gradient-based MCMC samplers, and developed the `mcex` package to experiment with his ideas. The following year, John was invited by the team to re-engineer PyMC to accomodate Hamiltonian Monte Carlo sampling. This led to the adoption of Theano as the computational back end, and marked the beginning of PyMC3's development. The first alpha version of PyMC3 was released in June 2015. Over the following 2 years, the core development team grew to 12 members, and the first release, PyMC3 3.0, was launched in January 2017. From 5dea60273018087e1307973bce9b330bed6783c5 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:26:41 -0500 Subject: [PATCH 06/12] Minor edits to intro --- docs/source/intro.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 7e22e46834..51fec419ab 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -41,7 +41,7 @@ What's new in version 3 The third major version of PyMC has benefitted from being re-written from scratch. Substantial improvements in the user interface and performance have resulted from this. While PyMC2 relied on Fortran extensions (via f2py) for most of the computational heavy-lifting, PyMC3 leverages Theano, a library from the Montréal Institute for Learning Algorithms (MILA), for array-based expression evaluation, to perform its computation. What this provides, above all else, is fast automatic differentiation, which is at the heart of the gradient-based sampling and optimization methods currently providing inference for probabilistic programming. -Most notably, the PyMC3 provides: +Major changes from previous versions: * New flexible object model and syntax (not backward-compatible with PyMC2). From e4fd502779c9fea2aa1bb625775cb27c66552e05 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:33:09 -0500 Subject: [PATCH 07/12] Fixed some bad rst --- docs/source/intro.rst | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 51fec419ab..10a97b8a78 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -84,18 +84,20 @@ plotting, csv table output, improved imputation syntax, and posterior predictive check plots. PyMC 2.3 was released on October 31, 2013. It included Python 3 compatibility, improved summary plots, and some important bug fixes. -In 2011, John Salvatier began thinking about implementing gradient-based MCMC samplers, and developed the `mcex` package to experiment with his ideas. The following year, John was invited by the team to re-engineer PyMC to accomodate Hamiltonian Monte Carlo sampling. This led to the adoption of Theano as the computational back end, and marked the beginning of PyMC3's development. The first alpha version of PyMC3 was released in June 2015. Over the following 2 years, the core development team grew to 12 members, and the first release, PyMC3 3.0, was launched in January 2017. +In 2011, John Salvatier began thinking about implementing gradient-based MCMC samplers, and developed the ``mcex`` package to experiment with his ideas. The following year, John was invited by the team to re-engineer PyMC to accomodate Hamiltonian Monte Carlo sampling. This led to the adoption of Theano as the computational back end, and marked the beginning of PyMC3's development. The first alpha version of PyMC3 was released in June 2015. Over the following 2 years, the core development team grew to 12 members, and the first release, PyMC3 3.0, was launched in January 2017. Usage Overview ============== -First, import the PyMC3 functions and classes you will need for building your model. You can import the entire module via `import pymc3 as pm`, or just bring in what you need:: +For a detailed overview of building models in PyMC3, please read the appropriate sections in the rest of the documentation. For a flavor of what PyMC3 models look like, here is a quick example. + +First, import the PyMC3 functions and classes you will need for building your model. You can import the entire module via ``import pymc3 as pm``, or just bring in what you need:: from pymc3 import Model, Normal, invlogit, Binomial, sample, traceplot import numpy as np -Models are defined using a context manager (`with` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: +Models are defined using a context manager (``with`` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: # Data n = 5 @@ -122,4 +124,4 @@ Save this file, then from a python shell (or another file in the same directory) # Plot two parameters traceplot(trace, varnames=['alpha', 'beta']) -This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). The sample is returned as arrays inside of a `MultiTrace` object, which is then passed to a plotting function. +This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). The sample is returned as arrays inside of a ``MultiTrace`` object, which is then passed to a plotting function. From a44fa9780c49ae5d30d742b336060e23e57d781f Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 19:35:23 -0500 Subject: [PATCH 08/12] Added all data to example --- docs/source/intro.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 10a97b8a78..13ebe28b77 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -100,8 +100,9 @@ First, import the PyMC3 functions and classes you will need for building your mo Models are defined using a context manager (``with`` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: # Data - n = 5 + n = np.ones(4)*5 y = np.array([0, 1, 3, 5]) + dose = np.array([-.86,-.3,-.05,.73]) with Model() as bioassay_model: From 69160db26b5f23de8cd63a302df543db78527139 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 21:12:35 -0500 Subject: [PATCH 09/12] Added image with example in intro --- docs/source/images/forestplot.png | Bin 0 -> 13433 bytes docs/source/intro.rst | 25 +++++++++++++++++++++---- 2 files changed, 21 insertions(+), 4 deletions(-) create mode 100644 docs/source/images/forestplot.png diff --git a/docs/source/images/forestplot.png b/docs/source/images/forestplot.png new file mode 100644 index 0000000000000000000000000000000000000000..760a7e8ab2fdfb221c5c46fdd9405899474deddc GIT binary patch literal 13433 zcmch8cR1C3`2Rskit>~ap+Snw>|I2M?3v7DXJv1dXc%>ny~nXfS=mt(vPT(FviIKn z?oUV0Q+@kh*Y~gAb6s8MiSzlq*Zscl*Xwn^dMqz1NkU9Xj6$JE(AUHjQ7C**6bf&E zh!B3l)zbSNKK9sLK`Rr%f38HgAHm=ETV1Ct5Fc?MQqfkW$sYT(3A#nsmDAWLlEea2X()1=hg+h7O zH{Bj!WUS?{{KD{d7LId(xKn2u+2aenUpAbVp7tF!8vsiJUqN7etsRJEeT75 z0V4d0N`{8Xh7F-{eI*V8^Q+|-E5~42o->ZCUwt~#RG7$*9W(7Kwv&>UZc9^1&ol4g ze(2>D*`>j8>C#fO*r8OlY;DueFNd7m?&#DKof^n#Nsu&aj=kJDq+RB0QsuS3bLa*0 z&G8QmF@i()Jf=N{o<8!Sw5qXvPkk)mc5E*-8S_{_?UbSFW@dS?e!N^{72P z#*ASfp5$5OJjSe%Sh2m7-LcJE+f(LJ2s16X^I36dqal=CEJWMcgiCdAm=DS2fk|u~Sy#Pn}*=#QV_OJMHoVLP}v*$A>9A z%*-;?K8Hf+mZtl9*6+N3|DJck{4T*6U0xP?`l~$!=53p63mu8*BV}_nRO3B`*|53f zR8(fnU9h&YO$@llq*T=9i-{%~IqM|Y@cD81^ zbM&fB%=jEfThjM|<+H8$tU-Y9ckO3T|sPTsiXj32dtDj6A>GYraQ zJ1Xx6$MnSC{{>y9PKCBvXGV)3&zF>BnF!OB=|1P}wULf=HC6*d!!+3_{`I~YcdW}y znVOu)RvvsMd3Dth{nCAq^WZ@2{rmWXGc%ULo3qu#T@KT|p9SHhI)}c0zkSJSpfWIP zB&%eaV`oWc#~@P)^Wo!1t{RHKprG4uWXeg-)5Q}kM~)pcbl;q-Jz`N`Uw?_7p1y`c zUn9?0kH&pf&!(QaF!cTVYk`rGcYN6mo1%pbS7*N_Tl5ttLl_xWdg7fE5a`j6G%-mH z*bQm0q!HK9!+JH6L9>O)w0w2E!nA!N_7v$wb|eB^;iX}e5uolUK3 zGRrPA^kpBeyhi{RkNq4!f48`}c<7fhmk8#Y1?mA@ZtJ#fx%!d|^YyF|aC;~NYW1}p zzwT{=2vJg1RgKHq-r9%|*`5o4ciK<)nh+kO(52)uyaD>Bg)&4t z##iZcnRPbo*}FF^G?c=oNoXZ2MTC2T+%u-}$^VNqBAJ}KgH?mAV<;3kLCFSPh zcB?^fvN(t(6WnofGKcFr(<)EQlB0ujOzDxG)V9*%cOAg;rm$oc-Vm=WSGI z%_6?%pk(li7ik$8lU}T(KHlD@kVMF{>fgVYPmu6Cdi*%HWV%>-V`Zk$V&YS7f@-El z(pgFW(Lgz2h3nUS;}a5=YN#yF)6t1P*mp?SoOx(y$ay5nfz!18JeNu9cdwr6>T2@C zvoV(!_-uw#^lSaQ^GpJ1TxTx_2L+wKZ~+a`sqy-567t&FS*xXqF1dH_-eD5k+m(}& zlEz>eFq`y6n71h$9yW7ou9jA<)}K1xX~D41V`qy}#Jv>Rys?)msR?N`jsti}$;qbG z-Xz3CM9s82tCHx}M`UL?o#t=T)6p5vd@1WLambeNr8td-i(Yis#s+TXmNXTQh4M&a zR0md$c58hiyVDthI(6lXiyY*eR(L(9>#EfjobCGNLQ|#z=}F#LIpH;N*d7es(8$O{ zw^??_{7^kICu9Lz8!KJWLN1J%TYV1w9Y^E@bF(6rk zlKW1I;m2?|MLpsJ2ii7fDzf3Ksjh9Ccjx7@IE*%5e*XOV`gF-m#>5pdu}ph3`tkBi ze|CX+PZ}g^*vYonJrq1P#rufq&YcTS&nm9lTpbKZ&dY0Wh!N>uU0VyYpX$C*U%Iv2 zGaR$CiCu%mef;+Bsf?>@=~#Du`UPfYxSL9vns*=xnrRf=F<~j0Wc&HJo)en>%-tRl;>vam*k)D-mjz^}41w zV^9K>u2^dsWx8iJG(dR${WT6r|C8)Bn?`gqj!bGIrT&6bs*;va0ekI+l8Qc@ZZN)&`)Glas??55m5E`xf|xRL_#WI-J3ylJD>JAVD4N!bD z>-X;6JN#xJEvHNbHy2F+_DZ)^k9pyZdUkxrVG1aG!%Z=)`o6dH^^p>Tjd9oD>9Y;o z+}g@Cuqd6AAfaFr4bi;&n2>}r@sP--{6trFVfiE#1x3n8w2LexB#jF9GNcB`gma=H z_OhXz#nVf_7){SER}7#LvYjbgQPuES+u2V8V@uiT*nc)4XYp9-Q0snegFSQR%uyPe zEVx1)**ZE8A3p5PHwz)9=1(khSxIwQneKGM$H$jdmoxb1$Pu}s$lzd=v6tBNYJd2O z`V?dtw4#N@G&*EwJUu%oCMFvusUAQLFPxm5Bo63{zj}b{&ZnFzH%ULrB;;OPy?Pas z$kXqt^ka#e{^c@o56tRTt0O3TP`I7u!uFm%DA4jyA_3_`z=O-ElK0kabN>A#{GUBHHZi?43=|PKz=2?yRmX z4~7k%``icFaD8TH+p^E9nnWW<_nz?9q7tOw>m-z%@=i`pOXiVJ+4k@8H)~7wQ_WC6 zo!ylVWt{2UIf=6pzVhj+nX=9?otZb8`1v&i9Hy0HgxzF|wxBv7kV@NWh#V@0N|&}v zCgf$?Y7(B&&qgtswoxQh+{#c^WxI3NGa&$pLlzbmI+CQ1m#qy)1ZQo8rrEQ}joh9{ zHKkUu-u&*XAK;M2)4#dt`tI#pW&Q&*s~;{^xJ}cgdhBd7va)tb(z+#PX_c4`)diE2 zj&T^&9Yq#kGu)s6xPqLLGEJ}AJBqW@qBA3`z_LHfp#Fu)+WZiufPDsy$EGzvg8OKO zO1^J1GuiMaWjNnXSU@`D4jlw4Sd_0BnuVpwlJb3l_)dqK3<*)A3qaxcz?%o;F9g(P zuomL`4s%JUtB1leGiKaBMhk5u)^E>NAL_2~sNiC>E3K-!;zz|JYgxAR0^(OiRn>6I z=!T+V{L7awrMkvOMzC;cjkh<~!iC+|_xBSlI+qcXl4i{cRSw!s_ukl9trclcQ8;^5 zLPE9pIur@maU*K(I7mzD(Y>h|Hw&k|>Zf`NMF#mBAI8SU9*M4Rj1n;GFL#5ao#Q6o z+1xBoa)LwLvV1KWs#z+4GrRHj55;av;~kVj&Us62jhmaBZ8TdVoSe!8ga?~F@d!9A zdU^9L`?YTtS_n-oki?EVFO4SwW*7q;+wQghh;7T&<74qpS@_fK);Bg{tUhHLMLP|9 ztwieYT;DFxf7Ess-+NGa?fVhHh04ffEVvWqyfU2$WB1wjm}C6io8=%jCOR|Qru#~Q zqocF#5$sDd?yT=fCSNs;KViz^HDSJF)TsQ* z@MC+Ki$mG=`b1qrL%TnLtT3RT1tzuQPG*c%h?-<;^J^ICE zrUPDnYRs6E!>scn^Np9{{jLjskgjhg@nivBGh3egyuRFHxzuaZWb(D@Ar!>T6J6FU zc>soC9o;Vtp1Q9M9VaIzPqpkXm%4tvJ@)d0MZlm4=F@m(=m$0P9wW845uDiiXhHxo zB~^;c&k_%?mNtiT&O-%8P%HMN-*q4qOxdOJu;?)etu`vl(l|IN!1>=Y3aLe}Dq&?i z04yfv<>lGDEQgA%y)e?$1$(~;D;z86v1n_P6Ul_kqO3j*KMAg=sGxC~Wd9h!BR5mJ z(0HLD`TkBU0Z}br;P;eKua`P++6@J(jl)CvxA;9gJlwXHx+vLnJt4DU+4XBuq(fK$ zIT$?D$CGM>ikJteMrg5FtkVvvf16#G78YvoI838cNlwVFRb#v(9cwqyIW`<&*?BfV zFdi^&s!>z4^Tt%+5){pJ2o{DiOF)c^?~d!FI4ulQa@>COZE%pwuz><@kH%T+KnA%^ z=EB~1xKs4!&M}@j6EAfsh(Rto1M))(tls(ib4gA>4pIUYDw%_t6#TX+>q`?@K#A#a zrICS#<{TFRz)Uz)Y(!a=&gdb1a+H8SI|P z+z!*4aL#feI_|I6S7x#k6BCvEze55Z8w%0sBv;dngB)%Q8Df3l!6DgI6rRI|46q*>@;R|O{AZtDAgB+fCyM;@_DKpvJ2!9(#jK~Zwu=G1lde)D1L z6IQ?zet{KaV@1W<^8*>lj~<_O9k_H*cDh#B?bPVPR=*P9n=j52%;;8FIbWP0ir0%w zw4D(7LLZ)Bf2nr19c4J=)d{+Gm{z$L`V_*`2iyALmyoBRM6sol0I7fa$nyuOC!#=D z|07TH^UTaD&d$Yjsd6#G#6$yQ1n{&E(I-MfJOY$ww}FHGzNm=@jX10r`#F*f_pbAQK(pMT_O+~ehfA1CLn)b|B)Oow3JxA0gc0~uaVkU*Iek&vj!%KCS}0%otCB!mxSZaASM zmIPpo)U|8!SFYSACMJ$#*#|r3J4-D}<;xCfDGxVuGErDSh~A!N?`O{rXWn>KV+zBq zoZlI)Bw6&w{n0DAP=+8O{`o1!!Di;dm+ z8RGRE27t8pz%g}m_^}QU5JZR(uY|vV=+f*B%0N zQG_S_FGtS8(%BGpi5}gSEOQpw0pZ7g9yq)cgsbMakB?;jdf;btVIm=?YxCj~6U7&X zKL*m#GcYi*u{D2v0>79%`|qKj!W|PV_x~JxCl6A>HqyZ(p?hJ~rl2VN7_2()O;eMc zkn>{BuYp5R1Hz8g3HWwheLdhQdNjZ*hCdf``r?mQ;1|3ZoIB6W-HAJ?-538mML690 z`ars?;^K!1;0$BL_uL_P1tB47E{fa#T_Fq?vb&r7QwVD&1Rw>R94dO<#1g+2DFoDs zue_ez)B-<*v+P@Yeh6$7czLIxCIcH)rGl>VX@=-Gv$?GSyeoRcyYszO-IVyB#|^YTS`|Q%z0FE|}ZATUjQ;$>JzE`31P9AO=UXx;OrffjX*>A2te6n#RV)<`)*)0Dr1zYrD4!KCd{P#RPQ zhohrfTNTUZ>zJGYEGTPy&ecpC_;hg~^JY?2RYHXi>CjaYRLY~K1Ikz_hI^a9-%Z<7 zo)=j3rOnRH5(kO`?`tn{uq*?nN6CK6i*Bq-$3q2D1}GxQR2D^-O-xLRrwV%VY(`}B zP1+pR1~t;;G|b`2lP9e=Ez6PsX_Qm)tbLbtxL%0}By~*L&*(BW@-=`2 zGh9@Ps>KURTT%Bs397QGzdv<={ks27(6XaHjnJ+7VjU;fS^QupOcd(th<^y>h>Z?0#p$T%kxo>E$>31UVr%)vaf0mSL4 z%F4!BOVBTM@#aglX(s_CE0$L;Aw8Lbk$fA#*%kqmR#&)0*&kb;ssb#{xT<0az zttm^7w)`$_CUo704=t?NalWX1cH@y-MCyjasg@$2!$gl=9MEk>dG3S?m(4dl9czoq zI+%Y?2gn<82jE&sBV<{4WAf63YawLMzrp^QfSY_=VYw7Hjk1--T zH*VZm2ib=UVn(V$Ni7i6FtZxp6X5_qPXs96{Wsdbj{qlf_~Il9LQ*?~ZpPyy4C4Uw zM5d^XK87dF4c4XuTJ0{hj1gWPAb|P{W9R_gq~^5}g;Q)rP$iTw^0TL1!>;Z)M)$#NWc_-Kv7KMqen!ybm9Af|AB~~VCWQrV|J%~))Vz> z*Qa}6$(j7biBu?iPG#EIdRQm#*^~R4ll5y6oHeVm3yRNra(j7uGcqvL;>s5(QeJ2Ev6zCwb45kPAPfd7#hKl8idS;89e`2W34eJDF|nelX&MexidVwpa7++D z$XdCPn=#xFhDkJOOD6j}z<_W47KghBjtiY<`rYy%FaYgBV<6X)Lqo&rDh#pwKOizJ zcYAvq>PG0_XA<<{1tZ*POgo6Vq=Y~RT>pVb8jujr$J-rDuX_M4ERB%!ZwU0DKI}Gt z+Z+-|>veT?7`k`0wMYI~Oy%KYxR~jIPS#Oq`TYF?)KI`X&=_DqkN%(vyQpy&TH!Wu zcawiMCXF9-O{m%n!Iw>0APyM4Osm6}~qj%|x>OKs5QDo#v90uk+ zh}`F&@eGI_@O{v;XR*-kf=a-<3)=)dxheF4zM;ZeK9$4})?$CDU*MMJuWSRrVF8&5UKz87N z9yq-7@ZrO;kd^c&=+LnM5X{1YNW1g~yvn1NdVzsy8K zBW-cf7Oe=nA}<9eBnk9?7iss#c$++kiHUJu9M$_ZaM*l*CbXcC5GE!jY0ObtTDd;IT+90fq@}*`Tfo zyIq8CjWXxAKPd{Kp!4p(RAp?Oa`*0CTvqwU2ITDDgkXy@KY#u_!Z82&=O1Nx`2kx# z)BuCWDhhS<5aIQ4s^g za`$S$@gm8ZhK2s-Ue$O~xbZcwtrZv5r&p{j|BdY+SwH2(GQck2&dG6^L1a30HO zx(fiN?%lgbiZCBt+CS45vRqJqK~uW%_n81wXl-kwN4u>reg!o2Tj|^#>_jq&ySw{% zOM=g@K~&B5Gr7rYYiDI-Ttq9nx(fcen1gyh^CUPHa&vS4xC{dX(4^eM>_rOg^jY6#M5GtS)?G{AXP5V2=DmOYyGZ+K*4H1 z@<>-~0vQ{r`HjBqIGf%dH6HWhiOj+`ig$BKm8Hg}qeMYI2=B$~%|nm3z8hmi4YYGx z1o1oA7Zalz@U!T<3q{szo+MDi&?~PF|ieV*42H%uEN0CA2#5fPcB;4b)OQ^M~gpW^G+uRC5o zACbac(xAY=1kh@+^_Q6)?kvxAW-97^ZRr>Dna4vF?VZ6(k;O*sX@%y$d2f+|evNO7 zmP20>sG6hx0@KaF2~&G4E3^vx-Almc0IlhZ(AgD#dYrWrdWuHoeZ?vWLs)1M$*$NO zV(l(>E1|m}D2T|5&d{AAKXN1iRCGjNY)wQXY^j#k!&RHRH^jO#Kq;NM7G=9Ix5L{Q zGxPNz?TB`*zodX$t%sD0pk%4jc+_?lM0WI}HkP+HX1b78X_9nE z8YsHmuZ#l1IZbZ$VL{-AU5tQRn3|WzDHFCnAV+L=FRuvf%LWG+><28oxdT-O?Oa zfJ&5{##VK97O+p33>%u&blj2%_K_mA2$N^jM56_L2K&OnQY8Dqi1cXa@~Kh#Z1)Xi z&|+(dVrKwjuUL5p*ko&!)JdWSzeT3+&#%1^Ma&{1q64s|yU02L#3ZI(`!oeX&|Gj} zZw!Q6Wl**AVFU#G6&0=CkxEnD1(4YG)M}4z?EH?$nHYzcmpm`3S6+_A<3SkEiYJZ% zT@F5X57%}<$TPp)cp{KPI1!|QLGDh;V{s7;P(cys&ch84a6TY?Am~dcfr6gqwz16R zj!$G&{%_ExwG!!qx(IYIQDkdO4r%B^FXaSe#x$)Gd$2VGS(GiQB7X%@e0^;s25Fao z)G%5+k)>4sf;ku(j8G=+CO;`aUq=BXHb5td(2tP-3WWtBBKe20r!3Qm8N>;&spNKx z+EHH5t3oT9 zko05<^yJe4SYi@^#A+f#K>t4uxQi*YPE5f-(c0RIG!Kxo3+FO>0Wem{?WU7IlzJw1 zb_HOu4B}s)xRu|x!BG?={9MpIo5M>L)j#*T!f$9R0X=DFcZK&w!qReb*nF6v!sZ0* zW{dPeRI!YeRnD=C>bif)`+(EnRv`F0CFml%7dV&xLMdS0(CJeK%?FbR?LjLxI63OOWJ@IK+F#9KXG`0cNhOQH3JPGI)K`{DlBX_(ieby zi$NOqe+Hl4u^f^&Z{8rS@?Q)6E^$L}8Y813;BNp_%F4>$ch~vff$y#o-g5c!Whoh% zzc)6#zkR#)F_KRS!)e~lc?)*nFI*cMDBx(5iR5kh^}u(f;k(zV0M&6?ei1|~-o5)0 zpMZcm3(94s@c&TCVY&VNI*<%s{i>AjQVJLY)W9SxRvNRuI9B~8_V+H$F7@KLP6hqr z_5{g8zXguoML0Roste*En?owZB!cDb&&5zw!Oi|5$%Ek#tR8>Aig384rLYun@jZWn zC-F)c1$tIwvvbcU*8sL=E`vn^TI}{;XsA%LwE17nct;izsUAI4C`R4>xNSc&7<}_9OZbmck z_@O#$DlYWc2I1aF6HuE=&A`F&vU4)8xg`QqtjxGWCv*8(DX=ANRWWi z{R|ux+InHT{szP};^KaT@L0uxYIsE;Zd8X8N-O&|1` z1Nh?onm;w?moM8u@i&n3I4aJHSFHv@!WA_=~X1@*SbA#Wuv0 zxdcKmP@~Wy3Jhz>pyO$Q zTMHDyP7X--?fYn5V_~MI@1Bsa_wQ`!fWPAHn>1?tezkU2$%CCh@f$TMO3=9$AxZqb-_!t}!VMZ4Q>!r~5pB4d{ zWC*s0P7b)S5||rcv{f>QCuZ!K@t7#zn7#y^Ch{yQQ1+omnbz3asIF2l21u0$(*SR4 zR8lb-Da?Ld*9&f zQ+k;FTf-;T%KgkRbrskuRshqX<};RLZY}b`8^}IX!e&QdKw2S<4m`LFykp zGK+ zK};|MS9tNaCrq~B(30jamIECVVK!ASVV2e4;9_*E;B{Ti0X+6iUd4z3+*Tl<-v{f8 zP#KuTkj#YS41gtkTHX5@T|TM|-}bROACV@d4qk9gMO+T7-<#-A5!Y}WEmG`QohQ1 z7W2%F{E)SsKVL`vWCU>KMhsaf652ernqe{8kgyheX;(Vmz`y zeA4LRvFKF?`zpWVtnS01&sb9*xBzPF?qnb%LmS;EjeiAhh3cSo55s?(kVv4u7i#Nx zYtmC7?U@rus~7O{+qdEnV{0I*fk}h~4xU#_$`7&ka{>3L+O`9KJklka)WOcd6pVMa zx3YRFeA$aPrc3f2=We+ze>OslybpUyY{%lENkhER4g@Y8+DSQb9ryrh$+rtIU=n(% zu~OLg(m>kJ$vkEn(i!EB!(A)cNf`Wn}#J_BaYQpts z!OypE-wp+v(kp1pv=v(F9HFK*2SaTX%SQz;dKf}dK>kQz!UP<$5UceX@HRj8nuc($ zdhx4Q!#uVZVuqn<#tr<56Z&L`+0(jceqlnnNeLQ$VbJ+QZu5u6MsCFQ93|kujgXfM z7cbftFD{HD7IQcWt`jFt7=o?0Q||KRd-)Af4qr?_L9swOJg@Ys2?Ikzje*EfXHYWO z6tDOIYwYiZe{|3}-|w~(wV#B93m6I)WR77dN0)$q8IH9k$(mQ~pR{O$Fwuj5`)~_x zxgPWjO=|*ZEde$(fYCRq+lx4DKD=b;#mxrr$69J>mpSLpt*r1s_mUmnW(eTp92f@f z^c8nJmf^(%gpUxUwo)e>#G#F3MzF+E47 z@UBkD0T^N*9*xsQ9Md*$_8t!fAqx^HEo5YSqo?+jsnJ;IFW3OjjsQjz4w$_S%$hdP zGb@>o$*GGziynr4*{c_PcfUV60Tx2|*B{ZTN?89Bkgq?3724+NV502G_A@ZsgL^)W zcx4SdYYj;u+NSsK-@guaH>S-gGyIaC#ZUSHVbJysg*aBKaCd71v0wq*k$g(7^vG5~ z`zd2e!0^rey~)4<&z(R2o{o+Vaq*6B9&;zA(cj*QNB}rswYU7&e|9sxgNm864iz(} S29*khLSK~?&$^;}|Nj9)d=v`+ literal 0 HcmV?d00001 diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 13ebe28b77..f68da8e013 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -94,10 +94,12 @@ For a detailed overview of building models in PyMC3, please read the appropriate First, import the PyMC3 functions and classes you will need for building your model. You can import the entire module via ``import pymc3 as pm``, or just bring in what you need:: - from pymc3 import Model, Normal, invlogit, Binomial, sample, traceplot + from pymc3 import Model, Normal, invlogit, Binomial, sample, forestplot import numpy as np -Models are defined using a context manager (``with`` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment:: +Models are defined using a context manager (``with`` statement). The model is specified declaratively inside the context manager, instantiating model variables and transforming them as necessary. Here is an example of a model for a bioassay experiment. + +:: # Data n = np.ones(4)*5 @@ -116,7 +118,9 @@ Models are defined using a context manager (``with`` statement). The model is sp # Model likelihood deaths = Binomial('deaths', n=n, p=theta, observed=y) -Save this file, then from a python shell (or another file in the same directory), call:: +Save this file, then from a python shell (or another file in the same directory), call. + +:: with bioassay_model: @@ -125,4 +129,17 @@ Save this file, then from a python shell (or another file in the same directory) # Plot two parameters traceplot(trace, varnames=['alpha', 'beta']) -This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). The sample is returned as arrays inside of a ``MultiTrace`` object, which is then passed to a plotting function. +This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). + +:: + + Auto-assigning NUTS sampler... + Initializing NUTS using ADVI... + Average Loss = 12.562: 6%|▌ | 11412/200000 [00:00<00:14, 12815.82it/s] + Convergence archived at 11900 + Interrupted at 11,900 [5%]: Average Loss = 15.168 + 100%|██████████████████████████████████████| 1500/1500 [00:01<00:00, 787.56it/s] + +The sample is returned as arrays inside of a ``MultiTrace`` object, which is then passed to a plotting function. The resulting graphic shows a forest plot of the random variables in the model, along with a convergence diagnostic (R-hat) that indicates our model has converged. + +.. image:: ./images/forestplot.png From 83fcb7875af832e361460878fa752af3e2a8e230 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 21:14:03 -0500 Subject: [PATCH 10/12] Edited text in intro --- docs/source/intro.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index f68da8e013..74514aff82 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -129,7 +129,7 @@ Save this file, then from a python shell (or another file in the same directory) # Plot two parameters traceplot(trace, varnames=['alpha', 'beta']) -This example will generate 1000 posterior samples on each of two cores, preceded by 500 tuning samples (the default number). +This example will generate 1000 posterior samples on each of two cores using the NUTS algorithm, preceded by 500 tuning samples (the default number). The sampler is also initialized using variational inference. :: From 202dd8c646ce2472a7710f278c4538c6a6732fcd Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 21:14:42 -0500 Subject: [PATCH 11/12] Fixed typo --- docs/source/intro.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/intro.rst b/docs/source/intro.rst index 74514aff82..b52d382e9f 100644 --- a/docs/source/intro.rst +++ b/docs/source/intro.rst @@ -127,7 +127,7 @@ Save this file, then from a python shell (or another file in the same directory) # Draw wamples trace = sample(1000, njobs=2) # Plot two parameters - traceplot(trace, varnames=['alpha', 'beta']) + forestplot(trace, varnames=['alpha', 'beta']) This example will generate 1000 posterior samples on each of two cores using the NUTS algorithm, preceded by 500 tuning samples (the default number). The sampler is also initialized using variational inference. From cf3b577e28f37ba42966b9f1a3e608b808273b42 Mon Sep 17 00:00:00 2001 From: Christopher Fonnesbeck Date: Thu, 6 Jul 2017 21:19:04 -0500 Subject: [PATCH 12/12] Added intro to index --- docs/source/index.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/source/index.rst b/docs/source/index.rst index 9b74d8dce3..4397941747 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -8,6 +8,7 @@ Contents: .. toctree:: :maxdepth: 3 + intro getting_started examples api