When first setting up this environment, please follow the following instructions. Steps 2 and 7 can be ignored if you don't want to run Stan models.
- Download and install Julia 1.4 from https://julialang.org/downloads/.
- Clone this repository in a directory on your system. Let's call the parent directory
xyz
. cd
(change directory) into the cloned repository usingcd xyz/DynamicPPL_NeurIPS
in your command line interface.- Run the Julia executable to start a new Julia session.
- Run
using Pkg; Pkg.add("DrWatson"); Pkg.activate("."); Pkg.instantiate()
.
There are two possibilities to install PyStan.
- Download and setup PyStan using the instructions in https://pystan.readthedocs.io/en/latest/getting_started.html#.
- Setup
PyCall
by linking it against the Python version on your system used to install PyStan above. Instructions are provided in https://github.com/JuliaPy/PyCall.jl#specifying-the-python-version.
In the running Julia session, execute using Conda; Conda.add("pystan=2.19.1.1", channel="conda-forge");
and re-open the Julia session.
You should now be ready to run the experiments. Try the following code to check that all the above steps were successful:
using DynamicPPL, Turing, PyCall
pystan = pyimport("pystan")
pystan.__version__ # this should give you 2.19.1.1
The above steps are only needed the first time when setting things up. Every time after that when you want to run some Julia code in this environment, follow the steps below:
cd
(change directory) into the cloned repository usingcd xyz/DynamicPPL_NeurIPS
in your command line interface.- Call the Julia executable to start a new Julia session,
- Run
using DrWatson; @quickactivate "DynamicPPL_NeurIPS"
To run the benchmarks locally, use the bash script run_locally.sh
.
In case you have access to a SLURM cluster system, you can submit each benchmark as a SLURM jobs by running submit.sh
.
After all results have been computed, you can generate visualisations and a latex tables summarizing the results by running
julia --project=. results.jl