Realtime economic monitoring.
This project was broken down into three components:
- Data Gathering
- Forcasting and Data Smoothing
- Data Visualization
Data gathering was broken into several components:
- Get raw tables from Statcan
- Structuring the data into fixed tables
- Storing resulting tables
The code for data gathering can be found under gathernomics (gather -ing eco- nomics data).
The current output format of data gatherer is a CSV file with the following attributes.
Field | Type | Description |
---|---|---|
value | Integer | CAD value of datum |
indicator | String | Economic indicator of Value |
category | String | Broader category of economic indicator |
frequency | String | Frequency which new data is recorded |
date | ISO Date | Date of the datum (YYYY-mm-DD ) |
There is a sample config file provided in the project as config.json. The files provide an ability to add new tables and indicate the source of the data.
Download Python requirements.
$ pip3 install psycopg --user
$ pip3 install psycopg-binary --user
$ pip3 install urllib3
Setup environment
$ # From project root director
$ export PYTHONPATH=`pwd`:$PYTHONPATH
Run Program:
$ # From project root director
$ python3 gathernomics
Remaining works with Data Gathering:
- Store results of the table dumps in a Postgresql DB
- Add filters to gather more granualized capital economic data
- Read from Statcan's daily Delta File to more efficiently grab data
The Postgresql database back-end could not be fully implemented, though a significant amount of the modeling work has been complete.