All contributions to this project are welcome. If you want to add or updates sources and don't want to use GitHub, you can use the form here and the MobilityData team will add it.
If you want to use GitHub:
To propose changes, we encourage contributors to:
- Fork this project on GitHub
- Create a new branch, and
- Propose their changes by opening a new pull request. We encourage contributors to format pull request titles following the Conventional Commit Specification.
Have you encountered an error? A critical step in troubleshooting is being able to reproduce the problem. You can create a bug ticket with reproduction steps and we will review it.
Note that adding or updating sources manually is possible, although not recommended as it increases the risk of introducing incorrect or invalid information into your branch and pull request. Using the operations makes sure your JSON files are valid.
Note that your contribution must pass all of our tests, as implemented in the CI workflows of this project repository, to be merged into the main branch. To pass our tests, make sure that your contribution conforms to the appropriate JSON schema and that the ID and direct download URL values contributed for a source are not already in the Mobility Database Catalogs.
To contribute data to the Mobility Database catalogs, please follow these steps:
- Check our sources to see if any of them match the one you want to add or update.
- Clone our repository on your local machine using the HTTPS protocol
git clone https://github.com/MobilityData/mobility-database-catalogs.git
. - Create a new branch for your contribution
git checkout -b $YOUR_NEW_BRANCH
. Note that you can list the existing branches withgit branch -a
to make sure your branch name is not already used. - Set up the Python environment following the instructions in our README.md.
- Contribute data.
- Test your contribution.
- Add your contribution files to the git staging area with
git add $YOUR_FILES
where$YOUR_FILES
is a list of files or the directory where your modifications are. - Commit your contribution including a message explaining your contribution with
git commit -m "$YOUR_COMMIT_MESSAGE"
. - Push your contribution to your branch on the origin repository with
git push origin $YOUR_NEW_BRANCH
- Go to the repository pull requests page and open a draft pull request with your branch. If you are adding a GTFS Schedule source, your pull request must include the string "[SOURCES]" at the end of its title. Eg. "feat: Add Montreal GTFS Source [SOURCES]".
- Modify your contribution as many times as needed following steps 4 to 8.
- When your contribution is ready, convert your pull request from draft to ready for review and request a review from a team member at Mobility Data. Not that if you need to modify your contribution after this step, you will be asked to convert your pull request back to draft.
- Once your pull request is converted to ready for review and that all the checks have passed, we will approve and merge it.
The easiest way to add a GTFS Schedule source is to use the operation tools.operations.add_gtfs_schedule_source
through the Python interpreter or in your scripts. This operation makes sure the information provided is correct and will pass our tests. Provide the information about your source in the operation function to add your source.
If your GTFS Schedule source requires API authentication, please use the form instead of the PR process so we can generate our own API credentials for the source. Sources with authentication_type=1
or authentication_type=2
can't be added from forks, but those with authentication_type=0
can.
>>> add_gtfs_schedule_source(
provider=$YOUR_SOURCE_PROVIDER_NAME,
country_code=$YOUR_SOURCE_COUNTRY_CODE,
direct_download_url=$YOUR_SOURCE_DIRECT_DOWNLOAD_URL,
authentication_type=$OPTIONAL_AUTHENTICATION_TYPE,
authentication_info_url=$CONDITIONALLY_REQUIRED_AUTHENTICATION_INFO_URL,
api_key_parameter_name=$CONDITIONALLY_REQUIRED_API_KEY_PARAMETER_NAME,
api_key_parameter_value=$NOT_STORED_API_KEY_PARAMETER_VALUE,
subdivision_name=$OPTIONAL_SUBDIVISION_NAME,
municipality=$OPTIONAL_MUNICIPALITY,
license_url=$OPTIONAL_LICENSE_URL,
name=$OPTIONAL_SOURCE_NAME,
features=[$OPTIONAL_FEATURE_ARRAY],
status=$OPTIONAL_STATUS,
feed_contact_email=$OPTIONAL_FEED_CONTACT_EMAIL,
redirects=[{
"id": $OPTIONAL_REDIRECT_ID,
"comment": $OPTIONAL_REDIRECT_COMMENT
}],
)
The easiest way to add a GTFS Realtime source is to use the operation tools.operations.add_gtfs_realtime_source
through the Python interpreter or in your scripts. Provide the information about your source in the operation function to add your source.
>>> add_gtfs_realtime_source(
entity_type=[$YOUR_SOURCE_ARRAY_OF_ENTITY_TYPES],
provider=$YOUR_SOURCE_PROVIDER_NAME,
direct_download_url=$YOUR_SOURCE_DIRECT_DOWNLOAD_URL,
authentication_type=$YOUR_SOURCE_AUTHENTICATION_TYPE,
authentication_info_url=$CONDITIONALLY_REQUIRED_AUTHENTICATION_INFO_URL,
api_key_parameter_name=$CONDITIONALLY_REQUIRED_API_KEY_PARAMETER_NAME,
license_url=$OPTIONAL_LICENSE_URL,
name=$OPTIONAL_SOURCE_NAME,
static_reference=[$OPTIONAL_ARRAY_OF_STATIC_REFERENCE_NUMERICAL_IDS],
note=$OPTIONAL_SOURCE_NOTE,
)
The easiest way to update a GTFS Schedule source is to use the operation tools.operations.update_gtfs_schedule_source
through the Python interpreter or in your scripts.
The default value for every parameter is None
. Note that once a parameter value is added, it cannot be set to None
again.
mdb_source_id
and data_type
cannot be updated. All other parameters can.
If your GTFS Schedule source requires API authentication, please use the form instead of the PR process to update the source. Sources with authentication_type=1
or authentication_type=2
can't be updated from forks, but those with authentication_type=0
can.
>>> update_gtfs_schedule_source(
mdb_source_id=$YOUR_SOURCE_NUMERICAL_ID,
provider=$OPTIONAL_SOURCE_PROVIDER_NAME,
name=$OPTIONAL_SOURCE_NAME,
country_code=$OPTIONAL_SOURCE_COUNTRY_CODE,
subdivision_name=$OPTIONAL_SOURCE_SUBDIVISION_NAME,
municipality=$OPTIONAL_SOURCE_MUNICIPALITY,
direct_download_url=$OPTIONAL_SOURCE_DIRECT_DOWNLOAD_URL,
authentication_type=$OPTIONAL_AUTHENTICATION_TYPE,
authentication_info_url=$CONDITIONALLY_REQUIRED_AUTHENTICATION_INFO_URL,
api_key_parameter_name=$CONDITIONALLY_REQUIRED_API_KEY_PARAMETER_NAME,
api_key_parameter_value=$NOT_STORED_API_KEY_PARAMETER_VALUE,
license_url=$OPTIONAL_LICENSE_URL,
features=[$OPTIONAL_FEATURE_ARRAY],
status=$OPTIONAL_STATUS,
feed_contact_email=$OPTIONAL_FEED_CONTACT_EMAIL,
redirects=[{
"id": $OPTIONAL_REDIRECT_ID,
"comment": $OPTIONAL_REDIRECT_COMMENT
}],
)
The easiest way to update a GTFS Realtime source is to use the operation tools.operations.update_gtfs_realtime_source
through the Python interpreter or in your scripts.
The default value for every parameter is None
. Note that once a parameter value is added, it cannot be set to None
again.
mdb_source_id
and data_type
cannot be updated. All other parameters can.
>>> update_gtfs_realtime_source(
mdb_source_id=$YOUR_SOURCE_NUMERICAL_ID,
entity_type=[$YOUR_SOURCE_ARRAY_OF_ENTITY_TYPES],
provider=$YOUR_SOURCE_PROVIDER_NAME,
direct_download_url=$YOUR_SOURCE_DIRECT_DOWNLOAD_URL,
authentication_type=$YOUR_SOURCE_AUTHENTICATION_TYPE,
authentication_info_url=$CONDITIONALLY_REQUIRED_AUTHENTICATION_INFO_URL,
api_key_parameter_name=$CONDITIONALLY_REQUIRED_API_KEY_PARAMETER_NAME,
license_url=$OPTIONAL_LICENSE_URL,
name=$OPTIONAL_SOURCE_NAME,
static_reference=[$OPTIONAL_ARRAY_OF_STATIC_REFERENCE_NUMERICAL_IDS],
note=$OPTIONAL_SOURCE_NOTE,
)
You need to
- Manually search for the JSON file you want in your fork. For example, finding
ca-ontario-toronto-transit-commission-gtfs-732.json
in theschedule
folder. - Modify the file name. For example, updating the file name to
ca-ontario-ttc-gtfs-732.json
. - Modify the
latest.url
to match the new file name. For example, changinghttps://storage.googleapis.com/storage/v1/b/mdb-latest/o/ca-ontario-toronto-transit-commission-gtfs-732.zip?alt=media
tohttps://storage.googleapis.com/storage/v1/b/mdb-latest/o/ca-ontario-toronto-ttc-gtfs-732.zip?alt=media
.
To contribute operations to the mobility database catalogs, it is suggested that you follow the steps below.
- Check our operations under
tools.operations
to see if the operation you want to add is not already implemented. - Check our representations under
tools.representations
to see if there are methods that might be useful for your contribution. - Check our helpers under
tools.helpers
to see if there are functions that might be useful for your contribution. - Check our constants under
tools.constants
to see if there are constants that might be useful for your contribution.
- Contribute the operation under
tools.operations
. Note that the code added intools.operations
should only implement the business logic of the operation (filter, attribute assignment, etc.). Any functions to create, verify, extract data/metadata or import/export data must be implemented intools.helpers
and called by the operation intools.operations
. - Contribute the representations needed for your operation under
tools.representations
. - Contribute the helpers needed for your operation under
tools.helpers
. - Contribute the constants needed for your operation under
tools.constants
. - Contribute the unit tests for every function or operation you added after running tests locally.
"Sticking to a single consistent and documented coding style for this project is important to ensure that code reviewers dedicate their attention to the functionality of the validation, as opposed to disagreements about the coding style (and avoid bike-shedding)."
This project uses the Black code formatter, which is compliant with PEP8, the style guide for Python code. A pre-commit hook file is provided with the repo so it is easy to apply Black before each commit. A CI workflow is testing the code pushed in a pull request to make sure it is PEP8 compliant.
This project includes unit and integration tests in order to:
- Verify the implementation behaves as expected in tests as well as on real data
- Make sure any new code does not break existing code
Run the following command at the root of the project to run Python tests:
$ pytest