A service where anyone can run evaluations to assess how compliant to the FAIR principles is a resource, given the resource identifier (URI/URL).
An evaluation runs a collection of assessments against the resource to evaluate.
- Evaluations can be created by anyone without authentication. An evaluation takes the URI of the resource to evaluate, and a collection of assessments to run against this resource.
- Collections can be created through the API after authenticating with ORCID. A collection is a sorted list of assessments.
- Assessments are tests written in Python that can be part of a collection. Each assessment run some tests against the resource to evaluate, record the results, and pass the results to the next assessment in the collection. To create a test you will need to add a python file in the folder
backend/app/assessments
and send us a pull request (see below for more details)
Backend built with FastAPI, Pydantic, and MongoDB
Frontend built with React and Material UI
-
Poetry if you need to install new Python packages.
-
Node.js (with
npm
) andyarn
if you need to do frontend development
Create a .env
file with your development settings in the root folder of this repository (you can copy .env.sample
):
ORCID_CLIENT_ID=APP-XXX
ORCID_CLIENT_SECRET=XXXX
FRONTEND_URL=http://localhost:19006
Start the stack for development locally with Docker Compose from the root folder of this repository:
docker-compose up -d
If the database is empty you'll need to register the initial Metrics Tests with the init script:
./init_metrics_tests.sh
Now you can open your browser and interact with these URLs:
- Automatic OpenAPI documentation with Swagger UI: http://localhost/docs
- GraphQL endpoint with Strawberry: http://localhost/graphql
To check the logs of a specific service, run:
docker-compose logs backend
To delete the volume and reset the database, run:
docker-compose down
docker volume rm fair-enough_mongodb-data
You can also run this script to reset the database, and restart the docker-compose:
./reset_local_db.sh
If you need to completely reset the Python cache:
docker-compose down
sudo rm -rf **/__pycache__
docker-compose build --no-cache
Without docker:
By default, the dependencies are managed with Poetry, go there and install it.
From ./backend/
you can install all the dependencies with:
poetry install
To add new dependencies, run:
poetry add my-package
With docker:
If you don't have poetry installed locally or are facing issue with it, you can also add new packages in the running Docker container, while docker-compose
is running use:
docker-compose exec backend poetry add my-package
During development, you can change Docker Compose settings that will only affect the local development environment, in the file docker-compose.override.yml
Make sure the database is reset:
docker-compose -f docker-compose.test.yml down
Run the tests:
docker-compose -f docker-compose.test.yml up
If your stack is already up and you just want to run the tests, you can use:
docker-compose exec backend /app/tests-start.sh
That /app/tests-start.sh
script just calls pytest
after making sure that the rest of the stack is running. If you need to pass extra arguments to pytest
, you can pass them to that command and they will be forwarded.
For example, to stop on first error:
docker-compose exec backend bash /app/tests-start.sh -x
Start the stack with this command:
DOMAIN=backend sh ./scripts/test-local.sh
The ./backend
directory is mounted as a "host volume" inside the docker container (set in the file docker-compose.dev.volumes.yml
).
You can rerun the test on live code:
docker-compose exec backend /app/tests-start.sh
Because the test scripts forward arguments to pytest
, you can enable test coverage HTML report generation by passing --cov-report=html
.
To run the local tests with coverage HTML reports:
DOMAIN=backend sh ./scripts/test-local.sh --cov-report=html
To run the tests in a running stack with coverage HTML reports:
docker-compose exec backend bash /app/tests-start.sh --cov-report=html
You will need to define the ORCID OAuth app ID and secret to enable login, you can add it to your .bashrc
or .zshrc
to make it automatic everytime you boot:
export ORCID_CLIENT_ID=APP-XXXX
export ORCID_CLIENT_SECRET=XXXX
After starting the backend with docker-compose
, enter the frontend/app
directory, install the NPM packages and start the live server using the scripts in package.json
:
cd frontend/app
yarn
yarn dev
Then open your browser at http://localhost:19006
Create a .env
file with your production settings:
ORCID_CLIENT_ID=APP-XXX
ORCID_CLIENT_SECRET=XXXX
FRONTEND_URL=https://fair-enough.semanticscience.org
Deploy the app with production config:
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d --build
If the database is empty you'll need to register the initial Metrics Tests with the init script:
./init_metrics_tests.sh https://api.fair-enough.semanticscience.org
To stop the stack in production:
docker-compose -f docker-compose.yml -f docker-compose.prod.yml down
There is a main docker-compose.yml
file with all the configurations that apply to the whole stack, it is used automatically by docker-compose
.
And there's also a docker-compose.override.yml
with overrides for development, for example to mount the source code as a volume. It is used automatically by docker-compose
to apply overrides on top of docker-compose.yml
.
These Docker Compose files use the .env
file containing configurations to be injected as environment variables in the containers.
They also use some additional configurations taken from environment variables set in the scripts before calling the docker-compose
command.
It is all designed to support several "stages", like development, building, testing, and deployment. Also, allowing the deployment to different environments like staging and production (and you can add more environments very easily).
They are designed to have the minimum repetition of code and configurations, so that if you need to change something, you have to change it in the minimum amount of places. That's why files use environment variables that get auto-expanded. That way, if for example, you want to use a different domain, you can call the docker-compose
command with a different DOMAIN
environment variable instead of having to change the domain in several places inside the Docker Compose files.
Also, if you want to have another deployment environment, say preprod
, you just have to change environment variables, but you can keep using the same Docker Compose files.
Livestream logs:
- https://fastapi.tiangolo.com/advanced/websockets/
- https://amittallapragada.github.io/docker/fastapi/python/2020/12/23/server-side-events.html
Project bootstrapped with https://github.com/tiangolo/full-stack-fastapi-postgresql
This work has been largely inspired by the FAIR evaluator in Ruby
Some code for the Metrics tests has been re-used from F-UJI