Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
165 changes: 82 additions & 83 deletions Pipfile.lock

Large diffs are not rendered by default.

152 changes: 83 additions & 69 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,27 +8,27 @@
## Table of Content

- [Table of Content](#table-of-content)
- [Summary of the project](#summary-of-the-project)
- [Local development](#local-development)
- [Summary Of The Project](#summary-of-the-project)
- [Local Development](#local-development)
- [Dependencies](#dependencies)
- [Setup](#setup)
- [Updating Packages](#updating-packages)
- [Running Tests In Parallel](#running-tests-in-parallel)
- [Visual Studio Code Integration](#visual-studio-code-integration)
- [Debug from Visual Studio Code](#debug-from-visual-studio-code)
- [Run Tests From Within Visual Studio Code](#run-tests-from-within-visual-studio-code)
- [Cognito](#cognito)
- [Local Cognito](#local-cognito)
- [Importing Data from the BOD](#importing-data-from-the-bod)
- [Local Development](#local-development-1)
- [Running tests in parallel](#running-tests-in-parallel)
- [vs code Integration](#vs-code-integration)
- [Debug from vs code](#debug-from-vs-code)
- [Run tests from within vs code](#run-tests-from-within-vs-code)
- [Type Checking](#type-checking)
- [Mypy](#mypy)
- [Library types](#library-types)
- [Library Types](#library-types)

## Summary of the project
## Summary Of The Project

`service-control` provides and manages the verified permissions. TBC

## Local development
## Local Development

### Dependencies

Expand Down Expand Up @@ -58,79 +58,36 @@ You may want to do an initial sync of your database by applying the most recent
app/manage.py migrate
```

## Cognito
### Updating Packages

This project uses Amazon Cognito user identity and access management. It uses a custom user attribute to
mark users managed_by_service by this service.
All packages used in production are pinned to a minor version. Automatically updating these packages
will use the latest minor (or patch) version. Packages used for development, on the other hand, are
not pinned unless they need to be used with a specific version of a production package
(for example, boto3-stubs for boto3).

To synchronize all local users with cognito, run:
To update the packages to the latest minor or patch version, run:

```bash
app/manage.py cognito_sync
pipenv lock
pipenv sync --dev
```

### Local Cognito
To update packages to a new major release, modify the version in the Pipfile, then run the command
above.

For local testing the connection to cognito, [cognito-local](https://github.com/jagregory/cognito-local) is used.
`cognito-local` stores all of its data as simple JSON files in its volume (`.volumes/cognito/db/`).

You can also use the AWS CLI together with `cognito-local` by specifying the local endpoint, for example:

```bash
aws --endpoint $COGNITO_ENDPOINT_URL cognito-idp list-users --user-pool-id $COGNITO_POOL_ID
```

## Importing Data from the BOD

The "Betriebsobjekte Datenbank" (BOD) is a central database for running and configuring the map
viewer and some of its services. It contains metadata and translations on the individual layers
and configurations for display and serving the data through our services such as Web Map Service
(WMS), Web Map Tiling Service (WMTS) and our current api (mf-chsdi3/api3).

You can import a BOD dump and migrate its data:

```bash
make setup-bod
make import-bod file=dump.sql
app/manage.py bod_sync
```

To generate more BOD models, run:

```bash
app/manage.py inspectdb --database=bod
```

The BOD models are unmanaged, meaning Django does not manage any migrations for these models.
However, migrations are still needed during tests to set up the test BOD. To achieve this, it is
necessary to create migrations for the models and dynamically adjust the `managed` flag based on
whether the tests or the server is running (`django.conf.settings.TESTING`). Since these migrations
are only for testing purposes, the previous migration file can be removed and recreated:


```bash
rm app/bod/migrations/0001_initial.py
app/manage.py makemigrations bod
```

Afterward, the `managed` flag needs to be set to `django.conf.settings.TESTING` in both the models
and the migrations.

## Local Development

### Running tests in parallel
### Running Tests In Parallel

Run tests with, for example, 16 workers:

```bash
pytest -n 16
```

### vs code Integration
### Visual Studio Code Integration

There are some possibilities to debug this codebase from within visual studio code.

#### Debug from vs code
#### Debug from Visual Studio Code

In order to debug the service from within vs code, you need to create a launch-configuration. Create
a folder `.vscode` in the root folder if it doesn't exist and put a file `launch.json` with this content
Expand Down Expand Up @@ -164,8 +121,7 @@ Alternatively, create the file via menu "Run" > "Add Configuration" by choosing
Now you can start the server with `make serve-debug`.
The bootup will wait with the execution until the debugger is attached, which can most easily done by hitting F5.


#### Run tests from within vs code
#### Run Tests From Within Visual Studio Code

The unit tests can also be invoked inside vs code directly (beaker icon).
To do this you need to have the following settings either in
Expand All @@ -187,6 +143,64 @@ interpreter of your venv selected (`.venv/bin/python`).
You can change the Python interpreter via menu "Python: Select Interpreter"
in the Command Palette.

## Cognito

This project uses Amazon Cognito user identity and access management. It uses a custom user attribute to
mark users managed_by_service by this service.

To synchronize all local users with cognito, run:

```bash
app/manage.py cognito_sync
```

### Local Cognito

For local testing the connection to cognito, [cognito-local](https://github.com/jagregory/cognito-local) is used.
`cognito-local` stores all of its data as simple JSON files in its volume (`.volumes/cognito/db/`).

You can also use the AWS CLI together with `cognito-local` by specifying the local endpoint, for example:

```bash
aws --endpoint $COGNITO_ENDPOINT_URL cognito-idp list-users --user-pool-id $COGNITO_POOL_ID
```

## Importing Data from the BOD

The "Betriebsobjekte Datenbank" (BOD) is a central database for running and configuring the map
viewer and some of its services. It contains metadata and translations on the individual layers
and configurations for display and serving the data through our services such as Web Map Service
(WMS), Web Map Tiling Service (WMTS) and our current api (mf-chsdi3/api3).

You can import a BOD dump and migrate its data:

```bash
make setup-bod
make import-bod file=dump.sql
app/manage.py bod_sync
```

To generate more BOD models, run:

```bash
app/manage.py inspectdb --database=bod
```

The BOD models are unmanaged, meaning Django does not manage any migrations for these models.
However, migrations are still needed during tests to set up the test BOD. To achieve this, it is
necessary to create migrations for the models and dynamically adjust the `managed` flag based on
whether the tests or the server is running (`django.conf.settings.TESTING`). Since these migrations
are only for testing purposes, the previous migration file can be removed and recreated:


```bash
rm app/bod/migrations/0001_initial.py
app/manage.py makemigrations bod
```

Afterward, the `managed` flag needs to be set to `django.conf.settings.TESTING` in both the models
and the migrations.

## Type Checking

### Mypy
Expand All @@ -199,7 +213,7 @@ make type-check

This will check all files in the repository.

### Library types
### Library Types

For type-checking, the external library [mypy](https://mypy.readthedocs.io) is being used. See the [type hints cheat sheet](https://mypy.readthedocs.io/en/stable/cheat_sheet_py3.html) for help on getting the types right.

Expand Down
5 changes: 1 addition & 4 deletions app/access/admin.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
from django.contrib import admin
from django.db import models
from django.db.models import Model
from django.http import HttpRequest

from .models import User
Expand All @@ -12,6 +11,7 @@ class UserAdmin(admin.ModelAdmin): # type:ignore[type-arg]

list_display = ('user_id', 'username', 'last_name', 'first_name', 'deleted_at', 'provider')
list_filter = ('deleted_at', ('provider', admin.RelatedOnlyFieldListFilter))
readonly_fields = ('user_id', 'deleted_at', 'created', 'updated')
actions = ('disable',)

@admin.action(description="Disable selected users")
Expand All @@ -25,6 +25,3 @@ def get_queryset(self, request: HttpRequest) -> models.QuerySet[User]:
def delete_queryset(self, request: HttpRequest, queryset: models.QuerySet[User]) -> None:
for user in queryset:
user.delete()

def get_readonly_fields(self, request: HttpRequest, obj: Model | None = None) -> list[str]:
return ['user_id', 'deleted_at']
28 changes: 28 additions & 0 deletions app/access/migrations/0005_user_created_user_updated.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Generated by Django 5.2.2 on 2025-06-10 10:10

import django.utils.timezone
from django.db import migrations
from django.db import models


class Migration(migrations.Migration):

dependencies = [
('access', '0004_alter_user_username'),
]

operations = [
migrations.AddField(
model_name='user',
name='created',
field=models.DateTimeField(
auto_now_add=True, default=django.utils.timezone.now, verbose_name='Created'
),
preserve_default=False,
),
migrations.AddField(
model_name='user',
name='updated',
field=models.DateTimeField(auto_now=True, verbose_name='Updated'),
),
]
2 changes: 2 additions & 0 deletions app/access/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,8 @@ def __str__(self) -> str:

username = CustomSlugField(_(_context, "User name"), unique=True, max_length=100)
user_id = models.CharField(_(_context, "User ID"), unique=True, default=generate_short_id)
created = models.DateTimeField(_(_context, "Created"), auto_now_add=True)
updated = models.DateTimeField(_(_context, "Updated"), auto_now=True)
first_name = models.CharField(_(_context, "First name"))
last_name = models.CharField(_(_context, "Last name"))
email = models.EmailField(_(_context, "Email"))
Expand Down
3 changes: 3 additions & 0 deletions app/distributions/admin.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ class AttributionAdmin(admin.ModelAdmin): # type:ignore[type-arg]

list_display = ('attribution_id', 'name_en', 'provider')
list_filter = (('provider', admin.RelatedOnlyFieldListFilter),)
readonly_fields = ('created', 'updated')


@admin.register(Dataset)
Expand All @@ -23,6 +24,7 @@ class DatasetAdmin(admin.ModelAdmin): # type:ignore[type-arg]

list_display = ('dataset_id', 'title_en', 'provider')
list_filter = (('provider', admin.RelatedOnlyFieldListFilter),)
readonly_fields = ('created', 'updated')

def get_form(
self,
Expand Down Expand Up @@ -52,3 +54,4 @@ class PackageDistributionAdmin(admin.ModelAdmin): # type:ignore[type-arg]
'managed_by_stac',
('dataset__provider', admin.RelatedOnlyFieldListFilter),
)
readonly_fields = ('created', 'updated')
62 changes: 62 additions & 0 deletions app/distributions/management/commands/stac_sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,10 @@
from urllib.parse import urljoin

from bs4 import BeautifulSoup
from distributions.models import Attribution
from distributions.models import Dataset
from distributions.models import PackageDistribution
from provider.models import Provider
from pystac.collection import Collection
from pystac_client import Client
from requests import get
Expand All @@ -32,6 +34,7 @@ def __init__(self, command: CustomBaseCommand, options: dict['str', Any]):
self.url = options['url']
self.endpoint = options['endpoint']
self.default_dataset = options['default_dataset']
self.create_default_dataset = not options['no_create_default_dataset']
self.counts: dict[str, Counter] = {}

def increment_counter(self, model_name: str, operation: Operation, value: int = 1) -> None:
Expand All @@ -48,6 +51,59 @@ def clear_package_distributions(self) -> None:
model_name = model_class.split('.')[-1].lower()
self.increment_counter(model_name, 'cleared', count)

def ensure_default_dataset(self) -> None:
""" Create the given default dataset if required and not yet available.

This will create a provider and attribution with the same ID as the dataset.
"""

if (
not self.create_default_dataset or not self.default_dataset or
Dataset.objects.filter(dataset_id=self.default_dataset).first()
):
return

provider = Provider.objects.filter(provider_id=self.default_dataset).first()
if not provider:
self.print(f"Added provider '{self.default_dataset}' for default dataset")
provider = Provider.objects.create(
provider_id=self.default_dataset,
name_de="#Missing",
name_fr="#Missing",
name_en="#Missing",
acronym_de="#Missing",
acronym_fr="#Missing",
acronym_en="#Missing",
)

attribution = Attribution.objects.filter(attribution_id=self.default_dataset).first()
if not attribution:
self.print(f"Added attribution '{self.default_dataset}' for default dataset")
attribution = Attribution.objects.create(
attribution_id=self.default_dataset,
name_de="#Missing",
name_fr="#Missing",
name_en="#Missing",
description_de="#Missing",
description_fr="#Missing",
description_en="#Missing",
provider=provider
)

self.print(f"Added default dataset '{self.default_dataset}'")
Dataset.objects.create(
dataset_id=self.default_dataset,
title_de="#Missing",
title_fr="#Missing",
title_en="#Missing",
description_de="#Missing",
description_fr="#Missing",
description_en="#Missing",
geocat_id="#Missing",
provider=provider,
attribution=attribution
)

def update_package_distribution(
self, collection_id: str, managed_by_stac: bool
) -> Dataset | None:
Expand Down Expand Up @@ -184,6 +240,7 @@ def run(self) -> None:
self.clear_package_distributions()

# Import data
self.ensure_default_dataset()
self.import_package_distributions()

# Print counts
Expand Down Expand Up @@ -228,6 +285,11 @@ def add_arguments(self, parser: CommandParser) -> None:
default="",
help="Add packages with missing dataset to this dataset"
)
parser.add_argument(
"--no-create-default-dataset",
action="store_true",
help="Do not create the default dataset if needed"
)

def handle(self, *args: Any, **options: Any) -> None:
Handler(self, options).run()
Loading
Loading