Skip to content

Latest commit

 

History

History

integration

System Integration

Prerequisites

  • The offline workload needs to be executed once
  • The new system should:
    • provide a Python client library
    • support bulk loading using a CSV format

Integration steps

  • Create a folder with the name of your system under systems/ and install your database inside.

  • Install the Python client library inside the virtual environment (TSMvenv).

  • Load the datasets located under the datasets/ folder. The column names of the datasets are: time, id_station, and s0,s1 ... s99. Examples of loading scripts are provided in systems/{system}/load.sh.

  • Create a file called queries.sql that implements the queries. Make sure to keep the variables <sid> ,<stid> and <timestamp> as placeholders (see example here). Each query should be added as a new line.

  • Create a script called launch.sh to launch the database (see example here).

  • Create a Python script called run_system.py to run the queries. The script should follow this template.

    • Note: The timestamp format should be updated according to one of the system (e.g., "YYYY-MM-DDTHH:mm:ss" for MonetDB, "YYYY-MM-DD HH:mm:ss" for QuestDB, etc.).
  • Add the name of your system's folder to config.py.

  • Execute the offline worloakd. The benchmark should report the runtime of the new system

  • To execute the online workload, three additional scripts need to be added:

    • __init__.py: script to use your folder as a Python module that should follow this template. You need to replace "system" with "new_system_name".
    • start.py: script to launch your system that should follow this template.
    • add_data.py: script to add and delete data that should follow this template.