As a modern automotive test engineer, reliance on automated solutions for the execution, reporting and evaluation of my test suites is essential. The complexity of the systems under test, and thus the amount of necessary tests is ever growing tremendously. One of the tools which can help with these tasks is tracetronic test.guide. As a user of test.guide, it is desirable to have a means to customize and structure my test reports in a simple manner.
This generator acts as a helper to create a test.guide compatible test report. Specific Python classes reflecting the different elements of a test report (TestSuite, TestCase and so on) were designed in such a way that you can create your own testsuite from these objects. This facilitates the conversion from arbitrary test report formats into a .json which test.guide can handle. With this generator, it is no more necessary to convert non-ATX formats directly into a .json for test.guide. Instead, the delivered Python classes are prefilled in a simple manner, and the .json is generated for you. On top of this, early format checks are conducted such that you will be notified right away if something is not compliant to the json schema. The currently supported test.guide json schema can be found here.
test.guide is a database application for the overview, analysis and follow-up processing of test procedures, which has been specially
developed for use in the automotive sector. It significantly facilitates the management of test resources. At the same time, it encourages
cross-role cooperation, thereby closing the gap between test execution and test management.
tracetronic test.guide Report Generator project is part of
the Automotive DevOps Platform by tracetronic. With
the Automotive DevOps Platform, we go from the big picture to the details and unite all phases of vehicle software
testing – from planning the test scopes to summarizing the test results. At the same time, continuous monitoring across
all test phases always provides an overview of all activities – even with several thousand test executions per day and
in different test environments.
You can directly install the project from GitHub using pip:
# HTTP
pip install git+https://github.com/tracetronic/testguide_report-generator/
# SSH
pip install git+ssh://[email protected]:tracetronic/testguide_report-generator/
or by adding the testguide-report-generator to your dependency management file, such as requirements.txt or pyproject.toml.
The commands which are necessary to generate test.guide reports are collected exemplarily in the example_TestSuite.py. Run the example script to generate json and zip file:
python example_TestSuite.py
The elements follow the hierarchy TestSuite --> TestCaseFolder --> TestCase --> TestStepFolder --> TestStep
. So, instances of TestCase(Folder) are added to TestSuite, and instances of TestStep(Folder) are added to TestCase. At least one TestCase or TestStep has to be added to the respective folder (see Restrictions).
In the end, the report generator will take the assembled TestSuite and generate the report. The generator output is a .json report and a .zip file containing the generated test report along with possible testcase artifacts. The .zip file can be uploaded to test.guide via the appropriate option in test.guide. The schema of the .json which test.guide expects can be found here.
A small example may look like this:
# import necessary classes for the TestSuite creation and the .json generator
from testguide_report_generator import TestSuite, TestCase, Verdict, Generator
def create_testsuite():
# create the TestSuite object
testsuite = TestSuite("All Tests", 1666698047000)
# create the TestCase object
testcase = TestCase("Test Brakes", 1666698047001, Verdict.FAILED)
# add the TestCase to the TestSuite
testsuite.add_testcase(testcase)
# initialize the generator
generator = Generator(testsuite)
# execute the generator and export the result
generator.export("output.json")
if __name__ == "__main__":
create_testsuite()
A more extensive example is given in example_TestSuite.py.
Class | Arguments | Description |
---|---|---|
TestStep | name of type string , verdict of type Verdict , (expected result of type string ) |
a fundamental teststep, is added to TestCase or TestStepFolder |
TestStepArtifact | filepath of type string , type of type TestStepArtifactType |
artifact which gets attached directly to a teststep (such as plots) |
TestStepArtifactType | the type of a teststep artifact (only used with TestStepArtifact) | |
TestStepFolder | name of type string |
contains teststeps or teststep folders, is added to TestCase |
TestCase | name of type string , timestamp of type int , verdict of type Verdict |
a testcase, may contain teststeps or teststep folders, as well as further specific elements; is added to TestCaseFolder or TestSuite |
TestCaseFolder | name of type string |
contains testcases or testcase folders, is added to TestSuite or TestCaseFolder |
TestSuite | name of type string , timestamp of type int |
the testsuite, may contain TestCases or TestCaseFolder |
Verdict | the verdict of the test object | |
Artifact | filepath of type string |
an optional artifact to an existing filepath, can be added to TestCase |
Parameter | name of type string , value of type string or int , direction of type Direction |
a testcase parameter, can be added to TestCase |
Direction | direction of a Parameter (only used with Parameter) | |
Constant | key of type string , value of type string |
a test constant, can be added to TestCase |
Attribute | key of type string , value of type string |
a test attribute, can be added to TestCase |
Review | comment of type string , author of type string , timestamp of type int |
a review may contain further specific elements; is added to TestCase |
- (): arguments in parentheses are optional
Please note that certain requirements for the creation of the test components need to be met in order to generate a valid .json. These include:
- at least one TestCase or TestCaseFolder within a TestSuite
- at least one TestCase within a TestCaseFolder
- at least one TestStep within a TestStepFolder
- names for TestSuite, TestCaseFolder, TestCase, TestStepFolder and TestStep between 1 - 120 characters
- Review comments between 10 - 10000 characters
- timestamps in milliseconds (epoch Unix time) for TestSuite and TestCase
A complete specification can be found in the schema.
At the moment, no external contributions are intended and merge requests from forks will automatically be rejected! However, we do encourage you to file bugs and request features via the issue tracker.
The documentation of the project is formatted as reStructuredText. You can generate documentation pages from this with the help of tools such as Sphinx. All necessary files are located under docs/source
. sphinx-apidoc
is used to generate the referenced modules'.rst files. Use sphinx-build
to generate the documentation in the desired format,
e.g. HTML.
If you have any questions, please contact us at [email protected] and mind our support page.
This plugin is licensed under MIT license. More information can be found inside the LICENSE file or within the LICENSES folder. Using the REUSE helper tool, you can run reuse spdx to get a bill of materials.