Skip to content

Latest commit

 

History

History
34 lines (28 loc) · 7.4 KB

testability.md

File metadata and controls

34 lines (28 loc) · 7.4 KB

Testability check

Part of the Multi-team Software Delivery Assessment (README)

Copyright © 2018-2021 Conflux Digital Ltd

Licenced under CC BY-SA 4.0 CC BY-SA 4.0

Permalink: SoftwareDeliveryAssessment.com

Based on material from the following books:

Purpose: Assess the approach to testing and testability within the software system. 

Method: Use the Spotify Squad Health Check approach to assess the team's answers to the following questions, and also capture the answers:

Question Tired (1) Inspired (5)
1. Test-first (classes) - What proportion of the time do you write the test first for methods and classes? We often do not have time to use a test-first approach We use a test-first approach all the time - it's the only way to get good software!
2. Test-first (features) - What proportion of the time do you write the test first for features and behaviour? We often do not have time to use a test-first approach We use a test-first approach all the time - it's the only way to get good software!
3. Unit Test % - At what code coverage level do you deem your Unit Tests to have succeeded? Our unit tests succeed with 10% coverage Our unit tests succeed with 80% or greater coverage
4. Feature Tests % - At what feature coverage level do you deem your Feature Tests (or Behaviour Tests) to have succeeded? Our feature tests succeed with 10% coverage Our feature tests succeed with 100% coverage
5. Feature Coverage - What proportion of the features in your code is covered by a Feature Test (or Behaviour Test)? Less then 50% of our features have corresponding feature tests Every one of our features has at least one corresponding feature test
6. Test Data - What proportion of your test data is generated from scripts and automatically injected into data stores? We have manual processes for setting up test data All our test data is generated from scripts and injected into data stores as part of automated testing
7. Deployment - What proportion of your deployment pipeline code has tests covering the behaviour of build and deployment? We do not test our build and deployment code We have tests (such as Deployment Verification Tests) for the key parts of our build and deployment scripts and the code is modular and well-structured
8. Testability - What proportion of your time is spent on making the software testable? We do not spend time making our software testable We refactor regularly to make our software more testable - every sprint or week
9. CDCs/Pact/SemVer - How much do you use inter-team testing approaches such as Consumer-Driven Contracts (CDCs)/Pact/Semantic Versioning? We just use the latest versions of each component or package We use CDCs / Pact to help test interface changes. We use Semantic Versioning to communicate the meaning of changes, including any breaking changes. We strive to make no breaking changes at all using the Tolerant Reader pattern.
10. Other Code - How confident are you in the code from other teams in the organisation that you work with or consume (but not write)? Code from other teams is really flaky and unpredictable. We are confident in using code from other teams due our comprehensive automated test suites