You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
for manual triggers probably nothing extra should be done
for defining schedules, ensure a correct error is returned when attempted (not allowed for this type of flow)
add unit tests
connect flow system and compacting service to execute HardCompacting flow
compacting service should be configured using options defined in flow configuration, or with defaults, if no config is active
propagate compacting result status to flows system up to GraphQL API:
extend flow summary/description/... for UI so that enough information is transformed
for compacting we are interested how many blocks we had initially, how many we have finally, and probably total number of records
add view API unit tests
succesful completion of root compacting should initiate "ExecuteTransform" flows of derived datasets:
those datasets would fail for now
we need to ensure error message is shown up to API level
in near future, there will be another type of transforming, which resets previous results and re-calculates derived dataset from scratch, but we are skipping this for now
cover flow system logic with solid integration tests ("time diagram" type of tests that is used for other flow system functions now)
The text was updated successfully, but these errors were encountered:
User requirements:
Technical requirements:
HardCompacting
, as we will haveSoftCompacting
at some future pointmax_slice_size
,max_slice_records
)HardCompacting
flowThe text was updated successfully, but these errors were encountered: