You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How resultQuality documents the quality of a result is not documented in Sosa. The examples that we have so far focus on nominal or static measurement errors. Example B.9 DHT22 Description makes use of examples classes to record the FairQuality of results which is underspecified.
Some use cases:
GPS units and other "smart" sensor will report confidence intervals, HVOP, Circular Error Probable or rms95 along with a position.
WaterQuality data where, in order to assess the resultQuality (ex : turbidity, suspended materials, …), extra observations are taken at the same time (ex : weather).
Conclusions:
Documenting these patterns is a must have but sure we’ll land in a ‘that depends on your UseCase situation’.
it will be chaos if we don't provide canonical patterns - any number of complex result formats and no interoperability
In some cases we’ll advise Observation via hasResultQuality to provide some Quality tags (‘raw data’, ‘expert validated’, …)
It may be possible to reuse some of our previous structures to do this.
(Related ticket #138 ; filling this out from email exchanges with @sgrellet @rob-metalinkage )
How resultQuality documents the quality of a result is not documented in Sosa. The examples that we have so far focus on nominal or static measurement errors. Example
B.9 DHT22 Description
makes use of examples classes to record theFairQuality
of results which is underspecified.Some use cases:
Conclusions:
The text was updated successfully, but these errors were encountered: