Skip to content

Commit b6930e5

Browse files
Support other units than nm for datasets (#7783)
* WIP: adding unit setting to datasource settings * WIP: backend: voxel size as object * WIP: Adding unit to dataset scale [ci skip] - make keep the base unit internally to nm - use configured unit for 3d space / voxel space - WIP: post fix all relevant function with Nm or Vx depending on their unit to make this as explicit as possible * finish migrating to new datasetscale format [ci skip] - current state: broken; nothing is rendered * adjust nml schema to backend changes * Fix initial dataset loading with scale in mircometers - add a lot of debugging output for testing [ci skip] * Fix td viewport camera init ( calulate in datasource unit not always in nm) - [ci skip] - Dont ignore far plane calculation * undo accidental semantic change * support conversion between larger and smaller units in 1d-3d - Full support for uncommon units is missing for 2d and 3d * format 1d length units in datasource unit to avoid precision loss * WIP: Support formatting for 2d and 3d values without always converting to nm [ci skip] * unify format function to work with multi dimensional values and not only in 1d && fix lots of tests [ci skip] * finish writing more tests for format utils * include all unit format in the datasource json validation in the frontend * Add support to convert to cm - WIP: Fix tests now that cm is supported * Add tests for uncommon units [ci skip] * rename "datasourceUnit" to "unit" everywhere * WIP: remove complex tests and add example based tests [ci skip] * add more format utils tests and fix some tests * fix / remove introduced todos of this pr * remove debugging console.logs * fix datasets e2e test * redo: use shorter decimals in scalebar * WIP: voxel unit support backend * fix vx distance calculation of shortest path between nodes * improve previous vx length calculation fix * use voxel size unit in more spots * schema, evolutions * snapshots * rename LengthUnit to Unit in frontend * further clean up of frontend code * more code cleanup frontend * update datasource json frontend validation * add changelog entry * log warning in case a mesh is loaded whose scale does not match the dataset's scale * bump api version to 7, drop 1 and 2 * document that ad-hoc meshing uses voxel size factor only * unused imports * WIP: use ngff units during explore * read ngff voxel size unit during explore * implement backend pr feedback part 1 * update nml spec snapshot * report unit in ngff streaming. change composeRequest to expect full voxel size with unit * format * unused import * Fix uploading zipped tiff (should set needsConversion=true) * changelog * rephrase code comment * WIP apply feedback * Add unit to params of wkw conversion job for dataset upload * apply pr feedback * fix adhoc mesh computation * rename datasetScale to voxelSize in frontend And remove datasetScale/voxelSize uniform from shaders as it was unused * power-of-two check on all mag dimensions * adapt voxelSize to nanometer for worker for now * use term REST API in changelog after all * use long lengthUnit names as default format * snapshots * migrate frontend to use long unit names except for format utils * rename UnitShortMap to LongUnitToShortUnitMap * fix flycam reducer tests --------- Co-authored-by: Florian M <[email protected]> Co-authored-by: Florian M <[email protected]>
1 parent 514b8d5 commit b6930e5

File tree

110 files changed

+1469
-1127
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

110 files changed

+1469
-1127
lines changed

CHANGELOG.unreleased.md

+3
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
1616
- Added the option to set a default mapping for a dataset in the dataset view configuration. The default mapping is loaded when the dataset is opened and the user / url does not configure something else. [#7858](https://github.com/scalableminds/webknossos/pull/7858)
1717
- Uploading an annotation into a dataset that it was not created for now also works if the dataset is in a different organization. [#7816](https://github.com/scalableminds/webknossos/pull/7816)
1818
- When downloading + reuploading an annotation that is based on a segmentation layer with active mapping, that mapping is now still be selected after the reupload. [#7822](https://github.com/scalableminds/webknossos/pull/7822)
19+
- Added the ability to change the unit of the dataset voxel size to any supported unit of the [ome/ngff standard](https://github.com/ome/ngff/blob/39605eec64ceff481bb3a98f0adeaa330ab1ef26/latest/index.bs#L192). This allows users to upload and work with low-resolution datasets with a different base unit than nanometer. [#7783](https://github.com/scalableminds/webknossos/pull/7783)
1920
- In the Voxelytics workflow list, the name of the WEBKNOSSOS user who started the job is displayed. [#7794](https://github.com/scalableminds/webknossos/pull/7795)
2021
- Start an alignment job (aligns the section in a dataset) via the "AI Analysis" button. [#7820](https://github.com/scalableminds/webknossos/pull/7820)
2122
- Added additional validation for the animation job modal. Bounding boxes must be larger then zero. [#7883](https://github.com/scalableminds/webknossos/pull/7883)
@@ -37,6 +38,8 @@ For upgrade instructions, please check the [migration guide](MIGRATIONS.released
3738
- Fixed that dataset composition did not work when selecting only one dataset for composition. [#7889](https://github.com/scalableminds/webknossos/pull/7889)
3839

3940
### Removed
41+
- REST API versions 1 and 2 are no longer supported. Current is 7.
42+
data
4043
- If the datasource-properties.json file for a dataset is missing or contains errors, WEBKNOSSOS no longer attempts to guess its contents from the raw data. Exploring remote datasets will still create the file. [#7697](https://github.com/scalableminds/webknossos/pull/7697)
4144

4245
### Breaking Changes

MIGRATIONS.unreleased.md

+1
Original file line numberDiff line numberDiff line change
@@ -17,3 +17,4 @@ User-facing changes are documented in the [changelog](CHANGELOG.released.md).
1717
- [114-ai-models.sql](conf/evolutions/114-ai-models.sql)
1818
- [115-annotation-locked-by-user.sql](conf/evolutions/115-annotation-locked-by-user.sql)
1919
- [116-drop-overtimemailinglist.sql](conf/evolutions/116-drop-overtimemailinglist.sql)
20+
- [117-voxel-size-unit.sql](conf/evolutions/117-voxel-size-unit.sql)

app/controllers/AnnotationController.scala

+2-2
Original file line numberDiff line numberDiff line change
@@ -398,8 +398,8 @@ class AnnotationController @Inject()(
398398
allItems.grouped(batchSize).toList
399399
}
400400

401-
private def percent(done: Int, todo: Int) = {
402-
val value = done.toDouble / todo.toDouble * 100
401+
private def percent(done: Int, pending: Int) = {
402+
val value = done.toDouble / pending.toDouble * 100
403403
f"$value%1.1f %%"
404404
}
405405

app/controllers/AnnotationIOController.scala

+3-3
Original file line numberDiff line numberDiff line change
@@ -424,7 +424,7 @@ class AnnotationIOController @Inject()(
424424
"temp",
425425
fetchedAnnotationLayers,
426426
Some(annotation),
427-
dataset.scale,
427+
dataset.voxelSize,
428428
None,
429429
organizationName,
430430
conf.Http.uri,
@@ -452,7 +452,7 @@ class AnnotationIOController @Inject()(
452452
volumeVersion,
453453
skipVolumeData,
454454
volumeDataZipFormat,
455-
dataset.scale)
455+
dataset.voxelSize)
456456
} ?~> "annotation.download.fetchVolumeLayer.failed"
457457
fetchedSkeletonLayers: List[FetchedAnnotationLayer] <- Fox.serialCombined(annotation.skeletonAnnotationLayers) {
458458
skeletonAnnotationLayer =>
@@ -464,7 +464,7 @@ class AnnotationIOController @Inject()(
464464
name,
465465
fetchedSkeletonLayers ::: fetchedVolumeLayers,
466466
Some(annotation),
467-
dataset.scale,
467+
dataset.voxelSize,
468468
None,
469469
organizationName,
470470
conf.Http.uri,

app/controllers/JobController.scala

+10-4
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
package controllers
22

33
import play.silhouette.api.Silhouette
4-
import com.scalableminds.util.geometry.Vec3Int
4+
import com.scalableminds.util.geometry.{BoundingBox, Vec3Double, Vec3Int}
55
import com.scalableminds.util.accesscontext.GlobalAccessContext
66
import com.scalableminds.util.tools.Fox
77
import models.dataset.{DataStoreDAO, DatasetDAO, DatasetService}
@@ -19,7 +19,7 @@ import java.util.Date
1919
import javax.inject.Inject
2020
import scala.concurrent.ExecutionContext
2121
import com.scalableminds.util.enumeration.ExtendedEnumeration
22-
import com.scalableminds.util.geometry.BoundingBox
22+
import com.scalableminds.webknossos.datastore.models.{LengthUnit, VoxelSize}
2323
import models.team.PricingPlan
2424

2525
object MovieResolutionSetting extends ExtendedEnumeration {
@@ -109,12 +109,18 @@ class JobController @Inject()(
109109
}
110110

111111
// Note that the dataset has to be registered by reserveUpload via the datastore first.
112-
def runConvertToWkwJob(organizationName: String, datasetName: String, scale: String): Action[AnyContent] =
112+
def runConvertToWkwJob(organizationName: String,
113+
datasetName: String,
114+
scale: String,
115+
unit: Option[String]): Action[AnyContent] =
113116
sil.SecuredAction.async { implicit request =>
114117
log(Some(slackNotificationService.noticeFailedJobRequest)) {
115118
for {
116119
organization <- organizationDAO.findOneByName(organizationName) ?~> Messages("organization.notFound",
117120
organizationName)
121+
voxelSizeFactor <- Vec3Double.fromUriLiteral(scale).toFox
122+
voxelSizeUnit <- Fox.runOptional(unit)(u => LengthUnit.fromString(u).toFox)
123+
voxelSize = VoxelSize.fromFactorAndUnitWithDefault(voxelSizeFactor, voxelSizeUnit)
118124
_ <- bool2Fox(request.identity._organization == organization._id) ~> FORBIDDEN
119125
dataset <- datasetDAO.findOneByNameAndOrganization(datasetName, organization._id) ?~> Messages(
120126
"dataset.notFound",
@@ -124,7 +130,7 @@ class JobController @Inject()(
124130
"organization_name" -> organizationName,
125131
"organization_display_name" -> organization.displayName,
126132
"dataset_name" -> datasetName,
127-
"scale" -> scale
133+
"scale" -> voxelSize.toNanometer.toUriLiteral
128134
)
129135
job <- jobService.submitJob(command, commandArgs, request.identity, dataset._dataStore) ?~> "job.couldNotRunCubing"
130136
js <- jobService.publicWrites(job)

0 commit comments

Comments
 (0)