Skip to content

Preparing assets for annotation

angelxuanchang edited this page Jul 14, 2024 · 7 revisions

Registering your assets

Regardless of what you are trying to annotate, the most important step is to make your assets available to the framework. To do so, you will need to:

  1. Decide on a name for your asset and prepare your asset files, and put them somewhere that is web accessible.
    One way to do this is to put your assets under server/static/assets/<yourassetname>

    Your assets are then accessible at http://localhost:8010/resources/assets/

    A logical way to organize your assets is by asset id. For example, for the semantic segmentation task you will need a input scan (.ply), an over-segmentation (.segs.json), and a possible screenshot to preview your asset (.png). They can be organized as follows:

       server/static/assets/<yourassetname>/<assetid>/<assetid>.ply        
       server/static/assets/<yourassetname>/<assetid>/<assetid>.segs.json        
       server/static/assets/<yourassetname>/<assetid>/<assetid>.png
    
  2. Prepare metadata file describing your asset and a index listing your asset. The simplest index is a csv file that lists your asset id, and any per asset information you would like to store. If you plan to have a lot of assets (> 10,000), you should use a database or solr to index your assets.

    See server/static/data/nyuv2/nyuv2.json and server/static/data/nyuv2/nyuv2.csv for example json metadata and csv listing the assets. Copy and modify as needed for your asset.

    Add your assets to server/static/data/assets-extra.json (used for web access of your assets) and ssc/data/assets.json (used for batch rendering - see step 4 below).

  3. Render screenshots of your assets using ssc/render.js or any other external tool. We provide two scripts for rendering (see ssc for more details) You will need to cd ssc and do npm install before ssc scripts can be used.

    ssc/render.js fetches your assets remotely and requires having your assets registered (as specified in step 2). Once your assets are registered, you can

      # Render specific asset
      NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --id <assetid>
      # Render all assets for given <yourassetname>
      NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --id all
      # Render assets with ids (one line each) specified in <ids_file>
      NODE_BASE_URL=<path to your assets or server> ./render.js --source <yourassetname> --ids_file <ids_file>
    

    ssc/render-file.js allows for rendering directly from file (without registering your assets).

    If you skip this step, then your assets won't have a preview picture (e.g. the view in step 5 will have a blank space)

  4. Make your assets web accessible. You can either

    • Symlink your data so it resides under server/static or
    • Update server/app/routes/proxy-rules.js so your assets are available on the server. This is only necessary if your assets is externally hosted. You can skip this step if you already put your assets under server/static.
  5. Create view to browse your assets.

    Example view is at: server/proj/scannet/views/nyuv2-annotations.jade

    Run the getexamples.sh script in the repository root to download one example scene from NYUv2 and connect to this view.

    The view can be accessed at: http://localhost:8010/scans/nyuv2

    Note that the above URL path is routed through the parameters in the server/proj/index.js file.

    The route is hooked up in server/proj/scannet/index.js

       app.get('/nyuv2', function (req, res) { res.render('nyuv2-annotations', { baseUrl: config.baseUrl }); });
    

    You can add a link to your view in the main page at server/static/html/index.html

Example asset metadata files

Basic

Here is an example of a basic asset metadata file that assumes the assets are placed under server/static/assets/yourassetname and with special per-asset fields segment-annotations-manual and scan-model-alignments that are used to export the annotations.

{
  "source": "yourassetname",
  "assetType": "scan",
  "rootPath": "${baseUrl}/assets/yourassetname",
  "screenShotPath": "${rootPath}/${id}/${id}.png",
  "hasThumbnail": false,
  "assetFields": ["segment-annotations-manual", "scan-model-alignments"],
  "formats": [
    { "format": "ply",
      "path": "${rootPath}/${id}/${id}.ply",
      "defaultUp": [ 0, 0, 1 ], "defaultFront": [ -1, 0, 0], "defaultUnit": 1,
      "materialSidedness": "Front",
      "useVertexColors": true,
      "computeNormals": true
    }
  ],
  "surfaces": {
    "format": "segmentGroups",
    "file":  "${rootPath}/${id}/${id}.segs.json"
  },
  "segment-annotations-manual": {
    "format": "segmentGroups",
    "files": {
      "annIds": "${baseUrl}/scans/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&condition[$in]=manual",
      "segmentGroups": "${baseUrl}/scans/segment-annotations/aggregated?annId=${annId}",
      "segments": "${rootPath}/${id}/${id}.segs.json",
      "annotatedAssetIds": "${baseUrl}/scans/segment-annotations/list?$columns=itemId&format=json&condition[$in]=manual"
    }
  },
  "scan-model-alignments": {
    "files": {
      "annIds": "${baseUrl}/annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&type=scan-model-align&condition[$in]=manual",
      "annotatedAssetIds": "${baseUrl}/annotations/list?$columns=itemId&format=json&type=scan-model-align&condition[$in]=manual"
    }
  }
}

NYUv2

Here is an example asset metadata json file for mesh reconstructions of the NYUv2 dataset (${baseUrl} is the base url path for the scene toolkit webapp). This example is provided with the code (server/static/data/scannet/nyuv2.json).

{
  "source": "nyuv2",
  "assetType": "scan",
  "rootPath": "${baseUrl}/nyuv2",
  "screenShotPath": "${rootPath}/${id}/${id}_vh_clean_2.png",
  "hasThumbnails": true,
  "formats": [
    { "format": "ply",
      "path": "${rootPath}/${id}/${id}_vh_clean_2.ply",
      "defaultUp": [ 0, 0, 1 ], "defaultFront": [ 0, -1, 0], "defaultUnit": 1,
      "useVertexColors": true,
      "computeNormals": true 
    }
  ],
  "surfaces": {
    "format": "segmentGroups",
    "files": {
      "segmentGroups": "${rootPath}/${id}/${id}.aggregation.json",
      "segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
    }
  },
  "segment-annotations-raw": {
    "format": "segmentGroups",
    "files": {
      "annIds": "${baseUrl}/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&format=json&condition[$in]=nyu-new-batch1",
      "segmentGroups": "${baseUrl}/segment-annotations/aggregated?annId=${annId}",
      "segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
    }
  },
  "segment-annotations-clean": {
    "format": "segmentGroups",
    "files": {
      "annIds": "${baseUrl}/segment-annotations/list?itemId=${fullId}&$columns=id,workerId,data&$clean=true&status=cleaned&format=json&condition[$in]=nyu-new-batch1",
      "segmentGroups": "${baseUrl}/segment-annotations/aggregated?annId=${annId}&$clean=true&status=cleaned",
      "segments": "${rootPath}/${id}/${id}_vh_clean_2.0.010000.segs.json"
    }
  }
}

Preparing assets for specific annotation tasks

Please see Scan Annotation Pipeline for details on how to prepare your scans for annotation.

Clone this wiki locally