Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KHR_draco_mesh_compression #874

Merged
merged 50 commits into from
Feb 14, 2018
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
acfd558
Initial commit
fanzhanggoogle Mar 10, 2017
6472e25
Finished most of the Readme
fanzhanggoogle Mar 10, 2017
41432dc
Added json schema file
Mar 10, 2017
66dfde1
Finished the first version
Mar 10, 2017
eae186d
Fix style issue
Mar 10, 2017
3dff72b
Changed extension name
fanzhanggoogle Mar 13, 2017
54db399
Removed support for multiple compression libraries
fanzhanggoogle Mar 13, 2017
65e79d3
Added support for point cloud in the spec
fanzhanggoogle Mar 13, 2017
b132fa0
Fix
fanzhanggoogle Mar 13, 2017
7c48fb7
Refined alternative option
fanzhanggoogle Mar 13, 2017
007e570
Removed unused file
fanzhanggoogle Mar 13, 2017
7e7ff5d
Changes on lexaknyazev's comment
fanzhanggoogle Mar 14, 2017
ca7c8ee
Fixes
fanzhanggoogle Mar 14, 2017
ddf534f
Renaming
fanzhanggoogle Mar 14, 2017
1d4ab96
Fix
fanzhanggoogle Mar 14, 2017
251452d
Address Patrick's comments
fanzhanggoogle Mar 14, 2017
260534e
Added version property and description
fanzhanggoogle Mar 14, 2017
ea273be
Changed indicesCount to indexCount
fanzhanggoogle Mar 15, 2017
198792f
grammar fix
fanzhanggoogle Mar 15, 2017
5956b53
minor fix
fanzhanggoogle Mar 19, 2017
f60ff63
Removed indexCount and vertexCount
fanzhanggoogle Apr 19, 2017
61bd686
Removed attributes in the extension
fanzhanggoogle Apr 20, 2017
2c3a051
Define order of attributes
fanzhanggoogle Apr 24, 2017
3c14401
Fix
fanzhanggoogle Apr 24, 2017
a502855
minor fix
fanzhanggoogle Apr 24, 2017
fba95f4
Merge branch 'master' into KHR_mesh_compression
fanzhanggoogle Jul 27, 2017
c765c2f
Added draco extension to extension README
fanzhanggoogle Jul 27, 2017
0cd8f60
Fix for comments
fanzhanggoogle Aug 8, 2017
6102695
Add fallback for loaders not support the extension.
fanzhanggoogle Aug 9, 2017
63f07ee
Changed last paragraph about the fallback
fanzhanggoogle Aug 9, 2017
a8f4e06
Added support for gltf attribute types that are not defined in Draco
fanzhanggoogle Aug 10, 2017
d0d5fa8
Fixed comma
fanzhanggoogle Aug 10, 2017
1da7ae2
Fix duplicate description
fanzhanggoogle Aug 10, 2017
85673d4
fix nit and added 1) attributes in extension must be subset of attrib…
fanzhanggoogle Aug 10, 2017
99b131c
Removed version property, specify draco bitstream version for the ext…
fanzhanggoogle Aug 10, 2017
306e5bc
Refine conformance
fanzhanggoogle Aug 10, 2017
ae9c8d6
removed version in schema; removed alternative approach
fanzhanggoogle Aug 16, 2017
9039f52
Force to consistently use attribute id to get data
fanzhanggoogle Aug 16, 2017
0cdbd3c
Fix example
fanzhanggoogle Aug 29, 2017
d643d96
minor fix
fanzhanggoogle Sep 12, 2017
ae30e83
Removed point cloud.
fanzhanggoogle Oct 4, 2017
2cba5c2
typo
fanzhanggoogle Oct 9, 2017
e71f0d8
Removed unused image
fanzhanggoogle Oct 9, 2017
e3cce40
Wording fix
fanzhanggoogle Nov 1, 2017
66bfc6d
more wording fixes
fanzhanggoogle Nov 1, 2017
734b3cb
more fixes
fanzhanggoogle Nov 1, 2017
e42d251
Change Draft to Complete
fanzhanggoogle Nov 1, 2017
8e9a344
Change Conformance to Recommended Loader Process; change normative
fanzhanggoogle Nov 2, 2017
1eba79a
Make conformance normative
fanzhanggoogle Nov 2, 2017
cb478d9
Merge branch 'master' into KHR_mesh_compression
pjcozzi Feb 8, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
172 changes: 172 additions & 0 deletions extensions/Khronos/KHR_mesh_compression/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
# KHR_draco_geometry_compression
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This extension is proposed by Google, so should it use GOOGLE (or EXT) prefix instead of KHR?
Also, since it extends mesh object, maybe it should be called _mesh_draco_compression to be aligned with other extension names?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We talked to Patrick and Neil and the plan was to preferably propose it as KHR extension and integrate it to glTF 2.0 if we can work something out before the 2.0 release soon. But we can start with GOOGLE and see how it goes it that's better?

The draco library could also compress point cloud, in that case the primitive will just not have indice accessor. Calling it geometry will be better since it includes mesh and point cloud.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In glTF, point clouds are represented via mesh object by using different mesh.primitive.mode value. So they're meshes in glTF terms. I think that extension's name should reflect its formal relations with core spec.

CC @pjcozzi

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed the name back to mesh. And removed geometryType in extension. Added description for restriction in README.


## Contributors

* Fan Zhang, Google, <mailto:[email protected]>
* Ondrej Stava, Google, <mailto:[email protected]>
* Frank Galligan, Google, <mailto:[email protected]>
* Kai Ninomiya, Google, <mailto:[email protected]>
* Patrick Cozzi, Cesium, [@pjcozzi](https://twitter.com/pjcozzi)

## Status

Draft

## Dependencies

Written against the glTF 2.0 spec.

## Overview

This extension defines an schema to use Draco geometry compression libraries in glTF format. This allows glTF to support streaming compressed geometry data instead of the raw data.

The [conformance](#conformance) section specifies what an implementation must to do when encountering this extension, and how the extension interacts with the attributes defined in the base specification.

## glTF Schema Updates

Draco geometry compression library could be used for `primitive` by adding an `extension` property to a primitive, and defining its `KHR_draco_geometry_compression` property.

The following picture shows the structure of the schema update.

**Figure 1**: Structure of geometry compression extension.
![](figures/structure.png)

In general, we will use the extension to point to the buffer that contains the compressed data. The major change is that `accessor` in extended `primitive` no
longer point to a `bufferView`. Instead, the `attributes` of a primitive will use the decompressed data. This is valid because in glTF 2.0, `bufferView` is not required in `accessor`, although if it is not present or the id is 0, it will be used with `sparse` field to act as a sparse accessor. In this extension, we will ignore the `bufferView` property.

Usage of the extension must be listed in the `extensionUsed` and `extensionsRequired`.

```javascript
"extensionsUsed" : [
"KHR_draco_geometry_compression"
]

"extensionsRequired" : [
"KHR_draco_geometry_compression"
]

```

The extension then could be used like:

```javascript

"mesh" : {
"primitives" : [
{
"attributes" : {
"POSITION" : 11,
"NORMAL" : 12,
"TEXCOORD_0" : 13,
},
"indices" : 10,
"mode" : 4
"extensions" : {
"KHR_draco_geometry_compression" : {
"buffer" : 10,
"byteOffset" : 1024,
"byteLength" : 10000,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use bufferView object to address Draco stream instead of "re-inventing" the same concept (buffer, byteOffset, and byteLength) here? Such bufferView would have byteStride and target properties undefined.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We thought about that. The reason for not using bufferView is that we want to reduced the number of newly introduced layers. But I think it makes sense to use bufferView here. Changed here and the json.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like buffer, byteOffset, and byteLength should be removed from example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

"indicesCount" : 1000,
"vertexCount" : 500,
"geometryType" : "mesh",
"attributesOrder" : [
"POSITION",
"NORMAL",
"TEXCOORD_0"
],
"attributesComponentTypes" : [
"VEC3",
"VEC3",
"VEC2"
],
"attributesType" : [
5126,
5126,
5126
]
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

attributesComponentTypes and attributesType must be switched to match core spec definitions.

attributesOrder, attributesComponentTypes, and attributesType arrays must be of the same size and must list elements in the same order, right? So why not combine these properties into one structure:

{
    "attributes" : [
        {
            "semantic": "POSITION",
            "type": "VEC3",
            "componentType": 5126
        },
        {
            "semantic": "NORMAL",
            "type": "VEC3",
            "componentType": 5126
        },
        {
            "semantic": "TEXCOORD_0",
            "type": "VEC2",
            "componentType": 5126
        },
    ]
}

type: "VEC3" and componentType: 5126 could be made default, since we could expect them to be the most frequent values.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea! Changed.

}
},
]
}

```
We will explain each of the property in the following sections.
#### buffer
The `byteOffset` and `byteLenght` define where is the compressed data in the
`buffer`. The data should be passed to a mesh decoder and decompressed to a
mesh.

#### attributesOrder
The decompressed mesh needs to be write to the memory for uploading to GPU,
including indices and attributes data. `attributesOrder` is used to define the
order of the attributes. For example, if we have the following
`attributesOrder`

```javascript
"attributesOrder" : [
"POSITION",
"NORMAL",
"TEXCOORD_0"
]
```

Then we need to write the attributes to vertex buffer like

--------------------------------------------------
| POSITION | NORMAL | TEXCOORD_0 |
--------------------------------------------------

#### indicesCount, vertexCount
`indicesCount` and `vertexCount` are references for verifying the decompression of
mesh.

### geometryType
`geometryType` specifies if the `primitive` is a mesh or a point cloud. If it is
a point cloud then there will be no indice data.

#### attributesComponentType, attributesType
`attributesComponentType` and `attributesType` are duplicated properties in
`accessor` of `attributes`. The purpose of having these in the extension is to
define how to write the decompressed mesh to buffer for uploading to GPU without
refering to the sibling `attributes` node.

The extension currently don't support morph targets, e.g. `targets` is used in
`primitive`.

### JSON Schema

For full details on the `KHR_binary_glTF` extension properties, see the schema:

* [extension property](schema/node.KHR_draco_geometry_compression.schema.json) `KHR_draco_geometry_compression` extensions object.

## Conformance

To process this extension, there are some changes need to be made in loading
glTF.
* When encounter a `primitive` with the extension the first time, must process the extension first. Get the data from the pointed `buffer` in the extension and decompress the data to a geometry of a specific format, e.g. Draco geometry.
* Write the indices and vertex data of the mesh to separate buffers. This allows for uploading data with different target including ARRAY_BUFFER and ELEMENT_ARRAY_BUFFER. When writing attributes to a buffer, the order should follow property `attributesOrder` as discribed above.
* Then, process `attributes` and `indices` properties of the `primitive`. When handling `accessor`, insteading of getting data from the regular `bufferView --> buffer`, get the data from the buffers created in previous step.

It is pretty straigtforward for top-down loading of glTF file, e.g. only
decompress the geometry data when a `primitive` is met the first time. However, for
bottom-up loading, loading `accessor` before `primitive` will not get the data. It could only be handled when processing its parent `primitive`. This is based on the consideration that it will rarely happen that
loading an `accessor` without knowing its parent `primitive`. And it should be
easy enought to change the loader to ignore `accessor` without `bufferView` in glTF 2.0. But we are
definitely open to change this if there actually is some use cases that require
loading `accessor` independently.

## Alternative Approach

We also thought about other designs for adding geometry compression as an extension, but we don't think they are as good as the proposed one. We will describe one here in case someone is interested.


The idea is to add an extension object `decompressedBuffer` to `bufferView` instead of `primitive`. See the following picture for the structure of the alternative approach.
`decompressedBuffer` will work like `bufferView` but pointing to compressed data. The extension will basically declare that the data pointed by the `bufferView` is compressed geometry data. Then for bottom-up loading, the process is to decompress the data when the `bufferView` is loaded, and store the data as a normal buffer of `bufferView`. So after decompression `bufferView` should work the same with or without the extension. For top-down loading, when `accessor` requires the data through `bufferView` the first time, the loader needs to look at the decompressed mesh data of `bufferView` instead of regular `bufferView` --> `buffer` approach.


The advantage of this alternative approch is that it has less impact on the modularity of layers of standard glTF spec. For example, for bottom-up loading, it only requires to load `decompressedBuffer` between `buffer` and `bufferView` so that it could replace the data used by `bufferView`. However, it will change the `bufferView` that its `buffer` property will not be used any more. Another disadvantage is that it is not good for high level understanding and looks hacky. The proposed approach is mush easier to understand and the structure is clear since the compression is applied on the mesh/primitive objects. But we are definitely open to discussions about the choice we made.


**Figure 2**: Structure of extension.
![](figures/decompressed.png)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
{
"$schema" : "http://json-schema.org/draft-04/schema",
"title" : "KHR_draco_geometry_compression extension",
"type" : "object",
"properties" : {
"buffer" : {
"allOf" : [ { "$ref" : "glTFid.schema.json" } ],
"description" : "The index of the buffer."
},
"byteOffset" : {
"type" : "integer",
"description" : "The offset into the buffer in bytes.",
"minimum" : 0
},
"byteLength" : {
"type" : "integer",
"description" : "The length of the compressed data in bytes.",
"minimum" : 0
},
"indicesCount" : {
"type" : "integer",
"description" : "The number of indices of this primitive.",
"minimum" : 0
},
"vertexCount" : {
"type" : "integer",
"description" : "The number of vertex of this primitive.",
"minimum" : 0
},
"geometryType" : {
"type" : "string",
"description" : "The type of the geometry, mesh or point cloud.",
"enum" : ["mesh", "pointcloud"],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

String-based enums should use uppercase.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

},
"attributesOrder" : {
"type" : "array",
"description" : "The order of attributes in decompressed butter.",
"items" : {
"type" : "string",
},
"minItems" : 1,
"maxItems" : 8,
"gltf_detailedDescription" : "The decompressed mesh needs to be write to the memory for uploading to GPU, including indices and attributes data. `attributesOrder` is used to define the order of the attributes."
},
"attributesComponentType" : {
"type" : "array",
"description" : "The component type of each attributes that are encoded in the compressed mesh.",
"items" : {
"type" : "integer",
"description" : "The datatype of components in the attribute.",
"enum" : [5120, 5121, 5122, 5123, 5125, 5126],
"gltf_enumNames" : ["BYTE", "UNSIGNED_BYTE", "SHORT", "UNSIGNED_SHORT", "UNSIGNED_INT", "FLOAT"],
},
"minItems" : 1,
"maxItems" : 8,
"gltf_detailedDescription" : "This is a duplicated property of each attribute that is compressed in the primitive. Used for defining how to write decompressed mesh to buffer correctly."
},
"attributesType" : {
"type" : "array",
"description" : "The component type of each attributes that are encoded in the compressed mesh.",
"items" : {
"type" : "string",
"description" : "Specifies if the attribute is a scalar, vector, or matrix.",
"enum" : ["SCALAR", "VEC2", "VEC3", "VEC4", "MAT2", "MAT3", "MAT4"],
},
"minItems" : 1,
"maxItems" : 8,
"gltf_detailedDescription" : "This is a duplicated property of each attribute that is compressed in the primitive. Used for defining how to write decompressed mesh to buffer correctly."
}
},
"additionalProperties" : false,
"required" : ["buffer", "byteOffset", "byteLength", "attributesOrder", "attributesComponentTypes", "attributesType"]
}