mirror of
https://git.mirrors.martin98.com/https://github.com/google/draco
synced 2025-09-13 04:03:15 +08:00
Updated Draco to 1.2.2
Fixed issues when parsing ill-formatted .obj files + various other bugfixes
This commit is contained in:
parent
a7c0d80a71
commit
441d5e05f7
@ -5,8 +5,12 @@
|
||||
|
||||
News
|
||||
=======
|
||||
### Version 1.2.2 release
|
||||
The latest version of Draco brings a number of small bug fixes
|
||||
* Fixed issues when parsing ill-formatted .obj files
|
||||
|
||||
### Version 1.2.1 release
|
||||
The latest version of Draco brings a number of enhancements to reduce decoder size and various other fixes
|
||||
The 1.2.1 version of Draco brings a number of enhancements to reduce decoder size and various other fixes
|
||||
* Javascript and WebAssembly decoder size reduced by 35%
|
||||
* Added specialized Javascript and Webassembly decoders for GLTF (size reduction about 50% compared to the previous version)
|
||||
|
||||
|
56
javascript/npm/draco3dgltf/README.md
Normal file
56
javascript/npm/draco3dgltf/README.md
Normal file
@ -0,0 +1,56 @@
|
||||
|
||||
<p align="center">
|
||||
<img src="https://github.com/google/draco/raw/master/docs/DracoLogo.jpeg" />
|
||||
</p>
|
||||
|
||||
Description - glTF Draco Mesh Compression Extension
|
||||
===================================================
|
||||
|
||||
|
||||
[Draco] is a library for compressing and decompressing 3D geometric [meshes] and [point clouds]. It is intended to improve the storage and transmission of 3D graphics.
|
||||
The [GL Transmission Format (glTF)](https://github.com/KhronosGroup/glTF) is an API-neutral runtime asset delivery format. glTF bridges the gap between 3D content creation tools and modern 3D applications by providing an efficient, extensible, interoperable format for the transmission and loading of 3D content.
|
||||
|
||||
This package is a build for encoding/decoding [Draco mesh compression extension](https://github.com/KhronosGroup/glTF/pull/874) in glTF specification. It could be used to compress the meshes in glTF assets or to decode the buffer data that belongs to a Draco mesh compression extension. For more detail, please read the extension spec.
|
||||
|
||||
TODO: Add glTF branch url.
|
||||
|
||||
NPM Package
|
||||
===========
|
||||
|
||||
The code shows a simple example of using Draco encoder and decoder with Node.js.
|
||||
`draco_encoder_node.js` and `draco_decoder_node.js` are modified Javascript
|
||||
encoding/decoding files that are compatible with Node.js.
|
||||
`draco_nodejs_example.js` has the example code for usage.
|
||||
Here we use a Draco file as an example, but when it's used with glTF assets, the
|
||||
Draco file should be instead some buffer data contained in the binary data.
|
||||
|
||||
How to run the code:
|
||||
|
||||
(1) Install draco3dgltf package :
|
||||
|
||||
~~~~~ bash
|
||||
$ npm install draco3dgltf
|
||||
~~~~~
|
||||
|
||||
(2) Run example code to test:
|
||||
|
||||
~~~~~ bash
|
||||
$ cp node_modules/draco3dgltf/draco_nodejs_example.js .
|
||||
$ cp node_modules/draco3dgltf/bunny.drc .
|
||||
$ node draco_nodejs_example.js
|
||||
~~~~~
|
||||
|
||||
The code loads the [Bunny] model, it will first decode to a mesh
|
||||
and then encode it with different settings.
|
||||
|
||||
glTF Extension
|
||||
==============
|
||||
|
||||
The above example shows how to decode compressed data from a binary file. To use with glTF assets. The decoder should be applied to the data of the `bufferView` that belongs to a Draco extension. Please see the spec for detailed instruction on loading/exporting Draco extension.
|
||||
|
||||
References
|
||||
==========
|
||||
[Draco]: https://github.com/google/draco
|
||||
[Bunny]: https://graphics.stanford.edu/data/3Dscanrep/
|
||||
|
||||
Bunny model from Stanford's graphic department <https://graphics.stanford.edu/data/3Dscanrep/>
|
BIN
javascript/npm/draco3dgltf/bunny.drc
Normal file
BIN
javascript/npm/draco3dgltf/bunny.drc
Normal file
Binary file not shown.
11
javascript/npm/draco3dgltf/draco3dgltf.js
Normal file
11
javascript/npm/draco3dgltf/draco3dgltf.js
Normal file
@ -0,0 +1,11 @@
|
||||
/**
|
||||
* @fileoverview Main file for draco3d package.
|
||||
*/
|
||||
|
||||
var createEncoderModule = require('./draco_encoder_gltf_nodejs');
|
||||
var createDecoderModule = require('./draco_decoder_gltf_nodejs');
|
||||
|
||||
module.exports = {
|
||||
createEncoderModule,
|
||||
createDecoderModule
|
||||
}
|
26
javascript/npm/draco3dgltf/draco_decoder_gltf_nodejs.js
Normal file
26
javascript/npm/draco3dgltf/draco_decoder_gltf_nodejs.js
Normal file
File diff suppressed because one or more lines are too long
27
javascript/npm/draco3dgltf/draco_encoder_gltf_nodejs.js
Normal file
27
javascript/npm/draco3dgltf/draco_encoder_gltf_nodejs.js
Normal file
File diff suppressed because one or more lines are too long
150
javascript/npm/draco3dgltf/draco_nodejs_example.js
Normal file
150
javascript/npm/draco3dgltf/draco_nodejs_example.js
Normal file
@ -0,0 +1,150 @@
|
||||
// Copyright 2017 The Draco Authors.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
//
|
||||
'use_strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const assert = require('assert');
|
||||
const draco3dgltf = require('draco3dgltf');
|
||||
const decoderModule = draco3dgltf.createDecoderModule({});
|
||||
const encoderModule = draco3dgltf.createEncoderModule({});
|
||||
|
||||
fs.readFile('./bunny.drc', function(err, data) {
|
||||
if (err) {
|
||||
return console.log(err);
|
||||
}
|
||||
console.log("Decoding file of size " + data.byteLength + " ..");
|
||||
// Decode mesh
|
||||
const decoder = new decoderModule.Decoder();
|
||||
const decodedGeometry = decodeDracoData(data, decoder);
|
||||
// Encode mesh
|
||||
encodeMeshToFile(decodedGeometry, decoder);
|
||||
|
||||
decoderModule.destroy(decoder);
|
||||
decoderModule.destroy(decodedGeometry);
|
||||
});
|
||||
|
||||
function decodeDracoData(rawBuffer, decoder) {
|
||||
const buffer = new decoderModule.DecoderBuffer();
|
||||
buffer.Init(new Int8Array(rawBuffer), rawBuffer.byteLength);
|
||||
const geometryType = decoder.GetEncodedGeometryType(buffer);
|
||||
|
||||
let dracoGeometry;
|
||||
let status;
|
||||
if (geometryType === decoderModule.TRIANGULAR_MESH) {
|
||||
dracoGeometry = new decoderModule.Mesh();
|
||||
status = decoder.DecodeBufferToMesh(buffer, dracoGeometry);
|
||||
} else {
|
||||
const errorMsg = 'Error: Unknown geometry type.';
|
||||
console.error(errorMsg);
|
||||
}
|
||||
decoderModule.destroy(buffer);
|
||||
|
||||
return dracoGeometry;
|
||||
}
|
||||
|
||||
function encodeMeshToFile(mesh, decoder) {
|
||||
const encoder = new encoderModule.Encoder();
|
||||
const meshBuilder = new encoderModule.MeshBuilder();
|
||||
// Create a mesh object for storing mesh data.
|
||||
const newMesh = new encoderModule.Mesh();
|
||||
|
||||
const numFaces = mesh.num_faces();
|
||||
const numIndices = numFaces * 3;
|
||||
const numPoints = mesh.num_points();
|
||||
const indices = new Uint32Array(numIndices);
|
||||
|
||||
console.log("Number of faces " + numFaces);
|
||||
console.log("Number of vertices " + numPoints);
|
||||
|
||||
// Add Faces to mesh
|
||||
const ia = new decoderModule.DracoInt32Array();
|
||||
for (let i = 0; i < numFaces; ++i) {
|
||||
decoder.GetFaceFromMesh(mesh, i, ia);
|
||||
const index = i * 3;
|
||||
indices[index] = ia.GetValue(0);
|
||||
indices[index + 1] = ia.GetValue(1);
|
||||
indices[index + 2] = ia.GetValue(2);
|
||||
}
|
||||
decoderModule.destroy(ia);
|
||||
meshBuilder.AddFacesToMesh(newMesh, numFaces, indices);
|
||||
|
||||
const attrs = {POSITION: 3, NORMAL: 3, COLOR: 3, TEX_COORD: 2};
|
||||
|
||||
Object.keys(attrs).forEach((attr) => {
|
||||
const stride = attrs[attr];
|
||||
const numValues = numPoints * stride;
|
||||
const decoderAttr = decoderModule[attr];
|
||||
const encoderAttr = encoderModule[attr];
|
||||
const attrId = decoder.GetAttributeId(mesh, decoderAttr);
|
||||
|
||||
if (attrId < 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
console.log("Adding %s attribute", attr);
|
||||
|
||||
const attribute = decoder.GetAttribute(mesh, attrId);
|
||||
const attributeData = new decoderModule.DracoFloat32Array();
|
||||
decoder.GetAttributeFloatForAllPoints(mesh, attribute, attributeData);
|
||||
|
||||
assert(numValues === attributeData.size(), 'Wrong attribute size.');
|
||||
|
||||
const attributeDataArray = new Float32Array(numValues);
|
||||
for (let i = 0; i < numValues; ++i) {
|
||||
attributeDataArray[i] = attributeData.GetValue(i);
|
||||
}
|
||||
|
||||
decoderModule.destroy(attributeData);
|
||||
meshBuilder.AddFloatAttributeToMesh(newMesh, encoderAttr, numPoints,
|
||||
stride, attributeDataArray);
|
||||
});
|
||||
|
||||
let encodedData = new encoderModule.DracoInt8Array();
|
||||
// Set encoding options.
|
||||
encoder.SetSpeedOptions(5, 5);
|
||||
encoder.SetAttributeQuantization(encoderModule.POSITION, 10);
|
||||
encoder.SetEncodingMethod(encoderModule.MESH_EDGEBREAKER_ENCODING);
|
||||
|
||||
// Encoding.
|
||||
console.log("Encoding...");
|
||||
const encodedLen = encoder.EncodeMeshToDracoBuffer(newMesh,
|
||||
encodedData);
|
||||
encoderModule.destroy(newMesh);
|
||||
|
||||
if (encodedLen > 0) {
|
||||
console.log("Encoded size is " + encodedLen);
|
||||
} else {
|
||||
console.log("Error: Encoding failed.");
|
||||
}
|
||||
// Copy encoded data to buffer.
|
||||
const outputBuffer = new ArrayBuffer(encodedLen);
|
||||
const outputData = new Int8Array(outputBuffer);
|
||||
for (let i = 0; i < encodedLen; ++i) {
|
||||
outputData[i] = encodedData.GetValue(i);
|
||||
}
|
||||
encoderModule.destroy(encodedData);
|
||||
encoderModule.destroy(encoder);
|
||||
encoderModule.destroy(meshBuilder);
|
||||
// Write to file. You can view the the file using webgl_loader_draco.html
|
||||
// example.
|
||||
fs.writeFile("bunny_10.drc", Buffer(outputBuffer), "binary", function(err) {
|
||||
if (err) {
|
||||
console.log(err);
|
||||
} else {
|
||||
console.log("The file was saved!");
|
||||
}
|
||||
});
|
||||
}
|
||||
|
21
javascript/npm/draco3dgltf/package.json
Normal file
21
javascript/npm/draco3dgltf/package.json
Normal file
@ -0,0 +1,21 @@
|
||||
{
|
||||
"name": "draco3dgltf",
|
||||
"version": "1.0",
|
||||
"description": "This package contains a specific version of Draco 3D geometric compression library that is used for glTF Draco mesh compression extension.",
|
||||
"main": "draco3dgltf.js",
|
||||
"scripts": {
|
||||
"test": "nodejs draco_nodejs_example.js"
|
||||
},
|
||||
"keywords": [
|
||||
"geometry",
|
||||
"compression",
|
||||
"mesh",
|
||||
"point cloud"
|
||||
],
|
||||
"author": "Google Draco Team",
|
||||
"license": "Apache-2.0",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/google/draco.git"
|
||||
}
|
||||
}
|
@ -335,6 +335,9 @@ bool MeshEdgeBreakerDecoderImpl<TraversalDecoder>::DecodeConnectivity() {
|
||||
if (!DecodeVarint(&encoded_connectivity_size, decoder_->buffer()))
|
||||
return false;
|
||||
}
|
||||
if (encoded_connectivity_size == 0 ||
|
||||
encoded_connectivity_size > decoder_->buffer()->remaining_size())
|
||||
return false;
|
||||
DecoderBuffer event_buffer;
|
||||
event_buffer.Init(
|
||||
decoder_->buffer()->data_head() + encoded_connectivity_size,
|
||||
@ -773,6 +776,8 @@ MeshEdgeBreakerDecoderImpl<TraversalDecoder>::DecodeHoleAndTopologySplitEvents(
|
||||
return -1;
|
||||
}
|
||||
if (num_topology_splits > 0) {
|
||||
if (num_topology_splits > corner_table_->num_faces())
|
||||
return -1;
|
||||
#ifdef DRACO_BACKWARDS_COMPATIBILITY_SUPPORTED
|
||||
if (decoder_->bitstream_version() < DRACO_BITSTREAM_VERSION(1, 2)) {
|
||||
for (uint32_t i = 0; i < num_topology_splits; ++i) {
|
||||
|
@ -105,8 +105,8 @@ enum EdgeFaceName : uint8_t { LEFT_FACE_EDGE = 0, RIGHT_FACE_EDGE = 1 };
|
||||
// exactly two occurrences of this event for every topological handle on the
|
||||
// traversed mesh and one occurrence for a hole.
|
||||
struct TopologySplitEventData {
|
||||
int32_t split_symbol_id;
|
||||
int32_t source_symbol_id;
|
||||
uint32_t split_symbol_id;
|
||||
uint32_t source_symbol_id;
|
||||
// We need to use uint32_t instead of EdgeFaceName because the most recent
|
||||
// version of gcc does not allow that when optimizations are turned on.
|
||||
uint32_t source_edge : 1;
|
||||
|
@ -71,6 +71,8 @@ bool FloatPointsTreeDecoder::DecodePointCloudKdTreeInternal(
|
||||
DecoderBuffer *buffer, std::vector<Point3ui> *qpoints) {
|
||||
if (!buffer->Decode(&qinfo_.quantization_bits))
|
||||
return false;
|
||||
if (qinfo_.quantization_bits > 31)
|
||||
return false;
|
||||
if (!buffer->Decode(&qinfo_.range))
|
||||
return false;
|
||||
if (!buffer->Decode(&num_points_))
|
||||
|
@ -18,7 +18,7 @@
|
||||
namespace draco {
|
||||
|
||||
// Draco version is comprised of <major>.<minor>.<revision>.
|
||||
static const char kDracoVersion[] = "1.2.1";
|
||||
static const char kDracoVersion[] = "1.2.2";
|
||||
|
||||
const char *Version() { return kDracoVersion; }
|
||||
|
||||
|
@ -426,13 +426,14 @@ bool ObjDecoder::ParseMaterialLib(bool *error) {
|
||||
if (std::memcmp(&c[0], "mtllib", 6) != 0)
|
||||
return false;
|
||||
buffer()->Advance(6);
|
||||
parser::SkipWhitespace(buffer());
|
||||
DecoderBuffer line_buffer = parser::ParseLineIntoDecoderBuffer(buffer());
|
||||
parser::SkipWhitespace(&line_buffer);
|
||||
material_file_name_.clear();
|
||||
if (!parser::ParseString(buffer(), &material_file_name_)) {
|
||||
if (!parser::ParseString(&line_buffer, &material_file_name_)) {
|
||||
*error = true;
|
||||
return true;
|
||||
}
|
||||
parser::SkipLine(buffer());
|
||||
parser::SkipLine(&line_buffer);
|
||||
|
||||
if (material_file_name_.size() > 0) {
|
||||
if (!ParseMaterialFile(material_file_name_, error)) {
|
||||
@ -454,9 +455,10 @@ bool ObjDecoder::ParseMaterial(bool * /* error */) {
|
||||
if (std::memcmp(&c[0], "usemtl", 6) != 0)
|
||||
return false;
|
||||
buffer()->Advance(6);
|
||||
parser::SkipWhitespace(buffer());
|
||||
DecoderBuffer line_buffer = parser::ParseLineIntoDecoderBuffer(buffer());
|
||||
parser::SkipWhitespace(&line_buffer);
|
||||
std::string mat_name;
|
||||
parser::ParseLine(buffer(), &mat_name);
|
||||
parser::ParseLine(&line_buffer, &mat_name);
|
||||
if (mat_name.length() == 0)
|
||||
return false;
|
||||
auto it = material_name_to_id_.find(mat_name);
|
||||
@ -479,10 +481,13 @@ bool ObjDecoder::ParseObject(bool *error) {
|
||||
if (std::memcmp(&c[0], "o ", 2) != 0)
|
||||
return false;
|
||||
buffer()->Advance(1);
|
||||
parser::SkipWhitespace(buffer());
|
||||
DecoderBuffer line_buffer = parser::ParseLineIntoDecoderBuffer(buffer());
|
||||
parser::SkipWhitespace(&line_buffer);
|
||||
std::string obj_name;
|
||||
if (!parser::ParseString(buffer(), &obj_name))
|
||||
if (!parser::ParseString(&line_buffer, &obj_name))
|
||||
return false;
|
||||
if (obj_name.length() == 0)
|
||||
return true; // Ignore empty name entries.
|
||||
auto it = obj_name_to_id_.find(obj_name);
|
||||
if (it == obj_name_to_id_.end()) {
|
||||
const int num_obj = obj_name_to_id_.size();
|
||||
|
@ -121,6 +121,17 @@ TEST_F(ObjDecoderTest, ComplexPolyOBJ) {
|
||||
ASSERT_EQ(mesh, nullptr);
|
||||
}
|
||||
|
||||
TEST_F(ObjDecoderTest, EmptyNameOBJ) {
|
||||
// Tests that we load an obj file that has an sub-object defined with an empty
|
||||
// name.
|
||||
const std::string file_name = "empty_name.obj";
|
||||
const std::unique_ptr<Mesh> mesh(DecodeObj<Mesh>(file_name));
|
||||
ASSERT_NE(mesh, nullptr);
|
||||
ASSERT_EQ(mesh->num_attributes(), 1);
|
||||
// Three valid entries in the attribute are expected.
|
||||
ASSERT_EQ(mesh->attribute(0)->size(), 3);
|
||||
}
|
||||
|
||||
TEST_F(ObjDecoderTest, TestObjDecodingAll) {
|
||||
// test if we can read all obj that are currently in test folder.
|
||||
test_decoding("bunny_norm.obj");
|
||||
|
@ -206,6 +206,22 @@ void ParseLine(DecoderBuffer *buffer, std::string *out_string) {
|
||||
}
|
||||
}
|
||||
|
||||
DecoderBuffer ParseLineIntoDecoderBuffer(DecoderBuffer *buffer) {
|
||||
const char *const head = buffer->data_head();
|
||||
char c;
|
||||
while (buffer->Peek(&c)) {
|
||||
// Skip the character.
|
||||
buffer->Advance(1);
|
||||
if (c == '\n')
|
||||
break; // End of the line reached.
|
||||
if (c == '\r')
|
||||
continue; // Ignore extra line ending characters.
|
||||
}
|
||||
DecoderBuffer out_buffer;
|
||||
out_buffer.Init(head, buffer->data_head() - head);
|
||||
return out_buffer;
|
||||
}
|
||||
|
||||
std::string ToLower(const std::string &str) {
|
||||
std::string out;
|
||||
std::transform(str.begin(), str.end(), std::back_inserter(out), tolower);
|
||||
|
@ -52,6 +52,9 @@ bool ParseString(DecoderBuffer *buffer, std::string *out_string);
|
||||
// Parses the entire line into the buffer (excluding the new line character).
|
||||
void ParseLine(DecoderBuffer *buffer, std::string *out_string);
|
||||
|
||||
// Parses line and stores into a new decoder buffer.
|
||||
DecoderBuffer ParseLineIntoDecoderBuffer(DecoderBuffer *buffer);
|
||||
|
||||
// Returns a string with all characters converted to lower case.
|
||||
std::string ToLower(const std::string &str);
|
||||
|
||||
|
5
testdata/empty_name.obj
vendored
Normal file
5
testdata/empty_name.obj
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
o
|
||||
v -1.3264260292053223 -0.3829360008239746 -0.8749939799308777
|
||||
v -1.3262399435043335 -0.3956040143966675 -0.8693050146102905
|
||||
v -1.3313590288162231 -0.38378798961639404 -0.8686969876289368
|
||||
f 1 2 3
|
Loading…
x
Reference in New Issue
Block a user