Why JSON Rulez!

Wednesday, April 22, 2009 | 4:52 PM

One of the issues that came up as we developed O3D was what format to load for 3D content. There are a few standards out there, the most notable probably being Collada, and so originally we planned on loading Collada files directly in O3D. We quickly ran into limitations.

There's nothing wrong with Collada per se, it's just that it's designed to do one thing: exchange 3D data. If your application doesn't fit its model you either have to work around it by trying to squeeze your extra data in to the nooks and crannies of the Collada spec or you have to redesign your application to fit what Collada supports.

After a few months of working with the limitations of Collada, we thought about it more and realized that similar to how other 3D APIs like OpenGL or DirectX do not define a file format, we shouldn't either. Instead, we should provide the pieces needed to make it possible to load data into our API, and then leave file formats up to the individual application developer. This allows the developer to tailor the data they send over the net to match their application. A mapping application might only need to send line segments, street widths and names. A 3D viewer might need triangles, materials and a transform hierarchy. A game might need all that plus collision data, character data, A.I. data, etc.

In switching to JSON we made all of that possible. Instead of some "black box" in which you give a URL and it magically loads the file with no control from the developer, we instead made the process completely transparent. Not only can you see exactly what is happening but you can also change almost every part of it to suit your situation.

Some parts of the data seemed like they were probably too large for JSON. Textures of course fit into this category, but also other elements such as vertices, animation and skinning data can get quite large. Even though, those assets can still be 100% in JSON if you like (depending on the size of your assets), we made it possible to load those types of assets from binary data and provided an API to load the data from gzipped tar files so you can package all of this up into manageable, easy-to-download pieces.

Having this process be open might sound like more work for each developer, but we provide a solution for this problem as well. We have a sample offline converter that takes a Collada file and converts it to a sample JSON format. We then have a sample library that reads that JSON format and recreates the Collada scene in O3D. With one command to convert your data and a couple of lines to load your scene, you'll find that it's very easy to use.

However, the offline converter and loading library are just that, samples. If you want different data you are free to modify the converter or write your own. For example, if you want something different to happen at load time, the loader is written entirely in JavaScript (most of it is in serialization.js) which you can use as a starting point and modify to fit your needs. If we had chosen a fixed format, all of that flexibility would have been lost.

Given this flexibility, we're excited to see the myriad of applications that will come out of O3D. Make something cool and share it with us via the discussion groups and come meet us at Google I/O.

Posted by Gregg Tavares, software engineer, O3D Team.

4 comments:

themightyatom said...

I applaud your approach and the general decisions made so far. As a long time web developer and 3d hobbiest I think this is a good match with the real needs of developers. The technology, (i.e. exposing hardware acceleration to javascript) should just be made to run as many places as feasibly possible, and everything else should be definable and extendable by individual developers as well as communities. If not for the "beta" nature of the technology I have several projects where it could be applied immediately. I hope adoption goes well and fast so we can use 3d, when and where appropriate without the barriers there exist today.

Will google be presenting at the Web3D symposium in Germany?

Metaverse One said...

I am more excited to see if Google will work with the existing 3D web open standard X3D. This is a great opportunity to work with those that have been developing 3D web standards for over a decade now and really do something that will evolve, not hype, the web.

Videometry.net said...

It will interesting to see if the Web3D consortium embraces the o3d initiative and can even capitalise on it.

After attending a web3D symopsium, I've had my concerns that X3D has become entrenched in "backwards compatibilty syndrome". Evolving very little in the decade I've been tracking it (compared to say, Flash), while computer graphics power and bandwidth have exploded exponetially. Because of o3d's open architecture it could be made to show content developed in any format, as long as the community behind that format is motivated enough to write and maintain the appropriate parser library.
If that happens I'm sure X3D/VRML will experience a comeback, as it clearly has it's uses as a 3D geometry transfer format, and can be generated by many existing 3D authouring tools.

I have myself fallen fowl to one of the proprietary X3D plugin companies altering their API, which for me is a strong indicator that the implementation of X3D isn't actually Standardized enough. In fact it's impossible to build an application around one X3D plugin and have it work with another one, unless you're merely displaying a very simple model. I don't want to bash the X3D community, their hearts are in the right place, it's just the technology is arguably in the wrong place! I've argued that point many a time and gotten nowhere. Better luck Google.

(btw, in the spirit of full disclosure, I used to be the mightyatom, so I'm always likely to agree with him :)

Creativityforacause said...

awesome this is what i was waiting for...i will be writing an article about it on my website(http://animationnews.co.cc) and promoting it to as many 3d artists as possible.