Apple Watch – Real-Time Visualization – Unity 5

Apple Watch – Real-Time Visualization – Unity 5 from Winston Patrick Brathwaite on Vimeo.

As my focus has shifted recently from more-traditional forms of rendering (setting up visualizations and clicking “render” once things are in place) to real-time raytracing (often GPU-enabled, persistently-refining “progressive” renderers {Octane / iray / etc.}) – I have often thought of the true usability of true real-time renderers (otherwise known as game engines) as suitable platforms for compelling product visualizations. With the general industry focus shifting heavily towards VR content, making such visualizations both visually compelling as well as offering substantive interactions with these visualizations may prove worth looking into.

For the artist/developer of the visualization, there are really only a couple realistic options as far as a package to develop within; Unity and Unreal – rightly or wrongly I would consider Autodesk Stingray and CryEngine to be on the fringes as possible options.

A simple search online can yield various examples of visually compelling architectural visualization projects created within Unreal Engine 4 (UE4). For my purposes as strictly product visualization tools – these offer some levels of comparison but not the fine detail that I deem import. A lot can be hidden when a scene is rife with things/stuff; a visualization can become visually cluttered and fine detail that is essential to single-focus product visualization scenarios can be lost or even intentionally covered-up.

As this is purely evaluation, all techniques and applications available to me will be looking into. My ideal point of success with this is as follows:

Reasonable to develop – need not explore skills beyond my own/those that I can readily acquire.

Visual quality that is sufficiently compelling – a floating goal.

Interactive – interaction is crucial and is what truly differentiates this evaluation from any other form of product visualization. If valuable interaction cannot be achieved, it is essentially useless as a medium.

Deployability – the ability to develop for many of today’s media consumption platforms (mobile devices, desktop computers, VR devices).

I am exploring this as a medium for other verticals beyond consumer electronics (consumer electronics, as will be displayed in this first example) to extend to automotive and aeronautical industries as well. Particularly for automotive, some evaluation is being done in tandem to this and relevant progress may be uploaded as well.

Ultimately it is a battle of data: poly-counts, texture memory, script-compatibility, shaders/materials, and stability of everything together.

As mentioned, this first example is using small consumer electronics for the simple reason that as a dataset (poly-count, materials, and textures), it should be much smaller than something like a fully-equipped vehicle. Once the hurdle of visual-fidelity without compromising interaction can be cleared, I intend to move on to more, perhaps larger devices/more devices in-scene, etc. and other verticals.

This example features an Apple Watch Sport (primarily modeled in MoI 3.0, supplemental modeling and scene setup handled in Autodesk Maya 2016) and developed for real-time visualization in Unity 5. To keep processing to a minimum, this current iteration features no real lights. There is a combination of a cube-map texture and an ambient-light gradient provided all of the illumination within the scene. I am also endeavoring to limit texture-use (which may become unsustainable as an initiative for some products, understandably, some products will need textures). Things such as the label on the reverse side of the Watch’s case are modeled in, these are not textures. The trade-off in this case however is texture-memory vs. poly-count (further evaluation will prove whether this is a concern at this scale). As true raytracing is not applicable in real-time engines such as these, phenomena important to realistic imagery such as ambient occlusion must be dealt with in other ways. Ambient occlusion post-effects exist and can add a great deal of visual-acceptance to a scene but may not go far enough if not combined with other methods. The deeper shadows within this visualization (the shadow of the pedestal where it meets the ground and the shadow of the watch meeting the pedestal are baked in textures (baked using The Foundry MODO 701). If you have been keeping count, that would mean there are a total of three textures in this current visualization- the two shadows and the cube-map environment. The trade-off of course to using baked-in shadows however is that neither the pedestal nor the watch itself can really move from their position without making the shadows visually-incongruous; they will no longer line-up.
Next steps are to prototype a compelling interaction aside from the freedom to orbit/zoom the camera. The current idea is to take advantage of the various case/band options available to essentially replicate a full customization experience (pseudo-Apple Store experience) that would be suitable as a mobile application/web experience. The idea is that it would give the potential customer all of the customization allowed via the current web-store/app-store experience except with the increased ability to fully view the item and their customization at most-any angle imaginable or otherwise allowed by the visualization.

More to come, hopefully.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s