Zalando recently held its annual Hack Week, during which teams can brainstorm and work on innovative ideas that bring about real business value. This year I went for something totally geeky compared to last year, which had me working on a refugee-related project. The name of it is VRify.
VRify’s purpose is to create an immersive conference call experience that allows you to communicate with your remote colleagues more efficiently. It brings geographically dispersed team members together by utilizing cutting edge VR and AR technology to create virtual meetings with realistic 3D avatars and real-time audio. For the scope of this article, we will only focus on the 3D avatar, which we will refer to from now on as holograms.
So, how can somebody get started with 3D avatars?
First things first, we need to digitize people, and this is possible using 3D scanning. The scans produced will be the starting point for our holograms. Using a special sensor called Structure and an iPad, we went about scanning people and creating 3D models out of them. Apart from the hardware, we also needed some software to make it all work. After running some experiments with Skanect, we decided to go for itSeez3D for it’s ease of use and quality.
The scanning process looked like this:
… and as you can see, the results were stunning:
Once we have our 3D models, it’s almost time to take them into Unity, a game development platform that allows you to create 2D and 3D visuals. There seemed to be an issue with that as the imported model had some artifacts. In order to fix this, I had to use Meshlab, an open source system for processing and editing 3D triangular meshes, with tools for editing, cleaning, healing, inspecting, rendering, texturing and converting.
In Meshlab, open the .obj file:
And then do: Filters → Normals, Curvature and Orientation → Compute Face Normals, finishing off by re-exporting the .obj file. At this point, we are ready to import into Unity and have a perfect looking model.
Now that we’ve created our model in Unity, we can start building our virtual experience. To make things a bit more impressive however, we will be achieving this with Augmented Reality and placing our 3D models into the real, physical world as holograms using Microsoft HoloLens.
HoloLens is an AR or Mixed Reality set of goggles that allows you to pin and see virtual holograms or 2D UWP apps in real space. Out of the many cool things it can do, such as viewing your favorite TV shows or movies on a virtual screen, or playing Super Mario on the wall using an Xbox One S controller, we will use its power for the sake of productivity.
In order to develop applications for HoloLens, you first need to install the tools required and configure Unity and Visual Studio – there is no separate SDK for HoloLens; holographic app development uses Visual Studio 2015 Update 3 with the Windows 10 SDK (version 1511 or later). We managed to build our own HoloLens app called ARify and pin and project our previously scanned 3D model in the real world as a hologram. The result? Almost scary…
It turns out that we managed to bring the 3D scanning quality into HoloLens quite nicely. The resolution was adequate and the overall feeling quite realistic. The next step would be to try to rig and animate the model, which we also managed to do.
We were successful in getting real-time audio working between two users of the app. We had begun looking into animating lips and audio-lip syncing as well.
During the time of Hack Week, we were able to make the first steps towards building a real-time holographic conference experience. The results surpassed our expectations, super ignited our excitement, and definitely demonstrated how nicely 3D scanning and AR technologies play together. You can see more photos of the Hack Week event, which includes some more shots of our team’s efforts, over on the Zalando Technology Flickr page.