by on / 7 comments

Today, the Google I/O conference is happening here in San Francisco. The talk Voiding your Warranty: Hacking Glass included the above video, which features our cofounder Ryan showing off his hacker skills with the Glass. Here’s the story behind the video.

Using an avatar as a proxy for communication has many benefits. Your avatar can always look good, be well lit and in an interesting location. However, even the most immersive virtual worlds fall flat when trying to deliver the emotional data from real world facial expressions and body language.

From video game controllers to tracking your sleep behavior, there is a good deal of experimentation being done with wearable sensor hardware right now. In addition to soldering our own creations together, we have been checking out work done by others as fast as we can all with the goal enabling rich emotional avatar communication.

As you can imagine, when we received our beautiful new Google Glass as part of the Explorer Program, we were eager to see if we could access its sensors and drive our avatar’s head movement (caveat: Google Ventures is one of our investors).

Being the only white guy with a beard here at High Fidelity, working with Glass fell to me 😉 This was a great exercise because it gave us an opportunity to abstract the input layer for multiple device support (we also got Oculus working! Stay tuned for that blog).

We had previously created an Android app that grabbed all the phone’s sensor data and sent it over UDP to a configurable port. Imagine holding your phone and being able to twist and move your avatar’s hand. Kinda like turning any phone (with sensors) into a Wii controller. Low and behold when we plugged our Glass in and tried to run the Android app from our IDE, Glass showed up as a device and it “just worked”. We could not edit the fields in the GUI on Glass but we could see from the log that it was transmitting the data.

For obvious reasons, Glass has some pretty aggressive energy saving behavior which made it tricky to keep the transmission alive. We ended up moving the sensor data transmission to a service layer. To stop transmission we just turn Glass off.

You can see in the video that we have a very low latency connection between human and avatar head movement using Glass!

Cristian Vorstius Kruijff on May 27, 2013

Love it. Doesn’t matter if its crude, it’s great stuff you’re developing avatar’s. In the future anyone can present him or herselve in any shape, form or impersonation. Mickey mouse meets Einstein? Keep up the good stuff!


raynbow on May 27, 2013

High fidelity… yet another ‘female free’ tech boys club company in process… the only white guy with a beard ? Is there even one white female in the entire company in anything resembling a key role … ? Be honest if you dare… and if one truly does exist… try posting a pic.


    Jeska on May 29, 2013

    Hi! I’m here, beard free!


Harlow Heslop on June 6, 2013

As a long time, dedicated Second Life resident, I am tremendously excited to see the progress of High Fidelity! I’ve been enjoying following the blog, and can’t wait to see what’s to come! :) Great work so far everybody!!


Nonna Hedges on February 8, 2014

this is all very inspiring and i cannot wait till its release. i’m dying to know if we will be able to be creative, build products, make clothing to sell, make our avitars customized like we do in secondlife.
thanks, nonna


Terry Beaubois on February 18, 2014

In google glass since January 2014
In Second Life since 2005,
In Silicon Valley since 1976
Let me know if I can help in any way.


Sun on March 26, 2014

Following the progress of high fidelity ,the development of avatars is awesome , being able to use a phone and moving avi sounds fantastic , keep up the good work



  1.  High Fidelity Note
  2.  Using Google Glass to drive avatar head | Virtu...
  3.  Internet trends 2013 | Avataric
  4.  High Fidelity: Body Language through “Telekinesics” | Virtual Body Language

Add your thoughts


Ryan Karpf