Heads up: Our Ideas Factory has been refreshed, levelled up, and grown-up into Alphero Intelligence. Some of our old posts are pretty cool tho'. Check this one out.

- Google Glasses are back after 8 years, and now they are all-in for enterprise use (versus consumer).
- In these Covid times Google Glass shows real potential for enabling new remote working and training models.
- We designed and built a prototype Google Glass app to showcase its potential.
A long long time ago, in 2012...
In 2012 the world was presented with a piece of technology that would change everything: The Google Glass. The doodad targeted consumers and came with the promise of timely Facebook Notifications, latest breaking news and joining Google Hangouts calls while skydiving. Surprisingly, it turned out the world wasn’t quite ready to have your mobile notifications blinking at the corner of your eye every other second. Google quietly took it off the market in 2015 to “reinvent” it, but not before we had an opportunity to build an innovation prototype for Westpac NZ in 2013.
The Google Glass reboot in 2020
And then the glasses came back with a new take. The hardware itself didn’t change much, but the target market did. Google shifted their focus to new use cases for industries and workers that need information, collaboration and training while having their hands free. Google embarked on marketing the device into Manufacturing, Logistics and Retail, and Healthcare sectors.
And then a pandemic happened
While there was curiosity in those target sectors about exploring the future potential for Google Glass as a tool to improve efficiency and productivity, Covid-19 and the need for social distancing and enabling new remote working (and training) models has accelerated real-world uptake examples.
We at the Ideas Factory got our hands on one just prior to lockdown, and after some initial “Oohs” and “Ahhs” we started to explore what we could do with the gadget.
Our 2020 prototype
The basics, what does a Google Glass do?
The only glass piece on the Google Glass - is its small screen, which hangs just above your right eye. The screen works pretty much as any screen and can be used to show images, videos, lists and any type of text you want. It also comes with a camera and a touchpad at your temple level, and with its microphone you can pass on voice commands.
The twist is that Google Glass out of the box comes with no apps. Zilch. You can put it on, fiddle with the clock and some settings, but if you want it to do anything you have to build it yourself. Which has all to do with Google’s focus - they don’t want consumer apps, they want apps that bring value without distraction.
How did we produce our prototype?
We started some prototyping work just before going into lockdown, and given the impact of Covid, we focused on the “See what I see” feature.
“See what I see” is exactly what it sounds like. I wear the glasses and complete tasks as normal, hands free. You can remotely view what I am looking at, via another screen (laptop or tablet).
The glasses have a small speaker by the ear. You can provide remote direction, advice or discuss what you are doing. If there is any extra information that is useful, we can load it up on the Google Glass screen.
Considering user interaction patterns for Google Glass
The Google Glass presents some interesting new ways of thinking about screens and interaction and required some storyboarding so we could make sure we were all on the same page. Once we got the idea right for our prototype, Fraser (an Alphero design extraordinaire) scamped out the experience and different stages showing the different actors and possibilities for a simple video showing remote coaching.
The prototype - remote coaching in how to make a cappuccino using our office coffee machine
The prototype we built allows multiple people to get into the call and follow the person wearing the Google Glass around. People on the call can also load images and videos to provide information to the person wearing Google Glass.
Given that we have a cafe-grade coffee machine in our office, we figured that remote coaching / training in how to make a great cappuccino was an understandable scenario, that really tells the story about how Google Glass works.
Google Glass runs Android, so building apps just requires Kotlin, which is the same technology used to build an app for an Android phone - A piece of cake for our developer Connor to get going. Being the talented chap that he is, he also built an iPad app for the “coach” and set up the necessary server infrastructure to coordinate the communication. We took Fraser’s journey and produced ‘training material’ in the form of numbered images and animated GIFs that could be used as training prompts for our trainee coffee maker.
In our demo video we have Emma and Tony showing the prototype in action. It required some intense tongue-twisters and asking about how they were feeling, then they got into the “making some coffee” business.

We’re planning to expand on our prototype by adding features such as: allowing experts to draw on the live video to highlight areas of interest; enabling other participants to take screenshots and record videos for documentation purposes; and providing checklists and text-based information that the technician can interact with via voice command.
What are the best use cases for it?
Google is encouraging uses in areas that enhance people’s workflow by eliminating the need to interact with a touch device.
We see this being handy in areas such as maintenance work where workers can go through steps, gather evidence and tick them off one by one, and also request specialist assistance.
Remote training has major potential. Students could remotely join a tutor performing a task, ask them to repeat things and demonstrate from different angles, and follow along as they ask questions and take notes.
Healthcare can also benefit from this function. With the idea of limited travelling, district nurses or general practitioners in rural areas could wear a device and ask specialists to join into the consultation, diagnosing a patient together. This can also be expanded to other areas such as council workers inspecting building constructions (think: specialist engineers).
FAQ
Do you need to sync to a phone or are they stand-alone?
Glasses are stand-alone and don’t need any other mobile devices to be used.
Do you need internet access?
It will depend on the functionality of the app. If you’re storing long videos or a lot of images and audio, you might run into storage limits, but internet access is only required if you need on-time data to be collected from or sent to a server.
What if you wear glasses?
We’re not gonna lie - it takes a bit of fiddling and it’s not the most comfortable, but you can mount them over your glasses. Google is aware of this and has a few alternative mounting options.
Do you need separate apps to do anything useful?
Yes. You will need to build your own apps. When you buy a Google Glass off a distributor it does nothing other than connect to wifi and let you update the clock. This is by design by Google - they want enterprise customers developing custom applications for specific tasks or purposes.
Where can I buy them in NZ?
You can’t - distributors are only in the US at the moment, but they’re really prompt to get them shipped in.
Can I buy one and use it to stream my whole life live on Youtube?
Yes, but you would need to build an app that does that and load it on the Glass. Plus, make sure you have wifi accessible the whole time.
How comfortable/intuitive is it?
It takes a few seconds for you to adjust and get used to having a screen hanging around your eye, but it doesn’t take long to get used to it.
How do I interact with it?
You can use voice commands or a track pad on the side of the temples. These interaction methods need to be considered when you design and build your app.