Chances are you’re reading this on a screen shining light directly into your eyeballs. Whether LED, LCD, or OLED, these are all emissive displays which work by shining light through various filters, and pushing photons directly to your retina. This can be harsh and is different from how we naturally see the world, where light reflects off objects and into our eyes.
According to a recent study by Neilsen, the average American spends 10+ hours a day in front of a screen. The short term effects range from eye fatigue to sleep problems and we’re only beginning to understand the long term effects.As technology continues to work it’s way into every corner of our lives, we will need to find new ways to interface in a more natural way.
Voice is one area that I’m particularly excited about (see blog post for Future of Sound event). However, just as illustrations can supplement speech between two people to convey ideas, the same is true with graphics + voice compute. For example, if I ask Alexa the weather it may be easier to see a simple graphic showing the 5 day forecast, then to hear it aloud. This idea of the power of graphics + voice is why I was so excited when I met the team at Lightform.
The founders are a group of technical PhDs and creative wizards that have created a hardware/software solution for projection mapping. Previously working at companies like Microsoft Research, Disney, Adobe, and IDEO, they come from diverse and highly capable backgrounds. When paired with a projector, lightform allows users to map out a space and overlay light to create amazing results.
I’ll be the first to say, I’m a sucker for a good demo! It not only helps me understand the product, but demonstrates that it works and founders can execute. Go figure, a bunch of ex-Disney guys didn’t disappoint! The most memorable demo came when Brett Jones, CEO of Lightform, asked Alexa to call an Uber.
I do this all the time and my usual MO post-request is to pull out my phone and see where the Uber is. In this case, Lightform produced an unobtrusive UI element on the wall showing the pertinent information for the uber request including eta, map and car information. Brett explained that Alexa communicated the request to Lightform, which was paired to a small laser projector across the room. The software then pulled out the pertinent details and determined the best location, size and color for a UI element to appear in the environment.
Video projectors continue to increase in quality while decreasing in size and price driven largely by advances in mobile phone technology. Brett and I discussed what the home of the future could look like, with very few screens and UI elements appearing on demand using Lightform technology enabling you to easily interact with your home.
I find myself using my Alexa more and more everyday. The notion that Alexa, or any voice compute device, could supplement its output with unobtrusive, dynamic graphics using Lightform is incredibly exciting to me. Sometimes when I meet a founder or demo a product I get the feeling I’m looking at the future. This was certainly one of those times!
(Lightform funding announcement here)