The pinnacle achievement in active eyewear was once simply making a pair of sunglasses that were light and wouldn’t shatter into your eyeballs. While that’s still important, technology has obviously progressed. So what is the future, exactly? Heads-up displays? Augmented reality? Cybernetic retinal implants?
At a recent event in Los Angeles, we caught up with Oakley’s Ryan Saylor, Senior Research and Development Engineer, and Ryan Calilung, Research and Design Engineer. Together they shed some light on where we might be headed.
Welcome to (Ryan Saylor, left, and Ryan Calilung, right)
Gizmodo: At the event tonight there was a lot of talk of “innovation” and what you’re going to do next. You mentioned electronics and heads-up display-type stuff, but there weren’t any product announcements. So what’s on the horizon?
Ryan Calilung: If we’re talking about what’s on the horizon — like heads-up displays and all that kind of stuff — it’s been done. Sure, there will be a point when we can push that out to the masses, but I think what’s really interesting is what the next step is beyond that, moving past heads-up displays. I think there will be an integration of the electronics into the consumer. I think the problem with the current technology is that it’s super heavy-handed; it’s so obvious. I think that technology is dictating design and how the customers live with it, and I want to see the next level where all that stuff falls away and the experience is integrated seamlessly with it.
Ryan Saylor: Yeah, I think the customer lives with it right now. But what we’re looking at is: What’s the experience that the customer really wants? Maybe they don’t want to pull their cell phone out right now, but they don’t want to carry a dongle around 90% of the time, either. We think about the experience and how we can make the experience more seamless, less intrusive, and overall better.
Giz: Right, so… how are you going to do that?
RS: I don’t think we’ve answered that question! We did Thump 10 years ago and it might have been a little ahead of its time — it was at the same time Bluetooth was coming out — and I think where Thump was then is kind of the age we live in right now. If we had access to technologies that made our consumers’ lives better without burdening them with an interface, with bulky devices, and with things they didn’t really want to put on their face, we’d have a profitable market.
Our team is a software development team that is looking for those new technologies, to figure out how to package them into products that people would want to wear.
Giz: And that could be any type of form factor?
RS: Yeah, our current form factor is Airwave [ed. Airwave is the name of Oakley’s new snow goggles with an integrated heads-up display]. Airwave is a great learning experience for us. We’re currently learning everything our consumers like about it, and we’re learning some of the areas we can make improvements, but I think it’s fair game. If you look at our products, we have eyewear, footwear, and everything in between. These are all areas that we can likely interact with our customers and make their lives better. Biometrics, information, data — it’s all in play.
RC: I think that we have an obvious advantage, especially as our generation gets older and more and more people begin wearing eyewear, either prescription or not. It’s just there. It’s not like putting a watch on. I actually had to find my watch today because I don’t usually wear a watch anymore, and then you’ve got your phone in your pocket, as well. I’m envisioning, in the not-too-far-future, that you’re not necessarily carrying a phone, you’re not necessarily carrying a watch, it’s all just eyewear-mounted, because that’s really the thing that customers always need. The RX, the prescription, or sunglasses in sun, or sports. There are a whole bunch of different applications for eyewear, and luckily for us that seems to be a natural place to mount all that stuff.
I think that one thing that’s different about us than the approach from Apple and other companies is that we’re not necessarily taking a technology and then packaging it. We’re asking, “What is the actual thing that we’re trying to solve?” It’s a really subtle differentiation in product development or how to look at things. We’re doing a bunch of research right now to see how users interact, because I think that was the problem with Thump. It was so ahead of its time. MP3 was the wild west no one had figured out. You had Napster, and Bluetooth was in its infancy. Honestly, it was a great product, but consumers deselected it. They decided that how they live with electronics isn’t how we thought they would. That was a mistake that we learned from. So we’re doing a lot of setup on problem statements and mapping what the consumer really needs.
Giz: If we could talk about fantasy products for a minute, what needs to happen technologically for us to get to that next level? What are some of the barriers?
RC: I’ve been really interested in semi-solid, liquid crystal that could be applied to lenses so that it could grow or change for different uses. Or imagine if your RX could transform itself. Everyone’s seen a liquid crystal watch or liquid crystal windows, so if that technology takes a couple of generational steps, the possibilities are really endless with what we could do. It would be legitimate science fiction eyewear.
Giz: How would that manifest itself exactly? Again, fantasy scenario: What form might that take?
RC: Imagine a prescription whose geometry is controlled by some magical, electronic box. You ordered a pair of our RX glasses, it came in a box, you put it on, and it was like a learning computer. You looked at several stock images and it fed that information back to the box and adjusted your prescription instantly. Now, in reality, you’re supposed to get a new prescription every year because our eyes aren’t stable. They get better or they get worse. But most people don’t go to the eye doctor every year, because they just don’t have the time. So now you would have these glasses that would automatically be able to readjust the prescription on the fly according to your changing eyes, so they would always be perfect. That’s just a small example.
RS: Yeah, I think the dynamic, reactive nature is something we’re going to capitalise on. I think the technology is going to drive us to a progressive lens. Even at this point, we’re able to track your vision and understand what you’re looking at and why. We’ll drive to the point where the eyewear adapts to what you’re doing. That’s probably fairly near-term, actually.
Giz: So if that’s the output, what’s the input? Are we talking cameras, voice-control type stuff?
RS: Certainly biometrics, definitely eye-tracking. Colour is a big one, since colour definitely affects your mood and your visual acuity. What you’re looking at and what the light in your environment is, all has a big impact on your vision and your behaviour.
RC: We’ve done a bunch of stories on pupil dilation in stressful situations. It’s a good indicator of what you’re doing, especially for athletes, it can be pretty amazing. Imagine predicting from someone’s eye whether they are stressed or if they need more transmission of light to come in, or if it needs to be reduced, or you need more or less contrast in your vision. All of these variables can be used to reduce the stress from that situation.
RS: Your pupil is basically an F-stop. If you can control your pupil dilation, you can control your visual acuity. So if you think about lenses in a dynamic world, and you’re always managing through a 3.2 millimetre pupil dilation, by adapting the lenses dynamically to your surroundings you can actually control acuity.
Giz: Do you guys contemplate scenarios involving augmented reality-type stuff or Google Glass situations?
RC: I grew up in sport, and I was kind of the data-nerd. Where I come from, things like power, heart rate, and cadence — all of those metrics and data points are really important. That’s especially true when you look at sport performance at the highest levels, and you have Olympic athletes competing and winning by a tenth of a percentage point of performance. That sort of data management and that sort of feedback is really helpful for training. If you roll back down to the average weekend warrior, the gains that can be made at that level are huge with just a little data management.
RS: Yeah, that’s one component, and there’s the other side, where if you think about people that go for a hike or a mountain bike ride to experience the environment or to get away from work, they have a different view of data and how intrusive the device can be. One of the ways we think about it is to let the consumer have the experience they want, let’s not necessarily drive the experience. With athletes we have pretty good footing on performance data and what they want, but that’s definitely not going to be for everyone, so we have to find a way to deliver the experience they’re looking for.
Giz: Anything else you want to add on this topic?
RS: We like to think of Airwave the same way Google approaches this first version of Glass. We’re not going to sell a million of them, but we get to interact with those consumers and we get to learn from them.
RC: It’s a learning tool. We knew it was going to be a loss leader so we took a kind of “failed path” mentality. “Let’s make a mistake and learn from it, so we can make the next version even better.
Giz: What kind of feedback have you been getting back from consumers on Airwave?
RS: Some good personal patterns, some learnings on battery life expectation. Nothing you haven’t heard before at this stage. But I think we’ve learned a lot about how people like to interact with the device. The things they appreciate like the Buddy Tracker feature. We’ve seen some very distinctive patterns in who’s buying the product and why we think they’re buying it. And, of course, that gives us access to those people, so we get to have a conversation with them.
RC: What’s been interesting to me is that these sports are especially social. We’re seeing that people are keeping track of their friends and sharing data and stats, and that’s really fascinating. When I was growing up it was like the Cobra Kai Dojo — you know, “Win win win!” It wasn’t so much about hanging out. The social aspect of sports and the social training is especially fascinating.
Giz: Do you think it will get to the point where it’s used in competitive team sports? Like a relay teams of speed skaters could set it so they wouldn’t even have to look back for a baton exchange?
RS: That’s actually a scenario we’ve discussed. The same way with mountain biking or road biking where your team is. Think about the Tour de France, especially with how strategic and tactical it is. We definitely hope that your team will show up on in-glass screens just like your friends on the Buddy Tracker right now. You might even get a look at their hydration or heart rate so you can get a picture of what your whole team is doing and even do some coaching in there.
RC: If I could dream for a second, I watched Oblivion recently. There’s a scene where he’s flying his ship: he turns around and the display shows him what it looks like as he’s looking through his ship, you could apply that stuff with augmented reality. For instance, riding in the rain: if could cut all that visual interference out and just see what’s really there, I think that’s crazy cool. Or night skiing and things like that.
RS: People ask us a lot, “When am I going to get to see through stuff?” Through the rain, through the snow or clouds. And that’s a reality right now for the military with thermal imaging and whatnot. But, for us, since the semi-conductor industry doubles in efficiency every two years, it will take a little while for to microtize everything to an extent that it looks good on your face. So a lot of what we’re talking about right now, we could do — but people aren’t going to wear it.
RC: And it’s going to be super heavy-handed. Thump was great for 2004, but we could make it smaller, leaner, and meaner right now. If you look at Google Glass right now you’re like, “Huh, nice heads-up display.” Especially if you get a white or orange one it’s pretty obnoxious, and sure early adopters like to show stuff like that off, but the average consumer is a little more stealth and covert than that.
Giz: Were you guys paying attention to what Flir was doing at CES, with their thermal imaging iPhone case?
RS: Yeah, absolutely. It’s coming! And that’s technology we already use when we’re testing our gear. Y’know, we’ll heat map your body and your face. So, absolutely, we went straight to the Flir booth.
RC: Yeah, we played with a bunch of military spec stuff like night vision and tried to apply it to things like mountain biking or fishing at night. Let’s just say that the technology is not quite ready yet. The depth perception isn’t there, the field of view was poor. There’s this mountain bike park where we tested it and as soon as I dropped in, within the first 50 feet was like, “I’m going to die.” I had no depth perception at all. It was pretty comical. The technology isn’t there yet — but I think it will be, eventually.