Next big thing for virtual reality: lasers tracking your eyes
Next big thing for virtual reality: lasers in your eyes
By Marco della Cava, USA TODAY 2:38 p.m. EDT May 2, 2016
SAN FRANCISCO – The next big leap for virtual and
augmented reality headsets is likely to be eye-tracking, where headset-mounted
laser beams aimed at eyeballs turn your peepers into a mouse.
A number of startups are working on this tech, with an
aim to convince VR gear manufacturers such as Oculus Rift and HTC Vive to
incorporate the feature in a next generation device. They include SMI, Percept,
Eyematic, Fove and Eyefluence, which recently allowed USA TODAY to demo its
eye-tracking tech.
“Eye-tracking is almost guaranteed to be in
second-generation VR headsets,” says Will Mason, cofounder of virtual reality
media company UploadVR. “It’s an incredibly important piece of the VR puzzle.”
At present, making selections in VR or AR environments
typically involve moving the head so that your gaze lands on a clickable icon,
and then either pressing a handheld remote or, in the case of Microsoft’s
HoloLens or Meta 2, reaching out with your hand to make a selection by
interacting with a hologram.
As shown in Eyefluence’s demonstration, all of that is
accomplished by simply casting your eyes on a given icon and then activating it
with another glance.
“The idea here is that anything you do with your finger
on a smartphone you can do with your eyes in VR or AR,” says Eyefluence CEO Jim
Marggraff, who cofounded the Milpitas, Calif-based company in 2013 with another
entrepreneur, David Stiehr.
“Computers made a big leap when they went from punchcards
to a keyboard, and then another from a keyboard to a mouse,” says Marggraff,
who invented the kid-focused LeapFrog LeapPad device. “We want to again change
the way we interface with data.”
EYE TECH NOT DUE FOR YEARS
As exciting as this may sound, the mainstreaming of
eye-tracking technology is still a ways off. Eyefluence execs say that although
they are in discussions with a variety of headset makers, their tech isn’t
likely to debut until 2017. Other companies remain largely in R&D mode, and
Fove has a waitlist for its headset's Kickstarter campaign.
The challenges for eye-tracking are both technological
and financial. Creating hardware that consistently locks onto an infinite
variety of eyeballs presents one hurdle, while doing so with gear that is light
and consumes little power is another.
And while a number of companies in the space have managed
to land funding – Eyefluence has raised $21.6 million in two rounds led by
Intel Capital and Motorola Solutions – some tech-centric VCs are sitting on the
sidelines while they wait for the technology to mature and for headset makers
to make their moves.
“What eye-tracking will do will be powerful, but I’m not
sure how valuable it will be from an investment standpoint,” says Kobie Fuller
of Accel Partners. “Is there a multi-billion-dollar eye-tracking company out
there? I don’t know.”
Among the unknowns: whether the tech will be disseminated
through a licensed model or if existing headset companies will develop it on
their own.
Still, once deployed eye-tracking has the potential to
revolutionize the VR and AR experience, Fuller expects.
Specifically, eye-tracking will “greatly enhance
interpersonal connections” in VR, he says, by applying realistic eye movements
to avatars.
Facebook founder Mark Zuckerberg, who presciently bought
Oculus for $2 billion, is banking on VR taking social interactions to a new
level.
“The most exciting thing about eye-tracking is getting
rid of that 'uncanny valley' (where disbelief sets in) when it comes to
interacting through avatars,” says Fuller.
LESS COMPUTING POWER
There are a few other ways in which successful
eye-tracking tech could revolutionize AR and VR beyond just making such worlds
easy to navigate without joysticks, remotes or hand gestures.
First, by tracking the eyes, such tech can telegraph to
the VR device’s graphics processing unit, or GPU, that it needs to render only
the images where the eyes are looking at that moment.
That means less computing power would be needed.
Currently, a $700 Oculus headset requires a powerful computer to render its
images. Oculus’s developer kit with a suitable computer costs $2,000. “If you
can save on rendering power, that could significantly lower the barrier to
entry into this market for consumers,” says UploadVR’s Mason.
And second, by not just tracking the eyeball but also
potentially analyzing a person’s mood and logging in details about their gaze,
AR/VR headsets are in a position to deliver targeted content as well as give
third-party observers insights into the wearer’s state of mind and situational
awareness.
POLICE USE
The former use case would appeal to in-VR advertisers,
while the latter would come in handy for first responders.
“Police and paramedics are looking for an eyes-up,
hands-free paradigm, and eye-tracking can bring that,” says Paul Steinberg,
chief technology officer at Motorola Soluations, an investor in Eyefluence.
Steinberg sketches out a scene from what could be the
near future.
A police officer on patrol has suddenly unholstered his
gun. Via his augmented reality glasses with eye-tracking, colleagues at
headquarters are instantly fed information about his stress level through pupil
dilation information.
They can then both advise the officer through a radio as
well as activate body cameras and other tech that he might have neglected to
turn on in his stressed state. What’s more, another officer on the scene can
instantly scan through a variety of command center video and data feeds through
an AR headset, flipping through the options by simply looking at each one.
“We would have to work with our (first responder)
customers to train them how to use this sort of tech of course, but the
potential is there,” says Steinberg. “But we’re not months away, we’re more
than that.”
DEMO SHOWS OFF EASE OF USE
An Eyefluence indicates that eye-tracking technology
isn't a half-baked dream.
Navigating between a dozen tiles inside a
first-generation Oculus headset proves as easy as shifting your gaze between
them. Making selections – the equivalent of clicking on a mouse – is also
equally intuitive. At no time does the head need to move, and hands remain at
your side.
After about 10 minutes in the demo, it feels antiquated
to pop on a VR headset and grab a remote to click through choices selected with
head movements.
Marggraff says Eyefluence's technical challenges included
making technology that could respond in low and bright light, accounting for
different size pupils and ensuring that power consumption is minimal.
But, he adds, his team remains convinced of the
inevitability of its product: “Just like when we started tapping and swiping on
our phones, we’re going to eventually need a better interface for AR and VR.”
This comment has been removed by the author.
ReplyDeleteGood post... Virtual reality technology is going to be a one of the essential things in the particular businesses.
ReplyDeleteVirtual Reality Shopping, VR real estate, VR architecture