For more crisp and insightful business and economic news, subscribe to The Daily Upside newsletter. It's completely free and we guarantee you'll learn something new every day.
Like many companies developing AR and VR tech, Snap wants to look into your eyes.
The company is seeking to patent tech that determines "gaze direction" to generate content when wearing a pair of AR glasses. Here's how it works: Snap's system first generates an "anchor point," or a point in the user's field of view where they're focusing. Once generated, the system identifies a surface within the user's field of view, and measures the distance between that surface and the anchor point.
The AR content is then generated based on that distance, and changes as you move closer or farther from it. Taking this type of measurement into consideration allows for more accurate rendering.
For example, if you are wearing a pair of Snap's AR glasses, and you look at your kitchen table, the system would use the distance measurements and anchor point to accurately place an AR object.
Snap's tech employs a concept called foveated rendering, or when rendering is done only at the point where a user is looking. By rendering content in this manner, Snap's tech aims to "reduce latency and increase efficiency in processing captured image data thereby also reducing power consumption in the capturing devices."
The company said that these renderings are based on tracking "head orientation and a relative position of a pupil or iris" with myriad sensors packed into the glasses to measure motion and eye movement. Snap said that its headwear may track users in a host of different ways, including facial tracking, hand tracking, biometric readings like heart rate or pupil dilation, and speech recognition for "particular hotwords."
Snap has been working on AR glasses for a hot minute. The company first debuted AR Spectacles in 2017, a launch that resulted in $40 million in losses from 300,000 unsold units, and has released several iterations since. The company also has sought plenty of headgear-related patents, including one for a prescription version of its AR glasses.
But Snap isn't the only one interested in tracking your eyeline. Meta has sought plenty of patents for gaze-based control of content, and offers eye-tracking within the Meta Quest Pro. Apple, meanwhile, has touted eye tracking and control as a big feature of its recently debuted Vision Pro headset.
Jake Maymar, VP of Innovation at The Glimpse Group, said that these tech firms see a lot of potential in users' eyes ... literally. He compared companies' newfound interest in gaze control to the shift from buttons to touch screens on cellphones: While the idea of a touchscreen once felt novel and outlandish to the average consumer, it's since become an embedded part of daily life. "It's just a new paradigm that I don't think we realize is going to be the paradigm of the way we interact with things," he said. "You don't realize how easy it is to use until you actually use it."
Another (potentially lucrative) reason that Snap and other companies are so interested in vision tracking: Hyper-targeted advertising, said Maymar. In an AR or VR experience, tracking where a person is looking can tell a company where it should place ads, how they're reacting to those ads and what content actually holds a person's attention.
"There's that saying that eyes are the windows to the soul," said Maymar. "By tracking those, you can actually gain a lot of information. If you're gaining that information, you'll be able to really create memorable experiences that have impact."
As a social media company, Snap makes the bulk of its revenue from digital advertising, a business model that's struggling amid a drop in demand for ads. Finding new ways to target the consumer in its future iterations of headsets could provide an additional revenue stream.
Have any comments, tips or suggestions? Drop us a line! Email at [email protected] or shoot us a DM on Twitter @patentdrop. If you want to get Patent Drop in your inbox, click here to subscribe.