loader

Meta Has a Sweet Vision for VR, But It’s a Long Way from Reality for the Rest of Us

The tech behind Starburst, Butterscotch, and Holocake sounds delightful but won’t be viable for years.

On Thursday last week, Mark Zuckerberg planted a flag in the VR sand. Speaking to reporters from a room decorated with decals of California, Montana, and Hawaii (all states where he owns homes), Zuckerberg led a show-and-tell of Meta’s VR research and prototyping. This, he said, is “what it’s going to take to build the next-generation displays for virtual and augmented reality.”

The names of the devices — Starburst, Butterscotch, Holocake — were as dreamy and delicious as their potential. The prototypes were built to overcome four elements of the human visual system — focus, resolution, distortion, and high dynamic range — to achieve a level of visual realism that is indistinguishable from reality. Meta calls this the “visual Turing test.”

But like virtual reality, the devices aren’t yet as real as they appear. While the technology is promising, it won’t become viable for consumer use for years. And Meta provided no details around cost or timeline or even the projected challenges around powering and cooling these devices.

That doesn’t seem to be the point. Several statements made during the presentation hinted at Meta’s desire to establish itself as being ahead of the competition. 

“I think we are the company that is the most serious and committed to basically looking at where VR and AR need to be 10 years from now, what are the problems that we need to solve, and just systematically working on every single one of them in order to make progress,” said Zuckerberg. And Reality Labs’ Director of Display Systems Research Douglas Lanman noted proudly, “Our team is certain that passing the visual Turing test is our destination and that nothing in physics appears to prevent us from getting there.”

So why share this technology with the press now, when it is years away from hitting the market? In response to that inquiry, Meta Spokesperson Kate Mitchell told Mashable that the company wants policymakers, regulators, developers, and researchers to be aware of the developments coming down the pipeline. But the presentation was touted as a “media roundtable” and, to our knowledge, no policymakers, regulators, developers, and researchers were invited to this presentation. Make of that what you will.

Now, on to the tech. Below are the takeaways from the 75-minute call, which included presentations about focus, resolution, distortion, and high dynamic range technology from Zuckerberg, Lanman, and Reality Labs’ Chief Scientist Michael Abrash, Research Program Manager Marina Zannoli, Research Scientist Manager Nathan Matsuda, and Research Scientist Andrew Maimone.

Solving for focus with Half Dome

Four iterations of the Half Dome prototype lined up side-by-side.
Four iterations of the Half Dome prototype. Credit: Meta/ Reality Labs

 

Reality Labs revisited 2017’s Half Dome varifocal technology to explain how their research to improve focus has developed. This tech works like autofocus on a camera: it adjusts the clarity of an object based on its distance from you. Current VR systems have a set focus of about five feet in front of the wearer, which means objects closer than that remain out of focus. 

Varifocal tech results in less fatigue and blurry vision, clearer reading, and according to Meta’s user testing, is generally preferred over the fixed focus of our current VR systems.

The tech sounds great, but it was pitched as “ready for primetime” in 2020 and has yet to be incorporated into a Meta headset. 

“It’s going to be awesome,” Zuckerberg said.” As hard as it is to build the first version of something, it can often be even harder to get it into a shipping product. But we’re optimistic that that will come soon.”

Solving for resolution with Butterscotch.

Three eyesight tests, one for each models of headset: the Rift, Quest 2, and Butterscotch. Butterscotch is significantly sharper than the other two.
The 20/20 clarity of Butterscotch in comparison to the Rift and Quest 2. Credit: Meta/ Reality Labs

 

To explore how resolution affects experience, Meta introduced a research prototype called Butterscotch. It delivers stunning clarity, and a high enough resolution that you can read the 20/20 vision line on an eye chart in VR. 

The catch? The Reality Labs team says display panels that support that retinal resolution for the standard VR field of view don’t exist, so the team shrank the field of view to about half that of Quest 2.

Solving for high dynamic range (HDR) with Starburst.

Zuckerberg holds a prototype of the Starburst in his hands. It's large and clunky, with two fans on top for cooling.
Zuckerberg holds a prototype of the unwieldy Starburst in his hands. Credit: Meta/ Reality Labs

 

To achieve significantly higher brightness levels, Meta created what they believe is the first HDR VR system. 

“Research has shown that the preferred number for peak brightness on a TV is 10,000 nits,” said Abrash, using the technical metric of measurement. Modern HDR TV displays peak around several thousand nits. The maximum nit level in Quest 2, by comparison, is about 100. They did not note how many nits Starburst was capable of, but the prototype is so heavy that it requires two external handles to hold in place and is “wildly impractical,” Zuckerberg noted.

Putting it all together with the Holocake 2 

A side profile of Holocake 2, which is black with silver goggle-esque rims.
A side profile of Holocake 2. It is a tethered device, though no wire is pictured. Credit: Meta/ Reality Labs

 

All these developments come together in a fully functional headset prototype that Meta calls the Holocake 2. It’s sleek, closer to glasses than the hulking systems we currently use, and it runs any existing PC VR, title, or app.

The chunky curved lenses of the Quest 2 have been replaced by a holographic lens, which is significantly flatter but affects incoming light in the same way. The addition of polarized reflection technology, also known as “pancake optics,” reduces the distance needed between the display and the eye, which means the physical headset is slimmer. 

“To our knowledge, [Holocake 2] has the most compact optics of any Quest 2-class headset and [is] the first that has holographic optics,” Maimone said. 

Zuckerberg sits on a bench with the Holocake 2 headset on.
Zuckerberg tries on the prototype of Holocake 2. Credit: Meta/ Reality Labs

 

The light source needed to achieve that slim frame? Lasers (today’s VR headsets only use LEDs) but finding lasers at the “performance, size, and price” needed is a challenge, Abrash says. 

“Honestly, as of today, the jury’s still out on a suitable laser source. But if that does prove tractable, there will be a clear path to sunglasses-like VR displays,” Abrash says.

Other challenges to getting Holocake into the hands of the consumer? Meta says it’s PC tethered, so it’s not a freestanding device like the Quest. And it wasn’t mentioned that Holocake 2 could support varifocal and eye-tracking.

The refractive lens and screen combo of Quest 2 is on the left, pancake lens iteration in the middle, and the holographic lens and laser combo of Holocake 2 on the right.

A diagram of three eyes in front of three different lens combinations.
The refractive lens and screen combo of Quest 2 is on the left, pancake lens iteration in the middle, and the holographic lens and laser combo of Holocake 2 on the right. Credit: Meta/ Reality Labs
 

The mirage of Mirror Lake

 

A mixed-reality concept called Mirror Lake, shown to press as a goggle-like rendering, combines the Holocake tech with multi-view eye-tracking, electronic varifocal modules, pass-through technology, and thin prescription attachments (to eliminate the need for contacts or eyeglasses). It also features a reverse pass-through display that “lets nearby people see a realistic digital representation of the headset wearer’s eyes and face.” 

It sounds incredible, but it doesn’t yet exist. “This Mirror Lake concept is promising,” said Maimone. “It will be a game-changer for the VR visual experience,” says Abrash. 

But we won’t see it for years.

So why did Meta organize this big presentation for the press? My bet is that they’re looking to get out ahead of a competitor like Apple, hoping to be able to point to these prototypes and say “we did it first” as others roll out similar technology.

By the way, we asked Meta Spokesperson Kate Mitchell what a Holocake would taste like. Her reply? “It tastes like the future.”

Source: Mashable

 

Leave a Reply

Your email address will not be published. Required fields are marked *

error: