Meta Butterscotch, Starburst, Holocake & Half Dome Prototypes Show
Author: Alex Dzyuba, Lucid Reality Labs Founder & CEO
Table of Contents
- Meta Half Dome Prototype
- Meta Butterscotch Prototype
- Meta Starburst Prototype
- Meta Holocake Prototype
- In Conclusion
Meta has yet again caused a stir amongst the tech community. This time, it was with the unraveling of a whole set of VR headset prototypes and a background sneak peek of some devices that potentially looked like AR glasses. Counting 24 devices (see image below), Meta representatives, including Mark Zuckerberg and Chief Scientist Michael Abrash, have spoken about four leading names – Meta Half Dome, Meta Butterscotch, Meta Starburst & Meta Holocake, as well as a couple of previous iterations of these prototypes. While one could suspect it being part of a grand promotional campaign for the upcoming launch of Project Cambria, Meta’s newest and most potent headset, we decided to look more closely at the key announcements. A while back, we looked at Project Cambria in one of our previous articles, “XR Headsets to Look Forward to in 2022”. Back when the headset itself and its capabilities were still a bunch of speculations and rumors. Today, we still don’t have a complete vision of Project Cambria specs, aside from an estimated 2160×2160 per eye resolution and 90Hz refresh rate and a couple of more predictions, even though it seems rather apparent that Meta is planning on launching the VR headset at the end of 2022.
Nevertheless, if we look beyond Project Cambria and the potential marketing move, Meta’s Media Roundtable behind-the-scenes series “Inside the Lab,” this time hosted by both Mark Zuckerberg and Michael Abrash from Meta Reality Labs, revealed quite a few exciting technologies. The ones, if developed into final products, could become the next significant VR headset breakthroughs of the decade. Combined with Adam Savage’s extensive interview and his on-site, hands-on experience at the Meta Reality Labs, we had a grand sneak peek of what is “cooking” inside the Meta technology research centers. This also sets our expectation for more technological revelations to take the stage in the upcoming years.
All in all, as some say, in their attempt to flex, Meta still uncovered quite a bit of their Virtual Reality (VR) technology, spoke a few words about Augmented Reality (AR), set the stage for some potential new features, and capabilities as well as dropped a couple of hints about their on-going developments.
Meta Reality Labs claims to be working on advancing every aspect of hardware technology, from lenses and sensors to silicon and software. Even though some of the items they showed are yet to take final shape, there is a big chance they could move from concept to prototype and further into products in the foreseeable future.
To begin with, both the media roundtable and the interview revolved around the VR headset prototypes, mentioning some of the essential features an AR headset should also include. But most importantly, Meta has highlighted a number of their fundamental research directions, which include display resolution, optical focus shifting – the very focal technology, distortion, brightness, high dynamic range, and headset form factor. We also talked quite a bit about the critical milestones that should be achieved in technological development to pass the Visual Turing Test. According to the Meta’s Chief Scientist Michael Abrash, no one has passed.
MICHAEL ABRASH META CHIEF SCIENTIST
“The Visual Turing Test, which is the phrase we adopted along with other academic researchers, is a way to evaluate whether what’s displayed in VR can be distinguished from the real world. It’s a completely subjective test because what is important here is the human perception of what they see, the human experience, rather than technical measurements. And it’s a test that no VR technology can pass today. While VR already does create a strong sense of the presence of being in virtual places in a genuinely convincing way, it’s not yet at the level where anyone would wonder whether what they are looking at is real or virtual.”
While Meta concentrated their prototype demonstration efforts around the VR headsets and solving fundamental technical challenges in their development, they did mention AR headsets within the Visual Turing Test subject. However, the potential timeframe for AR hardware shipment that Mark said was estimated to be planned only for the second half of this decade. Moreover, yet another breakthrough in display technology will be required to achieve the realism and visual fidelity feasible to pass the Visual Turing Test for any of Meta’s future AR and VR headsets.
As already mentioned, today, there is still quite a few challenges and limitation in VR headset technology that need to be overcome, like resolution, distortion, eye focus, and fatigue, before we even start approaching the Visual Turing Test. An ideal VR headset of the future would be a stand-alone device and required both compact, lightweight, and capable of running for prolonged periods off of a single battery charge, as well as having enough processing power to deliver photorealism and next-level immersion. The display resolution of these next-generation headsets would be sharp and stereoscopic (in 3D), with a wide angle of the field of view, more pixels, and focus capabilities. Mark talks about these displays and the vision for their potential use.
MARK ZUCKERBERG META FOUNDER & CEO
“Displays that match the full capacity of human vision will unlock some essential things. The first is a realistic sense of presence, the feeling of being with someone or in some place as if you are physically there. And given our focus on helping people connect, you can see why this is such a big deal.”
While the opportunities these new displays can deliver could be unlimited, Meta concentrates more on connecting users, giving birth to new forms of art and self-expression, the next step in creativity, richness of experience, and feeling as if physically present in several interactions, presumably in the Metaverse. But looking further than simply connecting users in photorealistic environments, they could have a grand potential for many businesses and institutions to integrate new technology for education, upskilling, reskilling, and training. This is especially essential for industries where photorealism will be the next standard benchmark required by the industries, like Medtech and especially Healthcare.
But for now, going back, let’s look at the introduced VR headset prototypes, concepts, and experiments that could be the key milestones in coming closer to passing the Visual Turing Test. Before we get anywhere close to photorealism, quite a few fundamental challenges will need to be solved, considering how the human eye perceives visual information and how our brain processes and reconstructs it. Mark talks more about the general challenges of currently existing VR headsets, some of which have already been solved in the shown prototypes and some that still require work.
MARK ZUCKERBERG META FOUNDER & CEO
“You need stereoscopic displays to create 3D images; build a render of objects and focus your eyes at different distances, unlike a traditional screen or a display where you only need to focus at one distance. It would be best to have a display covering a much wider field of view than traditional displays. A retina-level resolution across that whole field of view requires way more pixels than on any traditional display. It would be best to have screens that can estimate the brightness and the dynamic range of the physical world, which requires ten times more brightness than we get on HD TVs today. It would be best to have realistic motion tracking that is low latency. You need to build a new graphics pipeline to power this type of display that can get the best performance out of CPUs and GPUs. Need enough power but must not drain battery or overheat or be combined in a device that can fit on one’s face.”
The Chief Scientist of Meta also talks quite a bit about one of the more complex issues, the need to reduce lens distortion, which will require to be done via software, at an eye movement pace, and in real-time. Michael Abrash also outlines the next level challenges that will be set for later stages of VR hardware development that include vergence-accommodation conflict (VAC) – the issue when the brain receives conflicting information regarding the distance of objects in the virtual environment, chromatic aberration (CA) or so-called “fringes” of color that can occur on the boundaries between the light and dark borders of the image, ocular parallax or minor depth-dependent image shifts that impact depth perception and realism as well as pupil swim which refers to the level of distortion associated with the pupil movement around the lens.
Finally, let’s talk about the VR headset prototypes that were introduced – the Half Dome for very focal technology, the Starburst with high dynamic range, the Butterscotch with high resolution, and, of course, the Holocake that demonstrated the lightweight form factor.
Meta Half Dome Prototype
The Half Dome prototypes are a series of at least three prototypes demonstrating very focal technology, introduced to show how autofocus could work in a VR headset. The VR Half Dome 1 was built back in 2018 and is the fourth focal headset amongst Meta prototypes, built in majority based on the Oculus Rift model. The headsets attempted to incorporate reliable eye-tracking, essential for every focal technology, with each generation experimenting with a different set of lenses and mechanic engineering to create a comfortable, low-vibration, and noise-dynamic focusing mechanism.
For these VR headsets iterations, the concept was to change how the lenses are built into the headset. Instead of being in a fixed position, the devices introduce dynamic positioning, with the change of focus moving back and forth, replicating the natural life eye focusing of a user when they look around at objects close up and far away. Today’s VR devices can’t adjust and switch focus, thus putting a significant strain on the users’ eyes, which causes discomfort and eye fatigue.
While working with these prototypes, the distortion correction has posed quite a challenge, as it varies with the eye movement and technically must be corrected by the software. Ultimately, Meta has reached a point where they built Half Dome 3 using pancake lenses, balancing a combination of weight, comfort, field of view (FoV), and power that could run an electronic focusing mechanism.
Having an autofocusing VR headset could be a significant game-changer for the content, usability, and capabilities of VR headsets. Something we would grow accustomed to and expect in our AR and VR devices is the technology to instantly adjust to and replicate the mechanics of how human eyes work.
Meta Butterscotch Prototype
This prototype is aimed at both the display resolution and the diminishing mark of VR headset resolution. Aiming to match the retina-level resolution of 60 pixels per degree (PPD), the display could provide comfort for prolonged hours of use. At the same time, it has a significant limitation regarding 3D objects and environment rendering in real-time. Today, the chips that power VR headsets do not have enough processing power to be compact enough and work off existing batteries without overheating or making the device form factor too bulky and uncomfortable to be head-worn.
At the same time, the software stack is yet to reach such a level of rendering and will require development in parallel with the hardware to take full advantage of the retina-level resolution. While foveated rendering, which allows the utilization of eye-tracking and reduces resolution outside of the eye focus area, is considered a solution, it will require development much further in its capabilities to the extent that eye-tracking can predict where the user will be looking at milliseconds before the eye movement.
Nevertheless, the Butterscotch prototype aims to achieve and recreate something close to 20-20 human eye resolution, which, according to Mark, will be around 54.5 PPD and total to 8K display resolution. Today, that can only be achieved with the prototype wired to a PC; the future would have to potentially deliver the next generation of lightweight and powerful chips to operate as a stand-alone device.
The other major challenge that would have to be solved for this prototype to start shaping into a natural product would be the dynamic distortion, which is so far difficult to overcome with such a high level of resolution. At this point, Meta places its bets on the advancement of the software rather than the mechanics inside the VR headset. What is interesting, though, is that while testing out different options of lenses to achieve the desired resolution, Meta has written a Light field portals package that allows the simulation of a variety of lenses without any restrictions or having to build an array of prototypes.
But to sum up, without a doubt, having a retina level of resolution could be a significant game changer for any experience far beyond social interaction and communication.
Meta Starburst Prototype
This particular prototype is most definitely a curious find that demonstrates a significantly high dynamic range. Inside the prototype, Meta uses a backlight already 200 lighter than in any currently existing VR headsets, an equivalent of 11K Nits.
This light level can unravel new possibilities for content as it brings the next level of contrast between light and dark spectrum. We are making it possible to build VR objects and environments with entirely different levels of depth from what we experience in VR today. Thus making the next level of realism and immersion possible.
What’s interesting for this particular case is that the research team of Starburst was not limited in any way by the form factor, thus the bulky exterior. This allowed us to demonstrate the possibilities of adding high visual contrasts and the difference it could make in the visual perception of the experience together.
Further research, especially in combination with the retina-level resolution and accurate eye-tracking, could impact how the VR industry and virtual existence are perceived. They are making Meta’s vision of a massively engaging Metaverse much more possible.
Meta Holocake Prototype
Last but not least, the prototype demonstrated was the Holocake prototype, where the form factor played an essential role. The form factor included visual comfort and effective weight. Today, this VR prototype works teetered from a PC but could become a stand-alone device of the next generation.
Communicated to have holographic optics and polarized reflection, this much thinner and lighter prototype could be the benchmark for many future VR and even AR devices. Generally speaking, immersive hardware would have to make a significant leap into the future to ensure VR could be transported into something of a sunglasses frame. There will need to be a tremendous leap in display, silicone, software, and many other technologies to make it happen. At this point, Meta is still looking for practical solutions to power the headset; however, looking at how the developments are going, we can predict it to be possible in the next half of the decade.
Still, the second generation of the prototype Holocake 2 is planned to include very focal and eye-tracking. Even though today it is still a concept, once the architecture is proven, if it actually could be, it without doubt would be a significant breakthrough in both AR and VR technology.
The introduced prototypes and developed technology sound promising but like a potentially significant step forward in the next generation of immersive devices. While Project Cambria promises to feature pancake lenses, delivering advanced levels of resolution, it seems that Meta is set on integrating them into further generations of both AR and VR devices.
While Meta is working on the optical stack, they have tried incorporating a specter of different lenses that could be activated in various combinations, as well as lenses that could be electronically actuated to move back and forth depending on the focus range. The vision is to have a lens that could be rapidly activated, eliminating the need to physically move the lens inside the headset, which is the main idea behind the varifocal lens technology. Realistically, Meta expects it to take another five to six years, considering the milestones already achieved today.
Let’s not forget that Meta has also mentioned the AR headsets; even though no actual prototypes were revealed, we could count on another big prototype revelation of augmented technology, displays, and hardware of what could be considered as AR glasses or headsets in the nearest future.
With all being said, Meta still plans to make their hardware affordable for most consumers, to have the technology as widespread as mobile phones are today. This would come hand in hand with their vision to unravel a new field of social interactions, collaboration, work, and social gaming. At the same time, they do not exclude another significant segment aimed at professional and enterprise use.
Thus, it has two price points, one closer to a mobile phone, potentially around the current Quest 2 price of USD 299, and the second within a professional PC range. It is still affordable for many. Here, we can only assume it falls within the $1000-USD 2000 range. With the more advanced headsets being initially launched for the business, the high-end segment transitioned to the consumer level. One thing is clear: they will without doubt be competing with Apple, their long-run technological rival and one of the generations of their much-anticipated MR headsets and AR glasses.
In Conclusion
So, looking at what we have learned from Meta, they currently have a couple of Quest generations in the works, as well as a potential AR headset. With all that, the big question today remains when we could expect a VR headset developer to build wearable technology that could feel visually indistinguishable from a natural world environment. One thing is sure: Meta is developing many of its devices to be smaller, lighter, and, importantly, affordable for the general public to take over the Metaverse-connected hardware market.