How Eyetracking Works in Bigscreen Beyond 2e

How Eyetracking Works in Bigscreen Beyond 2e

Most VR headsets like the Meta Quest 3 don't have eyetracking capability, which is only seen in high-end devices such as the Apple Vision Pro and the discontinued Meta Quest Pro. We're introducing breakthrough built-in eyetracking in the Bigscreen Beyond 2e. Eyetracking enables a variety of interesting use cases in VR games. For Social VR games like VRChat, eyetracking enables more natural interaction with more expressive body language cues in avatars. For performance intensive VR games like iRacing and Microsoft Flight Simulator, eyetracking can enable foveated rendering to improve performance and help games run more easily on PCs.

Following our core design philosophy of "every gram matters," the Beyond 2e contains world's smallest eyetracking sensor design with an unbelievably tiny image sensor the size of a grain of sand. Eyetracking cameras are only one part of the story. Today, we're sharing details on how our eyetracking technology works, our development roadmap, how eyetracking works with VR games, and how to get started with the Beyond 2e Eyetracking Beta Program.

Beyond 2e's Native Eyetracking Technology


We've spent the past few years operating at the bleeding edge of camera design and computer vision research & development. Our team of CV researchers and software engineers are creating fascinating new technologies which quietly "just works" under the hood to deliver a great end user experience.

Eyetracking implementations typically do image processing – simple feature extraction – on raw camera feeds of an eye. This worked fine for many years, but can be performance intensive, inaccurate, and fails in many common scenarios. For example, image processing will look for the iris by searching for an elliptical shape. However, in many cases, the human iris isn't perfectly elliptical in shape! Naive approaches assume the eyeball, pupil, and iris are simple rigid body shapes. Many challenging scenarios like eyelid drooping and eyelashes can break in naive approaches. In reality, the eye is a dynamic, fluid mass: shapes deform, squish, and stretch as the eye saccades around.

Rather than just image processing, our eyetracking technology understands the 3D shape of the entire eye, including the eyelids, iris, and pupil. We achieve this by training an AI model of the human eye in 3D space. And, we go one step further by generating a unique human eye model for each individual user by capturing a few images of a user's eye using the Beyond 2e's eyetracking cameras. These images are fed into a refinement pipeline, a computational intensive process utilizing significant GPU computation and large amounts of data that exceeds the capabilities of at-home PCs. The end result is a tiny, highly accurate model fine tuned to the unique user that runs locally on their GPU in a privacy centric manner.

For those who need a generalized eyetracking solution without any cloud-based fine tuning, we're actively developing generalized solutions. Our eyetracking software will offer customers multiple choices, allowing customers to choose between performance, accuracy, features, shareability, and privacy to find the right approach that best suits their own needs.

Option 1: Eyetracking Beta Program

Bigscreen's native eyetracking software is currently in beta. To use it today, you'll need to join the Eyetracking Beta Program which is available to all customers. You'll automatically receive a token via email when it is your turn to receive the Eyetracking Beta access. We are slowly emailing tokens in batches each day. If you wish to use eyetracking right away, you can email customer support after receiving your Beyond 2e to get early access to the Beta Program. Our Beta Program enables us to refine the product experience and rollout new technology updates regularly. This program also ensures our GPU training infrastructure can handle the demand for eyetracking as we ship Beyond 2e to customers.

When you receive a token via email, insert the token into the Bigscreen Beyond Eyetracking client. This lets you train your personal eyetracking model and utilize eyetracking features in games. For more information, read the Eyetracking Beta Program Setup Guide.

Later this year, we expect this to be a seamless UX with no beta signup or setup steps. Eyetracking will work out of the box in SteamVR content and OpenXR applications.

Option 2: 3rd-party Eyetracking Software

Instead of Beyond 2e's native eyetracking software, customers can opt to use 3rd-party community-developed eyetracking solutions such as EyeTrackVR and Babble. We've enjoyed collaborating with developers in the community and providing early access hardware in order to provide great eyetracking solutions for the VR community to choose from.

Any VR developer can easily develop custom eyetracking solutions. In addition to the eyetracking data piped directly into OpenXR feeds from the 2e's eyetracking software algorithms, the Beyond 2e's raw camera outputs are open and accessible to any VR developer. This enables customers to develop custom eyetracking implementations and perform science & research.

How Beyond 2e works with VRChat

You can use Beyond 2e with VRChat using Bigscreen Beyond's native software and OSC or VRCFT. As explained in the Setup Guide, instead of Bigscreen's native software, you can also opt to use Beyond 2e with 3rd party software such as Babble and ETVR, along with OSC or VRCFT.

How Beyond 2e works with OpenXR applications

Beyond 2e's native eyetracking works with OpenXR compliant applications out of the box, sending eyetracked data into OpenXR feeds for an application to consume. If you're developing you're own OpenXR applications, eyetracking data will appear natively.

However, there are a few remaining bugs in SteamVR where OpenXR failures are not yet seamlessly handled. This means some OpenXR games today may experience errors during game startup, which can prevent eyetracking features from working correctly. We're actively working on improvements with partners to ensure this works smoothly in SteamVR, our software, and OpenXR content.

Software Roadmap

Unlike hardware development which is locked months or years in advance, software development for eyetracking gets regular updates each week.

Today's Beyond 2e native eyetracking software performs simple gaze tracking: it tracks the position of your pupil and reports it to VR games. We wanted to ensure that this feature works out of the box, while we developed more advanced features in our human eye model. Pupil position is working today in the Eyetracking Beta Program.

You can expect new features such as pupil dilation, blinking, eyelids, and more in the coming weeks and months. Our internal focus is on advanced technology development (as explained above in the Technology section), and the end result is customers can expect many more features in VR games.

As we further develop the underlying eyetracking models and resolve OpenXR bugs, we expect to release advanced performance enhancing features such as Foveated Rendering and Quad Views. This means that Beyond 2e can enable performance intensive games like iRacing, DCS, and Microsoft Flight Simulator 2024 to run more easily on PCs.

Looking to the future

Eyetracking is a fascinating technology. While it enables short term features and performance improvements in VR games, it also has the potential for radical long term features. Varifocal lenses and realtime chromatic aberration and pupil swim correction rely on exceptional eyetracking. With continued research and development in eyetracking, we can one day create new products with even better visuals.

Back to blog