Meta, the company formerly known as Facebook, held a presentation today where it announced its rebranding and showed off what it imagines the future of computing will look like — but amid all the fluff, it actually had some interesting tech demos. The first showed off its life-like Codex avatars and an environment for them to exist in, which the company says was rendered in real-time and reacted to real-world objects. Mark Zuckerberg also talked about the company’s work with neural interfaces that let you control a computer just by moving your fingers.
We’ll discuss the demos in just a moment, but it’s really worth seeing them for yourself. You can check them out on the Facebook keynote archive or on Reuters’ YouTube stream if you prefer that interface.
What is the metaverse?
It depends who you ask, but it usually refers to an array of interconnected digital spaces, sometimes in VR, sometimes experienced through a social network, and sometimes including real-time reference points to the physical world. You can read more about it here.
During the presentation, Meta showed off the work it’s been doing on its Codec Avatars to give users better control over their eyes, facial expressions, hairstyles, and appearance. The company also showed off the ability to simulate how the avatar’s hair and skin could react to different lighting conditions and environments and even how it was working on interactive clothing.
The presenter made it very clear that this tech is “still very definitely research,” and it’s no wonder — there’s an immense amount of hardware required to make one of the avatars, and Meta’s likely using very high-powered computers to render it. But it’s at least exciting to see that Meta’s goal is to let us render ourselves with graphics equivalent to what cutting-edge video game engines are capable of.
The company also showed off its real-time environment rendering, which it said would eventually be a place for you to use your avatar to interact with others. The system also lets people interact with real-life objects, with the changes being reflected in the virtual world. While realistic environments are nothing new, being able to make changes to them in the real world and see those changes happen in a virtual one makes for a really cool demo (even if the tracking points added to the real-life objects are a bit distracting in how much they stand out).
As for the neural interface, Meta’s relying on something called electromyography, or EMG, to turn the signals your brain is sending to your hand into computer commands. It showed off the wristband it uses to do this earlier this year, and the demo Meta gave today didn’t really show off anything that wasn’t around in 2019 when it bought a company called CTRL-Labs and acquired its tech, but it’s still good to see that it’s being worked on.
Meta has said that it’s pursuing EMG input instead of brain-reading devices, so it may have been showing off the way you interact with its future devices. Compared to visions of people playing VR ping-pong for an audience of avatars, this felt like the most realistic thing Meta showed off Thursday. That doesn’t mean the Oculus (er, Meta, apparently) Quest 3 will come with an EMG wristband, but in a sea of hype without much substance, its demo and the other actual research Meta showed off was a sight for sore eyes. Check them out if you’re looking for a break from the rest of the Facebook / Meta news.