Over two thousand companies descended on Las Vegas last week for the annual tech exhibition CES, which is organized by the Consumer Technology Association. Given the way that the last year has gone you can probably guess the conference’s buzzword.
Hint: It was “metaverse.” Ina Fried, writing in Axios, joked that “Many CES observers suggested a drinking game in which keynote watchers took a shot every time the metaverse was mentioned—but that would have been a recipe for alcohol poisoning.” Nima Zeighami, who works in the immersive technology industry and was at the conference, posted an extremely entertaining and derisive Tweet thread chronicling all of the uses of the word “metaverse” on various ads and branding exercises.
While some of the uses of the term bordered on meaningless, there were also several pieces of technology to get genuinely excited about. (My colleague Patrick Lucas Austin also has a more general roundup here.) And for many at CES, the buzzword was simply an entry point into more specific and technical dialogues. “The concept of the metaverse is starting to pivot from just a hot topic into a way to have more informed conversations about these technologies: the difference between AR (augmented reality) and VR (virtual reality), between digital twins and virtual objects,” Chris Stavros, the founder of the AR/VR platform Makesea who attended CES and spoke on a panel about education in virtual spaces, says.
Here were the announcements coming out of CES that caught my eye, for better or worse:
A full decade after Google announced Google Glass—eye glasses with built-in smart displays— smart glasses have yet to permeate mainstream culture in the slightest. But the public’s lack of interest isn’t stopping many companies from developing their own prototypes. The concept makes a lot of sense in the abstract: since we spend so much time looking at screens, why wouldn’t we want to transpose some of that information onto what we see in the real world? On the other hand, the idea of smart glasses poses a bevy of privacy and security risks. Such devices open the door to people being surveilled without their knowledge more easily. The technology could also be hacked or abused by stalkers.
Regardless, the Chinese electronics maker TCL unveiled smart glasses that enable you to take and share photos, navigate with GPS directions that are projected into your field of vision and set up a work display with multiple virtual monitors. Microsoft announced a partnership with Qualcomm to develop lightweight AR glasses. And smart contact lenses are coming, too. Mojo Vision is partnering with Adidas and other athletic-focused companies to develop contacts that provide real-time performance data, like your running pace or the upcoming turns on a ski slope. The company, however, is still awaiting FDA approval.
So when might you see people wearing smart glasses on the street? John Egan, the CEO of the tech analyst firm L’Atelier, told me last month he thinks it will still be quite a while. “If the utility of a product is very high, the aesthetic barrier is lowered, and vice versa,” he says. “Lensware and glassware have not reached the point where they have achieved an aesthetic value for which people will accept the low level of utility. That’s a chasm that it has to cross.”
VR (virtual reality) headsets, which completely cover your field of vision to transport you into 3D graphic worlds, have also been slow to take off. Personally, I’ve found wearing a headset disorienting and headache-inducing; I have a hard cap of how long I can wear the thing before my temples start throbbing. (It seems like many people are having better luck, though: Meta’s Oculus VR app was the most downloaded app at Christmas.)
And, a couple of new prototypes were announced at CES that could make VR even more mainstream. Playstation’s VR2 promises “new sensory features” and eye-tracking, which allows you to swivel your eyes to look to your left and right instead of turning your head. Panasonic, in contrast, is going for utility with its headset MeganeX, which weighs about half of Meta’s Oculus Quest 2.
To bridge the physical disorientation of spending time in virtual worlds, companies are developing projects that allow you to feel bodily sensations based on what’s happening inside the metaverse. The Spain-based startup Owo is hawking $450 haptic jackets that are meant to allow users to feel “a gunshot, the wind, someone grabbing your arm and even a hug from a loved one.” Shiftall, a subsidiary of Panasonic, has a bodysuit that makes you feel temperature changes via a sensor placed on the nape of the neck. These flourishes might seem trivial, but a recent study from the National Research Group found that a majority of consumers responded that a key draw of the metaverse would be its ability to “more closely resemble physical interactions.”
One of the main questions the general public had about NFTs last year was “how do you even look at them?” Well, Samsung thinks it has the answer: its new TV sets will be compatible with NFTs, so that you can view your Bored Apes and browse NFT marketplaces on a big screen. It’s not exactly high up on the list of things the world needs, but I bet plenty of newly wealthy NFT whales will buy them.
Few companies expressed as much enthusiasm at CES for metaverse-related developments than Hyundai Motor Company, who used a lot of lavish rhetoric to wax poetic about a lot of big, nebulous metaversian ideas. (I’d honestly love for someone to explain to me what metamobility is.) The company also announced a partnership with Unity to build digital twin factories; I wrote about the phenomenon in an earlier newsletter.
There were also a couple of metaverse-focused panels at CES, including “Learning in a Virtual World.” The panel may have lacked splashy announcements, but gave a solid overview of progress made in the space. Stavros, the founder of MakeSea, was one of the panelists; he’s excited about how MakeSea is starting to be used in K-12 education across disciplines. “One kid is doing drone scanning, another one is looking at skeletons and MRI scans. We’ve got a student who’s focused on architecture and modeling, and another kid that was working on a robotics project,” he says. “They’re learning how to use this technology as a universal communication tool.”