CES 2024: How Immersive Tech Is Redefining the Limits of Human Connection

|
|

CES 2024 is well underway in Las Vegas, and we had the chance to attend an event hosted by Forbes and sponsored by Meta that explored the intersection of immersive technologies like augmented and virtual reality and the future of human connection.

The event began with opening remarks from Forbes Vetted Executive Editor Dave Johnson. Somewhat coincidentally, his first experience with VR was at CES in 2013 when he demoed a Rift dev kit. At the time, Johnson joked with a journalist friend that he was going to buy a Rift, get it home, and never take it off again.

While the early promise of VR was palpable even then, it’s grown to encompass so many things, Johnson notes: gaming for consumers, training and simulation for the enterprise and governments, and improved situational awareness for the military. It’s truly a new kind of social platform, connecting people in interesting new ways.

“It’s immersive, it’s compelling, and now it’s combining with generative AI,” Johnson says. “We’re at a pivotal moment where technology isn’t just a tool—it’s a bridge, connecting us to each other like never before and expanding our experiences.”

Johnson then moderated a panel discussion with Menlo Ventures Partner Amy Wu and Meta’s Chief Operating Officer of Reality Labs Dan Reed. “I can’t believe I’m getting paid for what I do,” says Wu, who invests in consumer tech and gaming from seed through Series B financing. She does explain that folks in her position need to get comfortable with a certain degree of failure. “We’re in the business of trying to predict the future and find product market fit over years,” she says.

Reed, who grew up in Ann Arbor (go Wolverines), highlighted the fact that Meta changed the name of the company to reinforce its commitment to the metaverse, AR, VR, and the next computing platform. Needless to say, we’re all in.

And so is Wu, who points out that there are over three billion gamers in the world, with games making up a larger category of consumer spending than films and music combined. “Games have pushed forward a lot of technologies for years—think of NPC AI, for example,” she says. “VR games are a very obvious win.”

Johnson asked what types of games we’ve seen achieve success in VR, with many notable examples of short games with a strong mechanics hook, followed by AAA experiences. Reed acknowledges that most games in VR have been relatively short to date, though he asserts that we’re starting to see that shift with the launch of Asgard’s Wrath 2, which clocks in at 60+ hours (100+ hours for completionists).

Wu adds that comfort has been much improved across the board in VR over the years, particularly with the launch of Meta Quest 3. “And as games get longer,” she explains, “the economies can get deeper.”

Johnson then steered the conversation away from Quest 3, which “everyone knows,” and toward the Ray-Ban Meta smart glasses collection.

“There’s an entirely new computing platform coming,” Reed explains. “The smartphone is not the end of the line.” He points to future AR glasses as the form factor that will unlock all the connectivity and convenience of the smartphone and laptop combined while allowing us to remain in the moment rather than looking away at a screen. “Powered by AI, AR glasses will provide you with the information you have to drag out your phone to get to today,” he says.

And while the Quest line of VR headsets represents one path to AR glasses, Reed asserts that smart glasses are another path to get there, pointing to the integration of Meta AI and multimodal capabilities* in our current generation of Ray-Ban Meta smart glasses that will allow people to look at a sign in a foreign language and have it translated in real time, for example.

“This is a new category that’s great on its own but also a stepping stone on the road to AR glasses,” Reed says of Ray-Ban Meta smart glasses. And the utility of smart glasses could combine with safety features in the future. “You can imagine a world where AI has access to traffic patterns and real-time information to not only give you turn-by-turn directions but also alert you to obstacles and hazards in your path,” Reed explains.

This proved a logical segue to generative AI more broadly. As Johnson says, “2023 was the year of generative AI,” with many astounding advances that caught many people by surprise. “There’s potentially this crossing of paths between generative AI and AR, VR, MR, and UGC,” Johnson notes before asking: “Is that viable?”

“Absolutely,” responds Wu without missing a beat. For compelling AR and VR experiences, you need a lot of content, she explains. And great content can be really expensive to produce, so there’s seemingly a lot of promise for generative AI to lower the barrier of entry to create great content—though it’s debatable if generative AI can produce truly great content. While generative AI models undoubtedly can produce beautiful content, these systems still seem to struggle with spontaneity.

“If you believe these devices will become as ubiquitous as smartphones,” Reed says of VR headsets and future AR glasses, “then AI will power virtually everything in the future.” He points to shopping and commerce as just one area that generative AI could revolutionize, making experiences much more natural and participatory.

Johnson then brought up a utility company that used VR to train its employees in potentially hazardous situations, and asked for the panelists’ thoughts on the potential cost and time savings to be realized from enterprise VR adoption.

“We’re bullish on enterprise,” says Reed. He called to mind the early days of PC adoption, when consumers were won over by flight simulators and games and then spreadsheets were the other primary driver of adoption. He called out Pfizer, which was able to improve training time by 40% using VR, as well as collaboration and creative design as strong use cases that see significant cost and time savings from rapid VR prototyping.

“The barrier thus far is that it hasn’t been easy to deploy these devices at scale,” Reed admits, though the launch of Quest for Business in 4Q2023 appears to be the solution.

When asked by Johnson what they’re most excited about in the year to come, Wu—an avid user of Ray-Ban Meta smart glasses—points to the intersection of virtual and mixed reality and large language models (LLMs).

“We’ll have to get you into our beta test for multimodal,” Reed quips.

In a more serious vein, Reed calls out the incredible power of VR as an empathy machine, allowing us to approximate some sort of understanding of the lived experiences of others—all of which ties in with Meta’s mission to give people the power to build community and bring the world closer together. He also points to the power of AR and AI to provide real-time translation of foreign languages as another way to break down barriers and help people connect.

The event wrapped up with a closing keynote from technologist and author Shelly Palmer, who highlights the importance of “incremental betterment” and its prevalence at CES. While the improvements on display may not seem earth-shattering, they represent incremental progress that adds up in the long run.

If you happen to be at CES, Palmer suggests you don’t sleep on the Samsung exhibit, which features the world’s first micro-LED 4K display that’s completely transparent. He also gives kudos to the marketing team at SK for its Wonderland exhibit. “We’re in a world where you can build and personalize anything,” Palmer says.

Looking to the healthcare space, Palmer points out a clear delineation between a health span and a life span. He also sees a three-pronged approach being deployed in the healthcare and wellness space: a subscription-based app, some sort of physical kit, and expendables that you have to replenish the kit with. He also touches on the automotive industry, calling out notable absences while also acknowledging the significant presence of electric vehicles and charging technology. Autonomous driving is out, though assisted driving is very much in—which Palmer sees as the key to unlocking truly autonomous vehicles in the future.

Perhaps Palmer’s most provocative assertion: “Technology is meaningless unless it changes the way people behave.” And as increasingly large numbers of people have gotten comfortable rocking a headset over the years—and as the close of 2023 saw new waves of people using MR to elevate their experience while doing household chores—we couldn’t agree more. []-)

*Meta AI features only available in the US in beta.