Join leaders in San Francisco on January 10 for an exclusive night of networking, insights, and conversation. Request an invite here.
At its outset, the metaverse offered an exciting new concept that had so far only existed in sci-fi: A virtual world where we could be anyone and anywhere we want.
We could walk around with wearables and interact in a realm overlaid with and enhanced by digital graphics, 3D and other interactive features. There would literally be no limits to what we could do.
But this idea never came to pass — at least not quickly enough for the impatient in society — and excitement for the metaverse was almost immediately doused by generative AI.
“People tend to focus on one thing at a time; generative AI was such a big shiny object, such a big disruption that it became the focus,” Gartner analyst Marty Resnick told VentureBeat. “The metaverse in the short term seemed to be a little bit disappointing.”
VB Event
The AI Impact Tour
Getting to an AI Governance Blueprint – Request an invite for the Jan 10 event.
This has led some to say that the metaverse is not just passé, but “dead.”
But is it really? Experts say no — its definition and use cases are simply being reimagined beyond the ‘Everything Everywhere All at Once’ concept.
“Stop thinking about it within this virtual world context like VR,” Resnick said. “It’s more about new interactions between the physical and digital worlds.”
While Meta — which changed its name to reflect its lofty ambitions and confidence in virtual worlds — may have lost $47 billion on its metaverse investment, the metaverse economy is in fact expected to grow to $400 billion by 2030 (up from $48 billion in 2022) and the technology could generate up to $5 trillion in impact by the end of this decade.
Some predict the Apple Vision Pro — version 1.0 is expected as early as January — may well reinvigorate enthusiasm after a year dominated by gen AI.
Going forward, as opposed to pure VR — where users are immersed in worlds with no real rules — the metaverse will increasingly become part of the physical world through augmented reality (AR) and extended reality (XR), experts predict.
“The biggest opportunity for the metaverse is in the physical world, as opposed to digital ones,” said Resnick.
Ultimately, it’s not going to be one or the other; each will have its place. “We will go into the virtual world for certain experiences, and the virtual world will come to the physical world for certain experiences,” Resnick forecasted.
For enterprise, the metaverse of the future can provide opportunities in augmented work learning and development. For instance, users can interact through virtual offices and participate in collaborative digital onboarding.
Financial giants JP Morgan and Citibank have already launched these types of virtual onboarding and internships.
“A class of new employees coming in can get to know each other, collaborate and connect very quickly,” Resnick explained.
In this immersive space, for example, enterprises can present subjects like harassment and racism (or other ‘isms’) in ways that feel more real and don’t come off as stilted and scripted (such as in typical onboarding videos). This can be far more impactful and instill more empathy than traditional training and onboarding materials.
“It feels more like an earned learned experience than something you passively learned about or were informed about,” Bill Briggs, Deloitte CTO, told VentureBeat. “The retention and recall is just higher. Your brain is storing it in a different place.”
The metaverse also has great potential in industrial settings.
People can interact with machines to design, build and optimize manufacturing systems, experts say. Sensors, AI, XR, VR and digital twin technologies can provide simulations and real-world augmentation in operations, warehousing and logistics.
For instance, said Briggs: How can companies improve the flow of inventory? How can they approach potential machinery repair?
With spatial data and digital overlays, workers can see information from numerous systems integrated with “real time data, real-world controls,” he said. They can then fine tune production flow and run hundreds or even thousands of scenarios.
“They can have the ability to pivot with the future of their product and industry,” said Briggs.
Similarly, the metaverse can augment human workers. For instance, a manager with the ability to see what a specific employee is seeing through interactive devices can help provide triage. Or, training can take place on virtual versions of expensive, dangerous, difficult-to-replicate equipment.
“The conventional idea of the metaverse was this divorce from reality where we have a digital avatar to convene, communicate, cavort,” said Briggs. But the industrial metaverse “is seamlessly blending physical and digital.”
He added: “It’s the ability to shrink time and space, which is exciting.”
Challenges with technology, social acceptance
Still, there are significant challenges to overcome before the metaverse can reach its true potential.
For starters, VR and spatial computing technologies simply aren’t there yet. Users need headwear and displays that are like “typical glasses you wear everyday, as opposed to big cyberpunk glasses,” said Resnick.
Social buy-in is critical to the metaverse and will come about with the right hardware, he noted.
“If people are embarrassed to wear it, it’s not going to be accepted,” said Resnick.
Briggs agreed that “the idea of walking around with a computer strapped to your face is not exactly attractive.”
Similarly, users want to interact with something that looks and feels real — so graphics and overlays need to be responsive and on point. Some experts say Universal Scene Description (USD) is on its way to standardizing and democratizing tools to build virtual worlds, but that’s still in early stages of adoption.
This lack of digital content is a bigger challenge than connectivity, devices, sensors and greenfield versus brownfield retrofitting issues, Briggs contended.
The metaverse needs “photo-realistic, physics based renderings of products, equipment, facility and operating processes that in most companies and industries just doesn’t exist,” he said. “It’s this gap of how to create digital content that spatial computing elements require.”
While gen AI may have — at least temporarily — pushed the metaverse off the world stage, the two will inevitably enhance one another going forward.
For instance, gen AI can help create and enhance digital assets including 3D components.
“I don’t think you can look at them separately,” said Resnick. “They work very well together and will continue to do so.”
He pointed out that a combination of technologies will facilitate the ‘next big thing’ — not one solely on its own.
Gen AI and the metaverse working together will enable “more hyper personalized environments that can be created and experienced by anyone. It’s that whole democratization piece.”
Reimagine (don’t recreate) the way things have been done
No doubt, there is great ambition and imagination when it comes to the metaverse, AI and other evolving, cutting-edge technologies.
But, Briggs pointed out, it’s critical that organizations get over the sci-fi idea and create clear strategies.
“It’s a delicate balance of harnessing the enthusiasm without letting it become so ambiguous that you spend a lot of money and a lot of time and get a less than tangible impact and result,” he said.
Enterprises must “bound the aspiration into some real, meaningful problem states and potential improvement.”
By identifying use cases and outcomes, organizations can “fundamentally re-imagine” processes and spur ingenuity and creativity. He emphasized that the worst scenario is simply replicating the traditional ways that things have been done.
As he put it: “You tech-enable inefficient processes, you’ve just weaponized inefficiency.”
Ultimately, he pointed out, the metaverse, gen AI and other technologies are moving in a way that is “more evolution than revolution,” and on a more predictable path than people think (even as there are significant breakthroughs in individual areas).
“The collision between these different tech advances and forces is where all the most exciting use cases are happening,” said Briggs. “No one technology is the hero of any story. It’s up to all of us to keep that in mind and also not be beholden by the way we’ve always thought about the world and done things.”
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.
This news is republished from another source. You can check the original article here