Enabling a Metaverse-Ready Network

By David Krauss, Network Architecture, Office of the CTO, Ciena.

  • 5 months ago Posted in

Ready or not, the metaverse is coming. For customer-facing businesses in particular, it will have a real impact because one application is to be a channel for connecting to customer services. As surely as chat, social and digital channels have become contact centre must-haves, driven by younger generations who are used to interacting with friends in the same way, the metaverse will follow.

An immersive, interactive metaverse – whether being used as a customer service channel or for countless other reasons – requires a robust network. Today’s applications are already pushing the limits of available bandwidth and latency low enough to provide high quality experiences. So, what does a network need if it is going to support a metaverse-ready future?

Today’s metaverse

The product of bandwidth and latency define a network measurement that can be referred to as “Mean-Time-to-Cloud”. This is the amount of data to be transferred to the Cloud, the time to process that data, and then the time to return it to the user. Today’s networks can deliver performance characteristics of 10s of milliseconds, but only at relatively low bandwidths. Networks can deliver large amounts of bandwidth, but they will incur significantly more latency when doing so due to the sheer amount of data being transferred.

In a virtual environment, when the lag between a user making a motion and the movement being displayed reaches about 10ms, dizziness or motion sickness (also known as “motion-to-photon” latency) starts to kick-in and ruins immersive experiences. To avoid this happening and deliver a network performance capable of supporting a metaverse with movement that is as smooth as users need it to be, the network will require massive amounts of bandwidth, coupled with huge amounts of computation and storage accessible at ultra-low latencies. This is where the transfer of data is at, or below, the threshold of human or machine perception, that is, “instantaneous communications”.

Adaptive networks

Building a metaverse-ready network is complex. For example, users will want to be able to access any metaverse application - and have the highest possible level of experience - from any device, anywhere at any time. Consider a home or business, which is connected to a nearby datacentre. Both these points are fixed, so a fibre connection (specifically broadband connectivity through passive optical networking (PON)) would provide the high bandwidth and low latency needed for a positive metaverse experience. But mobile is different, requiring advanced wireless connectivity like 5G, 5G Advanced, and eventually 6G. As a person moves, their connected datacentre must change so that the application and its computation can shift to whichever is the closest datacentre to them at any one time. To preserve the required high bandwidth and low latency, the network must shift and make sure resources remain close by.

A metaverse-ready network will require improved network technology in three main areas:

· Faster and larger networks to deliver huge amounts of bandwidth to end-user applications. This will require a combination of both fibre and wireless networks, with the ability to deliver as much information through a given communications channel as is physically possible.

· Networks that put the massive amounts of computation and storage as close to the users and the applications that rely on them. This will reduce latency and keep mean-time-to-cloud in check.

· Smarter networks with automation and AI/ML to manage resources in a closed-loop manner with the ability to heal or reorganise the network quickly. This will require a self-optimising fabric where applications dictate to the network the level of performance and the resources needed, and then the network delivers them.

Coming soon

With Web 2.0 users are connected across the network to where the application resides. With Web3, applications and functions are decentralised and don’t reside in one place. The role of the network begins to look like a fabric interconnecting these Web3 services. To a certain extent, the applications and their functions become embedded in the network infrastructure, allowing them to exist throughout the fabric.

Sooner that we realise, generative artificial intelligence will fundamentally shift the way we access and interact with information. For decades, we’ve formulated questions and queries in a way that search engines understand (“Ford Fiesta boot dimensions”). But soon we’ll be able to use natural language to work with data (“Will my new 65-inch TV fit in my car?”).

Imagine an intelligent avatar tool that can preserve your presence in the metaverse when you’re not online, interact with people or other avatars with your mannerism and your knowledge, and then report back to you with what transpired. This tool might even be able to conduct autonomous decentralized finance (DeFi) transactions on your behalf, function as your proxy for a DAO (decentralized autonomous organization), and even “create” for you independently with the correct training.

Even with a network to connect the required computation and storage, how this physically connects to our body will be important. Today we wear headsets, glasses, or goggles to offer photo-realistic images or augmented information to our eyes. These are bulky and uncomfortable, and can only be worn for short periods of time before fatigue sets in. Haptic feedback, via gloves or suits, is rudimentary at best. Scents can be created with a squirt of mixed liquids integrated into a VR headset, but this just adds complexity to an already non-ideal solution. What will be needed is a non-invasive way to interface directly with our brains, a non-invasive brain-computer interface (BCI.)

Innovators have already begun exploring BCI techniques, but they currently require placing sensors in or near the brain. The accuracy of simply trying to understand the meaning of brain signals is based on the resolution of information gathered, which could require thousands to millions of sensors. Today’s challenge of collecting, computing, and storing large amounts of data at the metro city level will likely evolve into a scaling issue at the “body area network” level as well. Now add to this the next step after interpreting brain signals: inducing signals within the brain to produce a multi-sensory effect.

Today’s challenge of collecting, computing, and storing large amounts of data at the metro city level will likely evolve to a scaling issue at the ‘body area network’ level as well.

Realising the future of the metaverse

30 years ago, who could have predicted the profound impact the Internet and mobile communications have had on the world today? Similarly, how can we predict the impact that the metaverse will have on the world 30 years from now?

The metaverse will be the evolution of the internet. The sheer number of internet applications show that IP-based innovation can produce incredible results, and the metaverse will be no different. But this will only be possible with a metaverse-ready network.

If businesses want to provide immersive and interactive services and experiences in the metaverse in future, they will need their CSPs to provide an adaptive and technologically advanced network to support the substantial bandwidth at very low latencies required. Businesses should look for internet service providers that work with leaders in networking systems, who are developing hardware and software to program and orchestrate the Adaptive Network.

By James Bristow, SVP EMEA at Cradlepoint.
By Dr. Thomas King, Chief Technology Officer, DE-CIX.
By Michael Rendell, Partner, Monstarlab.
By Kaustav Chatterjee, Director of Product Engineering at Tecnotree.
By Eamon O’Doherty, Portfolio Director - Europe at Logitech.
By April Hickel, VP, Product Management, Intelligent Z Optimization and Transformation (IZOT) for BMC.
By Rytis Ulys, Analytics Team Lead at Oxylabs.