How Do We Power Web 3.0 and The Metaverse?

The next evolutionary step of the internet in the form of web 3.0 and the metaverse is here, but computational challenges remain.

1 February 2022
Hadean Team

As emerging technologies begin to converge, we are witnessing the next evolutionary step of the internet in the form of web 3.0 and the metaverse. The markets themselves are materialising this change, with Microsoft’s recent $70b cash deal for Activision likely to give them another key building block for their metaverse. However, many challenges remain in realising this vision, particularly around how we’ll provide the computational power needed for this new era. It’s certainly no coincidence that Facebook’s parent company ‘Meta’ are also building an AI supercomputer to help realise its dream for the metaverse. 

The move from web 1.0 to 2.0 saw webpages move from displaying static, read-only information to what became known as the participating ‘social’ web. Content began creating itself exponentially through the nature of social media and the opportunity for anyone to contribute via mobile phones and the dawn of cloud computing. Today, new tech is changing it once again, as we transition to the ‘spatial’ web. 

Pinning this evolutionary step down to a single driver is difficult, with a number of technologies playing a role. VR/AR, wearables and IoT devices are transforming how we interact with the web, as we immerse ourselves in 3D spatial worlds instead of 2D screens. Digital twins are helping us better represent physical objects and systems, blurring the lines between the digital and physical worlds. AI/ML is accelerating data analysis and creation of worlds and the NPCs and systems that inhabit them. The distributed nature of these systems has strengthened the significance of blockchain, which is providing the suitable decentralised information structure. 

Though not one and the same, the metaverse is often mentioned in the same breath for good reason. Creating the persistent, limitless world of the metaverse is brought into grasp with the dawning of web 3.0 technology, but there remain challenges in place.

It requires connecting data hungry and compute heavy applications, demanding an overhaul of computing infrastructure. High performance computing needs to be available at the point of use and readily scalable to meet demand. Computation needs to be able to be pushed between the cloud and edge seamlessly. Essentially, decentralised spatial computing needs an equally distributed approach which current providers are lacking.

Consider first, online virtual worlds and the current restrictions legacy computation has placed on them. The number of participants is capped at usually 50-100, vastly restricting the kinds of experiences possible. While larger numbers have been achieved, ensuring that experiences remain smooth when users converge together is a key difficulty. Due to this, large scale virtual events, such as industry conferences or musical concerts remain unpopulated and ultimately uninteresting. Other metaverse use cases, such as gameworlds or social hubs linger under creatives restrictions leading to stagnant experiences and little market interest.

Similarly, low fidelity in these worlds also acts as a barrier to user engagement, both in the sense of aesthetic detail and complexity of mechanics. Environments, characters, economic and ecological systems are overly simplified and offer little improvement over the virtual worlds of the past. For the metaverse to grow to market expectations, this fidelity has to improve to entice a larger audience.

Building a complex and populated world is all very well, but ensuring that it persists through time and undergoes change is another crucial feature. One of the defining features of the metaverse is its constant presence in tandem with the physical world. While historically, persistent worlds have been created, the key difficulty is with reliability and the hefty cost. Storing and continuously uploading the vast amount of spatial data involved with creating complex worlds can be both computationally challenging and expensive. Surges of data can result in crashes, while provisioning during downtime is often financially wasteful.

Virtual worlds need virtual economies, and the transaction layer of web 3.0 will be built on decentralised technology such as blockchain that enables ownership and transparency of data. This inherently distributed architecture will need a high performance compute layer that is able to deliver processing power to the point of need, across different environments be it edge, cloud and on premise.

Connecting the layers and technologies of web 3.0 will allow the metaverse to become what was envisaged of it, but it requires a rethink of computing infrastructure. With the correct orchestration, edge, 5G and distributed cloud can help do this.

Back