If one thing is clear from our foray into the spatial computing revolution that’s occurring in the form of web 3.0 and the metaverse, it’s that the amount of data involved is staggering. Visually representing anything from a concert to an industrial manufacturing process will involve the processing and transportation of every ‘0 and 1’ needed to encode these activities.
In the last blog, we looked at the transportation part of this challenge, in the form of the networking hurdles we’re going to have to leap over. In this blog I’d like to focus on one step below this, in perhaps the most foundational problem facing spatial computing: the computation itself.
The Situation Today
Gaming has pushed the boundaries of creating realistic and dynamic worlds, with modern tools for being greater than we have ever seen before. Unreal Engine 5, for example, is advancing game and world design through things like its new Lumen lighting system, virtual shadow maps and the MetaHuman character creation system. But while these game engines are no doubt revolutionising the creative part of the process, how we will actually process the metaverse for the user remains somewhat a mystery.
But, you may wonder, don’t we seem to be making incredible progress in delivering these worlds? Multiplayer games are getting bigger and more realistic everyday and when we look back on the earlier video games, they seem incredibly primitive. But in regard to the metaverse, the reality is a challenge far greater than those previously. Michael Ball, thought leader and writer around the metaverse, in one of his key metaverse essays said the following: “In totality, the Metaverse will have the greatest ongoing computational requirements in human history.”
To get a sense of why he thinks this, consider a few features of the metaverse that need to be accounted for. At a basic level, you have calculating the physics involved when, say, thousands of people interact in a game or concert. Again, we’ve come a long way with physics FX over the years and no doubt lack the knowledge and programs that cover how we simulate these physical interactions. But at such a scale, the problem lies in providing the actual processing power for all those interactions. It’s like having the right tool, but having no arm to pick it up and use it with.
On top of physics we have things like data consolidation, rendering and synchronisation. For systems and NPCs (non-playable characters) that occupy virtual worlds, there will likely be artificial intelligence providing the logic for them. For certain applications there might even be advanced features like motion capture.
It goes without saying that all these qualities will demand an unfathomable amount of compute power. When you pair this with the concept of a seamless experience, so crucial to the metaverse, it adds a further demand for computation to continuously scale as the activity changes. Walking into a crowded party, a shopping centre, a game arena or concert may all have challenges, but having to simulate one directly after another will have to involve an entirely new approach.
Let us go back to the definition of the metaverse that we began this series with: “a term that indicates the evolution of how we interact with the internet – driven by the convergence of a multitude of technologies (in particular 3D and spatial computing) and societal trends where people create and engage in shared experiences through virtual and augmented environments.”
We have to consider the ‘convergence’ of the various technologies that we explored in this series. As computers have become smaller and more powerful, the interactions across the web have changed dramatically. As we have seen in this series, it has allowed us to develop devices such as mobiles and tablets, with web 3.0 activity likely to happen over VR and wearables as well.
It’s one thing running a state of the art game or concert, featuring high fidelity textures and entity interaction, on a modern day PC. But, as we learnt about the metaverse, these experiences will need to be available to people using a range of devices. But how are, say people with mobiles for example, going to be able to run these kinds of programs?
Ensuring device agnosticity whilst also delivering on the promises of the metaverse has led to server side, cloud processing being pegged as the solution. While the move to cloud was a key driving force in the move to web 2.0, cloud infrastructure will need to adapt further for web 3.0 and the metaverse.
The Cloud is an Untapped Source of Power
Huge amounts of computational power is indeed available on the cloud, but accessing it has proved difficult. Typically, the method chosen to do this is through containerisation. With most applications being designed to run on a single machine, this method deconstructs and repackages for the cloud. The orchestration of running cloud based applications is usually handled by something like Kubernetes, but typically involves a large amount of middleware and a heavy load on DevOps. For simple programs, these solutions are viable. But to process something as complex as a simulation in the metaverse, the demand is simply far too great.
Fortunately, Hadean’s new processing model is changing what’s possible. It removes the need for container orchestration tools, with applications built over it distributed by default. By scaling computation dynamically, simulations can grow infinitely in size and complexity, deploying the needed scaling across any cloud or on-prem environment as needed. Write once and deploy anywhere, Hadean brings the power of the cloud in reach.
As we bring this series to a close, let us reflect on this quote from Tim Sweeney, founder and CEO of Epic Games. “It makes me wonder where the future evolutions of these types of games will go that we can’t possibly build today. Our peak is 10.7 million players in Fortnite — but that’s 100,000 hundred-player sessions. Can we eventually put them all together in this shared world? And what would that experience look like? There are whole new genres that cannot even be invented yet because of the ever upward trend of technology.”
We’re entering a new era. When infrastructural technology changes, it upends the whole system. It’s not simply an improvement on what we’ve had before, it’s a completely new toolset to use. Sweeney’s curiosity around what breaking limitations might bring is being realised in the metaverse.