The Cloud is the only way to solve the physical problem, but that too brings problems

Games are getting bigger in size, and not only the size of the game worlds. The actual size of a game install is getting unwieldy for a physical disc, even for UHD discs that can pack 100GB of data. In the last several years there have been numerous articles in the game press on the huge install size of modern games. PC Gamer ran an article that Call of Duty Warzone maps are running into 80GB. Optimizations and compressions can help with lazy and bloated coding, but they only go so far. Modern Warfare season 6 was so big, coming in at 375GB of data with a base install size and a patch, that it could not even fit on an Xbox Series S console and took up half the space of a PS5. For a Call of Duty gamer who owns Xbox Series S, it no longer became a question of what to uninstall to make room for Modern Warfare, it became an issue of which external hard drive to buy.

Just-in-time downloads

Inevitably as games install bases get larger, the Cloud could prove to be a neat and viable solution. Although gamers still prefer to download games and play them locally direct from a console or PC, to prevent latency in gameplay, streaming games with services such as GamePass xCloud and Stadia is one solution, another could be just-in-time downloads. Given most new games now need to be online all the time, we can envision a situation when game code could be continually downloaded and processed in the background and then used by the player. Once used, that data could be overwritten with new code. As opposed to downloading a huge patch, maybe the core data could reside on the hard drive and excess data downloaded temporarily just before it is required. This means game code that comes from the publisher server can be continually updated seamlessly.

 

 

Ubisoft recently announced “Ubisoft Scalar” (see above trailer), a cloud-based solution which it says will be the basis for even bigger game worlds.

 

The idea is that the game processing is done off console or PC and done on the Cloud instead. The data is then streamed to the gaming machine. It is not game streaming, per-se, but more game-engine streaming.

 

This idea of remote processing is not new, Microsoft tested the concept back in 2015 when it forwarded the idea that Crackdown 3 would be supported with a cloud-driven physics engine, taking the strain off the Xbox One. I was there in E3 when Microsoft demoed the concept and I thought this is amazing and that it could change gaming forever. The game did release with the Wrecking Zone multiplayer mode that utilized the Cloud physics engine, but it was all a bit of a damp squib. The game got mediocre reviews and the technology was not adopted by another game. No one cared, that is, until now.

During the GDC, Microsoft announced two new cloud services for indie gamers called ID@Azure and Azure Game Development Virtual Machine, and although they do not directly offer cloud-streaming engine solutions, they could be construed as steppingstones to a broader cloud-based solution for developing, managing and maybe one day delivering games. The idea that these services can be extended to bigger AAA publishers does not stretch the imagination. This could potentially solve the outcry from many gamers that publishers are dropping massive day one patches for games that can at time be larger than the core installation itself.

Could not a neater solution be to keep a reasonable amount of data locally on a console and to have the remaining excess code downloaded when it is needed. Given most people are getting faster broadband speeds, such a solution is credible, in time.

 

The downsides of remote processing

There are downsides to this idea, namely games will always need to be connected (this was the main objection to Don Mattrick’s vision of the Xbox One) and a fast reliable Internet is a necessary requirement. Another is that the game discs will not contain the full game, so archiving or preserving games will be an issue in the future if half the game resides on an external server elsewhere. That means hoping the publisher will continue to invest in running the server and keeping the game alive. To a large extent this problem already exists. Many single player games today require patches to run effectively and if those patches can no longer be downloaded, the experience for the gamer is diminished. Past events have shown game companies are mothballing games forever as they shut down servers, especially older multiplayer games.

So, the problem of size and space is a continual one for publishers, console makers and gamers alike. There is no perfect solution, just neat ones but inevitably there will be flaws to any digital delivery of games, we just have to make do. Until a physical disc or portable solid-state drive can be produced, at cost, and which can deliver terabytes of data, compromises will have to be made.

 

Author
Sam Naji