Although plenty of games themselves allow players to move around and freely explore fully realised 3D worlds and environments, fans watching their favourite gamer’s exploits on streaming services like Twitch are restricted to just that gamer’s perspective: but that may not be the case for long.
Being able to explore the 3D environment of a game while still monitoring the progress of a single player is far from a new idea. Many multiplayer first person shooters allow gamers who die the opportunity to follow another player around before they’re respawned back into the combat. Plenty of games also come with spectator modes, where viewers can freely float around a map that other players are gaming on. That experience, however, requires every player to have a copy of that particular game installed and running.
Streaming services like Twitch instead take a single player’s rendered view of a game and broadcast it out to thousands of viewers as a 2D video stream. This approach helps further popularise video games as a spectator sport, but the ability for viewers to change the perspective of the live stream they’re viewing would make the experience even more compelling, similar to how alternate camera angles can improve the experience of watching a sports match. But that would also require every single viewer to have a full copy of the game they’re watching someone else play installed on the device they’re using, which isn’t always possible. For instance, you can watch a live stream of a game that requires a powerful high-end gaming PC on a low-end tablet device that could never possibly run it.
Researchers from the University of Waterloo’s Cheriton School of Computer Science in Ontario, Canada, have come up with a way to give viewers of a video game stream the ability to actually look around the 3D world being presented, without the need for owning and installing a copy of the actual game. Even low-end mobile devices available today, like tablets and smartphones, have enough processing power to render simple 3D environments, and here that is leveraged to create a livestream viewing tool with expanded capabilities. Their research is available here in a recently published paper.
Instead of seeding everyone tuning into a livestream with a copy of the game (many A-list games with gigabytes of graphical resources can take hours to download on even a fast connection) the researchers simply enhance the 2D video data sent to viewers with additional information pulled from the game’s real-time rendering engine, including the “depth buffer, camera pose, and projection matrix.”
On the viewer’s end, this additional data allows the 2D video information to be used to recreate a rendered 3D environment that matches the geometries from the game. And just like in-game spectator modes, the viewer can manipulate the environment to change where they’re looking or even where the camera is positioned. Want to see where the missile that took out the gamer you’re watching came from? You could potentially look around the same 3D environment they’re in and see for yourself.
The only drawback is that part of the recreated 3D environment lacks texture and graphical information, leaving it looking mostly monochromatic and bland. For the time being, that may make this remote experience feel less compelling. But as the research progresses, we may eventually see ways to resolve this, particularly as interactive game streaming seems to be the way the industry is moving in general.