Researchers from the University of Waterloo’s Cheriton School of Computer Science in Ontario, Canada, have come up with a way to give viewers of a video game stream the ability to actually look around the 3D world being presented, without the need for owning and installing a copy of the actual game. Even low-end mobile devices available today, like tablets and smartphones, have enough processing power to render simple 3D environments, and here that is leveraged to create a livestream viewing tool with expanded capabilities. Their research is available here in a recently published paper.
Instead of seeding everyone tuning into a livestream with a copy of the game (many A-list games with gigabytes of graphical resources can take hours to download on even a fast connection) the researchers simply enhance the 2D video data sent to viewers with additional information pulled from the game’s real-time rendering engine, including the “depth buffer, camera pose, and projection matrix.”
On the viewer’s end, this additional data allows the 2D video information to be used to recreate a rendered 3D environment that matches the geometries from the game. And just like in-game spectator modes, the viewer can manipulate the environment to change where they’re looking or even where the camera is positioned. Want to see where the missile that took out the gamer you’re watching came from? You could potentially look around the same 3D environment they’re in and see for yourself.
The only drawback is that part of the recreated 3D environment lacks texture and graphical information, leaving it looking mostly monochromatic and bland. For the time being, that may make this remote experience feel less compelling. But as the research progresses, we may eventually see ways to resolve this, particularly as interactive game streaming seems to be the way the industry is moving in general.