close
close

Tech Tricks That Make Computer Games Look Real

  • By Chris Baraniuk
  • business technology reporter

Screenshot,

Nvidia combines ray tracing technology with AI

The police officer approaches an abandoned industrial building scrawled with graffiti. His body cam, shaking as he steps forward, captures the scene.

Is cloudy. Weeds make their way through the cracks in the pavement outside. A dog is barking somewhere in the distance, but no one seems to be around.

Inside the gloomy, rubble-strewn interior, it becomes clear that people are hiding here. People who would kill him.

“This is the only game I’ve ever seen that really messes my brain into thinking it’s real,” reads the top comment below the gameplay YouTube video from Unrecord, an upcoming title from independent French studio Drama.

The video racked up millions of views in just a few weeks and caused a sensation in the gaming industry. Some commenters on social media questioned whether it was really a game, and if so, whether it was actually too real, too raw an experience.

Drama declined an interview with the BBC, saying: “We are currently busy with investors and publishers.”

But the graphics in many different games have become noticeably more sophisticated, possibly approaching what’s known as “photorealism,” indistinguishable from real-world photos or video.

Unrecord’s demo looks so realistic in part thanks to some clever techniques, says Piers Harding-Rolls, head of gaming research at Ampere Analytics.

Mr. Harding-Rolls points to the shaky camera, which mimics real footage from the crime scene. Dim lighting, grit, and audible city bustle in the background also help.

But could it make some people uncomfortable?

“That setup is quite reminiscent of some of the more gruesome images you get from real life,” notes Mr. Harding-Rolls.

In a statement posted on Twitter, Drama said that the game is not inspired by specific real-life events.

Also, look closely at the stills from the video and you might see some objects and textures that don’t look realistic at all. This may not matter, but it undermines the idea that the game is photorealistic.

Mr. Harding-Rolls notes that, in general, advancements in graphics are important to the games industry: “Consumers definitely want that. They love to look at things and think, ‘Wow, this looks amazing.'”

image source, Rachel McDonnell

Screenshot,

How a game character moves is more important than how they look, says Rachel McDonnell

Rachel McDonnell, Lecturer in Creative Technologies, Trinity College Dublin agrees that the Unrecord video is impressive, though she notes that certain character animations are a bit clunky.

They remember the movement of characters in other games who fall and die in pre-programmed sequences.

“Animation hasn’t caught up with rendering in games yet,” he says, adding that crowds are particularly difficult to make realistic.

“You’ll still see them behaving very strangely, running around in circles and getting stuck, which instantly takes you away from your presence in the game.”

image source, monster emporium

Screenshot,

Unity spawned two million individual strands of fur to create this lion and cub.

Marc Whitten, president of Create Solutions at game software company Unity, notes that today’s most realistic content relies on highly detailed 3D modeling of objects.

Last year, Unity showed off a computer-generated clip of a lion and cub with two million individually rendered strands of fur.

“If you don’t do that, it doesn’t look like photorealism,” Whitten argues. The firm has also developed highly realistic human models, where digital puppets control their subtle facial expressions.

There is still room for improvement, he adds. There are many other difficult-to-simulate materials, such as clothing, that are still a long way from looking photorealistic in games.

A major emerging technology for game graphics is Neural Radiation Fields, or NeRFs. California-based Luma AI specializes in this and says it already has customers using the technology to make games.

A NeRF is an artificial intelligence (AI) system that can represent objects or landscapes captured in photographs or video footage in the real world.

“When you show it these images from different sides, the network learns how light bounces off everything,” explains Luma AI co-founder Amit Jain. “Measure light and learn from light.”

The way light reflects off a leather motorcycle seat compared to a headlight, for example, is completely different and simulating that in a game is very challenging. Nerfs could help automate the process.

Some of today’s best game graphics use what’s known as ray tracing: precise simulations of the way light bounces off surfaces or creates glowing effects around neon signs, etc.

AI is making it possible to produce these effects in games despite modest chip performance improvements, says Bryan Catarzano, Nvidia’s vice president of applied deep learning research.

“We have to be smarter in the way we build the world and how we represent it,” he explains.

A new mode for Cyberpunk 2077, an action-adventure game, called Ray Tracing: Overdrive, demonstrates what a difference this can make.

Nvidia says its Deep Learning Super Sampling (DLSS) technology allows developers to create high-resolution, high-frame-rate ray-traced graphics with the help of AI.

“The model is trained to know how things are in the real world,” explains Mr. Catarzano.

Sometimes games are getting harder to tell from real life, says Nick Penwarden, vice president of engineering at Epic Games.

However, he says that it is still very difficult to render certain materials convincingly, such as an iridescent layer of oil in a puddle of water.

“Those are aspects that we don’t yet have the power to simulate in real time,” he says.

And doing it on game consoles or home PCs is what matters. For CGI movies, it is possible to use huge computers and take many minutes or more to render individual frames.

The most popular games of the future may not need to be photorealistic. Consider Epic’s Minecraft or Fortnite, both hugely successful, both far from photorealistic.

However, improved lighting effects and material simulations help artists working on all kinds of games, Penwarden argues. You can also give stylized or cartoony environments more depth and complexity.

“One of the great benefits of having the ability to make photorealistic images is that the technology can start to do a lot of the work for you,” he says.