Do you like games? Do you like jargon? Well, then you’ll love ambient occlusion, morphological anti-aliasing, adaptive vertical synchronization, and real-time ray tracing.
All of the above were at one time the latest obtuse term for some complicated technology that’s been hyped as the next leap in gaming graphics. Except now, the last one might actually be truly revolutionary. Ray tracing achieved buzzword status at this week’s Electronic Entertainment Expo. Game announcements made by Microsoft, Nvidia, and AMD at the big gaming show were peppered with promises that their upcoming releases will bring this miraculous technology into our homes.
“I think it’s paradigm shifting,” says AJ Christensen, a visualization programmer at the National Center for Supercomputing Applications. “There’s a lot of stuff that we’ve been waiting to be able to do. The imagination far precedes the technology, and I think a lot of people are excited and waiting for it.”
So what makes ray tracing so potentially-ahem- game changing? Let’s break it down.
What is ray tracing?
Simply put, ray tracing is a technique that makes light in videogames behave like it does in real life. It works by simulating actual light rays, using an algorithm to trace the path that a beam of light would take in the physical world. Using this technique, game designers can make virtual rays of light appear to bounce off objects, cast realistic shadows, and create lifelike reflections.
First conceptualized in 1969, ray tracing technology has been used for years to simulate realistic lighting and shadows in the film industry. But even today, the technology requires considerable computing power.
“A game needs to run 60 frames per second, or 120 frames per second, so it needs to compute each frame in 16 milliseconds,” says Tony Tamasi, vice president of technical marketing at graphics card developer Nvidia. “Whereas a typical film frame is pre-rendered, and they can take eight or 12 or 24 hours to render a single frame.”
This newfound excitement around ray tracing comes just as home gaming hardware is on the cusp of being able to process lighting effects in real time. The graphics chips that will go into the next generation of gaming PCs and videogame consoles should have the rendering power to produce ray-traced scenes on the fly. When that happens, it could result in a tectonic shift for visuals in gaming.
How is it different than what we’ve seen before?
If you look at the way light works in videogames now, it might seem like all the elements are there: reflections, shadows, bloom, lens flare. But all that is just sophisticated trickery. Programmers can pre-render light effects (even with some ray tracing), but these are baked into the scene-essentially just packaged animations that always play out the same way. These effects can look quite convincing, but they’re not dynamic.
“The problem with that is that it’s completely static,” Tamasi says. “Unless you render in real time, the lighting is just going to be wrong.”
If the player alters the environment by-for example-blasting a hole through a wall, the light in the scene won’t change to stream through that hole unless the developers have specifically planned for that possibility. With real-time ray tracing, the light would adjust automatically.
How does ray tracing work?
In real life, light comes to you. Waves made up of countless little photons shoot out of a light source, bounce across and through a variety of surfaces, then smack you right in the eyeballs. Your brain then interprets all these different rays of light as one complete picture.
Ray tracing functions nearly the same way, except that everything generally moves in the opposite direction. Inside the software, ray-traced light begins at the viewer (from the camera lens, essentially) and moves outward, plotting a path that bounces across multiple objects, sometimes even taking on their color and reflective properties, until the software determines the appropriate light source(s) that would affect that particular ray. This technique of simulating vision backward is far more efficient for a computer to handle than trying to trace the rays from the light source. After all, the only light paths that need to be rendered are the ones that fit into the user’s field of view. It takes far less computing power to display what’s in front of you than it would to render the rays emitted from all sources of light in a scene.
Still, that’s not to say it’s easy. “Thousands of billions of photons enter your eye every second,” says the NCSA’s Christensen. “That’s way more than the number of calculations a computer can do per second … so there’s a lot of optimizing and efficiency and hacking that needs to happen in order to even begin to make something look realistic.”
Rather than try to map out every single ray of light, the solution for developers at Nvidia is to trace only a select number of the most important rays, then use machine learning algorithms to fill in the gaps and smooth everything out. It’s a process called “denoising.”
“Rather than shooting hundreds or thousands of rays per pixels, we’ll actually shoot a few or maybe a few dozen,” Tamasi says. “So we use different classes of denoisers to assemble the final image.”
When is it coming?
Real-time ray tracing is already here-kind of. If you have a PC that can handle it, it’s available in a few current games such as Battlefield V, Metro Exodus, and Shadow of the Tomb Raider, as well as upcoming titles like Cyberpunk 2077 and Wolfenstein: Youngblood.
Nvidia introduced ray-tracing capabilities last year, with the release of its RTX graphics card line. So your PC would need one of those to properly take advantage of the technology. Current consoles, like the Xbox One and Playstation 4, don’t have the hardware to pull it off.
For those of us unwilling or unable to cough up between $350 and $1,500 for a graphics card, ray tracing will also be supported by the next generation of game consoles, specifically the Playstation 5 and Microsoft’s mysteriously named Xbox One successor, Project Scarlett.
The potential might be exciting, but it will still be a few years before the tech becomes standard. Real-time ray tracing is still in its adolescence and has proven to be a little temperamental. And as the hardware improves, developers and designers will have to keep up.
“It’s a new tool in the toolbox,” Tamasi says. “We have to learn to use that new tool properly. There’s going to be a whole new class of techniques that people develop.”