So, ray-tracing. It’s really happening, isn’t it? Though there’s been a whole lot of bemoaning over it at almost every level, ray-traced graphics are slowly but certainly becoming the norm. In fact, I’d go so far as to say that rasterization is about to fall out of favor for real.
The equation is really simple, actually: it’s been over half a decade since the first mainstream raytracing-ready GPUs entered the market, and every single modern mainstream graphics processor supports the technology by default. It might not be able to run ray-traced games at a reasonable frame-rate, no, but even the humble Steam Deck can technically trace rays. It’s always sad to see old hardware, like the revered 1080Ti, left behind in lieu of the cutting-edge, but this really has been a long time coming. The question is, what does ray-tracing mean for the average gamer and the average game developer? Let’s start from the beginning.
The difference between ray-tracing and rasterization
First things first, then. Though this has been explained time and again, it’s worth it to go over the facts once more, just so we’re all on the same page. Ray-tracing is effectively an attempt at simplifying and adapting the way real-world lighting works in the context of computer-generated graphics. In the real world, objects are illuminated by beams of light coming from various sources, including (but obviously not limited to) the sun, the screen you’re reading this on, and the unceasing blinking power LED of the gadget sitting closest to you. All of these sources of light cast rays (i.e. photons) across all the other objects close to you, and these same rays then bounce off of the nearest materials to actually make them visible to you, a presumable human being. At a very basic level, ray-tracing and its bigger brother, path-tracing, attempt to emulate these properties for real-time lighting, reflections, and other related hijinks.
Rasterized lighting, on the other hand, leverages fixed polygonal vertice intersections to generate objects in front of your video game viewpoint. Each vertex intersection has a bunch of information associated with it, which tells the game engine how to transform the displayed three-dimensional model into something a two-dimensional display can render. Proper lighting is achieved through the use of shaders and manual placement of lights, which are sometimes baked-in, and other times not.
This is extremely simplified and truncated description of both of these rendering techniques, mind. The crucial differences between the two are these:
- Ray-tracing is programmatic, real-time, and highly precise, which makes it computationally expensive.
- Rasterization is manually curated and mostly set in stone, which makes it computationally cheap and quick to render.
Now, the stage is set. What does this all mean from a layperson’s perspective?
The real problem with ray-tracing is Nvidia
It’s all in the cheat sheet I laid out above, actually: ray-tracing is expensive to run, and many still consider it to be a wacky, unnecessary gimmick. True enough, that very well might’ve been the case back in 2018, when we were first coming to terms with the humble RTX 2000 graphics cards. Nowadays, though, I genuinely believe that rasterization is on its way out.
Whereas we once might’ve only had a single ray-tracing exclusive, it’s a fact of the matter that virtually every modern AAA comes with some manner of ray-tracing by default. In fact, many of 2025’s flagship releases, such as DOOM: The Dark Ages, have no rasterization whatsoever. This means that if your GPU can’t run ray-tracing, it won’t play these games at all.
Personally, I am quite torn on the matter of ray-tracing. While I’ve got plenty of problems with Nvidia’s approach to GPU development, the state of things is such that AMD is having a really hard time keeping up with its competitor. Heck, whereas AMD’s new-gen graphics cards are only targeting the mid-range market, Intel is firmly in the low-end sector with some serious caveats. This means Nvidia is the sole proprietor of the cutting-edge, for better or for worse. This, I think, is the problem most people actually have with ray-tracing.
If you want to play games with ray-tracing enabled and at reasonably high settings, AMD Radeon sadly doesn’t have all that much to offer, and Nvidia is prohibitively expensive nowadays. Long gone are the days when we could get a top-of-the-line GPU for under $500, after all. When all the fancy new games leverage fancy new tech that is effectively locked behind expensive GPUs, it’s a given that people might have knee-jerk reactions to the tech itself.
Ray-tracing is the future, whether we like it or not
As I explained before, ray-tracing is computationally expensive but makes the developers’ jobs much, much easier (as they don’t have to manually fake lighting via rasterization), and leads to a more realistic and dynamic scene across the board. With AMD at least a generation and possibly more behind Nvidia, however, there’s a discrepancy between the excitement around this fancy new rendering tech and its availability. Running ray-tracing at the settings where it makes a difference necessitates fast hardware, which most people simply do not have.
When that’s the case, it’s very easy to just dismiss ray-tracing as a gimmick that doesn’t even look that much different without considering its importance in the grand scheme of things. Ray-tracing is, on paper, a lighting system that is much closer to how photons work in the real world, and this makes compelling video game lighting and material-work more feasible.
The battlefield for cutting-edge visuals has simply moved beyond mere texture resolutions and effect density. Lighting is key, and rasterization is an outdated and impractical solution compared to real-time ray-tracing. The turning point for me was seeing proper, realistic indirect lighting show up wherever I looked in Cyberpunk 2077. Running it at sub-30 FPS on an RTX 3080 sure wasn’t fun, but things have changed a great deal since.
Things would’ve been different had AMD embraced ray-tracing from the get-go, I’m sure. As it currently stands, with Nvidia helming the tech, there’s a non-insignificant number of people eager to go against the grain and complain about ray-tracing as a gimmick. Except, it really isn’t a gimmick. Instead, it’s the next step forward for real-time graphics, whether we like it or not.
The next generation of consoles is bound to have reasonable ray-tracing capabilities, and I feel we’re mere years away from a situation where new AAAs outright do not support rasterization at all. Rasterization will stick around virtually forever, of course, as the purview of smaller-scale productions and indies, but it seems entirely obvious to me that ray-tracing has firmly entrenched itself as the lighting tech of the future. And, from the look of things, the future is a-now.
Published: Jan 31, 2025 10:31 am