Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

Nvidia’s ridiculously expensive RTX graphics have arrived full of promise, but with very little to actually show for themselves.

I’m currently reviewing a GeForce RTX 2080ti graphics processing unit. I’m not really ready to present my findings, because, well, to be honest, I’ve not really got anything for you.

But, before I go on, I really do believe that the direction the Nvidia are taking with their new RTX 20-series graphics cards is going to, literally, be a game-changer. I just can’t prove it to you right now.

The GeForce RTX 20-series GPUs have the same CUDA Cores that we had in the GTX 10s, but they also feature Nvidia’s new Tensor and RT Cores. Unfortunately, there is not way to test, benchmark or even engage the Tensor and RT Cores. Apparently, it’s coming for the likes of Shadow of the Tomb Raider and Battlefield V, but not yet.

Right now, if you dump down the best part of NZ$2,500 down for a GeForce RTX 2080ti, all you are looking at is a 30% increase over a 1080ti. You are paying over a grand for that 30% extra.

But.

If you factor in what’s coming. It’s a different picture.

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

You see, the RTX 20-series, with it’s Tensor and RT Cores are going to turn game development on their head. You may have read about the RTX 20s’ real-time raytracing capabilities (using the RT cores) and the AI capabilities that the Tensor Cores bring to the table and you may have seen the videos. It all comes across as marketing fluff.

So, when I caught up with Brian Burke, Nvidia’s gaming tech PR guy, at PAX AUS in Melbourne, I didn’t hold back. I asked him why should Kiwis part with such a huge amount of money for something that, right now, doesn’t do a lot.

I’m a massive advocate for Nvidia. Their technology has been driving the advancement of video game graphics since the demise of 3DFX. No disrespect to AMD/ATi, but they’ve been playing catch-up with Nvidia for well over a decade. Because of this I felt OK asking Nvidia where were the RTX demos? Why can’t new RTX users have something, anything, to show off the power of their new cards?

Brian’s response was muted, but I think Nvidia know that they may have jumped the gun launching the technology before the developers had anything ready. If it was the other way around though, it would have been the same problem, with gamers lamenting on why they have the games but not the hardware to run them.

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

Whilst the average Joe is going to have to wait, Nvidia did show me the Star Wars RTX demo in real time. I followed the HDMI cable from the TV to a PC running the demo with a GeForce RTX 2080ti installed, just to be sure.

The ray-tracing capabilities of the RTX 2080ti seem impressive. For developers, ray-tracing is the holy grail. Most of them have become experts at faking ray-tracing, using screen-space reflections, but will likely jump on-board with little persuasion.

All the games we play at the moment do a great job of simulating the way rays of light bounce of stuff and enter our eyes. There are, however, compromises, and as gamers, we sub-consciously tolerate these compromises. Reflective surfaces and shadows are limited to what is visible on the screen, unless specifically scripted.

With ray-tracing, virtual protons are fired from the scene’s light-sources, bouncing of all the objects and those hitting the camera, our virtual eye, create the image. Traditionally, this has been a complex task, reserved for high-end animated movies, with each frame taking hours or a super-computing powerhouse to create.

Nvidia RTX cards are going to have to do this at least thirty frames-per-second, and hopefully 60fps, to give us decent real-time ray-tracing performance.

The real challenge, though, is that developers have become so good at simulating ray-tracing it may be a while before we actually get to really notice the effects. The reflections in the Star Wars demo, and those demonstrated in the upcoming Battlefield V, really show off the power of ray-tracing and the realism that it can be used to achieve.

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

But not all games are going to be packed with polished, highly reflective surfaces. And if they are, we are so used to the cheats, it might be a while before we actually notice them.

At the Nvidia PAX AUS demo, I played a bit of 4A Games’ upcoming Metro Exodus. With a touch of a button I could switch between RTX and non-RTX visuals. Honestly, and this may disappoint Nvidia, there was not much in it. Yes, the RTX-enabled visuals were better, but not exactly $2,500 worth of better.

Developers can choose how they integrate the RTX-exclusive elements into their games. Balls-to-the-wall all out ray-traced environments may have to wait. The likes of Shadow of the Tomb Raider will only exploit the ray-tracing of shadows.

It’s the RTX’s DLSS (Deep Learning Super-sampling) technology, powered by those AI Tensor Cores that is likely to make the biggest impact in the immediate future. DLSS works by Nvidia running the game through their super-computers. As the game runs the computer is taught what the anti-aliased (that’s the removal of the jagged edges) images should look like.

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

Once it is finished the data is packaged up as an algorithm for the RTX Tensor cores to use to adjust the game’s frames on the fly. The result is, apparently, anti-aliasing with little impact on performance. A game that switches this technology on is going to get an instant performance boost.

But, right now we have nothing using these RTX features. The mooted ray-tracing in Shadow of the Tomb Raider and Battlefield V has not yet arrived. And I fear the worst.

For starters, ray-tracing is a Direct X 12 feature. DICE’s Frostbite Engine, which powers Battlefield V is a dog when it comes to DX12 implementation with Nvidia cards. The likes of Battlefield 1 and even Madden NFL 19 suffer from stuttering in game and during cut-scenes in DX12. Switching back to DX11 removes all the stuttering.

Trust us, we’re Nvidia: GeForce RTX 20-series GPU preview

Nvidia’s RTX 20-series GPUs are going to be a game-changer, but not right now. At this moment in time you are paying twice the price of a GTX 1080ti for about 30% of extra performance.

VR gamers are, nevertheless, going to enjoy that 30% extra performance as it gives them a performance boost that they even won’t get from two 1080ti GPUs running together in SLI. VR games, generally do not utilise two GPUs plugged together in a PC.

Nvidia’s RTX technology is very exciting, but, some two months since launch, there’s still nothing really justifying the huge cost of these RTX cards.

Look out for my RTX 2080ti/RTX 2080ti x2 SLI review very soon.