[ H ]: BF5 Raytracing VRAM easily exceeds 6GB

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
https://www.hardocp.com/article/201...nvidia_ray_tracing_rtx_2080_ti_performance/10

Even at 1080p, switching to DX12 by itself with no RTX doubles usage and easily blows past 6GB. What exactly was Jensen smoking when he presented the 2060 as "RTX"?

It also doesn't take much to exceed 8GB which is nVidia's #2 card:
1546856165f4k2bi0vdz_10_1.png
This is the perfect illustration of how the entire Turding line has just raised costs and is under-spec'd in every category as a result.

At every performance bracket we got a higher price with a VRAM reduction: 8 -> 6 (2060), 11 -> 8 (2080), and stagnant 11GB with price hike (2080TI).

Ironically if you stick to DX11 and ignore the DX12/RTX garbage, even 4K fits on a 6GB card. LMAO.

The fraudulent Turding line is a giant basket of fail.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
But actual performance seems to be OK, according to independent testing, though RTX intensive scenes causes drops below 60 FPS.
Based on what? Where are the BF5 frametime plots to confirm that statement?

From the article:
What is the result of bottlenecking VRAM in BFV? Stuttering, choppiness, pausing in gameplay, jerkiness, framerate skipping, noticeable periods of time going by as new art assets are loaded and swapped out of VRAM while playing. Moving from area to area in a map means the loading and re-loading of textures and assets, and there is lag when this happens, and it happens often when hitting that VRAM capacity cap. It can also cause inconsistent framerate and frametime, and wild swings in framerate as we have experienced. It does not make for a good gaming experience.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Just to get past BGF10K's vitriol the problem is not ray tracing it's DX12, in particular DICE's renderer. It's very memory inefficient compared to DX11 (and a lot slower too).

The problem here is that the change to low level API's forces dev's to essentially write what the GPU driver writers used to write. They just aren't as good at it so we end up with a DX12 renderer that's significantly slower and uses a lot more memory. The worrying thing is that if anyone was going to make this work it would be DICE as they originally pushed for the change (along with AMD) and have been working on low level gpu API's since Mantle first appeared.

Anyway the blame for this is squarely on DICE. They asked for low level API's, they got them and now they can't make them work well.
 

coercitiv

Diamond Member
Jan 24, 2014
6,198
11,891
136
Just to get past BGF10K's vitriol
the blame for this is squarely on DICE.
Definitely not a fan of @BFG10K vitriol either, but we have to admit some of the points recently raised in his threads are burning huge holes through Nvidia's marketing slides. Keep in mind it was Nvidia who chose DICE / BF5 as the first exponent for ray tracing. If DICE can't make it work properly with full support from Nvidia, who will?!

Personally I think we're watching Nvidia paying a long overdue bill for not pushing DX12 alongside AMD. They had RT on the roadmap for years, they knew they would need a healthy DX12 ecosystem to make it shine, but they chose to watch others push for low level APIs from the safety of their DX11 optimized hardware and software stack.

Let's blame the developers though, will work wonders for sales.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,795
3,626
136
Based on what? Where are the BF5 frametime plots to confirm that statement?
Gamers' Nexus did some testing:

https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56

rtx-2060-bfv_1.png


Granted it's sub-60 fps but still playable though especially at DXR = low.
We’ll give the 2060 credit where it’s due: It is able to play this game at 1080p and roughly 60FPS with ray-tracing enabled, although it does lose about 40% of its performance in doing so.
 
  • Like
Reactions: Muhammed

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Definitely not a fan of @BFG10K vitriol either, but we have to admit some of the points recently raised in his threads are burning huge holes through Nvidia's marketing slides. Keep in mind it was Nvidia who chose DICE / BF5 as the first exponent for ray tracing. If DICE can't make it work properly with full support from Nvidia, who will?!

Personally I think we're watching Nvidia paying a long overdue bill for not pushing DX12 alongside AMD. They had RT on the roadmap for years, they knew they would need a healthy DX12 ecosystem to make it shine, but they chose to watch others push for low level APIs from the safety of their DX11 optimized hardware and software stack.

Let's blame the developers though, will work wonders for sales.
That's the point - Nvidia can't *fix* bad developer work like it could with DX11. With DX11 much of the work was done by Nvidia in their drivers, DX12 being lower level requires more of the work to be done by the devs.

What should Nvidia do - they could produce some DX12 libraries specifically for their cards that do a much better job of it, but they would probably be terrible for AMD or Intel? I can't see that going down well around here?

Part of the problem is that different gpu architectures need different optimisations - what might work well for one probably won't be quite right for another. We saw this with Mantle and BF4 where the initial AMD cards it was released with worked better, but the 285 which was *very* similar worked worse. Now imagine the differences between Intel/Nvidia/AMD all having to use the same DX12 renderer.
 
  • Like
Reactions: ozzy702

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
We have learned MANY times that just because X game uses Y amount of VRAM, doesn't mean it NEEDS Y amount of VRAM. Battlefield 1 could easily "use" 6GB of VRAM at 1080P, yet my 4GB card, at ultra settings, never had any hitching issues (once the game itself was patched up, there was issues at release, but not related to VRAM).

I have no doubt that DXR requires more VRAM, thats a given. But you can't use a 12GB card to see how much memory is *required* at 1080P.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Gamers' Nexus did some testing:

https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56

rtx-2060-bfv_1.png


Granted it's sub-60 fps but still playable though especially at DXR = low.

Unless you're using a VRR display with really wide effective refresh, all three scenarios will be less than ideal. With a non VRR 60hz display, the top (off), would be the only way to avoid a pretty unpleasant jittery experience imho.

Of course, with Freesync being supported now, it's more likely that someone could afford or already has a monitor that would make the experience at least passable. So that's a real bonus.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
As anyone who has played DX12 BFV they absolutely dropped the ball on the DX12 path, at least for Pascal probably more. It runs horribly on my 1080 Ti with constant hitching, framedrops, lower average FPS. Not sure what the issue is because Star Wars Battlefront and Battlefield 4's DX12 paths ran great on my 290.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
BFV is a horrible game to introduce raytracing. Dice has NEVER had DX12 working properly and DX11 has always provided superior results in frame times and fps.

I actually really dig BFV, but I also play it with DX11, 1440p 144hz with low settings and all the nonsense turned off which still looks pretty damned good.

I do look forward to raytracing, but not on competitive multiplayer first person shooters and I think we're still several years out from raytracing becoming usable and implemented properly.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Has anyone actually written an engine for DX12 from the ground up yet? The only thing I can think of is maybe AotS, but I'm not even sure if that's true.

It seems like most DX12 implementations are just built on top of existing engines that weren't built with DX12 in mind, so it's little wonder that the performance is worse.

Also, I believe that DX12 works best in cases where games are more CPU-limited and the driver overhead eats up disproportionately more resources. If you're using the best high-end processor, it's not really playing to the strengths of DX12 either.
 
  • Like
Reactions: beginner99

Adawy

Member
Sep 9, 2017
79
24
41
I read from a Graphic Designer here, I think it was @Krteq, that until this day we're yet to see a fully native DX12 game, and the potential is similar to what we can see from Vulkan?

Correct me if i'm wrong.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Digital Foundry confirms RTX 2060 is vram bound in BF5 1080p /w ray tracing enabled, sees 5.5GB+ memory usage in Utra and finds that reducing texture quality to High has a considerable effect on framerate.

<snip>

Good video. One of the biggest take a way from it for me is this could be a position if NV does it right their Gaming Experience thing could "save" them. Leadbetter found a good combo of settings that gave him 60 FPS and "beautiful" visuals.

I'd rather tweak my settings personally, find my good spot, but I know my wife uses that damn Gaming Experience, and chances are (I've heard her complain less about settings) it's doing something that makes her happy. So /shrug.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Good video. One of the biggest take a way from it for me is this could be a position if NV does it right their Gaming Experience thing could "save" them. Leadbetter found a good combo of settings that gave him 60 FPS and "beautiful" visuals.

I'd rather tweak my settings personally, find my good spot, but I know my wife uses that damn Gaming Experience, and chances are (I've heard her complain less about settings) it's doing something that makes her happy. So /shrug.

Man this is an underrated comment and observation. It is so common to see people dismiss cards like 1060 or 580, or even older or mid-range CPUs, because they're only looking at the most extreme tests usually shown by reviews.

Trying to play every game literally maxed to the gills is an exercise in futility at times. Things go so much better if you're flexible enough to look at a way to balance performance with visuals, and willing to back off a thing or two.

Take the 1060 and 580. At 2560x1440/60, I found them pretty good actually, so long as you are sane with AA, and notch shadows down a bump or two. Then at Gysnc 35" 3440x1440, my ROG Strix 1080ti is the same. It could also do 4k/60 just fine with a couple of tweaks as well, though I found hugely more satisfying smoothness with the Gsync display.

I always appreciate deeper looks than just purely MAX AA all ULTRA. Especially if there are great settings that allow smooth non-choppy gameplay.
 

Jaskalas

Lifer
Jun 23, 2004
33,438
7,503
136
Digital Foundry confirms RTX 2060 is vram bound in BF5 1080p /w ray tracing enabled, sees 5.5GB+ memory usage in Utra and finds that reducing texture quality to High has a considerable effect on framerate.


That's a great video.

Seeing the RTX 2060 with RTX on handling BF5 is a re-write of the question. Does RTX work? Yes, yes it does.

As for the VRAM limit. Changing the textures from Ultra to High does little to nothing at 1080p, in terms of image quality. If that's what it takes then the public shouldn't have much of a problem with it. The 6GB RTX 2060 works just fine on BF5.
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
Going to be a hard pill to swallow dropping Textures from Ultra to high on a brand new RTX 2060 gpu so you could allow a resource hogging feature to run without a hitch!
 

Adawy

Member
Sep 9, 2017
79
24
41
Personally I think it doesn't matter, if you set the details at the Consoles' level, which is a mix between Medium and High, you're unlikely to notice much, especially when you immerse yourself into the Gameplay and Story.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Digital Foundry confirms RTX 2060 is vram bound in BF5 1080p /w ray tracing enabled, sees 5.5GB+ memory usage in Utra and finds that reducing texture quality to High has a considerable effect on framerate.
Yep, good link that confirms the issue. From the article:

First of all, there's the stutter. Frame-time lurches to 80-100ms disrupted the flow of the game and it's something I've only ever seen before trying to run Battlefield 1 on a VRAM-constrained GTX 1050 2GB. Secondly, there was the lack of repeatability in benchmark runs - it seemed to be the case that the longer I ran the game, the more performance degraded, and the more I tweaked settings, the more likely I was to see a prolonged drop in frame-rate - which could be mitigated to a certain extent simply by restarting the game.
Some might try to ignore the graphs but you can't ignore the in-game stutter they experienced like [ H ].

As for the VRAM limit. Changing the textures from Ultra to High does little to nothing at 1080p, in terms of image quality. If that's what it takes then the public shouldn't have much of a problem with it. The 6GB RTX 2060 works just fine on BF5.
Personally I think it doesn't matter, if you set the details at the Consoles' level, which is a mix between Medium and High, you're unlikely to notice much, especially when you immerse yourself into the Gameplay and Story.
So what we have so far is:
  • 4K doesn't matter.
  • Native resolution doesn't matter.
  • High refresh/FPS doesn't matter, all you need is 60FPS.
Now apparently texture quality doesn't matter either. Also detail levels don't matter, as long as they match consoles'.

Wow, nVidia's RTX has taught us sooooooo much about gaming! :rolleyes:

A $350 card shouldn't be 6GB, especially not when a cheaper 1070/1070TI has 8GB VRAM. Like I said in the OP, Turding is an overpriced downgrade.
 

Adawy

Member
Sep 9, 2017
79
24
41
Yep, good link that confirms the issue. From the article:


Some might try to ignore the graphs but you can't ignore the in-game stutter they experienced like [ H ].



So what we have so far is:
  • 4K doesn't matter.
  • Native resolution doesn't matter.
  • High refresh/FPS doesn't matter, all you need is 60FPS.
Now apparently texture quality doesn't matter either. Also detail levels don't matter, as long as they match consoles'.

Wow, nVidia's RTX has taught us sooooooo much about gaming! :rolleyes:

A $350 card shouldn't be 6GB, especially not when a cheaper 1070/1070TI has 8GB VRAM. Like I said in the OP, Turding is an overpriced downgrade.


Nah man, that's not what I mean, i'm not a fan of RTX either. I'm simply stating what I think about Graphics in general, I mean, you yourself don't mind playing old games, it's the Gameplay that matters most, the fun aspect.
 

maddogmcgee

Senior member
Apr 20, 2015
384
303
136
Nah man, that's not what I mean, i'm not a fan of RTX either. I'm simply stating what I think about Graphics in general, I mean, you yourself don't mind playing old games, it's the Gameplay that matters most, the fun aspect.


Completely agree with you on this but surely the best bet is to turn off ray tracing and leave everything else basically maxed?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So what we have so far is:
  • 4K doesn't matter.
  • Native resolution doesn't matter.
  • High refresh/FPS doesn't matter, all you need is 60FPS.
Now apparently texture quality doesn't matter either. Also detail levels don't matter, as long as they match consoles'.

Really is frustrating to read your posts now. You're complaining a non-high end card can't get performance even high-end cards can't get. Que?

Even if RTX isn't people's cup of tea, we're now reading posts about a card that delivers 60% more performance over its predecessor being a turd because, *GASP*, it can't run the newest games with the newest features (which the OP doesn't even like) without some compromise!

Holy crap all the SJW authors nailed it. Entitlement sure has spread from consolers to PC'ers now. I need a time machine, I need to disown ATI because that Radeon 7000 64MB PCI 128bit card I bought in like 1998 couldn't even run Medal of Honor at 60 FPS locked with max settings at 1600x1200! I had to, *crying*, lower settings. Hold me, please, I'm literally shaking now.

My eyes have been opened, thanks BFG10K! My next card is definitely going to be Radeon 7, based on your other thread, I should have NO ISSUE running DXR on it.