Discussion [ H ]: Battlefield Raytracing 2070

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/1

55% performance hit makes an unplayable slideshow on a heavily overclocked 2070, even at 1080p. Who buys a $600 card to slideshow @ 1080p?

Also previous comparisons of DXR are flawed. The correct way is to compare it to DX11:
15449785070ikskvadx6_8_2.png
DX12 drops performance by 18% just by itself, so using DX12-only falsely shows a smaller hit.

Yet another example (of many) showing the failure of low level APIs. Johan Anderson was cheerleading Mantle yet his AAA engine still flops DX12 even after repeated iterations injected with $millions from EA. If even he can't get it right then it's time to move on.

I still remember certain individuals on this forum telling us "DX11 would be dead within 18 months and all indie developers would get automatic performance gains from DX12" (LOL).

Epic fail.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
Dx12 works gorgeously and does exactly what it was supposed to do,it reduces CPU usage caused by the driver and releases this performance to be used by the game ,utilization % goes down while IPC per core actually increases with DX12,low level APIs where never meant to be used for over powered PCs,they are meant to free up resources for under powered PCs.
What dice fails at is to actually use the correct amount of threads to get the best performance because leaving 20 or even 10% of resources untouched is pretty sad.

I79ULEY.jpg
 
  • Like
Reactions: Despoiler

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Anyone know how much the TAA performance hit is? I have yet to see a TAA implementation that looks good.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Looks like EA's stock is taking a dump as hard as nVidia's: https://seekingalpha.com/article/42...ate-reductions-battlefield-v-sales-disappoint

I guess a game combining the failure of DX12/RTX with SJW politics isn't quite the successful formula they hoped for. :rolleyes:

What dice fails at is to actually use the correct amount of threads to get the best performance because leaving 20 or even 10% of resources untouched is pretty sad.
Of course! The answer's soooooo simple, just spawn more threads and DX12 performance will automatically scale!

This exact same flawed argument was used by quad-core CPU proponents back in the day.
 
May 11, 2008
19,542
1,191
126
Looks like EA's stock is taking a dump as hard as nVidia's: https://seekingalpha.com/article/42...ate-reductions-battlefield-v-sales-disappoint

I guess a game combining the failure of DX12/RTX with SJW politics isn't quite the successful formula they hoped for. :rolleyes:


Of course! The answer's soooooo simple, just spawn more threads and DX12 performance will automatically scale!

This exact same flawed argument was used by quad-core CPU proponents back in the day.

TheELF has a point though...

But to make good use of multithreading, the engine has to be designed for that from the earliest ideas on the back of an envelop all the way up to the end product that is being used for a game.
Just creating more threads is indeed not a solution when threads just continue to wait for each other or that the thread management for multithreading consumes more resources in comparison to a single thread doing all the game logic, input, audio and video.
And those are extreme examples but it is what happens.
But good multithreading management will help speed up things a lot.
Even today, when the cachecontroller in a cpu should take care of everything, it is obvious that threads written to keep as much as possible performance critical instruction code and data inside the local caches will help performance a lot in comparison to code that will cause non stop access to slow main memory because of cache evictions.
That hampers performance a lot.

DX12 is not bad. It is just that the way to write a proper game engine means taking classes again if as a programmer you are not used to pgramming in that way.
Add that there is always a deadline and never enough time.

Not to say the programmers are bad, it is just that the level of knowledge required is almost obscene for people working against the clock.
And above are all examples i know about, not what is going on in the frostbite engine. Which i have no idea of.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Yeah it's pretty key to note Frostbite predates DX12 and low level by many years. Frostbite 1 came out in 08, which means it was almost certainly in development around the time Xbox 360 just came out. So scaling to 3 cores was pretty future looking at the time. They've certainly improved on the engine substantially since then, since it looks incredible and runs very well, but basic level architectural decisions are set in stone short of a rework.