Historically, haven't most id games been better performers on nVidia hardware? Not saying it's GameWorks or a conspiracy or anything, just seems to have been that way since I can remember.
I think it had to do with they historically using OpenGL and Nvidia being the best at it for a long time, also I remember John Carmack having a clear preference/better relationship with NV, but he is no longer there.
anyway, I think this game is also 60FPS on consoles?
The reason why my Radeon's Wolfenstein Old Blood benchmarks were wrong.
Radeon owners, don't use MSI Afterburner with Wolfenstein Old Blood
And the new 7950 video
[new] Wolfenstein Old Blood 1920X1080 Ultra(-) 7950 @1.1Ghz CORE i7-860 @4GHz
Ultra(-) means Ultra preset but only 4xAA.
55fps from 32fps, due to the OSD, lol.
In all fairness Unwinder warns about using Vector 2D in RTSS, that's why I never use it, but that's the only setting that worked in this specific game.
In any case, lesson learned. Very sorry for speaking ill about the AMD drivers. That's second time in two weeks. It seems the driver is more solid than I want to believe.![]()
Yeah why is the 980 Ti SLI performing worse than just a single 980 Ti? I don't trust those graphs especially when it looks like they've got it maxed out at 60fps.
Look at 780, Titan and 780Ti.
A 7970Ghz/280X is matching Kepler Titan. Yet another new game where Kepler is taking a dirt nap.
You've made your point about Kepler many, many times on this forum. Let it go.
Look at 780, Titan and 780Ti.
A 7970Ghz/280X is matching Kepler Titan. Yet another new game where Kepler is taking a dirt nap.
The game is 60 fps 1080p on console, very well optimized to pull that off.
https://www.youtube.com/watch?v=f-z7WmdjpC0
Yet there's still so many deluded people who deny the data that is so common! Almost every new next-gen game, the same story.
Remember last year, HardwareCanucks did a story about whether Kepler has dropped off? Lots of folks on this very forum used their false data to push an agenda that Kepler is still "doing great, nothing to see here".
We can even go back further, 2012, when PS4/Xbone was known to be using GCN GPUs, some of us correctly called it, that next-gen games will run very well on GCN on the PC at the detriment to Kepler. That the 7970 is much more future proof than the 680.
One has to laugh when those threads are re-read today, the 680 with it's ~20W advantage over the 7970 was praised by many here like the second coming of Jesus. All 2GB of it. Heck, the hype even got to me, I went out and bought a gtx 670 only to see it age so poorly compared to my 7950.
Is this using Nvidia-killing Async Compute yet, or is that being saved for release?
770 2GB matching 380X 4GB, good showing by GK104! :thumbsup:
WTF? The GTX 970 in SLI is WAY below a single GTX 970. SLI isn't just broken, it's a major hindrance!
When you have sites like hwc shilling for Nvidia you can clearly see the power of the brand and Nvidia's mind share not just among users but also the press who in many cases have now become outright PR extensions of Nvidia. Anyway the upcoming generation should be interesting. I am looking to see how Polaris and Pascal compare architecturally.
Pretty strong accusations there, chief. Just because you don't agree with somebody's opinion doesn't make them a shill.
By your own admission you were once way too positive on AMD's prospects/execution and during that time were very aggressive towards a lot of people on a lot of web forums who held opposite views to yours. I don't think you are an AMD "shill" but even if I did, I would need actual evidence before making such a serious claim in a public forum.
Tone it down, dude. The people who do these analyses are humans with their own opinions on what makes a good vs bad product (especially at the time, w/o the benefit of 4+ years of hindsight).
But they are a shill site. You seem to like to pick fights. You gone and done it twice in this thread with barely any posts in it. :\
380X or GCN 1.2 is not performing well. The 280X smash the 380X.
This game is behaving like DX12... uarch specific optimizations... is it Vulkan?
They talked about using Async Compute, but hard to see it happening on OpenGL.
Again, calling a website a "shill site" is an extremely strong accusation. Do you have any proof?
And, no, I don't like to pick fights, but when I see people trying to trash the reputation of a website that's probably run by people who are just trying to do their best, I demand more evidence than simply, "I don't like what they have to say, they are shills!"