• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia Maxwell-based 900 series cards now going into legacy support

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
One game and one engine are not enough to know anything for sure.

As for GCN getting faster than Kepler, how can we be sure it's not the game developers who are not coding and optimizing their newer games for Kepler anymore?

There are far more variables than just drivers.


that is the point of driver optimization, just look at the witcher 3, before and after kepler patch.
 
that is the point of driver optimization, just look at the witcher 3, before and after kepler patch.

... and in some other cases it's the game itself that is running badly, like the case of Shogun 2 having performance issues on Fermi cards, back in the day, which was fixed by a game patch.

My point was, it's not always the driver.

Besides, it wasn't as if witcher 3 was running good already and needed 1-4% better performance. No, it was running badly and clearly had issues that needed fixing.

Such major performance issues get fixed as long as the card in question is supported in the driver.

Specific driver optimization in order to extract every single bit of performance are another matter though.
 
... and in some other cases it's the game itself that is running badly, like the case of Shogun 2 having performance issues on Fermi cards, back in the day, which was fixed by a game patch.

My point was, it's not always the driver.

Besides, it wasn't as if witcher 3 was running good already and needed 1-4% better performance. No, it was running badly and clearly had issues that needed fixing.

Such major performance issues get fixed as long as the card in question is supported in the driver.

Specific driver optimization in order to extract every single bit of performance are another matter though.


but still the point is not every developer have budged(or in case of gamework like gimping) to fix or to maximize performance and thats the point of driver optimization.
 
Deus Ex is out in a couple weeks. I think this is the first major title to release since the Pascal launch. Supports DX12 as well. We might get the first indication if Maxwell is getting Keplered when we see benchmarks of the game. If the 1080 is hugely faster than 980ti, the nerf has begun 😛
 
but still the point is not every developer have budged(or in case of gamework like gimping) to fix or to maximize performance and thats the point of driver optimization.

Huh?! You cannot fix a badly coded game with a driver, or vice versa. These are separate things.
 
Huh?! You cannot fix a badly coded game with a driver, or vice versa. These are separate things.

you can actually, did you ever read driver log, and usually you can see what the new driver fixed some glitch or add some performance in certain games.

and sometimes its amaze me, how good is the driver team, especially when they must face black box gamework and sometimes the dev isn't allowed to even optimize for team red.
 
you can actually, did you ever read driver log, and usually you can see what the new driver fixed some glitch or add some performance in certain games.

That's because those issue are in the driver, not in the game. If there is a performance bug in the game itself, it must be fixed by the game's developer. I even gave you an example.

sometimes the dev isn't allowed to even optimize for team red.

Was this ever proven? NVidia and AMD work with developers to optimize the games to run their best on their respective cards, but to actually prohibit the developer from optimizing for the other company's cards? That's a stretch. It might even be illegal.
 
Last edited:
Was this ever proven? NVidia and AMD work with developers to optimize the games to run their best on their respective cards, but to actually prohibit the developer from optimizing for the other company's cards? That's a stretch. It might even be illegal.
Welcome to nV's GameBarelyWorks program where AMD is cut off from source code for optimizing driver code.
 
Welcome to nV's GameBarelyWorks program where AMD is cut off from source code for optimizing driver code.

Again, was this ever proven? Locking Nvidia specific features, yes, but to lock out the source code of the ENTIRE game? That's doubtful.
 
Lots of failed reading comprehension, especially from the author of the original poorly-researched "blog" post quoted in the first post. Amateurism at its best.

Hmm. Wonder what cards AMD isn't making anymore, you know, GPUs classified as legacy.
 
The implication from the quote from the nVidia guy is that 9xx cards will no longer receive architecture specific support.

Wasn't the 970 the best selling card in history?

It's like "we've already sold you a card. We no longer have any desire to sell you a card. We no longer care how the 970 does in benchmarks. Indeed, the worse it looks, the better the 1070 looks. We already got your money."
 
That's because those issue are in the driver, not in the game. If there is a performance bug in the game itself, it must be fixed by the game's developer. I even gave you an example.

how even you sure it always driver ? and not because that game calling some specific feature that card lacked and causing bug like async in nvdia ?

that is why driver specific support was critical because not every dev know every card capabilities.

Was this ever proven? NVidia and AMD work with developers to optimize the games to run their best on their respective cards, but to actually prohibit the developer from optimizing for the other company's cards? That's a stretch. It might even be illegal.

i read it somewhere but I'm forget where i read it, but still gamework is blackbox and only nvdia know the source code.
 
The common Joe will make the same assumption that 9xx series won't have any support, nvidia really has some weird ideas like having a reference "founders edition" and making aftermarket cards perform the same or even worse..
 
80123.png

Oh no, it used a Proof of Concept Vulkan implementation to try to prove a point about performance.
 
I'm not exactly sure what legacy means in this sense. I do know that I'd be seriously worried if I had purchased a 9xx series card at this point. Nvidia lack of driver support for older cards has been troubling to say the least. I didn't even consider nvidia this go round due to this. I purchased a 480 for $200 and plan to get many years of gaming out of it.
 
how even you sure it always driver ? and not because that game calling some specific feature that card lacked and causing bug like async in nvdia ?

that is why driver specific support was critical because not every dev know every card capabilities.

Where did I say that? I'm saying sometimes it could be a bug in the game and sometimes in the driver. One must be fixed by the developer, the other by NVidia/AMD.

Lacking features are not bugs.

i read it somewhere but I'm forget where i read it, but still gamework is blackbox and only nvdia know the source code.

Gameworks, yes, those are NVidia specific and there is no point in optimizing them for AMD cards since they won't even work or work properly, but that doesn't apply to the entire game. Nvidia doesn't own the entire game.
 
Last edited:

Is that current? I know it's from AT but a Link to source would help. Interesting results there if that's the latest for that game.
I can surely understand Vulkan patched games being faster on AMD hardware, but this may indicate that's not always the case. I guess there can be exceptions under any circumstances.

It's not current. Here's a much more recent test under Vulkan, although it was done in Linux. Vulkan in Dota2 is still the slower render path so I don't think it's of much benefit as it's still an experimental release, but if interested:

Capture.png

http://www.phoronix.com/scan.php?page=article&item=amdgpu-rx480-linux&num=11
 
NVidia and AMD work with developers to optimize the games to run their best on their respective cards, but to actually prohibit the developer from optimizing for the other company's cards? That's a stretch. It might even be illegal.
Tell that the Kepler users 🙂
zlatan said:
Kepler is generally a strong architecture when the shaders are optimised for ILP. In this case all the cores can be used in any Kepler product. The only problem when you don't optimize your shaders for independent workloads, so Kepler will loose 33 percent of the theoretical performance. This is the case with GameWorks. These shaders just use 67 percent of the Kepler GPUs, so 33 percent of your shader cores will always be in idle.
Optimising for Kepler is really easy. There might be some workloads when it is not practical to find some independent operations for the idle cores, because of the register pressure, but many times this is not a problem.
The problems are the licences. With some specific codes NV won't allow the devs to optimize the shaders. They can see the source, but they don't able change it, and the original code will hurt Kepler. This is why some games don't run well on these products. Basically the licences won't allow the performance optimization.
Maxwell likely won't be as "handicapped" as Kepler in some of the newer games versus Pascal, but that fragmented memory on the 970, oh boy. Time will tell what's going to happen to the best-selling gaming card... or was it 960 ... 😉

Ordered 1070 myself the other day, planning to keep it for about 1.5-2.0 years. You know why, gamers tax at its finest lmao 😛
 
Last edited:
Regret at not waiting for a Nitro rising... I was willing to tolerate a lack of mining. I was even willing to endure a poor api. But this too? If Nivida is pulling this crap now then my 1060 will be junk in a year.
 
Tell that the Kepler users

This is about gameworks only, right?

The discussion was about the claims that nvidia can prohibit a developer from optimizing the ENTIRE source code of a game for AMD cards, not just locking the gameworks code, which is highly doubtful.

Throughout the years there have been nvidia sponsored games that run faster on AMD cards and vice versa.
 
This is about gameworks only, right?

The discussion was about the claims that nvidia can prohibit a developer from optimizing the ENTIRE source code of a game for AMD cards, not just locking the gameworks code, which is highly doubtful.

Throughout the years there have been nvidia sponsored games that run faster on AMD cards and vice versa.
My point wasn't exactly NV versus AMD, but NV versus NV. Nvidia finds a smart way to cut the older gpus support to boost the sales of its newer hardware. GameWorks/VRAM gimping/etc are merely just tools. But yeah, carry on with your specific discussion 😉
 
Status
Not open for further replies.
Back
Top