• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Any legitimate reason to pay premium for nvidia card

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
@desprado: I know about the "hardware frame metering" its just old marketing stuff for me. 290/290X exhibits the same problems of nvidia SLI: you need profiles for it to work.

Crossfire has a much more troubled history than SLI.

1. NV had hardware framepacing since at least Fermi, and if you don't know what frame times or framepacing means, go look up the Techreport article that started the whole framepacing controversy in the first place. Or see here for why it's a big deal: http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Tes-11

AMD only followed suit four years later with the R9 290/290X, and anything lower than that like R9 280X or 7970 or lower, doesn't get the hardware fix. Instead they get a sloppy software fix that partially fixes the issue for single-GPU, or dual-GPU single monitor. It still does not fix the issue for Crossfire+Eyefinity users like me. 🙁

2. If you want to know how bad the situation was from 2009 (and probably before 2009 as well) through most of 2013, see this: http://www.hardocp.com/article/2010/08/09/geforce_gtx_460_1gb_sli_vs_radeon_hd_5870_cfx (GTX 460 SLI beat HD 5870 Crossfire, yes you read that right, two midrange NV GPUs beat up the two best AMD GPUs at the time)

Thanks for repeating the same I said: It took AMD 1 generation to get a card on par with SLI: Hawaii.
 
@desprado: I know about the "hardware frame metering" its just old marketing stuff for me. 290/290X exhibits the same problems of nvidia SLI: you need profiles for it to work.



Thanks for repeating the same I said: It took AMD 1 generation to get a card on par with SLI: Hawaii.
No no if u search Google Still Nvidia GTX 7xxx sli > is better than R9 CF Series.
 
i was under the impression that the 680 was considered better than the 7970. Given that the 280x is a rebrand of the 7970 and the 770 is a rebrand of the 680... why exactly would you prefer the 280x?

Looking at benchmarks the ghz 7970 generally beats the 770 and it has a pretty solid lead in bf4 which matters most to me as that is what I'm currently playing. That and at one time it was cheaper with more ram.
 
@desprado: I know about the "hardware frame metering" its just old marketing stuff for me. 290/290X exhibits the same problems of nvidia SLI: you need profiles for it to work.



Thanks for repeating the same I said: It took AMD 1 generation to get a card on par with SLI: Hawaii.

And it has taken them longer than two years to get a fix out for their old 79xx and prior customers for DX11 crossfire eyefinity. DX9 will not be getting a fix. Sounds like great customer relations to me. Two years.

Poor software support like that doesn't happen on the nvidia side from what i've seen. For those guys it has "just worked" since the Fermi. Now here we are with AMD still working on it while approaching 2014 - I don't know how anyone can objectively view that situation as being pro consumer. Neither company is perfect. But nvidia is faster in getting fixes in place.

If there was an "nvidia tax", yeah, people were apparently willing to pay it to avoid this type of thing. Again, they're not perfect. But just far better and faster at correcting software problems and adding more features.
 
Last edited:
i'm still skeptical of hawaii's lack of crossfire connectors. Latency of pci express bus should be a lot higher than sli bridge. The better solution would have been to make a new higher bandwidth crossfire connector.

Though ultimately, I don't care, as I'm firmly in the camp that prefers a single card.
 
i'm still skeptical of hawaii's lack of crossfire connectors. Latency of pci express bus should be a lot higher than sli bridge. The better solution would have been to make a new higher bandwidth crossfire connector.

Though ultimately, I don't care, as I'm firmly in the camp that prefers a single card.

It seems to work more than properly indeed. Do you know how a bus works? standard crossfire bridges are old stuff, from pci-e gen-1 indeed. All HW solutions that needs a CF/SLI bridge looks old style stuff after Hawaii came out.

PS: I avoid SLI/CF too, AFR sucks, and the need of game profiles sucks.
 
i'm still skeptical of hawaii's lack of crossfire connectors. Latency of pci express bus should be a lot higher than sli bridge. The better solution would have been to make a new higher bandwidth crossfire connector.

Though ultimately, I don't care, as I'm firmly in the camp that prefers a single card.

So far, so smooth. R290 would have to be one of the best bang for buck cards, a great successor to the 7950.

In my region, R290s are still ~$150 less than GTX 780. No massive markup due to the mining craze, they held their launch prices thus far, but all sold out often.
 
It all depends on the game. Its like 50/50. However the 280x has 3Gb VRAM + more bandwidth + more compute power = future proof that 770 lacks. :whiste:
in which game u are talking about.Did u read the link before posting.First see the benchmark than post.Dont post without any knowledge.
 
Looking at benchmarks the ghz 7970 generally beats the 770 and it has a pretty solid lead in bf4 which matters most to me as that is what I'm currently playing. That and at one time it was cheaper with more ram.

That solid lead is almost gone due to patch and driver updates.

Check any review that uses the 331.82 or 331.93 drivers and you'll see that NVidia has caught up in BF4.

As a matter of fact, here's one right here..

As you can see, the Asus GTX 770 is right up there with the Sapphire Toxic 280x, even though it's one of the slower models..

The 280x still has the advantage though due to more VRAM, as you can see by the minimum framerate. If you got the 4GB model though, then it wouldn't matter.

That's the thing I love about NVidia. Their driver department is top notch when it comes to extracting performance. It wouldn't surprise me that eventually, they will overtake AMD completely in this game (on the DirectX 11.1 path at least) as the engine favors their multithreaded drivers.
 
It all depends on the game. Its like 50/50. However the 280x has 3Gb VRAM + more bandwidth + more compute power = future proof that 770 lacks. :whiste:

I see the 770 and 280X more or less even in terms of frames per second. True the default 280X has more VRAM, but you can get a 4GB 770 for less than the mining inflated 280X for the time being. You can get a 4GB 770 for around 330$-360$. The mining inflated 280X is 400-420$. So with the 280X you get 1GB more VRAM by default, or you can get a 4GB 770. 280X will have mantle, compute, and all of that jazz. Conversely, the 770 is cheaper right now, has much better and full featured software, doesn't have issues with SLI + surround microstutter, has AO / physx / TXAA and adaptive vsync.

Personally nvidia's software ecosystem has me sold; I have come to really enjoy features like adaptive vsync and driver AO. But for someone looking for performance and only that and won't ever use surround, the 280X can be an option. It's a great and fast GPU if you never have plans for using surround + CF. I also feel that AMD's software support is substantially worse than nvidia's, although the 280X has been around long enough (as the 7970) to where it may not be an issue for most people.

The situation is rather weird currently. Thanks to miners, equivalent Nvidia GPUs are cheaper than their AMD counterparts - now AMD has the tax and apparently for no good reason.
 
Last edited:
That solid lead is almost gone due to patch and driver updates.

Check any review that uses the 331.82 or 331.93 drivers and you'll see that NVidia has caught up in BF4.

As a matter of fact, here's one right here..

As you can see, the Asus GTX 770 is right up there with the Sapphire Toxic 280x, even though it's one of the slower models..

The 280x still has the advantage though due to more VRAM, as you can see by the minimum framerate. If you got the 4GB model though, then it wouldn't matter.

That's the thing I love about NVidia. Their driver department is top notch when it comes to extracting performance. It wouldn't surprise me that eventually, they will overtake AMD completely in this game (on the DirectX 11.1 path at least) as the engine favors their multithreaded drivers.

nice drivers :thumbsup:

In Battlefield 4 we were reluctant to find that the SAPPHIRE TOXIC R9 280X and the ASUS GeForce GTX 770 DC II were not quite playable at 2560x1600 while using "Ultra" quality settings. The SAPPHIRE TOXIC R9 280X was playable at 1920x1080 with 4X MSAA enabled. The "Ultra" quality settings were selected, including HBAO. It provided a playable experience averaging 63.3 FPS. The ASUS GeForce GTX 770 DC II matched the SAPPHIRE TOXIC R9 280X’s settings, and performed 1.9% faster with an average framerate of 64.5 FPS.

On paper the ASUS GeForce GTX 770 DC II’s performed faster, however the better gameplay experience goes to the SAPPHIRE TOXIC R9 280X. The graph clearly shows the SAPPHIRE TOXIC R9 280X performing smoother, which felt more fluent compared to the ASUS GeForce GTX 770 DC II. The ASUS GeForce GTX 770 DC II’s framerate jumped up and down over a wide range, which creates a stutter that reoccurs frequently throughout the run-through.
 
nice drivers :thumbsup:

Like I said, the 2GB of VRAM is definitely hurting the GTX 770, but since the OP said he was getting the 4GB version, then it wouldn't matter.

BF4 definitely benefits from having more than 2GB. On my machine, it tops out at 2.3GB in the campaign, and it's buttery smooth for me with no dips.
 
Like I said, the 2GB of VRAM is definitely hurting the GTX 770, but since the OP said he was getting the 4GB version, then it wouldn't matter.

BF4 definitely benefits from having more than 2GB. On my machine, it tops out at 2.3GB in the campaign, and it's buttery smooth for me with no dips.

Wait, you're suggesting the stuttering is garbage collection kicking in when it hits vram limit? That doesn't make sense, how would that explain Titans and 780s frame drops? 😉
 
You forget a few, G-Sync, Shadow Play, Windows Drivers, Control Panel, Nv Inspector, and Linux Drivers.

R280x is $400, 7970 is $470...

You sure it's worth paying the 20% premium for AMD?

What's their list to warrant a premium, Mantle? OP isn't mining, so what else is there that AMD offers?

Let me get a couple of things straight here...

Did OP say he us going to buy a GSync enabled monitor.?

It isnt AMD asking for a premium on their cards due to mining.
It is the retailers e.g Neweeg, Amazon, etc asking for a premium.

The retailers are asking for a premium because the miners are willing to pay it.

It isnt AMD's fault that theur cards are good for mining.
 
Last edited:
Wait, you're suggesting the stuttering is garbage collection kicking in when it hits vram limit? That doesn't make sense, how would that explain Titans and 780s frame drops? 😉

*blinks* One could ask the same question about how AMD cards are a mess in Assassin's Creed IV. Not only is performance slow and stuttery, but AMD doesn't even have crossfire support. In fact, AC 2, AC2: brotherhood, AC2: revalations, and AC3 also did not and still do not have crossfire support.

Back to BF4. Nvidia was also locked out during the development of BF4 due to it being AMD GE, so, i'd expect nvidia to catch up with drivers if they haven't already. Despite this fact, nvidia has SLi support in BF4, while I doubt AMD will ever have CF support in most ubisoft games.

Basically, you're making an example out of one game that nvidia was locked out of during the development. I wouldn't read too much into it. NV is going to improve that, and they already have SLI support. The same can't be said of AMD bothering to add CF to various Ubi titles. Yes, BF4 may favor AMD for the time being because they had exclusive access to the game as it is AMD GE. It is only one game, there are counter examples where AMD has a substantial performance deficit. I'd say that nvidia is competitive to even in BF4 with current drivers.
 
Last edited:
Wait, you're suggesting the stuttering is garbage collection kicking in when it hits vram limit? That doesn't make sense, how would that explain Titans and 780s frame drops? 😉

What frame drops are you talking about? If there are frame drops (particularly in Baku when the tower falls), it's eventually going to be either patched out or fixed with driver updates.

Here's another review showing the GTX 770 faster than the 7970 GHz in BF4. AMD's weakness is their drivers. They don't have the level of multithreading capability that NVidia drivers have, as exposed by the PClab.pl review .. The Frostbite 3 engine runs best on highly multithreaded architectures.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_1920_msaa.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_2560.jpg
 
What frame drops are you talking about? If there are frame drops (particularly in Baku when the tower falls), it's eventually going to be either patched out or fixed with driver updates.

Here's another review showing the GTX 770 faster than the 7970 GHz in BF4. AMD's weakness is their drivers. They don't have the level of multithreading capability that NVidia drivers have, as exposed by the PClab.pl review .. The Frostbite 3 engine runs best on highly multithreaded architectures.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_1920_msaa.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_2560.jpg
So much AMD evolved dam
Leave GTX 780TI alone even GTX 780> is better than R9 290X.
 
*blinks* One could ask the same question about how AMD cards are abysmally pathetic in Assassin's Creed IV. Remember that entire talk about how next gen console ports would improve PC ports? Yeah, AC 4 was on next-gen consoles, but doesn't even have crossfire support. In fact, AC 2, AC2: brotherhood, AC2: revalations, and AC3 also did not have crossfire support. Any bets on AC IV never having crossfire support?

Nvidia was also locked out during the development of BF4 due to it being AMD GE, so, i'd expect nvidia to catch up with drivers if they haven't already. That's what the excel at.

But for every example that you want to nitpick like BF4, rest assured there is a counter-example like AC IV that runs like garbage on AMD cards. So I wouldn't try to make such a great point out of one single game.

I'm not sure what you mean. I didn't experience any stuttering in AC4 on my R9290. It was CPU limited on my 4770k, in fact. Please don't tell me we're using this as a benchmark title now :|
 
I'm not sure what you mean. I didn't experience any stuttering in AC4 on my R9290. It was CPU limited on my 4770k, in fact. Please don't tell me we're using this as a benchmark title now :|

Let me rephrase. The main point i'm getting at is some games favor NV and some favor AMD. You're trying way too hard to make an example for NV with one game to which NV was basically locked out from during development; AMD has had similar titles in which the same happened to them. Ubisoft games, for instance, have traditionally always favored nvidia by a large margin. That wouldn't be so bad except it seems like AMD doesn't even try here, they haven't added CF support for most ubi titles such as AC2, ACBro, ACrev, AC3, etc. Also Splinter Cell : Blacklist does not have CF support last I checked.

So here we are with BF4, and unsurprisingly, it favors AMD at release. This should not be surprising since it is an AMD GE title - AMD had exclusive access to BF4 during development. If they weren't ahead, i'd say something was severely wrong. But with the latest drivers, NV is more or less even in BF4 from what i've seen. So I wouldn't read too much into the performance of one single game. Fact of the matter is when ignoring non performance related metrics, the 280X and 770 are mostly equal in terms of performance when averaged out among many games...
 
Last edited:
I'm not sure what you mean. I didn't experience any stuttering in AC4 on my R9290. It was CPU limited on my 4770k, in fact. Please don't tell me we're using this as a benchmark title now :|
Bro he i right OP asked reasons.Where as right now he has to pay mining tax to AMD whether he like or not.Right now Nvidia are cheaper and the only factor that people buy AMD card were price and performance is gone for while where as same price Nvidia provide better driver support,better software,PhysX,Nvidia Inpector,Shadow play etc.
 
Status
Not open for further replies.
Back
Top