• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FuryX now = 980ti 1080p/1440p > 4k

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
They weren't poor to be exact, but hamstrung by the VLIW drivers. It took them time to re-write the memory management for GCN, so the transition to full GCN took some time. It's not exactly poor as in they did a poor job, but more along the line of logistics.

I agree, it did take some time to transition to GCN for AMD. They launched with poor drivers. Thankfully they fixed it. Hell, I remember all the instability issues during the actual launch.

I ended up using the leaked 12.2 drivers since I wasn't using CFX at the time, but if I recall CFX users were told to use the 11.12 RC driver that came on the CD for a long time due to instability issues.

Woof, I forgot just how messy the drivers were back then haha.
 
Yep, 7970 CF at launch = big mistake. I actually sold my second card because of the problems. Still, 7970 @ +1.3ghz was pretty beastly even at launch and it got lot better after they sorted out driver fiasco.

GCN has gained a lot being so old architecture but I guess consoles are one of the main reasons why those mummies (7xxx series) are still going so strong.
 
As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.

Agreed, for AMD and nVidia too. To anyone paying attention, nVidia has officially cashed in the good will associated with the x60 and x80 brands now by selling products that never would have fit the bill for those marketing names in years past. Even single time posters here asking questions about graphics are noticing the 960 isn't what x60 cards used to be and the 980 very much isn't what x80 cards used to be.

Similarly AMD cashed in the x9x(x) name now that the 390 and 390x are actually midrange in their stack and "Fury" is now the top end. The x9x moniker has only had one generation of use, though the x9xx moniker was close enough that I think its fair to say they're cashing them both in at once.

Everybody's trying to sell midrange cards as if they were high end cards with slick marketing talk.
 
Man them Lightning results are amazing! While I wanted one, glad I didn't wait for it. Woof, almost 5 months after all the other cards came out. No thanks.

Good to see AMD addressing performance issues quicker. Fury owners won't have to face the performance drought Tahiti owners did! Enjoy it 😀

Yes, because Tahiti's O/C'ing performance didn't matter like the 980 ti's does.
 
Yes, because Tahiti's O/C'ing performance didn't matter like the 980 ti's does.

I remember the competition to Tahiti being pretty decent for OC, while now, Fury is really bad for OC and the 980 TI is the opposite, so it probably is more relevant now when comparing the two.
 
I remember the competition to Tahiti being pretty decent for OC, while now, Fury is really bad for OC and the 980 TI is the opposite, so it probably is more relevant now when comparing the two.

When Tahiti was first released (before Kepler) and people were getting monster O/C's All you heard from the pro nVidia crowd was, "but it's not guaranteed performance so it doesn't matter". Not a big deal. just throwing it out there for the nostalgia of it. Less people forget.
 
Yep, 7970 CF at launch = big mistake. I actually sold my second card because of the problems. Still, 7970 @ +1.3ghz was pretty beastly even at launch and it got lot better after they sorted out driver fiasco.

AMD seem to produce some fantastic chips but then botch the launch resulting in poor first impressions.
 
AMD seem to produce some fantastic chips but then botch the launch resulting in poor first impressions.

Agreed. Moreover, in the case of FuryX vs GTX980TI it was easy for me since I have a water cooled rig. The GTX980TI was able to use a waterblock AND OCs like a champ. Fury X ultimately had an EK block available but OCing seemed to be very little to none.

I was impressed by the HBM memory of Fury X but 4G vs the 6G for the GTX980TI didn't help either. Owning R9 290s I was familiar with AMD's history of releasing better drivers overtime. Again, the Fury X seems to be catching up but only to the reference GTX980TI. OC'd 980TIs still are ahead.

For me the choice then and now would be the GTX980TI (In my case the EVGA GTX 980TI SC was the best deal with a default core of 1102 vs 1000 stock) vs the Fury X.😎
 
Last edited:
Agreed, for AMD and nVidia too. To anyone paying attention, nVidia has officially cashed in the good will associated with the x60 and x80 brands now by selling products that never would have fit the bill for those marketing names in years past. Even single time posters here asking questions about graphics are noticing the 960 isn't what x60 cards used to be and the 980 very much isn't what x80 cards used to be.

Similarly AMD cashed in the x9x(x) name now that the 390 and 390x are actually midrange in their stack and "Fury" is now the top end. The x9x moniker has only had one generation of use, though the x9xx moniker was close enough that I think its fair to say they're cashing them both in at once.

Everybody's trying to sell midrange cards as if they were high end cards with slick marketing talk.


You're correct on AMD, but incorrect on NV. The only reason why the 960/980 look bad is because of the other card: the 970.

You compare both to that card, implicitly, and as such both look bad. But that's a stupid comparison to make. If you look at the 980, the direct comparison should be the high-end GK104, which is the 780. What's the performance upgrade? We're talking at least 40% here at 1080p.

You'd know this if you'd paid attention.

The 970 is not just on par with the 780 - but significantly ahead. This is ahistorical and as such, the 980 looks a lot worse than it is. It's not worth the premium compared to the 970, but when you look at generational comparisons(980/780 or GM104/GK104), it's absolutely within the normal boundaries for a generational change.

I'm kind of surprised more people haven't understood just how radical the 970 is compared to previous x70 cards. The 770, 670 and 570 were all neatly following the same pattern. The 970 doesn't, in large part because NV wanted to finish off AMD and that card is a good reason as any as to why AMD has been dropping like a tank. They made an out-of-the-ordinarily good midrange GPU for a shockingly good price. It messes up not only AMD, but also NV's own internal balance which has been held in check for GPU generation after GPU generation.
 
I think right now NV continues to sell on brand value and perception in the $100-400 range. 380 2GB > 950, 380 4GB/280X > 960, 290 has no competition, 390 > 970. Yet, NV completely outsells AMD with 950/960/970 cards.

What's most surprising is just how much better 280X is against the 950/960 cards, and how poorly the 780 aged. Those are far more eye-opening for me than Fury X getting slightly better against a reference 980Ti. The crazy part is how overhyped 780 was, how people purchased it over the mostly cheaper 290 and how sites like TechReport and HardOCP completely failed the consumer by failing to warn them about 2GB limits on the 960, while downplaying the performance advantage of 280X all this time, despite the latter often being within a similar price range. Once this generation is done, in 5 years, no one will care about any of these cards per say, but I'll never forget review sites that failed to point out glaring product flaws and prioritized NV's perf/watt marketing over raw GPU horsepower and VRM. As far as I am concerned this generation was a reputation killer for certain sites that lost all credibility they have built up over the last 10 years.

TPU FTW for listening to the consumers and understanding what outliers are.

one reason amd needs to change the presentation of new cards, the review sites have to much bias. said it along time, the user buy on the wrong facts
 
one reason amd needs to change the presentation of new cards, the review sites have to much bias. said it along time, the user buy on the wrong facts

I agree that AMD need to change the presentation of its new cards, this has been way more damaging than any market bias.

Recall that when the 680 launched it performed better than the 7970, was more efficient and cost less. Once the 7970Ghz came out, its overclocking potential realised, never settle drivers released (a year later) and the CF issues fixed (a year and a half later) the card really started to shine (and it still does).

Then there was the 290x's stock cooler, associated noise and thermal throttling. This was compounded by a lack of custom cooling solutions from AMD's partners.

Even if all review sites had a strong bias in favour of AMD, the market still would have been left with soured first impressions. 🙁

Sorry OP, wondered a little off topic.
 
Yes, because Tahiti's O/C'ing performance didn't matter like the 980 ti's does.

You are so salty. When talking about drivers, and performance gains from drivers, what does OC have to do with anything?

You can't erase all of AMD's blemishes. I don't get why you try so hard to make it look like AMD's products are invincible to criticism.

EDIT: Let it be known. I bought HD 7970, loved it, OC'ed it, stuck a an Accelero on it and kept it max clocks whisper quiet. I had no regrets 😀 And it took 2 years of botched/delayed drivers to finally make me say "no more."
 
Last edited:
You're correct on AMD, but incorrect on NV. The only reason why the 960/980 look bad is because of the other card: the 970.

You compare both to that card, implicitly, and as such both look bad. But that's a stupid comparison to make. If you look at the 980, the direct comparison should be the high-end GK104, which is the 780. What's the performance upgrade? We're talking at least 40% here at 1080p.

You'd know this if you'd paid attention.

The 970 is not just on par with the 780 - but significantly ahead. This is ahistorical and as such, the 980 looks a lot worse than it is. It's not worth the premium compared to the 970, but when you look at generational comparisons(980/780 or GM104/GK104), it's absolutely within the normal boundaries for a generational change.

I'm kind of surprised more people haven't understood just how radical the 970 is compared to previous x70 cards. The 770, 670 and 570 were all neatly following the same pattern. The 970 doesn't, in large part because NV wanted to finish off AMD and that card is a good reason as any as to why AMD has been dropping like a tank. They made an out-of-the-ordinarily good midrange GPU for a shockingly good price. It messes up not only AMD, but also NV's own internal balance which has been held in check for GPU generation after GPU generation.

Lol. Nope.

780 is NOT on parity with 980. Are you kidding? A cut down version (780) of a cut down version (OG Titan) of the full chip (780 Ti/ Titan Black) is not comparable to the full midrange chip. GM204. Which is undeniably the mid range Maxwell chip because there is a full GM200 above it and a GM206 below it. Middle. Mid Range. The 780 is the equivalent of the 560 Ti 448 in its stack (twice cut down version of top end xx0 codenamed chip). The 980 is the equivalent of the 560 Ti in its stack (full version of xx4 codenamed chip).

When was the last time x80 chips were based off the midrange GPU? Oh, right. Never before Kepler/Maxwell. You're wrong.

When was the last time an x60 chip was the lowest of the 3 GPUS in a family? Never before Kepler/Maxwell. Wrong again.

The 970 is the same as a 560. A cut down version of the xx4 codenamed chip. It's a little less cut down than the 560 was vis a vis the 560 ti, so that's nice and all, but its no game changer. Recall the 560 Ti (equivalent in stack to 980) was $250. So a 560 for $330 MSRP is not a good deal. Its a crap deal. You're being told that the x60 chip is actually a x80 chip and paying for the privilege. I understand costs go up with die size but there was absolutely no need to change the marketing names and upgrade x60 chips to x80 for any other reason except to deceive consumers and cash in goodwill.

The 670 is nothing like the 570. Is that a joke? A once cut down version of the top end chip in the Fermi family (GF110) = 570. A once cut down version of the midrange chip on the Kepler family (GK104) = 670. 680 is not the high end chip. It was based off of GK104. There is a GPU above and below it. By definition it is mid range. You can act like somehow the middle chip in the architecture family isn't the mid range chip, but its pure rationalization and nonsense. The 670 and the 970 alike are both equivalents to the 560 (no ti) not the 570. Except a lot more expensive than the 560 ever was.

It's absolutely hilarious you say "you're right AMD did that" when it's completely and 100% transparently obvious that nVidia started the trend of renaming lower end chips x60 to x80. Not that following the trend makes AMD any less culpable, but you can't just pick one as being blameless because it's your favorite team. Cheerleading isn't an argument.

Cool it with the condescending tone. Strange you'd take something directed against nVidia and AMD so personally.
 
Last edited:
You're correct on AMD, but incorrect on NV. The only reason why the 960/980 look bad is because of the other card: the 970.

You compare both to that card, implicitly, and as such both look bad. But that's a stupid comparison to make. If you look at the 980, the direct comparison should be the high-end GK104, which is the 780. What's the performance upgrade? We're talking at least 40% here at 1080p.

You'd know this if you'd paid attention.

The 970 is not just on par with the 780 - but significantly ahead. This is ahistorical and as such, the 980 looks a lot worse than it is. It's not worth the premium compared to the 970, but when you look at generational comparisons(980/780 or GM104/GK104), it's absolutely within the normal boundaries for a generational change.

I'm kind of surprised more people haven't understood just how radical the 970 is compared to previous x70 cards. The 770, 670 and 570 were all neatly following the same pattern. The 970 doesn't, in large part because NV wanted to finish off AMD and that card is a good reason as any as to why AMD has been dropping like a tank. They made an out-of-the-ordinarily good midrange GPU for a shockingly good price. It messes up not only AMD, but also NV's own internal balance which has been held in check for GPU generation after GPU generation.

the 970 is garbage to me. The 960 and 980 are not bad just in comparison to it, they are bad because they lose to AMD cards and will lose worse later on. When the 980 launched it looked untouchable. Already its trading with the 290x/390x. The Fury is faster too. 960 and 980 only make sense when there is no competitor.

I don't know whats up with nvidias launches but the cards always look better than they really are. A 970 is typically slower than a 290/390 now. When it launched it looked competitive with the 290x. I think nvidia just has too much room to mess around with their drivers and resulting performance. Another example is 285 vs 960. People would have sworn the 960 was faster, now...

The 980ti overclocked might remain ahead, but I really would not be surprised if it does not. If we were only going to be stuck with dx11 maybe it definitely would have, though PBR engines might have changed that.
 
Last edited:
GCN is very strong in the full PBR engines like CryEngine 3.4+ or the newest Frostbite. So this is not game specific. The primary reason for this is the robust cache design and the memory bandwith.

Well the other question you should ask yourself is.. what do you think more people will be playing? ProjectCARS? Or the FPS blockbuster of the year? 🙄
 
Looks to me like 980ti Lightning is still way ahead. Sure you could say its not a fair comparison because it is overclocked, but the power consumption is still lower. So for all practical purposes its not really overclocked.
 
Looks to me like 980ti Lightning is still way ahead. Sure you could say its not a fair comparison because it is overclocked, but the power consumption is still lower. So for all practical purposes its not really overclocked.
The 980ti is the best single gpu. It shouldn't be debated.
Yes there are reasons to get fury x, and I can make a great list o them.

But simply put the 980ti is the best all around single gpu.
 
GTX 780 debut:
perfrel_1920.gif


GTX 780Ti debut:
perfrel_1920.gif


I draw your attention to the 680 vs 7970Ghz results and the 7970Ghz vs 780 (big lead for the 780).

Then compare to now:

perfrel_1920_1080.png


The 770 which was 5% faster than 680 is now very behind the 280X which itself is slower than 7970Ghz.

Compare the 780 vs 280X (7970Ghz is a bit faster, would make the gap even smaller).
i must laught how bad NV cards Ageing.This is really super crazy.
 
Looks to me like 980ti Lightning is still way ahead. Sure you could say its not a fair comparison because it is overclocked, but the power consumption is still lower. So for all practical purposes its not really overclocked.
Apples and Oranges.

The lightning version just got released. AMD is releasing the Fury X2 "soon".

i must laught how bad NV cards Ageing.This is really super crazy.

The way I look at it is, do you want 10 fps now when it doesn't matter, or a couple years down the road when it will matter?
 
Last edited:
Apples and Oranges.

The lightning version just got released. AMD is releasing the Fury X2 "soon".



The way I look at it is, do you want 10 fps now when it doesn't matter, or a couple years down the road when it will matter?

I'd rather have the 10 FPS now, when it does matter, and upgrade in 2 years like most people do. You get a lot more bang for your buck this way, rather than overspending for what might happen in 2 years.
 
Games removed:-
Alien Isolation
Batman Origins
Bioshock Infinite
Dead Rising 3
Dragon Age
Metro LL
Project Cars
Tomb Raider
Wolfenstein: The New Order

Games Added:-
Mad Max
MGS V

All those game which are removed were causing defeat to Fury X by a normal GTX 980 Ti.As i told some people are really good making claims.
 
Games removed:-
Alien Isolation
Batman Origins
Bioshock Infinite
Dead Rising 3
Dragon Age
Metro LL
Project Cars
Tomb Raider
Wolfenstein: The New Order

Games Added:-
Mad Max
MGS V

All those game which are removed were causing defeat to Fury X by a normal GTX 980 Ti.As i told some people are really good making claims.

Nah. Most of those are within 5% of each other except for Project Cars and Wolfenstein. Those 2 games should be removed because they're outliers. The others are whatever. Leaving them in wouldn't matter much. The GTX 980 TI is still the better card for most gamers at this moment.
 
You are so salty. When talking about drivers, and performance gains from drivers, what does OC have to do with anything?

You can't erase all of AMD's blemishes. I don't get why you try so hard to make it look like AMD's products are invincible to criticism.

EDIT: Let it be known. I bought HD 7970, loved it, OC'ed it, stuck a an Accelero on it and kept it max clocks whisper quiet. I had no regrets 😀 And it took 2 years of botched/delayed drivers to finally make me say "no more."

I'm salty? lol Reread what you posted. You are basically telling me I'm so bad and then you go on to talk about how awesome you are. Nothing else of substance was said. Oh, we couldn't let the post go without the AMD drivers are suxor thrown in for good measure.

I was just pointing out how people flip flop depending on their favorite brand. If you don't think that's the case then good for you.
 
GCN is very strong in the full PBR engines like CryEngine 3.4+ or the newest Frostbite. So this is not game specific. The primary reason for this is the robust cache design and the memory bandwith.

Nonsense! We were explicitly told that AMD being in the consoles would offer them no advantage and was no reason to purchase AMD. /sarc 🙂
 
I'm salty? lol Reread what you posted. You are basically telling me I'm so bad and then you go on to talk about how awesome you are. Nothing else of substance was said. Oh, we couldn't let the post go without the AMD drivers are suxor thrown in for good measure.

I was just pointing out how people flip flop depending on their favorite brand. If you don't think that's the case then good for you.

Hmmm. I don't see him saying how awesome he is.
His last sentence had no substance? He was showing you that yes indeed, AMD is not invincible to criticism.

Your post here does demonstrate that you do not see things with a non-biased eye and it influences your posting heavily. Railven is/was a looooong time AMD proponent and you had no words with him or difference of opinion with him until recently when he started expressing exasperation over his AMD product.

I have my bias, certainly, but at least I do try to see things from a neutral prospective. Railven and I always used to converse via PMs and although our opinions did differ most of the time, we did respect each others opinions and didn't try to dismiss each others thoughts even before the sentences were finished.

Start seeing things for what they are. It is what it is.
 
Status
Not open for further replies.
Back
Top