Fury Nano, biggest bang in smallest package.. FPS/inch baby!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Is it luck they got their GCN uarch into major game platforms? There's a very likely chance its going to be in the Nintendo NX next year as well. So that's GCN in every major platform for game developers to optimize & design for. With DX12 being similar to Mantle, things can only look up for them.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Is it luck they got their GCN uarch into major game platforms? There's a very likely chance its going to be in the Nintendo NX next year as well. So that's GCN in every major platform for game developers to optimize & design for. With DX12 being similar to Mantle, things can only look up for them.

Ya... I'm going to say no. Things can not only look just "up" for them. They can go a variety of ways. Even if AMD has a GREAT product next gen it won't matter if it comes out at the wrong time. That's what usually happens to AMD. It comes to them releasing products too late or not at the right time or with an easily fixable flaw that ruins it.

So I could easily see AMD getting destroyed next round of cards if they don't execute a proper business plan effectively.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RS, I thought you once said chasing marketshare by lowering price and margin didn't work too well for them. Because regardless of how good a deal is, the majority still go with NV anyway.

Ya, that is true. I should note I was coming from a consumer's point of view. From AMD's point of view, I totally understand why they decided to raise prices. Lisa Su must have crunched the numbers and estimated that lower market share but higher profits per unit (in this case just trying to make $ vs. losses) is better solution than the strategy used during HD4000-R9 290 eras. Still though, I think like the management of the past that gunned after price/performance too much, she raised prices too high where AMD's price/performance is sometimes worse or barely as good as NV's.

AMD cannot do that this generation since they aren't winning any of the key metrics and especially dropped the ball with HDMI 2.0 and overclocking.

What makes you think NV users will switch and buy Fury if it was priced $100 less?

You are right, most of them wouldn't but at $399 like R9 290 replacement? They didn't buy AMD when it was free (aka bitcoin mining made $) or when HD6950 unlocked or when HD7950/7970 overclocked and had 50% more VRAM than 670/680 or when HD5800 was 6 months early to launch, etc.

But NV currently has a very weak position in the $330-$600 segment. If AMD priced Fury at $399, it's not much more expensive than the 970 but less than 980.

Without taking overclocking into account, an after-market 980TI is 25% faster than the after-market Fury at 1440P but it costs 18% more. NV brings HDMI 2.0, 6GB of VRAM as a bonus, PhysX as a bonus, higher resale value as a bonus. In that context Fury at $550 sits in no-man's land. It's too expensive and it overclocks poorly too, which makes it even worse.

Think about it, I might have bought dual Furys for $400 a pop but now I am definitely skipping them or this entire generation. I don't want to pay $1300 for two 28nm cards but buying Furys makes no sense when 980Ti is not much more expensive but is way faster. In other words, neither NV or AMD is getting my $ on those high-end cards. I am sure a lot of consumers feel the same as are just going to buy R9 290/290X/970/390 stop-gap and upgrade next round OR just go right away to a 980Ti.

The R290X custom models were ~10% behind 980 and at ~half the price. NV users still bought 980s. It was faster than 970 and runs cool & quiet, often found for cheaper than the 970, and NV users still bought the 970.

Not to mention the custom R290s which were significantly cheaper and offered 95% of the R290X performance.

No need to bring that up. Even if R9 290 was $199 when GTX980 was $550, most of the same people who bought 980s or defended them would still buy a 980 over a $200 after-market R9 290X. :cool:

I did not think the 390/X would sell well at its jacked up price but I'm wrong there. It's selling very well from the etailers I've talked to.

Perception has changed. 390/X are quiet, cool running, offering very competitive performance for less. There's no stigma even though they are power hungry, they don't run hot or loud which suggests people hated on the R290/X not because of power consumption, but the combination of hot, hungry, loud.

Except after-market R9 290/290X cards did not run hot, and they weren't loud if you did research as a consumer. Just goes to show the level of knowledge for most PC gamers building PCs. But that same reason is why if NV and AMD's product cost similarly or worse if AMD has inferior price/performance, the average gamer is even more likely to buy NV.

I can't predict market share for Q3-4 2015 but I am going to estimate that NV will keep taking it because AMD hasn't done enough to reverse it. I'd be shocked if AMD gains a lot of market share in Q3-4 2015.

Fury/X are sold out as well. There's no need to lower the price. AMD thinks there's a premium ($) to be had for water cooling. It seems the market agrees else nobody would touch Fury X when custom 980Ti are better at 1440p & below.

I see your point but a lot of Fury/Fury X cards are now in stock. It sounds like it was mostly a supply issue.

From day 1, it was relatively easy to find Fury/Fury X in Canada. But because of our Canadian currency, look at the current prices on Newegg.ca:

$550 US Fury price became $790 CDN + tax = $893
$650 US Fury X is $830 CDN + tax = $938

Getting a pair of those as a reasonable upgrade from GTX680 SLI / HD7970Ghz CF means spending $1600+ CDN. Ya, no way considering most PC games are console ports where churning down a few settings can still allow one to survive to next gen GPUs (imo). I think you feel much the same as I do with your Australian prices.

They don't need to lower the price until they can't sell them fast enough.

That's just temporary. Once supply can keep up with demand and one can readily purchase a Fury for $550 and Fury X for $650, those cards will have trouble moving against a $480 after-market 980 and $650 980Ti.

Nano would be killer at $399 but completely unrealistic. AMD wants a premium for form factor with that level of performance on offer at an excellent perf/w level. I think they will be right, the market will reward them because Nano would make for a killer mITX setup.

It can't be too expensive. A lot of small cases can fit a reference 980Ti and if it's $550 or something similar to that, might as well get the Fury X and have the flexibility to have top performance or lower performance via PowerTune. I think the Nano has to be below $500 to make sense. But even then I sound like a broken record but that's a niche case scenario to buy a $500 miniITX card in a case that can't fit a 980Ti / or someone doesn't just want the full-blown Fury X in the same case.

Completely agree. Why I'm confused when Russian jumps on that soap box.

See your point below:

This round with there literally no NV premium on the top cards (Fury X vs 980 Ti) I think AMD shot itself in the foot.

When AMD had prices that were too low, NV loyalists weren't purchasing those cards but at least objective gamers who don't care about brand but care about price/performance did. Now, NV loyalists do not buy AMD (still) while objective gamers realized that AMD actually offers inferior value to NV even (plus worse overclocking). The end result is AMD on paper is in a worse situation now because there is practically no reason at all to buy Fury or Fury X for someone with a mid-size case, unless going Cross-fire where it can give 980TI SLI a run for the money. But since stats show the SLI/CF market is only 300,000 GPUs, this is a tiny fraction of the overall high-end PC gaming market Fury/Fury X cater to.

On the lower side, some AMD users holding on to HD 7Ks probably upgraded seeing AMD finally deliver a viable product.

Why would they not just pay $100 more for the 980TI over Fury and Fury X over an after-market 980TI doesn't make a lot of sense considering it's worse in nearly every way imaginable. As you noted, AMD now has no chance of winning NV customers, but as I noted, objective gamers will also have almost no choice but to step-up to the 980Ti as $100 less for Fury isn't incentive enough to lose 25% guaranteed after-market performance (+10% overclock on top with the 980TI).

I'm sure most NV users already upgraded to GTX 970/980 and aren't swapping sides.

We know that. :D

The R300 series is just way to late in my opinion. If they only made a few tweaks but were able to instantly reverse the negative stigma of R290, they should have done this months ago.

They should have split the launch into 2 parts --> Notice how so many reviewers gave positive reviews to R9 390/390X but they aren't so hot about Fury/Fury X. AMD should have launched R9 390/390X January 1, 2015. Someone who manages supply/inventory of R9 290/290X messed up badly. Still, it would have been better to introduce R9 390 at $330 and R9 390X at $430 and just keep R9 290/290X at $250/300 as they have it now. The negative stigma of 290 series would have meant that even if R9 290/290X sold side-by-side with R9 390/390X, it wouldn't have mattered.

So it could be argued that one or two years from now, the Fury may be faster than a 980Ti. Especially if nVidia continues as they have with neglecting older architectures.

Realistically speaking, the performance difference between Fury and 980Ti is too great for this to happen.

33% at 1080P

30% at 1440P

6GB VRAM as a bonus. To make matters worse, 30-33% more performance and 50% more VRAM is achieved with just 38W more power.

Plus now there is a free MGS:pP thrown in. AMD is hopeless this generation without major price drops+game bundle. Even at $500 the Fury is too expensive in this context unless it comes with Star Wars Battlefront.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Is it luck they got their GCN uarch into major game platforms? There's a very likely chance its going to be in the Nintendo NX next year as well. So that's GCN in every major platform for game developers to optimize & design for. With DX12 being similar to Mantle, things can only look up for them.

GCN could end up in the NX but it's likely going to be some lower end 512-1280 shader part, not Nano or anything like that.

Nintendo's executives continue to emphasize uniqueness for their console's design goals:

"There is no doubt the NX will also bring a radical change to gaming. Shinya Takahashi, their Director of Software Planning & Development, came right out with it:

For us, the next step is to think about what is going to be that element that is really going to catch the attention of a large number of players again and get them excited. We’re constantly thinking about this idea from the perspective of the players and the needs of the players in terms of what can we can do with our ability and our technology to capture that excitement and passion.

When he says they’re trying to grab “the attention of a large number of players again,” think: Wii. When the Wii exploded on to the scene, everybody wanted one. "

http://jinjabobot.com/nintendo-nx-2016/

I think if Nintendo is going to invest $ into hardware, it's going to be not into high-end APU/GPU/CPU but controls/screen or some other features. Alternatively, they might try to hit more affordable price points such as $249-299. I don't see HBM or Fury or Nano in the NX as it's probably too expensive to realize.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Realistically speaking, the performance difference between Fury and 980Ti is too great for this to happen.

33% at 1080P

30% at 1440P

6GB VRAM as a bonus. To make matters worse, 30-33% more performance and 50% more VRAM is achieved with just 38W more power.

Plus now there is a free MGS:pP thrown in. AMD is hopeless this generation without major price drops+game bundle. Even at $500 the Fury is too expensive in this context unless it comes with Star Wars Battlefront.

Yeah, but thats showing a super clocked, highly binned card. That graph shows a stock 980Ti, and its only a few percent ahead.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Is it luck they got their GCN uarch into major game platforms? There's a very likely chance its going to be in the Nintendo NX next year as well. So that's GCN in every major platform for game developers to optimize & design for. With DX12 being similar to Mantle, things can only look up for them.

Must be luck. No way they could have planned it this way. All the game consoles, Low level API's, etc... were just luck. Looks like, with the number of monitors coming along supporting it, they've gotten lucky with Freesync too. Oh, and HBM ~12 mos ahead of the compatition? More luck. /sarc ;)
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
RS we might not of seen the image problem for aftermarkt 290/290X but Joe public did through the smear campaign that was done on them. That what it was; as the launch 290X image is what they saw

The 390/390X doesn't have that - the most recommend card right now in 390/970 price range is 390 specially MSI card over at OCUK and several other forums; some have tried to down play it and say 300s series is a failed launch but its now.

390 is after than 970 on ave and just a hair faster than 290X - then add in the OC with it; it will oc higher than 290X core and memory. 390X is being recommended over the 980 and Fury.....as its just as fast as the 980 but cheaper.

Furies are selling out; do I think Fury needs a price drop oh yea; put fury at $459-499; and Fury X at 550-599; and you have seriously solid cards at the right price; but because of the supply constraints I don't think AMD is worry about dropping the prices yet.

Once supply settles I do see the prices coming down. We know there is performance that's being left on the table with Fury - its still not being fed properly by the drivers. Once that's sorted; I don't think there will be a gap between the aftermarket Tis and the Xs.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
AMD is in a hopeless situation. They have been shooting themselves in the foot by completely ignoring perf/watt for more than 3 years. AMD knew the minute GTX 680 launched that they were behind in perf/watt and what have they done in the past 3.5 years. Nothing, zilch, nada. R9 290X did not bring any major power efficiency improvements compared to the ref HD 7970.

http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/25.html

AMD did the bare minimum in power efficiency improvements from R9 290X to Fury X to fit the 40% bigger chip (596 vs 432 sq mm) with 40% more transistors ( 8.9 Billion vs 6.3 billions transistor) within the similar 275W TDP given that they also gained power efficiency from transitioning to HBM. Fundamentally AMD need a ground up redesign of GCN for vastly improved perf/watt, perf/sq mm and perf/transistor.

Once Kepler launched AMD's notebook discrete GPU market share started falling rapidly and with Maxwell's launch Nvidia is now the only GPU vendor in high end notebook GPU space. GTX 680M based on GK104 was easily faster than HD 7970M based on Pitcairn and R9 295X based on Tonga was just a pathetic product compared to GTX 980M based on the impressive GM104. There is not a single high end gaming notebook vendor who even bothers to sport high end AMD notebook GPU because fundamentally AMD is completely outclassed in performance and power efficiency by embarassing margins. The gap now in notebook GPU performance and efficiency now resembles AMD's gap in the CPU market against Intel. This is a company which has been cutting R&D relentlessly to account for its lower revenues due to that wretched Bulldozer killing the company's products - CPUs and APUs. Meanwhile Intel has got really aggressive even at the low end with Baytrail also taking market share from AMD in a segment which they dominated in the Brazos days.

Right now AMD has to just live to fight another day and try to get by till Zen and 14nm FINFET GPUS. imo if those products are not competitive then its game over. Just wind up and sell yourself to whoever is willing to buy you for whatever peanuts your stock is worth. I am not very optimistic about their future products too given how much R&D has fallen and that they now spend less than Nvidia. For a company designing high performance x86 CPU and GPU architectures thats pathetic. RIP AMD. :mad:
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
AMD is in a hopeless situation. They have been shooting themselves in the foot by completely ignoring perf/watt for more than 3 years. AMD knew the minute GTX 680 launched that they were behind in perf/watt and what have they done in the past 3.5 years. Nothing, zilch, nada. R9 290X did not bring any major power efficiency improvements compared to the ref HD 7970.

http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/25.html

AMD did the bare minimum in power efficiency improvements from R9 290X to Fury X to fit the 40% bigger chip (596 vs 432 sq mm) with 40% more transistors ( 8.9 Billion vs 6.3 billions transistor) within the similar 275W TDP given that they also gained power efficiency from transitioning to HBM. Fundamentally AMD need a ground up redesign of GCN for vastly improved perf/watt, perf/sq mm and perf/transistor.

Once Kepler launched AMD's notebook discrete GPU market share started falling rapidly and with Maxwell's launch Nvidia is now the only GPU vendor in high end notebook GPU space. GTX 680M based on GK104 was easily faster than HD 7970M based on Pitcairn and R9 295X based on Tonga was just a pathetic product compared to GTX 980M based on the impressive GM104. There is not a single high end gaming notebook vendor who even bothers to sport high end AMD notebook GPU because fundamentally AMD is completely outclassed in performance and power efficiency by embarassing margins. The gap now in notebook GPU performance and efficiency now resembles AMD's gap in the CPU market against Intel. This is a company which has been cutting R&D relentlessly to account for its lower revenues due to that wretched Bulldozer killing the company's products - CPUs and APUs.

Right now AMD has to just live to fight another day and try to get by till Zen and 14nm FINFET GPUS. imo if those products are not competitive then its game over. Just wind up and sell yourself to whoever is willing to buy you for whatever peanuts your stock is worth. I am not very optimistic about their future products too given how much R&D has fallen and that they now spend less than Nvidia. For a company designing high performance x86 CPU and GPU architectures thats pathetic. RIP AMD. :mad:

At least they got that sweet sweet APU market to fall back on. :|

I'm interesting to see what AMD does with the follow up to GCN. I wonder if they can spin out of it or if it will just be an evolution of GCN. Getting in all the consoles at least got them a steady flow of cash.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
At least they got that sweet sweet APU market to fall back on. :|

I'm interesting to see what AMD does with the follow up to GCN. I wonder if they can spin out of it or if it will just be an evolution of GCN. Getting in all the consoles at least got them a steady flow of cash.

AMD's APUs are right now stuck with a really bad CPU architecture and a lack of bandwidth to unleash the GPUs' full potential. No wonder that their APUs are not selling well too. The only thing keeping the company afloat is the consoles. Nintendo's next gen console too is most probably GCN based. So AMD needs a new architecture which utilizes its GCN legacy but with a maniacal single minded focus on efficiency - perf/watt, perf/transistor and perf/sq mm. In the past AMD used to atleast do well in one of those metrics. But now AMD has lost on all the above 3 metrics - GTX 980 Ti vs Fury X. This is as bad as it can get. AMD is thoroughly and hopelessly outclassed in terms of architecture. A competitive GPU architecure at 16/14nm FINFET and a reasonably competitive 14nm Zen (even if Zen can get to Haswell IPC and clock to 4.2 Ghz max it would be a success given how pathetic they are now). That would lay the foundation for Zen based APUs with HBM to provide unbeatable perf/watt for total compute(CPU+GPU combined) and efficiency which is not possible on a CPU + dGPU. Efficiency and battery life matters in notebooks and should finally make their APUs attractive.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
I'm going to laugh when a GTX980 is tuned to exactly match the Nano's power consumption, and ends up beating it by 15%.
 
Feb 19, 2009
10,457
10
76
I'm going to laugh when a GTX980 is tuned to exactly match the Nano's power consumption, and ends up beating it by 15%.

Isn't the 980 already using 150-180W in games?

If AMD say Nano TDP is 175W, it should be a bit less than that in gaming load.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
AMD is in a hopeless situation. They have been shooting themselves in the foot by completely ignoring perf/watt for more than 3 years. AMD knew the minute GTX 680 launched that they were behind in perf/watt and what have they done in the past 3.5 years. Nothing, zilch, nada. R9 290X did not bring any major power efficiency improvements compared to the ref HD 7970.

http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/25.html

AMD did the bare minimum in power efficiency improvements from R9 290X to Fury X to fit the 40% bigger chip (596 vs 432 sq mm) with 40% more transistors ( 8.9 Billion vs 6.3 billions transistor) within the similar 275W TDP given that they also gained power efficiency from transitioning to HBM. Fundamentally AMD need a ground up redesign of GCN for vastly improved perf/watt, perf/sq mm and perf/transistor.

Are you sure they are ignoring the Perf/Watt? Perhaps the problem is they are having to clock and choose voltage higher than they'd like because they are struggling on keeping up with performance.

It's easy to have perf/watt if your designs are just faster.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
For the last 9 months, by far the vast majority of PC gamers did buy GTX980 over R9 290X for what 21% more performance for 80-90% higher cost? Talk about a horrible value but most of this forum hardly endorsed the 290X. In that context, your sentence of "33% higher price for 12% more performance" is a horrible deal suddenly seems like is a smoking deal, but I guess we are talking about AMD cards, not AMD vs. NV, right? :)

Please stop being purposefully naive. The GTX 980 was the fastest card in the market (before Titan X) and had the best metrics hands down compared to any other card. Fastest chip. Best efficiency. Legendary overclocking performance. Great thermals. The R9 290x had none of that and was permanently damaged by poor release reviews. All of that said, yes I agree the GTX 980 was overpriced. But the fastest card available almost always commands a price premium. You know this. All of the other perks added to what came out to be really, really favorable launch reviews for the 980 which allowed it to ride the $550 price point until GTX 980 Ti came along. The crown GPU often commands crown pricing, despite your constant objections and dismay.

Everything about Fury Nano does not scream perf/$. It's full Fiji. It's an ultra-small form factor. It's specially binned. Fury X already doesn't have great value vs. the competition. Bringing highly binned Fiji to the market at too low of a price will simply kill Fury X (and Fury Vanilla) sales. There is no logical reason to price Fury Nano at a fraction of Fury X's price based on all of that unless Fury X is imminently getting a price cut from it's $650 MSRP.
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
The problem is always the same. AMD designs their products for scenarios that really come to fruition 5 years later thatn expected. The compute capabilities will only be fully productive to gamers with DX12,so like 8xxx piledriver will become relevant 290/280 series will too, but then again, you will be already on a pascal o gcn1.3++ product. An awful sweet deal for their earlier customers, another nail in the coffin for them if those people longen their upgrade cyxle because of that, unless dx12 game devs go full r*tard and eat all the performance gained for the usual IQ gimmickery or if it allows them to become even lazier (i can see games swollen with a ton of drawcalls that could have been easily saved by instancing and using engine tricks to lower the draw call count, DX can easily become a nightmare for the o es running CPUs without much grunt).
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Everything about Fury Nano does not scream perf/$. It's full Fiji. It's an ultra-small form factor. It's specially binned. Fury X already doesn't have great value vs. the competition. Bringing highly binned Fiji to the market at too low of a price will simply kill Fury X (and Fury Vanilla) sales. There is no logical reason to price Fury Nano at a fraction of Fury X's price based on all of that unless Fury X is imminently getting a price cut for it's $650 MSRP.

Is it confirmed that Nano cards are higher binned Fiji? I would assume all fully functional Fiji cards will run at Nano speed and power requirements and a limited subset can run at Fury X speeds (main reason for limited oc potential), but if it's the other way around then Nano being higher binned makes sense.

Fury X is AMD's current halo card so AMD is assuming that they'll sell out no matter the cost of cards below it. Fury X should be limited numbers like Titan X. I always thought Titan models was made so the next tier card has much better value in comparison and AMD wants to follow that play. I mean would 980ti really be considered a great value to as big a crowd if Titan X doesn't exist? The problem AMD has is Fury X couldn't be priced higher like Titan X so Nano will be much closer in price to Fury X and won't get that nice value comparison.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
The problem is always the same. AMD designs their products for scenarios that really come to fruition 5 years later thatn expected. The compute capabilities will only be fully productive to gamers with DX12,so like 8xxx piledriver will become relevant 290/280 series will too, but then again, you will be already on a pascal o gcn1.3++ product. An awful sweet deal for their earlier customers, another nail in the coffin for them if those people longen their upgrade cyxle because of that, unless dx12 game devs go full r*tard and eat all the performance gained for the usual IQ gimmickery or if it allows them to become even lazier (i can see games swollen with a ton of drawcalls that could have been easily saved by instancing and using engine tricks to lower the draw call count, DX can easily become a nightmare for the o es running CPUs without much grunt).

AMD is like a company that would make screwdrivers before screws exist. They focus too much on the future, while other companies focus on the NOW. AMD focused on IGP SO early it was ridiculous. Now, Intel is focusing on it when the CPU performance is already there, perf/watt, and absolute wattage are already there and it's FAR more beneficial for them. It's about having the right idea at the right time. Which AMD doesn't seem to have usually.

Like, imagine instead of Fury X launching as the flagship card with WCE, The 290x launched as a flagship card with WCE, and the 290/lower's stock coolers were up to the standards of AIB coolers (just imagine AMD did the 300 release as the 200 series).
The WHOLE perception of AMD would be different today.

Instead, they launched a subpar reference product, got destroyed in reviews, never recovered, launched Fury X with WCE when in all reality it BARELY if really ever benefits from that. If AMD had started that with the 290x though, they'd be in a FAR better situation.

Give Nvidia, AMD's situation over the last 3 years, and Nvidia would have killed it. They would have spun the 290x to be "Hot, but that's what you need to be the fastest". They would have done what it takes. AMD just doesn't have that same business savvy it seems.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Is it confirmed that Nano cards are higher binned Fiji? I would assume all fully functional Fiji cards will run at Nano speed and power requirements and a limited subset can run at Fury X speeds (main reason for limited oc potential), but if it's the other way around then Nano being higher binned makes sense.

Fury X is AMD's current halo card so AMD is assuming that they'll sell out no matter the cost of cards below it. Fury X should be limited numbers like Titan X. I always thought Titan models was made so the next tier card has much better value in comparison and AMD wants to follow that play. I mean would 980ti really be considered a great value to as big a crowd if Titan X doesn't exist? The problem AMD has is Fury X couldn't be priced higher like Titan X so Nano will be much closer in price to Fury X and won't get that nice value comparison.

Titan exists in a world of its own to most people. It's not in a realm of being fathomable. So I still think the GTX 980Ti would be a GREAT value without the Titan X. Just look at ANY aftermarket review of the GTX 980Ti.... they speak for themselves. The card is a beast, it OCs like a beast, it shouldn't even have to be explained....
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I'm going to laugh when a GTX980 is tuned to exactly match the Nano's power consumption, and ends up beating it by 15%.

You won't be laughing if you continue to derail the thread with your trolling.

-Rvenger
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Must be luck. No way they could have planned it this way. All the game consoles, Low level API's, etc... were just luck. Looks like, with the number of monitors coming along supporting it, they've gotten lucky with Freesync too. Oh, and HBM ~12 mos ahead of the compatition? More luck. /sarc ;)

Good thing NV doesnt need any luck or the above as they are doing just fine without them!
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Is it confirmed that Nano cards are higher binned Fiji? I would assume all fully functional Fiji cards will run at Nano speed and power requirements and a limited subset can run at Fury X speeds (main reason for limited oc potential), but if it's the other way around then Nano being higher binned makes sense.

I honestly do not know, but I am under the impression that higher leakage chips generally run at higher clock speeds with sacrifice to efficiency.

Fury X is AMD's current halo card so AMD is assuming that they'll sell out no matter the cost of cards below it. Fury X should be limited numbers like Titan X. I always thought Titan models was made so the next tier card has much better value in comparison and AMD wants to follow that play. I mean would 980ti really be considered a great value to as big a crowd if Titan X doesn't exist? The problem AMD has is Fury X couldn't be priced higher like Titan X so Nano will be much closer in price to Fury X and won't get that nice value comparison.

The Titan over-pricing model only makes sense if you can sell the next model without regard to competition. Nvidia did that with the GTX 780, but since then hasn't been able to replicate that scenario because they've had competition very close by with the 780 TI and with the 980 TI. If AMD is trying to over-price Fury X, they've succeeded at their own expense. It IS overpriced in comparison to not only it's own product stack, but it's nearest competitor. It's likely more expensive to product than GM200 but it's average selling price is several hundred dollars lower. You say it is selling out, which is easy to confirm, but nobody knows in what volume.

But when you say "[Fury X] will sell out no matter the cost of cards below it" assuming you're right and Nano's are full-functional Fiji dies, AMD would be further shooting themselves in the foot selling possible Fury X chips as cheaper Fury Nano chips.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Good thing NV doesnt need any luck or the above as they are doing just fine without them!

I never claimed nVidia needed any luck. Saying everything positive about AMD is luck and not by design though simply isn't correct. That was my point.