[VC]AMD Fiji XT spotted at Zauba

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
That's a rather strong statement.

And absolutely true.
It should be pretty easy to market something that is easily the best, right?
The actual quality and value of the product is not nearly as important as how it is marketed. People PAY to wear Coca Cola T-shirts, think about that for a moment people pay the company and market the product for them.
Look at regular market leaders like Intel and NV. They both followed-up pretty terrible products like P4E and Geforce FX with great innovations like Core and 6800/7800/8800 soon after.
Critical difference here. When AMD comes out with a 2nd best product they pay for it dearly. When Intel came out with the P4 they made sure that AMD was not able to capitalize on their incompetence. There is a critical mass point that a product and company reach and after that it is very difficult to undo. Intel knew this full well which is why they were extremely aggressive in paying off OEMs not to sell AMD products.
 
Last edited:

godihatework

Member
Apr 4, 2005
96
17
71
Wouldn't an advantage in HBM be a decrease in board size and complexity due to a reduced need for tracing with the memory stacked on die?

I would think that would be a factor for everyone who wants big performance in smaller spaces.

EDIT

or am i completely misinterpreting this
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Exar333, your post proves that NV's brand name trumps all. X800/850 series was better than 6800 series, X1900 series was miles better than 7800/7900 series in DX10 games. 1 year after 7900GTX launched, 8600GTS started to outperform it in modern titles, while X1950XTX kept on trucking. In both of those generations ATI also had superior AA and Tri-linear texture filtering IQ. NV needed about 1 extra step of AA to match ATI's. But going back to FX5000 days, some people on our boards still bought those cards and they were just awful compared to 9700/9800 series.

I don't get you --- instead of desiring more efficiency -- you desire reference water cooling! If a gamer desires water cooling, AIB's and third parties can offer you this choice.

Simple. Efficiency gets me a 50% faster GM200 at 250W that still runs hot and loud relatively speaking. Water gets me 50% faster GM200 and 10-20% overclocking headroom, while still maintaining low temperatures and awesome noise levels. Secondly, when running dual cards as I do, I will have the heat being exhausted out of the case. Right now I have to choose between an inferior blower from NV/AMD and going open air quiet after-market cards. That forces ne to use a full tower to exhaust the heat out. With water I could move to MicroATX and still use 2 powerful uber overclocked flagship cards in CF/SLI.

You seem to have an either OR approach to this. I don't. I view water cooling as a far superior cooling solution on top of an already more efficient next gen design. All things being equal, GM200 on water will crush GM200 on air. If I am asked to pay $500-800 for such a card, I will pay $50 more for warrantied WC solution that blows NV's Titan blower out of the water (hehe).
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Have video cards ever been bandwidth limited?? 640GB/s.. that is an awful lot. I wonder what kind of performances HBM brings to the table. Its going to get very interesting if the rumors are true because AMD will have a year head start with this technology.

Not only that but 28nm vs 20nm (if the rumours are correct). I like how ATi's tradition (getting a head start on HQ AF, GDDR5 etc) is still alive at times with their inherent advantage in process tech/XDMA and HBM.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Simple. Efficiency gets me a 50% faster GM200 at 250W that still runs hot and loud relatively speaking. Water gets me 50% faster GM200 and 10-20% overclocking headroom, while still maintaining low temperatures and awesome noise levels. Secondly, when running dual cards as I do, I will have the heat being exhausted out of the case. Right now I have to choose between an inferior blower from NV/AMD and going open air quiet after-market cards. That forces ne to use a full tower to exhaust the heat out. With water I could move to MicroATX and still use 2 powerful uber overclocked flagship cards in CF/SLI.

You seem to have an either OR approach to this. I don't. I view water cooling as a far superior cooling solution on top of an already more efficient next gen design. All things being equal, GM200 on water will crush GM200 on air. If I am asked to pay $500-800 for such a card, I will pay $50 more for warrantied WC solution that blows NV's Titan blower out of the water (hehe).

In strict business sense, having Asetek supply AIO coolers for reference cooling of these cards is a failure. Think about the margins. Every cent counts when dealing with large volumes and I have to agree with Sirpauly here. You want to design not only a GPU that measures great in absolute performance but also power efficiency because this will affect what kind of cooling your product needs (which also affects power requirements/VRM design/noise performance).

nVIDIA has already shown to us that the blower design can be greatly improved upon. Compared to old designs, the titan cooler is perhaps the best blower design ever and many would agree with me. They could even opt to put a GTX690 style cooler on single GPU SKUs. At the end of the day, they end up with controlling BOM costs with the reference cooler which leaves the AIB to go ahead and make better coolers at their will.

Its abit worryingly that AMD is having to use AIO WC from another company because it tells us that the particular GPU requires hefty cooling (and they are once again neglecting reference designs). AIO WC's also brings a whole set of disadvantages ranging from more difficult installing procedure instead of a simple plug and play, another set of variables in terms of failure point, compatibility issues with a whole slew of cases and varying cooling performance due to setup (intake/outtake/no of fans/ fan type).
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Exar333, your post proves that NV's brand name trumps all. X800/850 series was better than 6800 series, X1900 series was miles better than 7800/7900 series in DX10 games. 1 year after 7900GTX launched, 8600GTS started to outperform it in modern titles, while X1950XTX kept on trucking. In both of those generations ATI also had superior AA and Tri-linear texture filtering IQ. NV needed about 1 extra step of AA to match ATI's. But going back to FX5000 days, some people on our boards still bought those cards and they were just awful compared to 9700/9800 series.

Couple of things.

Radeon X800/850 was faster but PE edition was virtually nonexistent & lacked Shader Model 3.0 support.
GeForce 7800/7900 & Radeon X1900 didn't support DX 10 at all, Radeon were faster in the DX9 path but neither provided even 1 fps in DX10.

I was one of those that bought a few FX5200's even though only for office machines. But id did use soft-modded Radeons for gaming in that era.

Other than that, good post. :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ I bet any money it costs LESS to put water-cooling on a flagship card than $3-4 billion it costs to design a GPU architecture with 2X perf/watt. SirPauly lives in the clouds where Company A magically competes with Company B as if they have the same resources. He either pretends that's the case or doesn't see the financial reality in which AMD and NV operate.

AMD's engineers are super smart because they are given way less to work with than Intel's and NV's engineers. The fact that NV didn't blow AMD out of the water in the last 8 years is stunning. NV is primarily a graphics card company while AMD has other businesses to split R&D and workforce capital across. If going WC+HBM is a more cost effective way right now, that's the right approach. Again, all things being equal, I would take WC Maxwell all day over any blower.

You say the Titan blower is good but it's still mediorce if you want to overclock a 250-275W GPU compared to say Gigabyte Windforce 3X 600W for example:

"In the automatic regulation mode, when the fans accelerated steadily from a silent 1000 RPM to a comfortable 2040 RPM, the peak GPU temperature was 78°C. It is about 20°C better than with the reference cooler and much quieter, too! That’s just an excellent performance for a cooler of the world’s fastest graphics card."

http://www.xbitlabs.com/articles/gr...orce-gtx-titan-black-ghz-edition_4.html#sect0

120mm rad cools 295X2, something that Windforce cooler can dream of. That shows you just how much superior water is over air for GPU cooling at similar power dissipation.

You made an inference that AIO LC costs more than a high quality water cooler from Asetek, but how do you know that? The BOM for Titan Black's cooler could be $40-50! If you buy 2-4 million 120mm rad AIO LC kits in volume from Asetek, you don't think you can get them for $40-50?

Secondly, you talk about mounting issues. When was the last time you saw a case that has neither a 120mm front or rear fan section? Who buys $350+ GPUs and installs them into a $20 case? Again, WC is especially superior with dual cards. Since I am specifically discussing flagship products, OEMs like Alienware or Origin or MainGear or CyberPower will have no issues mounting a 120mm rad-cooled $500 graphics card.

Third, let's talk about efficiency. Let's assume some worst case scenario: R9 390X uses 300W of power and is only 20% faster than 980 for $550. That's a delta of 135W against a reference 980.

Let's take some unhealthy gamer who has no life, no kids, no friends, no real world job, and plays 4 hours a day every day all year.

$0.30 per kWh x 135W delta x 4hr a day x 365 days / 1000W in 1 kWh ~ $59.

Ok, that's the cost of 1 PC game or 1 month of your cell phone bill. But wait, in North America we don't generally pay $0.30 per kWh and no healthy adult plays 4 hours a day every day unless that's his job or he is a millionaire / disabled person who uses gaming like reading/listening to music, etc.

On Toronto, Canada, peak rage is 14 cents, mid < 12 cents or roughly $0.10 USD:
http://www.torontohydro.com/sites/e.../yourbilloverview/Pages/ElectricityRates.aspx

That means 3X less than the $59 I calculated, or ~$20 a year. Therefore, if you are a very heavy gamer and you play 4 hours a day, based on electricity prices in NA, you would lose $20-25 annually with 135W higher power usage. That's like taking your gf/wife to the movies ONCE! If to someone $20-25 a year is too much, they shouldn't be buying $300 GPUs and $50 PC games to begin with.

That's why marketing people hate finance guys like me. I break their marketing BS into mathematics and disprove their marketing claims that something matters more than it does. In fact, history shows us that the greatest cost of videocard ownership is depreciation/value lost on resale, not electricity costs.

With the above data, WC solves my temperate and noise levels, allows me to run dual cards cooler and quieter and affects my CPU temperatures less, which in turn allows me to downsize my case.

What about MiniITX guys? Well, since PC gaming is moving to 4K, MiniITX is going to be struggling to keep up for 4K gaming in the next 5 years. It's not going to be possible to create a rig with multiple high end cards in such a cramped case. If someone with a MiniITX has no desire to upgrade to 4K in the next 5 years, there will be plenty of $400 and below cards that are air cooled. I am talking specifically $500+ flagship cards where NV/AMD should give us the OPTION of WC. The cooler on Titan Z proves that it's inferior to WC on 295X2:

http://m.sweclockers.com/artikel/18944-nvidia-geforce-gtx-titan-z/20

Wins in temperatures, noise levels, despite 295X2 using way more power.

The opposition to WC reminds me of hardcore Porsche fans but we know the modern 911 is a way better performing sports car after it went to water. It's no wonder the top Pc enthisiasts all watercool their GPUs.

Furthermore, since lower temperature of a chip lower its power consumption, a water cooled 250W card would be faster since it could be clocked higher vs. an air cooled 250W card.

Going Hybrid WC is a huge win-win, not a failure as you paint it. It would allow NV and AMD to create 275-300W single GPU monster cards and not worry about high temperatures and noise levels. Since 4K is so demanding, we are going to need as much performance leaps as possible as demand for 4K desktop PC gaming increases. For that to happen, the GPU makers should combine improvements in Perf/watt with WC standard for their flagship single and dual-GPU products.

Again, I am not saying that WC > perf/watt universally. I am just saying WC is a low hanging fruit that has major positive implications. Make 250-300W flagship desktop cards water cooled and see if the consumer/market embraces them. Worst case, NV and AMD can always go back to air. For everyone else who only cares about Perf/watt, there will be plenty of mid-range 150-170W cards for the next 10 years to keep you satisfied.
 
Last edited:
Feb 19, 2009
10,457
10
76
You made an inference that AIO LC costs more than a high quality water cooler from Asetek, but how do you know that? The BOM for Titan Black's cooler could be $40-50! If you buy 2-4 million 120mm rad AIO LC kits in volume from Asetek, you don't think you can get them for $40-50?

I know for a fact due to some retail exp, Asetek made 120mm AIO LC (Thermaltakes etc) sold in bulk at AT COST to retailers go for ~$28/$34 (30/60mm rads).

Cost is actually extremely competitive for the cooling level, coupled with extra advantages of sending heat out your case. The only downside is having a case that can fit the radiator, which isn't even a downside..

I have built a few Silverstone SG05 and 06 rigs, which IIRC is ~12L in volume and it handles a 120mm radiator just fine.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Let's take some unhealthy gamer who has no life, no kids, no friends, no real world job, and plays 4 hours a day every day all year.

4 hours a day? That's it?

I thought we were talking about no life, no friends, gaming addict (MMO?!).
And it looked like you wanted to get your message across (kWh being BS) by taking almost the worst case scenario.
Now be generous and give a man his 12 hours/day. He has no life, remember?

On Toronto, Canada, peak rage is 14 cents, mid < 12 cents or roughly $0.10 USD:
http://www.torontohydro.com/sites/e.../yourbilloverview/Pages/ElectricityRates.aspx

But wait, what's this ^^ Canada??
Out of all places, you chose Canada, which has the cheapest electric energy on the planet relative to purchasing power.
You are not going to get your message across if you continue like this ;)

Lets not take cheapo Canada OK? Nor 8 - 37 cents USA(much variance).
Lets take middle of the road 30 cents (Australia, Belgium, Netherlands, Italy , Ireland, Sweden) which accidentally is the number you started your calculation with.

$0.30 x 0.135 kWh x 12h x 365 x 100 / 80 (PSU efficiency) = $221 /year (~per 300W card)

So it looks like during the period of 2 years, our nolifer is going to pay for yet another nice GPU($450) on electricity alone.
Or $300 per 2 years, if he's no-lifer with a job, playing only 8h/day.


That means 3X less than the $59 I calculated, or ~$20 a year. Therefore, if you are a very heavy gamer and you play 4 hours a day, based on electricity prices in NA, you would lose $20-25 annually with 135W higher power usage. That's like taking your gf/wife to the movies ONCE! If to someone $20-25 a year is too much, they shouldn't be buying $300 GPUs and $50 PC games to begin with.

That's why marketing people hate finance guys like me. I break their marketing BS into mathematics and disprove their marketing claims that something matters more than it does.

Right, imagine the marketing slogans:

  • Buy 2 Radeons, get only 1 - GAMING EVOLVED 2 year plan.
  • Radeon inside - get a job or GTFO
  • AMD Gaming on a high budget.

And this is just for our no-lifer. It gets multiplied many times over in HPC environment.
AMD won't get a single high value supercomputer/data center contract if you need a freaking water just to use their consumer GPU.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
were they built out of 300W parts? while purposely ignoring similarly powerful 170W part

EDIT: I myself am not overly concerned with GPU perf/W.
For me perf/W is mostly about ergonomics. And peace of mind.

Why should I pay more - if I can get away with less, ie. power bill, comes at 3rd place.

But RS makes it sound like it's totally a non-issue.
Any demographics, any gaming habits, just forget about power bill difference. AND GAME ON!!
Well it doesn't work quite like that.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,584
14
81
I don't like WC by default on the cards(but for the big boys and halo-class cards it seems to be a good idea), but WC works. I don't like, but Works, and don't require maintenance.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Realize that a company like NV has products released and products in the pipeline. It should be pretty easy to market something that is easily the best, right?

Lol..... oh how misinformed and misguided this thought is....
 

Wild Thing

Member
Apr 9, 2014
155
0
0
4 hours a day? That's it?

I thought we were talking about no life, no friends, gaming addict (MMO?!).
And it looked like you wanted to get your message across (kWh being BS) by taking almost the worst case scenario.
Now be generous and give a man his 12 hours/day. He has no life, remember?



But wait, what's this ^^ Canada??
Out of all places, you chose Canada, which has the cheapest electric energy on the planet relative to purchasing power.
You are not going to get your message across if you continue like this ;)

Lets not take cheapo Canada OK? Nor 8 - 37 cents USA(much variance).
Lets take middle of the road 30 cents (Australia, Belgium, Netherlands, Italy , Ireland, Sweden) which accidentally is the number you started your calculation with.

$0.30 x 0.135 kWh x 12h x 365 x 100 / 80 (PSU efficiency) = $221 /year (~per 300W card)

So it looks like during the period of 2 years, our nolifer is going to pay for yet another nice GPU($450) on electricity alone.
Or $300 per 2 years, if he's no-lifer with a job, playing only 8h/day.




Right, imagine the marketing slogans:

  • Buy 2 Radeons, get only 1 - GAMING EVOLVED 2 year plan.
  • Radeon inside - get a job or GTFO
  • AMD Gaming on a high budget.

And this is just for our no-lifer. It gets multiplied many times over in HPC environment.
AMD won't get a single high value supercomputer/data center contract if you need a freaking water just to use their consumer GPU.

Why bitch about his use of Canadian figures then post rubbish about Australian electricity prices?
I'm in Australia...current price is $US .21 cents per kilowatt hour.:whiste:
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
So out of interest f1sherman,what graphics card did you have from late 2009 to early 2012?? Just interested to know.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
Why bitch about his use of Canadian figures then post rubbish about Australian electricity prices?
I'm in Australia...current price is $US .21 cents per kilowatt hour.:whiste:

There are multiple flaws/inconsistencies in that post beyond that as well.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Thx.

TBH,I think 12 hours a day is pushing it for 365 days a year.

Maybe if you have a week or two off,or want to finish a game quickly.


BTW,if someone is gaming 12 hours a day,365 days a year,it usually means that:
1.)They are unemployed and are on social benefits
2.)Are unemployed due to illness and are on social benefits
3.)Are a carer on social benefits
4.)Are a student who probably is doing an unintensive course and probably has no money
5.)Some rich playboy who gets money thrown to them,so they can spend 50% of their life gaming.
6.)Carer getting a decent salary
7.)Are unemployed/do like 3 hours work a day and survive off parents rent free.


I would say in scenarios 1.) to 4.) in most countries,even affording a highish end card,let alone a decent PC,is going to be a problem just after bills and rent are paid,unless you get into mega-debt.
5.) to 7.) might mean you can afford better stuff,but I suspect running costs won't really matter as much though.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Lets not make this into social study; gotta keep it simple ;)
I did offer lighter 8h version: only $300 per 2 years

I know... but what about different workloads?
Eh, is the guy not allowed to supersample/gedosato/MSAA/144Hz the hell out of his game(s)?
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
The average PC gamer is above 30, married with kids.

+1

I would say Russian's statement is true. 4hrs a day every day for a year is someone with a lot more time on their hands than the average gamer. Even though I'm considered a "gamer" to all my friends, I generally only average 3-4hrs a week.

Power efficiency certainly has it's place but $$ savings isn't really one of the reasons for most people.

Back OT - It sounds like the 390X would be pretty short-lived on 20nm if it debuts this spring and 16nm FF is supposed to be ready at the end of 2015.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
What's this? We went from

unhealthy gamer who has no life, no kids, no friends, no real world job

to
average PC gamer [...] above 30, married with kids

OK have it your way. 4h it is.
But lets not pretend nolifers are welcomed in our little calculus.

$150 during 2 years @ 30 cents/kWh (Australia, Belgium, Netherlands, Italy , Ireland, Sweden) due to 135W difference.

But what about CF... :sneaky:
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Lets not make this into social study; gotta keep it simple ;)
I did offer lighter 8h version: only $300 per 2 years

I know... but what about different workloads?
Eh, is the guy not allowed to supersample/gedosato/MSAA/144Hz the hell out of his game(s)?


8 hours a day is light gaming?

Is this serious for anandtech?