[Rumor/Speculation] GTX Titan X 12GB vs R9 390X 4GB vs Unknown GM200 GPU

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

destrekor

Lifer
Nov 18, 2005
28,799
359
126
The discussions of AIO vs air cooling kind of reminds me of the late 90's when the Voodoo 5 came out and it had active cooling vs the Voodoo 3 which was passive. Or the Geforce 256 vs earlier TNTs. While most were fine with it, there were some that called it out as lousy design. That however, was the difference in 11W vs being able to explore 30W+. AIO are being used to move to 300W+. I don't see that being sustainable. It may work this gen to edge out the competition, but we can't keep moving up the power ladder to extract more performance.

Agreed.

It may be a fine option, but IF it were to become required, that is a step too far. Changing how the on-card cooling is designed is one thing, and we've seen how they've incremented the designs over time as they paved a new path to match what the cards were putting out in terms of heat.

Imagine the non-fiasco the FX 5800 would have been had it shipped with a modern blower-style cooler, even something like on the 560 Ti versus the current phase-change style.

But to move the cooling OFF the card itself? That's no longer nearly as universally compatible.

As an option, especially for dual-GPU cards? Great. I don't think it'll be required this round, but I still worry about it even being the reference design, if it is.

I hope, for AMD's sake, that they are able to improve their reference cooler designs and, most importantly, bring the TDP down as they make improvements to GCN in time.
Hopefully they can reiterate enough on GCN to get back into a healthy revenue stream, because it may be that they need to draw up a completely new architecture design?

People here are getting all worked up, as if stating these kinds of things are insults and offensive toward the perfect goddess that is AMD. All companies have their issues that we should be able to point out, because we ultimately want all companies to succeed well enough, as they push each other so that a company like Nvidia can't compete against a top-end card with their middling smaller die.

I'm hoping this new revision of GCN is enough to put AMD back on the map in the eye of the average joe and of course in the eye of the enthusiasts.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Where does it end? Every gen we go up 100W? That can't happen.

Obviously we should ultimately progress to the point that building computers is a task much like building an engine/drivetrain block, with complete water-cooling for every component and a radiator the size of a modern PC tower.

While admittedly that would be sort of neat, that's the absolute wrong answer for a general PC.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
Perhaps this wasn't addressed at me, but I suspect it was. I just bought the 290X's, got them both for about $650 post rebates.
When I put my system together in 2011, it was with SLI 560 Ti and an OC'd 2600K. I just upgraded my GPUs for the first time in this build.

Are you going to be buying two new video cards right when they come out and spend over 1k on them? Anyone spending that kind of money can surely spend 10% of that on a new case that can fit them.

No reason to be vulgar over a question, people can miss read things, as example of above post. If was addressing it to you, I would of put it in the same post as your quote.

I am not trying to get under anyone skin, I just feel people are being very closed minded with moving forward on tech. Is AIO the best thing out and the only thing that can be used? I would say no, but it is a stepping stone to get more things moving forward.

We have no idea what other people are thinking and we don't know how people will react to AIO. I dont think people be should be putting out a negative attitude or bashing something that may or may not come out and is making way for improvements for better tech.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Where does it end? Every gen we go up 100W? That can't happen.
If enough people demand higher performance and existing technology can only deliver that performance at 100w higher power consumption levels than the previous generation, then it can and will happen.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
If enough people demand higher performance and existing technology can only deliver that performance at 100w higher power consumption levels than the previous generation, then it can and will happen.

I think you are being too short sighted. It simply can not happen. If we stay at the same performance/watt for the next 5 generations and assume each generation jumps 40% in performance then we will be at 1600W cards. Obviously efficiency is very important. Like I said, we can not "progress" by simply adding more power to get more performance.

It is a death sentence, especially when the competition is focusing on efficiency.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
Where does it end? Every gen we go up 100W? That can't happen.

I think we should be more open about false limitation that people are putting things. No reason why we can't have higher ceiling on power, no reason why we can't use water to cool things down and no reason why we shoot down thinking outside of the box to improve technology as a whole.

I would love to have a single video card that equal to CF/SLI in performance and power(if need be).
 
Last edited:

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I think we should be more open about false limitation that people are putting things. No reason why we can't have higher ceiling on power, no reason why we can't use water to cool things down and no reason why we shoot down thinking outside of the box to improve technology as a whole.

I would love to have a single video card that equal to CF/SLI in performance and power(if need be).

Adding more power is the furtherest thing away from "thinking outside the box."
 

lilltesaito

Member
Aug 3, 2010
110
0
0
Adding more power is the furtherest thing away from "thinking outside the box."

I dont see single card with 400watts. I see companies only trying to make cards around 250watts.

So I would say if Nvidia made a card that was 400watts and was 100% increase performance, is stepping out of the box model.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I dont see single card with 400watts. I see companies only trying to make cards around 250watts.

So I would say if Nvidia made a card that was 400watts and was 100% increase performance, is stepping out of the box model.

I hate to use the analogy, but you don't see 5mpg production cars with 1500hp being made by companies either.

AMD may actually go to 400W, but that is not because they want to.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I think you are being too short sighted. It simply can not happen. If we stay at the same performance/watt for the next 5 generations and assume each generation jumps 40% in performance then we will be at 1600W cards. Obviously efficiency is very important. Like I said, we can not "progress" by simply adding more power to get more performance.

It is a death sentence, especially when the competition is focusing on efficiency.

CPU's kinda went that way recently and nobody really seems thrilled with the tiny IPC improvement and add-more-cores solution manufacturers have employed.
Little apple to orangy-ey but same basic problem.

I don't think in GPU circles it would be as well received if the next generation of cards was 100w less but only 7% performance improvement, nobody would buy them. Not saying the opposite is true or that you are at all wrong, it's just part of the equation. Much like newer CPU's, it'd' be great for mobile gaming I guess.

I'm inclined to try to enjoy regular large performance improvements in GPU's while they last as long as even a small water cooler will keep the things in check temp wise.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
I hate to use the analogy, but you don't see 5mpg production cars with 1500hp being made by companies either.

AMD may actually go to 400W, but that is not because they want to.

I do see cars that are 12mpg with 600hp. Viper

Gtr is 16mpg and 545hp

May not be as extreme as you put it but if you compare it to most cars that are 150ish hp and getting 25+mpg, I would say it is the same as 250watt vs 400watt
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I do see cars that are 12mpg with 600hp. Viper

Gtr is 16mpg and 545hp

May not be as extreme as you put it but if you compare it to most cars that are 150ish hp and getting 25+mpg, I would say it is the same as 250watt vs 400watt

That's why I hate to use analogies like that and shouldn't have. It was to illustrate a point, not to debate cars.
 

lilltesaito

Member
Aug 3, 2010
110
0
0
Cars are great to use, for us that want faster stuff like the viper or GTR as car goes, we want faster video cards and throw out the better mpg/watts.

I know it is not what you wanted to point out and works better for what I am trying to show.
As a hobby we want that higher end and some people will pay crazy amounts(looking at you titan) for faster.

Edit:
Now it is just getting way off topic. So going to stop with the watts.
 
Last edited:

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
Not a bad analogy, just a little bit different scene/market.

I frequently wonder what sort of crossover there is between computer hardware preference and car preference. But that's for another day.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Cars are great to use, for us that want faster stuff like the viper or GTR as car goes, we want faster video cards and throw out the better mpg/watts.

I know it is not what you wanted to point out and works better for what I am trying to show.
As a hobby we want that higher end and some people will pay crazy amounts(looking at you titan) for faster.

Edit:
Now it is just getting way off topic. So going to stop with the watts.

250-300W is already the very top of the market though. Most "daily driver" GPUs are in the 10-50W range.

To be clear, my point is that going up the power ladder has very, very clear disadvantages and a terminal limit that is reached quickly. Efficiency is all advantage, the hard part is that it is much tougher to make something more efficient rather than simply use more power. When the competition is extracting more efficiency then you must be as well to survive long term.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
Your wants and what makes a company sustainable are two completely different things.

That seems at odds to some degree with the fanfare given cards like the Titans
and 295x2's and such "halo" products. They aren't rushing out low buck
low power (low performance) cards and trotting them out on stage at trade shows. I don't have any knowledge as to what makes them more money as far as "performance" cards vs low power efficient cards, but the marketing and such seems to indicate they place a lot of weight on the no holds bared performance examples. May well be a variant of the old "race on Sunday, sell on Monday" mantra, but I can't tell it as an observer.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Where does it end? Every gen we go up 100W? That can't happen.

It ends at whatever the top end limit of the best (but still affordable) cooling system is. This is precisely why AIO is a good idea. It increases the ceiling. Its a technological/economic improvement in heat dissipation, full stop. It's a superior system in terms of removing heat from the GPU and case. Any points against AIO are related to other things about it, but not its effectiveness...
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
That seems at odds to some degree with the fanfare given cards like the Titans
and 295x2's and such "halo" products. They aren't rushing out low buck
low power (low performance) cards and trotting them out on stage at trade shows. I don't have any knowledge as to what makes them more money as far as "performance" cards vs low power efficient cards, but the marketing and such seems to indicate they place a lot of weight on the no holds bared performance examples. May well be a variant of the old "race on Sunday, sell on Monday" mantra, but I can't tell it as an observer.

I don't believe it is at odds at all. The Titan technology is used in many sectors other than just gaming and is well within normal power levels. The 295X is a specialty card with two GPUs. Upping power levels on single GPUs year over year is not sustainable. Maxwell is precisely the right way to go. Much more efficient and a large improvement over the last generation while staying in a widely deployable power range.

AMD may be perfectly fine jumping power this generation, I'm not saying anything against that. It can't be their only way of progressing their performance though. We may be fine to 500W or more, but there is a limit and it will be fast approaching if companies go up the ladder with reckless abandon. That is what I'm pointing out.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
It ends at whatever the top end limit of the best (but still affordable) cooling system is. This is precisely why AIO is a good idea. It increases the ceiling. Its a technological/economic improvement in heat dissipation, full stop. It's a superior system in terms of removing heat from the GPU and case. Any points against AIO are related to other things about it, but not its effectiveness...

Perhaps in a vacuum, but when the competition is finding ways to increase efficiency, then pursuing ever better ways to cool down your power hungry cards generation after generation becomes self limiting. I'm talking long term, not just next generation.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Are you going to be buying two new video cards right when they come out and spend over 1k on them? Anyone spending that kind of money can surely spend 10% of that on a new case that can fit them.

No reason to be vulgar over a question, people can miss read things, as example of above post. If was addressing it to you, I would of put it in the same post as your quote.

I am not trying to get under anyone skin, I just feel people are being very closed minded with moving forward on tech. Is AIO the best thing out and the only thing that can be used? I would say no, but it is a stepping stone to get more things moving forward.

We have no idea what other people are thinking and we don't know how people will react to AIO. I dont think people be should be putting out a negative attitude or bashing something that may or may not come out and is making way for improvements for better tech.

Fair enough, and I apologize for misconstruing your posts.
 

metalliax

Member
Jan 20, 2014
119
2
81
Where does it end? Every gen we go up 100W? That can't happen.

It ends when the next manufacturing node jump occurs. When you see GPUs on 14nm, they'll be at the 150-200W barrier most likely, and then for the next few years, if 10nm doesn't produce good product, you'll see 250W, and then 300W GPUs. However, this is only applicable to single-GPU cards. This trend should continue as we progress to new manufacturing nodes, and it is probably dependent on how long each node will last.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I don't believe it is at odds at all. The Titan technology is used in many sectors other than just gaming and is well within normal power levels. The 295X is a specialty card with two GPUs. Upping power levels on single GPUs year over year is not sustainable. Maxwell is precisely the right way to go. Much more efficient and a large improvement over the last generation while staying in a widely deployable power range.

AMD may be perfectly fine jumping power this generation, I'm not saying anything against that. It can't be their only way of progressing their performance though. We may be fine to 500W or more, but there is a limit and it will be fast approaching if companies go up the ladder with reckless abandon. That is what I'm pointing out.

I was thinking as much on price as on power/performance as to them not being mainstream or what have you. Just that the marketing teams seem to think the top tier high power/performance cards are what deserves the marketing attention, not lower power more efficient stuff(even if they do make them more money, or not). No argument really though. It you follow CPU stuff you'll see a very similar goings on the last years with freq/power/heat/perf. AMD was considerably more liberal there too, and while I was pretty happy with them it didn't work out market wise. Even Intel has "hit a wall" in many respects where we don't see the 20% gains every generation like we used to because it would have taken huge power/frequency/heat to do it. They can lower power usage, add cores and some shortcut functions and instruction sets to improve efficiency, but it ain't what it used to be. GPU's are likely on the same road. Folks are regularly running 4 years on a given CPU before it's a real bottleneck, that was unheard of not too long ago.
I guess be carefull what you wish for.. lol
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Perhaps in a vacuum, but when the competition is finding ways to increase efficiency, then pursuing ever better ways to cool down your power hungry cards generation after generation becomes self limiting. I'm talking long term, not just next generation.

What's the big difference between stretching a technology to the limit and going to a more capable technology? Both companies are butting up against the limitations of blowers. They're dealing with it in different ways. An increase in performance at constant or near constant power is very much an increase in efficiency, and if they increase efficiency but can still put out a bigger more powerful chip, why not? If they can make a single chip that can fill a high wattage role like what two chip cards do now and keep on track with efficiency improvements, why not? Discussing absolute power use as if it were proportional to efficiency in the face of non-constant performance is a non sequitur.