• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Techpowerup/Chiphell/3DCenter: AMD 6990 launches March 8th - uses two 6970 Cores

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,143
32
91
Besides anything PCI-SIG can or cannot do, what you are doing, when you ignore the spec, is selling a card that other equipment (MoBo's, for example) aren't designed to run with.

Unless the card exhausts all of it's heat outside of the case, you are also dumping an incredible amt of heat into the case. If it does, then we are looking at one hell of a cooler, if it can exhaust all of it's heat, keep the card cool, and not sound like a dustbuster.

Personally, I think it's a hack job to ignore the specs. Just shows what you are not capable of engineering.
yeah, because ASUS wouldn't want you to slap a 301w card into a ROG mobo...
 

bryanW1995

Lifer
May 22, 2007
11,143
32
91
I thought it had already pretty much been established consumers didn't want to go there when it comes to power hungry cards that produce tons of heat and noise? I had a gtx 470 and 480 and personally I didn't mind the heat and noise but Nvidia took a beating sales wise and PR wise vs the cooler running 5XXX series. I believe the consumers spoke and nvidia listened.
No, that is a common misconception. Consumers cared about all those other great features of the AMD cards b/c performance was pretty close but amd overall had the edge: 5970 used ~ same power as gtx 480 but was faster, 5870 used significantly less power than gtx 470 and was slightly faster, 5850 had no direct competitor until gtx 460, and when gtx 460 showed up NV limited the clocks so much that 5850 was still faster with a similar power draw. If you bump up NV performance across the board by 20%, nobobdy cards that amd is cooler/less power hungry/etc b/c they're still getting smoked by NV. It's just when the cards are close in performance that you see bbq grills with NV logos on them popping up everywhere.

LOL @ IDC: you should change your name to I DO care.
 

Baasha

Golden Member
Jan 4, 2010
1,991
18
81
Guys,

So for resolutions of 2560x1600 and above (Eyefinity/Nvidia Surround), would you say the 6990 will outdo the GTX-590? Of course, the Nvidia card isn't out yet but for ultra-high resolutions, hasn't AMD held the crown?

If one were to get two of these dual-GPU cards (Quad-SLI/QuadFire), which one would you guys recommend (for ultra-high resolutions)?
 

badb0y

Diamond Member
Feb 22, 2010
4,012
27
91
Guys,

So for resolutions of 2560x1600 and above (Eyefinity/Nvidia Surround), would you say the 6990 will outdo the GTX-590? Of course, the Nvidia card isn't out yet but for ultra-high resolutions, hasn't AMD held the crown?

If one were to get two of these dual-GPU cards (Quad-SLI/QuadFire), which one would you guys recommend (for ultra-high resolutions)?
I think both the GTX 590 and HD 6990 will do well in high resolutions ypu should wait it out to see who gets better quad GPU scaling if you want to get two of these beasts
 

Skurge

Diamond Member
Aug 17, 2009
5,195
0
71
I think both the GTX 590 and HD 6990 will do well in high resolutions ypu should wait it out to see who gets better quad GPU scaling if you want to get two of these beasts
We would have to see if the 590 has more than 1.5gb per GPU or the 6990 would be a better choice even if QUAD GPU scaling was worse.
 

bryanW1995

Lifer
May 22, 2007
11,143
32
91
badb0y said:
I think both the GTX 590 and HD 6990 will do well in high resolutions ypu should wait it out to see who gets better quad GPU scaling if you want to get two of these beasts
skurge said:
We would have to see if the 590 has more than 1.5gb per GPU or the 6990 would be a better choice even if QUAD GPU scaling was worse.
yes, with ultra high res (above 2560x1600) and with very high detail settings the 2gb memory becomes more important. I'm interested to see if nvidia will go with a 2x3gb setup for the 590, and, if so, what kind of impact that will have on relative performance with both 2 x gtx 580 and also 6990. Interesting times ahead for sure.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,078
1,217
126
Guys,

So for resolutions of 2560x1600 and above (Eyefinity/Nvidia Surround), would you say the 6990 will outdo the GTX-590? Of course, the Nvidia card isn't out yet but for ultra-high resolutions, hasn't AMD held the crown?

If one were to get two of these dual-GPU cards (Quad-SLI/QuadFire), which one would you guys recommend (for ultra-high resolutions)?
In general after 2560x1600 AMD will perform better due to better texture throughput and more vram. If your other thread about the 590 bench is true and the card only has 1.5gb per gpu, AMD will definitely be faster for 3 monitor gaming.

Still your best option could be 3 way SLI of 3GB 580s. If money is no option of course.

But, AMD is claiming they have improved quadfire scaling to near 100% levels, in which case, 2x6990s would be best:






Will have to wait till the benches come in, and hope for 3 monitor benchmarks. Not enough of those going around.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Are all the eyefinity\surround outside LCD views just stretched?, because I think it looks pants?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
In some settings, in some games, the 1.5 gb vs 2gb may make a difference or cause the user to adjust a setting.
Very similar but quickly forgotten how the gtx 480 had 1.5gb and the 5970/ 5870 combination's only had 1gb . Except for rare SE versions.
A single 30 inch monitor at 2560x will probably never be effected.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
Not sure of your age, and that is only relevant as I'm not sure if you were "on the scene" of computers at the time, but the CPU industry went through a similar hesitation when it came to producing CPU's that produced so much heat that they actually required a fan to air-cool them.

Once the industry, and the consumer, got over their reluctance to the idea the market never looked back and we quickly rushed towards the practical limits of conventional HSF technology.

The same thing happened with GPU's and multi-slot cooler solutions. There was an initial reluctance to "go there"...but once we did then it was fair game.

I see the 300W "boundary" as nothing more. It is an arbitrary value affixed within a spec that served a purpose in its time but that time has come and gone.

I've no doubt there were practical reasons for the 300W limitation, but because of engineering progress in developing cost-conscience solutions that address those original concerns I have no doubt the arbitrary 300W limit will be lifted.

When the DDR2 spec called for a max Vdimm of 1.95V that was with the expectation/assumption that sticks of ram would never have heat-spreaders or active cooling. Then engineering developed heatspreaders and dimm heatsinks with fans (dominator series, etc) and the arbitrary voltage limit of 1.95V no longer had a basis in engineering.

What you guys would claim to be hacks and engineering defeatism is actually the opposite. The spec limits exist because of the lack of engineering solutions to a real problem. When engineers resolve those real problems it is not a hack, it is opportunity.

There may be other downsides to the solutions which results in you personally electing to not purchase the product, but that is a personal decision and nothing more.

I personally had not problem running my Myshkin redlines at 2.2V as spec'ed by Mushkin but in violation of the Jedec DDR2 spec. Why? Because my mobo was designed for it, otherwise I wouldn't be able to set the Vdimm that high, and my PSU was designed for it.

And the existence of >300W video cards is not suddenly going to create an unforseen dynamic inside people's computer cases. People have been tri/quad sli'ing & CF'ing vcards for years, the combined heat output being well in excess of 300W.

This sort of bean-counting of the wattage/PCIe-Slot is silly arbitrary. You scale your PSU and cooling solutions accordingly if you want the product, otherwise you don't because you don't.
There's a lot of merit to what you say. Possibly we might need more than 300W for a graphics card to do the job. If so, then people (engineers/the powers that be) need to sit down and set proper guidelines.

We also need to get rid of the people who are setting the guidelines if they are truly arbitrary, which makes no sense to me. Why would they just pick some arbitrary number? You think they would pick a spec for reasonable well thought out reasons. These people who are, apparently, "just throwing darts at a board" to select specs, need to be replaces by people who know what they are doing.

Now, I realize they don't throw darts at a board, but that's the impression that's given by the label arbitrary. Like it's just some meaningless number that having to adhere to stifles the creative process. Now, seeing as you are an engineer, don't take this personal, please. Sometimes certain engineers just don't know how to do something within guidelines and they therefore want those guidelines removed to make their job easier, or so they don't get replaced by someone younger and brighter. Or, sometimes it's the bean counters wanting regs removed because it's cheaper. Sometimes it's managerial types, who want regs removed/changed because they feel it would help them or hurt the competition, because they can't compete within the guidelines.

Regulations on specifications are there for a reason. If they are passe and no longer apply, they need to be rewritten. Just ignoring them shouldn't be acceptable though, it can be unsafe. It certainly wouldn't be fair for one company to be operating within the specs and have their competitor just ignore them. Just like it's not fair to label someone who wants proper safety controls and products engineered to be efficient as being shortsighted. I'm not part of the, "If man were meant to fly God would have given him wings", crowd. Anarchy is not a proper solution though.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
yeah, because ASUS wouldn't want you to slap a 301w card into a ROG mobo...
But 2 or 3 of them at 299W each would be fine...
Is there some reason you two are out to discredit me with strawman arguments, rather than actually addressing what I'm saying if you disagree?

We aren't talking about a 301W card. We're seeing numbers like 375W and 450W.

If ROG MoBo's, or any others, are rated to handle 450W cards in their x16 slots, then they should say so. Without some design regulation, though, how do we know for sure?
 

PreferLinux

Senior member
Dec 29, 2010
420
0
0
Is there some reason you two are out to discredit me with strawman arguments, rather than actually addressing what I'm saying if you disagree?

We aren't talking about a 301W card. We're seeing numbers like 375W and 450W.

If ROG MoBo's, or any others, are rated to handle 450W cards in their x16 slots, then they should say so. Without some design regulation, though, how do we know for sure?
Simple, no motherboard is rated for cards above 75 W. (i.e. the card is only supplied 75 W by the board, the rest is from other connectors.)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
Simple, no motherboard is rated for cards above 75 W. (i.e. the card is only supplied 75 W by the board, the rest is from other connectors.)
What stops the card from drawing more than 75W from the board? As far as I know, nothing. Just like the 75W 6P connector. What stops the card from drawing 150W, or more, thru the 6pin? I know nothing is stopping it there. Now, 6P/8P connectors can easily supply more power than they are rated for safely. I'll wager that high quality boards will also. What about circuits though that are more marginally designed?

You can't have no regulations on power requirements. You can't ignore the specifications equipment is designed for. Everything needs to be designed to work together. This is one of the reasons they have power limits in the first place. When a card draws 450W thru 2x 8P and the board, it's drawing above specifications somewhere. Each 8P is spec'd @ 150W and the board is spec'd for 75W, where's the other 75W coming from? Something is operating above it's design specification.
 

happy medium

Lifer
Jun 8, 2003
14,387
475
126
What stops the card from drawing more than 75W from the board? As far as I know, nothing. Just like the 75W 6P connector. What stops the card from drawing 150W, or more, thru the 6pin? I know nothing is stopping it there. Now, 6P/8P connectors can easily supply more power than they are rated for safely. I'll wager that high quality boards will also. What about circuits though that are more marginally designed?

You can't have no regulations on power requirements. You can't ignore the specifications equipment is designed for. Everything needs to be designed to work together. This is one of the reasons they have power limits in the first place. When a card draws 450W thru 2x 8P and the board, it's drawing above specifications somewhere. Each 8P is spec'd @ 150W and the board is spec'd for 75W, where's the other 75W coming from? Something is operating above it's design specification.
I think its the connectors. I seen somewhere that a 6 pin connector can actually give 150 watts. Mabe thats why some mid range cards can pull 225 watts with only 1 6 pin connector. So mabe a 8 pin can actually give 175 watts?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
I think its the connectors. I seen somewhere that a 6 pin connector can actually give 150 watts. Mabe thats why some mid range cards can pull 225 watts with only 1 6 pin connector. So mabe a 8 pin can actually give 175 watts?
Assuming the PSU is up to it, all the connections will supply whatever the card wants until they reach their thermal limit and fail.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,606
134
106
Regulations on specifications are there for a reason. If they are passe and no longer apply, they need to be rewritten. Just ignoring them shouldn't be acceptable though, it can be unsafe. It certainly wouldn't be fair for one company to be operating within the specs and have their competitor just ignore them. Just like it's not fair to label someone who wants proper safety controls and products engineered to be efficient as being shortsighted. I'm not part of the, "If man were meant to fly God would have given him wings", crowd. Anarchy is not a proper solution though.
Sometimes regulations are plain stupid. Sometimes regulations are written by people that don't understand what they are talking about. Sometimes regulations exist so the people writing them have a job. Sometimes (lots of times) people have agendas. Sometimes regulations are what they are because they've always been like that.

Don't think everyone running a business is some amoral demon trying to cut on safety and that everyone regulating business is some paragon of virtue trying to keep the consumer safe.

People are people.

So what is going to happen in this case?

Blowing GPUs? Blowing computers?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
Sometimes regulations are plain stupid. Sometimes regulations are written by people that don't understand what they are talking about. Sometimes regulations exist so the people writing them have a job. Sometimes (lots of times) people have agendas. Sometimes regulations are what they are because they've always been like that.

Don't think everyone running a business is some amoral demon trying to cut on safety and that everyone regulating business is some paragon of virtue trying to keep the consumer safe.

People are people.

So what is going to happen in this case?

Blowing GPUs? Blowing computers?
Anything to show that what you're insinuating applies here? Just because something happens "sometimes" doesn't make it a blanket truism.
 

happy medium

Lifer
Jun 8, 2003
14,387
475
126
What stops the card from drawing more than 75W from the board?
If nothing stops it from drawing more power then why not just let the slot power a hd5750. Why even have the 6 pin connector, the 5750 only pulls like 90 watts. I think its fair to say the pci-e SLOT only pulls 75 watts. The extra power come from the pin connectors that will run above spec.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
If nothing stops it from drawing more power then why not just let the slot power a hd5750. Why even have the 6 pin connector, the 5750 only pulls like 90 watts. I think its fair to say the pci-e SLOT only pulls 75 watts. The extra power come from the pin connectors that will run above spec.
I think you answered your own question. It has a 6P connector because it draws more than 75W.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
0
71
I think you answered your own question. It has a 6P connector because it draws more than 75W.
I understand what you are saying. The extra power has to come from somewhere, it could be the power connectors or it could be the PCI-E slot.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
200
106
I understand what you are saying. The extra power has to come from somewhere, it could be the power connectors or it could be the PCI-E slot.
Yes, thank you. And unlike these other guys, English isn't even yout first language. ;)

We have lots of people who upgrade components all of the time who can't afford a complete rebuild. There's enough to consider already while choosing and recommending components. Now, not only might your PSU not be up to snuff, but we might have to recommend an overclocker's board. Or an EVGA triSLI or better board, because the cards aren't going to adhere to the specs that the boards are designed to.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Yes, thank you. And unlike these other guys, English isn't even yout first language. ;)

We have lots of people who upgrade components all of the time who can't afford a complete rebuild. There's enough to consider already while choosing and recommending components. Now, not only might your PSU not be up to snuff, but we might have to recommend an overclocker's board. Or an EVGA triSLI or better board, because the cards aren't going to adhere to the specs that the boards are designed to.
Don't you think there would be some type of voltage regulation? Whether it be on the motherboard or the maximum draw within spec on the graphics card itself? I'm no electrical engineer, but I would think that when designing a mobo and graphics card, that some sort of voltage/amperage regulation would be a good idea.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
0
71
Don't you think there would be some type of voltage regulation? Whether it be on the motherboard or the maximum draw within spec on the graphics card itself? I'm no electrical engineer, but I would think that when designing a mobo and graphics card, that some sort of voltage/amperage regulation would be a good idea.
Forgive me, (was sleeping durimg that part of physics class) but would voltage regulation keep the card from drawing too much power?

I think I remember people who were running overclocked 4870X2 could feel the PCI-E cables softening and heating up while the card was under load. Wouldn't something similar happen with an overclocked 6990?
 

ASK THE COMMUNITY