EU cripples future graphics cards [Nordic Hardware Exclusive]

Status
Not open for further replies.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
"NordicHardware has seen exclusive information about a new energy law that will apply within the EU. The law requires that both discrete and integrated graphics cards live up to certain energy standards.

There are currently seven specifications for graphics cards - G1, G2, G3, G4, G5, G6 and G7. Graphics cards of the G7 classification have a bandwidth of 128 GB/s (GigaByte per Second) and more, without an upper limit today. The category depends on the performance - in this case measured in memory bandwidth. These GPU categories are also paired with a certain level of energy efficiency. If a graphics card doesn't live up to the standard set by the EC it can be removed from all markets within the EU. The rules will now be constricted, which threatens next generation graphics cards.

The commission wants to stop dedicated graphics cards of group G7 from going above 320 GB/s - that is in theory a memory bus at 384-bit connected to memory operating at 6667 MHz or 512-bit with 5001 MHz. For notebooks the limit will be only 225 GB/s. Performance delivered in games or general calculations are irrelevant according to Lot 3.

According to a report published in August this year the current roadmaps [from AMD and Nvidia] does not support the new requirements up until 30 months into the future. The changes in Lot 3 will therefore be introduced in steps. The first will be in 2013 or 2014 as mentioned above, and thereafter new restrictions will apply in 2015. OEM companies like Dell and HP are well aware of this and worried about how this will affect their operations. The changes should also affect retail graphics cards and home builders."


Full Article

Until some new information turns up, we'll let this subject rest. Everyone has seen the (confusing) source document, so there's nothing further to discuss at the moment.
-ViRGE
 
Last edited by a moderator:

hokies83

Senior member
Oct 3, 2010
837
2
76
Hmm if that happens i see alot of European people buying American gpus andshipping them over...
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Brilliant, Europe. Brilliant. If I lived over there I'd start a campaign to overturn this law. But I live in the US.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
I'm still not sure I follow. In order to regulate energy consumption, the EU is setting rules on memory bandwidth as opposed to energy consumption?

Edit: I can't find the figures referenced in the article in the EU document at all
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Hmm if that happens i see alot of European people buying American gpus andshipping them over...

Based on economies of scale, would AMD and NV be able to manufacture 2 separate lines of high-end GPUs that adhere to different global standards or just limit Europe to low-end and mid-range GPUs? Without EU market, would AMD and NV have enough customers in the rest of the world to support high-end GPUs with more memory bandwidth than 320GB/sec?

Sounds like a terrible move to enforce regulation on GPU makers in this difficult time for traditional PCs and especially for AMD as their HD8000 series will surely not meet Lot 3 standards already.

I'm still not sure I follow. In order to regulate energy consumption, the EU is setting rules on memory bandwidth as opposed to energy consumption?

"Besides that the energy efficiency requirements will be tighter - in this case the energy consumption of the card in relation to its memory bandwidth. According to data NordicHardware has seen from a high level employee at AMD, current graphics cards are unable to meet with these requirements. This includes "GPUs like Cape Verde and Tahiti", that is used in the HD 7700 and HD 7900 series, and can't meet with the new guidelines, the same goes for the older "Caicos" that is used in the HD 6500/6600 and HD 7500/7600 series. Also "Oland" is mentioned, which is a future performance circuit from AMD, that according to rumors will be used in the future HD 8800 series."

The details weren't provided in the article but I think they'll be explained later. I think this is the Report and Annex II has some details regarding total energy consumption allowed per year.

"1.1.3. Category D desktop computers and integrated desktop
computers meeting all of the following technical parameters are
exempt from the requirements specified in points 1.1.1 and
1.1.2:

(a) a minimum of six physical cores in the central processing
unit (CPU); and
(b) discrete GPU(s) providing total frame buffer bandwidths
above 320 GB/s; and
(c) a minimum 16GB of system memory; and
(d) a PSU with a rated output power of at least 1000 W.

1.2. 30 months after this Regulation comes into force ---> 1.2.2. Exemption indicated in point 1.1.3 is no longer applicable."
 
Last edited:

Qianglong

Senior member
Jan 29, 2006
937
0
0
I'm still not sure I follow. In order to regulate energy consumption, the EU is setting rules on memory bandwidth as opposed to energy consumption?

Maybe in this case the bandwidth consumption is related to the energy consumption of the chip? I mean GPUS with 300GB/s bandwidth are usually the higher end and most power hungry designs..
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
The details weren't provided in the article but I think they'll be explained later. There are 325 pages of info. It's gotta be buried somewhere.
I've used every possible search parameter I can think of. The details aren't in that specific EU document. Not that I doubt Nordic's story, but some hard data would be nice. It's very confusing right now, as if some of the facts were not present.
Maybe in this case the bandwidth consumption is related to the energy consumption of the chip? I mean GPUS with 300GB/s bandwidth are usually the higher end and most power hungry designs..
Sure, but it's not memory transactions that are actually consuming that bandwidth. I could hook up GF100 to a 32bit memory bus, load a high stress in-cache GPGPU application, and easily have it going to town on power consumption with only minimum memory bandwidth usage. Why not just regulate energy consumption directly?
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Wait, what? That's nonsensical. Cape Verde is nowhere near 320 or 225 GB/s.
 

Andle Riddum

Member
Dec 6, 2011
52
0
0
I might be the only one here that's actually happy about this development.

Graphics cards were getting each year more power hungry and freaking huge. There are a few exceptions to this trend, but it's crazy when a card has +500w psu requirement...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I've used every possible search parameter I can think of. The details aren't in that specific EU document.

Check out this Report. I added some details from there in Post 6.

Wait, what? That's nonsensical. Cape Verde is nowhere near 320 or 225 GB/s.

If you look at the report (see link right above), each G1 to G7 category is allowed its own max power consumption annually. Looks like those cards can't meet the new rules for their respective total energy consumption in their bus width categories.

I might be the only one here that's actually happy about this development.

Graphics cards were getting each year more power hungry and freaking huge. There are a few exceptions to this trend, but it's crazy when a card has +500w psu requirement...

Ok but now one forces you to buy a 200W GPU or a 500W PSU. You can buy a GTX650 or HD7750 if you want. What if you game 15 hours a week on your HD7750 and I game 1-3 hours a week on my HD7970? This is why you pay your portion of electricity wasted and I pay mine.

This sounds like a way for EU to raise taxes for companies who don't comply. If they really cared to save the environment, they would have introduced an environmental tax on all electricity wasted regardless of what devices uses it.

Right now the US Federal standards are 967 kW for 416 cycles of a 4.4 cu ft. clothes dryer or 2,325W per 1 dryer load, which is less than an hour on a standard dryer. A modern GPU like GTX680/7970 uses less than 200W an hour. You realize drying your clothes for less than 1 hour uses more power than that GPU in 10 hours? That means if the average person just dries their clothes 4x a month, they would use 9000W of power or more or equivalent to about 50 hours of gaming on an HD7970.

It's pretty amazing to me how the entire planet cares so much about power consumption but no one I know personally in the US/Canada air dries their clothes. It's remarkable really.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Limiting the memory bandwidth in stead of simply capping the power usage is a really dumb way of doing it. At least with power usage limited better engineering can still improve performance. Limiting the memory bandwidth though is just limiting performance.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Check out this Report. I added some details from there in Post 6.
Thanks. That document makes far more sense of the whole situation.

It looks like the EU isn't regulating memory bandwidth, but rather classifying GPUs based on memory bandwidth. The more memory bandwidth you have, the higher your classification. And the higher your classification, the higher the allowed energy consumption. A card with a 192-bit memory bus gets a higher energy allowance than a card with a 128bit bus, etc.

If I'm reading this right, 320GB/sec isn't some magical barrier/limit. It's only referenced because between months 12 and 30 high powered workstations are exempted from the energy regulations; having more than 320GB/sec is one of parameters for getting that exemption. After 30 months cards with that much memory bandwidth are still allowed, they're just not exempted and instead fall under G7.

As for the power requirements themselves, the EU looks to be setting power requirements for computers, with the power requirements varying depending on the configuration of the computer. That's the 7 G levels, as the class of the GPU is used as part of the calculation for a computer's energy allowance.

The one part I don't get is the EU's formula for energy consumption. It looks like it's based on adding up energy consumption when off, when sleeping, and when idle. At no point is active energy consumption being taken into account. This is good news because it's easy to stay under their limits if they're not counting active use.

The EU's formula is 8.76 * (0.55 * <off> + 0.05 <sleep> + 0.40 <idle>), result in kWh. Presumably they're calculating how often the average computer is turned on?

Anyhow, the strictest allowance for a G7 video card is 136kWH. If we apply the Radeon HD 7970 to that, we get 8.76 * (0.55 * 0W + 0.05 * 3W + 0.40 * 15W) = 53Wh. Unless I'm doing something wrong here (which is entirely possible), modern video cards are pulling orders of magnitude less power than the EU requires. So AMD and NV would seem to be in the clear.
 
Last edited:

hyrule4927

Senior member
Feb 9, 2012
359
1
76
If they are worried about power consumption, why on earth did they choose memory bandwidth as the deciding factor? This is pure idiocy.

Edit: Just saw ViRGE's post. I guess that makes more sense. Regardless, I can't understand why they are wasting their time with such a minor contribution to the power consumption of entire countries . . .
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If they are worried about power consumption, why on earth did they choose memory bandwidth as the deciding factor? This is pure idiocy.

Edit: Just saw ViRGE's post. I guess that makes more sense. Regardless, I can't understand why they are wasting their time with such a minor contribution to the power consumption of entire countries . . .

nVidia nor AMD made their political contributions to the right party this year?
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Hmm if that happens i see alot of European people buying American gpus andshipping them over...

You are kidding yourself if you think AMD or nvidia is going to waste time and more importantly money on this BS. They are just going to make ZERO EU certified cards and anyone in the EU will just deal with their CPU's built in IGP or buy a regualr high end PCIe card from anywhere else on the planet and use it.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Question 1:

G7 specifies 225 kWh/year. If my GPU consumes 225W, it may only run 1000 hours? Or if 24/7 operation is assumed, it may only consume 25W??? (225000/8760).

Question 2:
What's up with the revised specification, lowering the TEC wo 136 kWh? How/when does that come into effect?

Edit:
I'm missing the load power in the calculation. Maybe load power isn't accounted for at all and the specification only applies to idle, sleep and off power as it says in the formula for 24/7 operation? That would make waaaaay more sense. Meaning nordichardware completely got it wrong.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You are kidding yourself if you think AMD or nvidia is going to waste time and more importantly money on this BS. They are just going to make ZERO EU certified cards and anyone in the EU will just deal with their CPU's built in IGP or buy a regualr high end PCIe card from anywhere else on the planet and use it.

While the EU isn't larger than the North American economy, it has surpassed the US economy. nVidia and AMD will "waste" a lot of money there. It's to big to just drop it. That's ludicrous to even suggest.

The biggest drawback with regulations is that they add to the cost of everything. Not just the product but also the government to regulate it.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
EU cripples future graphics cards [Nordic Hardware Exclusive]

The commission wants to stop dedicated graphics cards of group G7 from going above 320 GB/s


Another fail read from AMD and Euro-teapartiers
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
this makes no sense...
people should make their own choices regarding how much power their VGA should use...
also using memory bandwidth as the only performance reference doesn't seem right...

this might be only for the EU, but it's going to affect other markets with higher costs and less options I think...
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
While the EU isn't larger than the North American economy, it has surpassed the US economy. nVidia and AMD will "waste" a lot of money there. It's to big to just drop it. That's ludicrous to even suggest.

The biggest drawback with regulations is that they add to the cost of everything. Not just the product but also the government to regulate it.

They will still sell just as many GPU's to the EU they just wont be sold IN the EU. I seriously dont think Nvidia and AMD are going to jump through hoops and design cards specifically around some BS EU law that doesnt even make sense to begin with.

Not to mention Nvidia and AMD have NOTHING to lose by not jumping through the hoops, want to game? then guess what you need to buy a gaming card, which you can still do and have it shipped in. Its not like there are other options other than AMD and Nvidia, if both dont play ball with the EU neither will lose any sales over it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
They will still sell just as many GPU's to the EU they just wont be sold IN the EU. I seriously dont think Nvidia and AMD are going to jump through hoops and design cards specifically around some BS EU law that doesnt even make sense to begin with.

Not to mention Nvidia and AMD have NOTHING to lose by not jumping through the hoops, want to game? then guess what you need to buy a gaming card, which you can still do and have it shipped in. Its not like there are other options other than AMD and Nvidia, if both dont play ball with the EU neither will lose any sales over it.

Sure. All video cards in the EU will be sold on the black market. Or, one of them will decide to meet EU specs and sell twice as many cards as they used to. They'll have the market all to themselves. The other one will let that happen. /sarc

Keep in mind this is Western Europe, not Eastern Europe. ;)
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I might be the only one here that's actually happy about this development.

Graphics cards were getting each year more power hungry and freaking huge. There are a few exceptions to this trend, but it's crazy when a card has +500w psu requirement...

except that GPUs haven't been getting bigger, they just went down on average with this most recent generation, AMD's biggest die size hasn't eclipsed the mark they set with the HD2900 @ 420mm^2

and nVidia's previous behemoth was the GTX260/280 @ 576mm^2 and have yet to eclipse that mark

ever since move up to dual slot as the norm, we've stayed pretty par for the course as far as GPU size and power requirements go, and many of us would really love to see them go a lot farther

also, no one forces you to buy the flagship parts
 
Status
Not open for further replies.