R9 290 series specifications

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

toyota

Lifer
Apr 15, 2001
12,957
1
0
No, the boost clock is the only clock speed that ever matters. This is true of both the Kepler and the 7970GHz - only the Boost is used. The base might be used in underwhelming desktop applications such as flash games. But if you're playing Crysis 3 you will only see the boost clock. That's why arguing the base clock is pretty much pointless. It's pointless with both Kepler and Tahiti.

That said, Kepler's GPU Boost is more versatile in that it allows dynamic adjustments based on TDP and temperature, which isn't the case with the 7970GE boost. What generally happens with Kepler cards is that your actual in game boost is far higher than advertised, and if you exceed TDP or temp limits it will dynamically adjust in 13mhz increments to get temp / TDP in line. AMD Boost does not do that. But, even with Kepler you will always be at the boost speed or higher in 3d applications that require the horsepower - the same is true of the 7970GE. Essentially, you will never see base clock in something like Crysis 3 or Metro: LL - you will only see Boost clocks.

Therefore the base clock never matters - Only the boost clock matters in 3d applications. The base clock is only used on the desktop in Flash games, perhaps, as i've mentioned. Personally, I would have liked for AMD to make their version of boost more versatile as is GPU boost 2.0. I've grown to like GPU Boost 2.0 and the concept makes a lot of sense in terms of increasing board longevity. I don't think AMD changed boost at all with the 280X series, but we'll see tomorrow I guess.
lol undemanding games can drop WAY below the base clocks and I thought you would know that. base clock simply means the clock it will go too in demanding situations where heat or tdp keep it from hitting boost clocks. in other words yes it should always hit boost clock if it needs it.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
If I had to guess, to lower 2d power consumption. The base clock *is* used quite frequently on the desktop in 2d apps.

I'm just speculating, though, like I said I don't know if AMD's boost has changed.

Base clock in that sense is rarely used, Tahiti/"Boost 1.0" cards would only default to base when temps were exceeded/power draw was exceeded, in 2d you have your idle clocks (300MHz almost always) and varying UPM states for stuff like video playback, low power acceleration, etc.

In real world usage the old cards would almost always be at either 2d idle or boost clocks.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
lol undemanding games can drop WAY below the base clocks and I thought you would know that. base clock simply means the clock it will go too in demanding situations where heat or tdp keep it from hitting boost clocks. in other words yes it should always hit boost clock if it needs it.

Actually, I posted just what you stated earlier if you go through my posts. If the 3d application needs the power, both Kepler and AMD boost will give you the advertised boost speed - the Kepler will often be higher than advertised boost.

I posted this:

With a 1070mhz boost, that means it is faster than the 7970GE. IIRC the 7970 is always at it's boost clock in 3d games. Unless you're playing a game from 2006 that doesn't require anywhere near what the GPU offers.

I was speaking in strictly in the context of demanding 3d applications in the post you quoted. And yes, Kepler will always be at advertised boost or higher (generally HIGHER) in those types of applications. However, if you're playing Darksiders or something like that which would probably run at 60 fps on an 8800 - yes, it is possible to go lower. Again, the context was demanding 3d applications.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Actually, I posted just what you stated earlier if you go through my posts. If the 3d application needs the power, both Kepler and AMD boost will give you the advertised boost speed - the Kepler will often be higher than advertised boost.

I posted this:



I was speaking in strictly in the context of demanding 3d applications in the post you quoted. And yes, Kepler will always be at advertised boost or higher (generally HIGHER) in those types of applications. However, if you're playing Darksiders or something like that which would probably run at 60 fps on an 8800 - yes, it is possible to go lower. Again, the context was demanding 3d applications.
yeah but you specifically said FLASH games and undemanding games will use the base clock. thats not always true as they will drop way below that in many cases
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
1070 boost. Not sure why you're going on and on about it being slower than the GHZ, honestly, because it isn't.

With a 1070mhz boost, that means it is faster than the 7970GE. IIRC the 7970 is always at it's boost clock in 3d games. Unless you're playing a game from 2006 that doesn't require anywhere near what the GPU offers. This isn't dissimilar to the Kepler which is always using the boost clock in 3d games. The Kepler does have a more versatile boost with 1 bin steppings, but the point remains. The boost clock is the only clock that matters.

You can play semantics with the base clock all day long but 3d applications that need the horsepower will use the boost clock of 1070, hence the 280X is indeed faster than the 7970GE.

One would assume that AMD would set the clocks on the 280X to beat the 770. Even if it's a straight rebadge of the 7970. There's only a couple of percentage points between them.

At $100 less with an extra gig of RAM and Tahiti's O/C'ing prowess, what more can they do to entice people to buy it over a 770?

AMD's boost is just to benefit power usage benchmarks, the best I can tell. nVidia's? Well we've already had those discussions.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes and no. The boost clock is not a guaranteed clock. If it was, they would have used base clock instead. When the load is light enough, boost works great. When it doesnt work so great you get something like this as the extreme case:

frequency.jpg

Move the power slider to +20%. Don't worry about boost. It's marketing.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Yes, the low base clocks are probably due to power efficiency concerns in benchmarks. That's one area that kept getting brought up "omg it uses a few 10s of watts more power than the competing 680 (with less RAM)".
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
But your HD7970 is not reporting how it boosts. AMD made sure that wouldnt happen to avoid the situation I can show with nVidia.

How will that make the 280X slower than the 7970GE? Try to stay on point instead of trying to spin as much negativity as possible towards AMD.

You said the 280X will be slower because of baseclock. Others are saying it doesn't matter because of boost. In a day or so, we'll see. Personally, I can't imagine why they'd re-release a slower version of Tahiti. You'd think after this much time they could squeeze some more performance out.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Why would they spec the card as 870/1070 and not 1000/1070 for example if it was so certain? It just doesnt add up.

And as I showed with the Titan, play 4 minutes of Metro 2033 and all your boost is gone.

You are ignoring power viruses like furmark. In those cases its most likely AMD reverts to 850 or 870 mhz whichever is the base clock to stay within a lower TDP. But we all know that Furmark is not representative of any game or benchmark. AMD looks to have focussed on improving power efficiency with R9 280X. In fact the BIOS dump at videocardz calls R9 280X as Tahiti XTL B0. so its more or less a new stepping with few efficiency tweaks. Hopefully the TDP is around 210w in a stock design.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well, they have a couple of typos.

R280X up to 2816spu
R270X 4 way Crossfire

AMD needs proof readers. :D
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So now the review is out. Can we agree the regular R9 280X is slower than a HD7970GE for the same price?

Nice find with the 290X. But all those "up to" listings....

***Why did you edit your post?***

I felt it was an unneeded comment after the results. But added it back due to you.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So now the review is out. Can we agree the regular R9 280X is slower than a HD7970GE for the same price?

If you want to call 1 or 2 fps slower, fine. I think it's within margin of error. The price isn't the same though. The 7970GHz isn't $299.

Overall though, I am disappointed that they couldn't squeeze a bit more out. I was looking for somewhere in the 10% range. Looks like it's purely improved efficiency. I'd be curious to know what happens with the power set to +20%, or whatever the amount is for the R9's. I wouldn't be surprised if it's being held back a bit by Powertune.


***Why did you edit your post?***
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
If you want to call 1 or 2 fps slower, fine. I think it's within margin of error. The price isn't the same though. The 7970GHz isn't $299.

Overall though, I am disappointed that they couldn't squeeze a bit more out. I was looking for somewhere in the 10% range. Looks like it's purely improved efficiency. I'd be curious to know what happens with the power set to +20%, or whatever the amount is for the R9's. I wouldn't be surprised if it's being held back a bit by Powertune.

AMD knows Tahiti's potential and has left the partners to bring out custom factory overclocked cards. to name a few

HIS R9 280X Iceq x2 - 1050 Mhz boost
Sapphire R9 280X Vapor-X - 1070 mhz boost
ASUS R9 280X Direct Cu II TOP - 1070 mhz boost
ASUS R9 280X Matrix - 1100 mhz boost

Sapphire R9 280 TOXIC and MSI R9 280X Lightning will fight it out for the fastest factory clocked Tahiti XTL. Sapphire's R9 280X TOXIC has a base clock of 1100 mhz and a boost clock not yet decided. if Sapphire get to 1200 mhz like they did with the Sapphire HD 7970 TOXIC that would be fastest clocked Tahiti. would be awesome. :thumbsup: But you can bet that card would be priced at USD 350 or more. The card is built like a beast with triple fans. hopefully we get reviews of that card soon.

http://cdn.wccftech.com/wp-content/uploads/2013/10/Sapphire-Toxic-R9-280X-3G-GDDR5.jpg

But the real deal is going to be the R9 290 which is going to end up around USD 449. thats the GPU which would not break the bank but run close to those high end cards. in fact on a clock for clock basis the R9 290X would be 5 - 7% faster than R9 290 but for atleast 100 bucks more.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So now the review is out. Can we agree the regular R9 280X is slower than a HD7970GE for the same price?

Nice find with the 290X. But all those "up to" listings.....

So now be a man and admit you were wrong on nearly everything you stated despite and everyone else here who told you so:

1) After-market R9 280X cards will have a premium of $0-20 not in the "$400 range" as you predicted.

2) R9 280X cards will boost to highest bin speeds in 3D games and thus match or exceed HD7970GE.

3) R9 280X will make both GTX760 and GTX770 irrelevant without NV's price drops.

perfrel_1920.gif


perfrel_2560.gif


Just 15-16% separates the reference 780 that costs $625 and a $310 MSI Gaming R9 280X card. :sneaky:

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/26.html

Also, all those people that for nearly 2 years wouldn't admit that HD7970Ghz reference power consumption measurements were a waste of time despite us telling you so, this is for you:

power_peak.gif


Good luck to NV for trying to sell a GTX770 2GB at $399 and 4GB at $449. Those cards' resale values just took a dump.

Glad to see AMD is finally exposing how much of a rip-off NV's 770 cards have been for the longest time.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Also, all those people that for nearly 2 years wouldn't admit that HD7970Ghz reference power consumption measurements were a waste of time despite us telling you so...

And this is for you:
http://www.hardwareluxx.de/index.ph...280x-r9-270x-und-r7-260x-im-test.html?start=7
https://www.computerbase.de/artikel...d-radeon-r7-260x-r9-270x-und-280x-im-test/10/
http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/20

Sometimes it's less, sometimes it's not, depending on the workload.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
And once again, someone doesn`t know how to read reviews.

Here, to correct you again RussianSensation
That 7970GHz and R9 280X is looking amazing right?
Here is a hint: Try looking at the efficiency of R9 280X and GTX 770...

So yeah, that is why people are making fun of the poor efficiency of 7970GHz.

Now, Im out. I have better things to do than to discuss a freaking rebrand.



perfwatt.gif
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
And once again, someone doesn`t know how to read reviews.

Here, to correct you again RussianSensation
That 7970GHz and R9 280X is looking amazing right?
Here is a hint: Try looking at the efficiency of R9 280X and GTX 770...

So yeah, that is why people are making fun of the poor efficiency of 7970GHz.

Now, Im out. I have better things to do than to discuss a freaking rebrand.



perfwatt.gif

Wow! With the performance capped at the same level 780 offers whole 7% longer battery life! Ohh..wait...
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Why would they spec the card as 870/1070 and not 1000/1070 for example if it was so certain? It just doesnt add up.

And as I showed with the Titan, play 4 minutes of Metro 2033 and all your boost is gone.

I haven't encountered a single game that I can't play at boost clock indefinitely. If the card loses its boost it means it overheats, to counteract this you can simply set temperature target higher or ramp up the fan, but completely stock that's correct, you may end up playing games with your titan clocked at a measly 836mhz.
 

Tweak155

Lifer
Sep 23, 2003
11,448
262
126
And once again, someone doesn`t know how to read reviews.

Here, to correct you again RussianSensation
That 7970GHz and R9 280X is looking amazing right?
Here is a hint: Try looking at the efficiency of R9 280X and GTX 770...

So yeah, that is why people are making fun of the poor efficiency of 7970GHz.

Now, Im out. I have better things to do than to discuss a freaking rebrand.

Clearly not, or else you wouldn't have been here to begin with :D