Rumored specifications for HD8850/8870 - Launch January 2013 (?)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Now we're talking like it's fact, awesome. Can't win this generation, make up numbers for the next.

Great job AMD marketing!

You still have not provided your opinion on:

1) Why AMD needs to stick to 115W power consumption of HD7870 with 8800 series?

2) Why AMD cannot increase die size from 212mm^2 to 270-280mm^2 on 8800 series?

3) Why is a next generation mid-range card that uses 160-170W of power at $299-349 with a similar performance to a GTX670/680 not possible? Tahiti XT and Pitcairn XT do not have the same double precision capabilities. Why can't AMD make a 270mm^2 mid-range HD8870?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You still have not provided your opinion on:

1) Why AMD needs to stick to 115W power consumption of HD7870 with 8800 series?

2) Why AMD cannot increase die size from 212mm^2 to 270-280mm^2 on 8800 series?

3) Why is a next generation mid-range card that uses 160-170W of power at $299-349 with a similar performance to a GTX670/680 not possible? Tahiti XT and Pitcairn XT do not have the same double precision capabilities. Why can't AMD make a 270mm^2 mid-range HD8870?

I'm not going to provide an opinion on made up specs. I'm going to wait for the actual cards.

But hey, if anyone here is hoping for a miracle it's me. Let's hope at least one side delivers something compelling this coming round. :thumbsup:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Maybe they'll accomplish something like with Steamroller vs. Bulldozer?

Hot_Chips_AMD_High_Density_Library.png


Now, if they can decrease size and power by 30%! :boom:
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Hopefully AMD has their Boost feature more refined by then or there's an option to turn it off.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,812
7,169
136
I'm not going to provide an opinion on made up specs. I'm going to wait for the actual cards.

But hey, if anyone here is hoping for a miracle it's me. Let's hope at least one side delivers something compelling this coming round. :thumbsup:

-But the whole idea of this thread is to provide opinions on made up specs soooooo....

That being said, there is definitely room for AMD to squeeze this node dry as evidenced by the extremely conservative clocks they put out with the 7xxx series. If between tape out and production the node matured to the point where people were getting 20-40% overclocks, then there is really no saying where it will be by the end of the year.

And yet the glaring absence of the 7990 would indicate that it hasn't gotten THAT much better (not enough improvement to up the performance while lowering the power to put up a good fight against the 690).

I think the jump will be slightly better than what we saw with the 40nm refresh for sure (which was disappointing all around), but people are setting themselves up for disappointment if they think its going to be a massive jump in performance on AMD's side.

I would cry tears of joy if AMD threw their "small die" strategy to the curb and went all in with monolithic again. They're finally neck and neck with NV again after all these years, lets make it a good show!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
But hey, if anyone here is hoping for a miracle it's me. Let's hope at least one side delivers something compelling this coming round. :thumbsup:

I still bet GTX470 900mhz SLI for $80-100 will be nearly as fast as a $500 GTX780/HD8970.

Hopefully AMD has their Boost feature more refined by then or there's an option to turn it off.

:thumbsup: Yup, the current version isn't great.

I think NV also needs to remove Boost with manual overclocking. They should allow the user to go into full override overclocking mode (with voltage control too).
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I still bet GTX470 900mhz SLI for $80-100 will be nearly as fast as a $500 GTX780.



:thumbsup: Yup, the current version isn't great.

I think NV also needs to remove Boost with manual overclocking. They should allow the user to go into full override overclocking mode (with voltage control too).

I would like to see voltage control return as well, but I think nvidia is setting a trend with locking voltage control that AMD will likely follow in the future.

Just my personal guess and two cents.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I still bet GTX470 900mhz SLI for $80-100 will be nearly as fast as a $500 GTX780/HD8970.

If they're only 30-40% faster than a 1360MHz 7970 they'd be about even, but you can't do 900 core in SLI on reference cards there isn't enough fresh air for both cards. Maybe if you had better cards than me and were ok with headphone gaming, I guess it's possible, 800-850 is much more likely though with decent cards.

Personally I'd be happy with 30-40% more performance than the GHz card, with a wealth of OC potential for improved cooling to reap the benefits of.

For something like that though we'll need a much higher shader count with lower clock speeds, these 1Ghz cards are for the birds.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't know about that. AMD even has dual-BIOS switches to allow for BIOS flashing. I don't see why AMD would want to get rid of voltage control. If your competitor removes an enthusiast feature, it actually makes sense to keep yours in place as a competitive advantage.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I would like to see voltage control return as well, but I think nvidia is setting a trend with locking voltage control that AMD will likely follow in the future.

Just my personal guess and two cents.

The problem with O/C'ing for the manufacturers is it makes it real hard to differentiate between the different models. Why buy a 680 or 7970 when the next model down will O/C and be within a couple of percent performance wise. If they can undervolt the lower card and lock it, they can control that. Look at the early reviews of the 7870 and 7850 where they couldn't overvolt the 7850. Later on with 50% O/C's on the 7850 with voltage who bothered with a 7870 for $100 more?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't know about that. AMD even has dual-BIOS switches to allow for BIOS flashing. I don't see why AMD would want to get rid of voltage control. If your competitor removes an enthusiast feature, it actually makes sense to keep yours in place as a competitive advantage.

Removal of direct voltage control aids in reducing RMA's.

I can't say for sure why Nvidia did it, if it was to limit GK104 in light of stifling it's potential against a tight limit TDP GK110, or because of all those EVGA 570s that popped vrms due to 900+ core clocks... or some other reason entirely, the overclocking of 680 isn't that inspiring but a lot of that has to do with it being clocked up high enough to compete with the 7970 at release too.

Edit: We're assuming a lot here too, mostly that the limited voltage control of GK104 reduced it OC potential. However if you look at KingPins results with unlocked voltage on air it's not getting much better, I believe he only reached 1400MHz with a modded bios that untethered GK104 from the TDP and voltage restrictions Nvidia placed on it. A lot of 680s can get into the 1300's without those bios however.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think HD8870 ~ GTX670-680 level and the flagship cards 30-40% faster than HD7970GE is more than enough given that January 2013 is just 13 months since HD7970 launched. I even think it might be less than that and NV and AMD will leave room for a larger performance increase in 2014.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yep that combined with a decent OC potential would be enough for me, at some point 1.28GB is going to deal me a crushing blow, and the vast improvement in perf/watt will no longer be something that can be easily ignored. I noticed a slight reduction in our monthly power bill when I switched out my 5.5GHz i5-2500k and Tri 470s to this i3 setup.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
You still have not provided your opinion on:

1) Why AMD needs to stick to 115W power consumption of HD7870 with 8800 series?

Wasn't the entire power consumption debate started by the rumored specs themselves, which has a TDP of 160 for the 8870 versus 175 for the 7870, indicating that the 8800 series won't go over the 115W power consumption of the 7870.

Maybe they'll accomplish something like with Steamroller vs. Bulldozer?

Hot_Chips_AMD_High_Density_Library.png


Now, if they can decrease size and power by 30%! :boom:

Unfortunately those high density libraries actually comes from the GPU side of AMD (to my knowledge), so they are almost certainly already using them.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Unfortunately those high density libraries actually comes from the GPU side of AMD (to my knowledge), so they are almost certainly already using them.

I probably should have researched it more. You are correct, they are already using it in their GPU's.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
I wonder if (assuming those specs are true) this is due to Rory Read. Is AMD returning to the large dies across the board?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I think HD8870 ~ GTX670-680 level and the flagship cards 30-40% faster than HD7970GE is more than enough given that January 2013 is just 13 months since HD7970 launched. I even think it might be less than that and NV and AMD will leave room for a larger performance increase in 2014.

Sure, 30%-40% faster than the 7970GE. :rolleyes:
And that's coming from the same person who question a 1000MHz GK110 card with 15SMX...
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
I don't know about that. AMD even has dual-BIOS switches to allow for BIOS flashing. I don't see why AMD would want to get rid of voltage control. If your competitor removes an enthusiast feature, it actually makes sense to keep yours in place as a competitive advantage.

Yeah I agree with this and am near certain we will see voltage control back on nvidia with GK110. The only reason I think it is missing currently is that they couldn't manufacture GK110 and had to redline GK104 to make it competitive. The cards are already near their limits and voltage control would of wound up killing cards.

You can see evidence of this in some GK104 cards showing early degradation with overclocks on the locked down voltage.

Hopefully we are seeing AMD going back to their ATI days of big dies and taking the GPU performance crown. They didn't shy away from it with power consumption fears this round on the 7970GE. AMD putting out 500+mm2 dies is going to be a good thing to keep nvidia from trying to charge $700 for a GTX 780.
 
Last edited:
Feb 19, 2009
10,457
10
76
You have to factor in that NV gets a lot of their $$ from HPC market, something AMD lacks. If they are going to go massive dies, low yields, it will be very expensive and they cannot compete with NV equally on price.

Interesting times ahead.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
I dont see 500mm2+ from AMD soon, i believe they will try to get as much performance/watt as possible from 400mm2+ for starters. This way they will have a cheaper chip having 80-90% the performance of NVIDIAs 500mm2+ Kepler behemoth.
Having a 400mm2 chip will ensure them a faster card than NVIDIAs GK104 replacement in order to remain close to $400 MSRP leaving NVIDIA and GK110 at $500+.

At the same time the Pitcairn replacement (HD8870) at 270-280mm2 could be able to compete against a GK114 by having 90-95% of its performance at $249.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm skeptical of them doubling the SP and DP per watt performance.

I'm skeptical of it all. :D If true (just for the sake of discussion) could it mean that they aren't limiting the performance bracket when it comes to compute? Or, could it mean that the 8900 will be proportionately faster too?

Still leaks can often mean that the product is getting closer to release. :crossedfingers: