AMD HD7*** series info

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Charlie thinks there's no chance of AMD using XDR2 at all. B3D and other places think that leak is fake.

So.. no news yet.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I find it hard to believe they will actually be using XDR2 too. that stuff has been around for 5 years and they never bothered with it or even mentioned it as a possibility.
 

Via

Diamond Member
Jan 14, 2009
4,670
4
0
As long as the 5970 stays relevent for a while I'm happy.

And with the current state of gaming, I'm sure it'll outlast the 8800 GTX as far as GPU lifespan goes.
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
As long as the 5970 stays relevent for a while I'm happy.

And with the current state of gaming, I'm sure it'll outlast the 8800 GTX as far as GPU lifespan goes.

I wouldn't worry about your 5970. This should last you a long time!
 
Mar 11, 2004
23,444
5,852
146
I find it hard to believe they will actually be using XDR2 too. that stuff has been around for 5 years and they never bothered with it or even mentioned it as a possibility.

Rambus did the specification for the design 5 years ago, but I don't see anyone that makes it, and the memory manufacturers don't exactly get along with Rambus, so AMD would have to find some way of having it made. I could be wrong, but memory doesn't seem to be that big of an issue right now, so I don't see why they wouldn't just go with the higher speed GDDR, there's still a pretty good increase they can see from that, and power doesn't seem like its going to be a major issue for now so the alleged lower power use of XDR2 wouldn't be a necessity.

I think only the PS3 used XDR, and I haven't seen anything that really talked like it was much of a success on there (seems more that criticisms levied about it having split memory versus unified like on the 360). I don't know who actually fabbed it in that instance, if it was Sony, IBM, or Toshiba, so maybe it would be possible.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Rambus did the specification for the design 5 years ago, but I don't see anyone that makes it, and the memory manufacturers don't exactly get along with Rambus, so AMD would have to find some way of having it made. I could be wrong, but memory doesn't seem to be that big of an issue right now, so I don't see why they wouldn't just go with the higher speed GDDR, there's still a pretty good increase they can see from that, and power doesn't seem like its going to be a major issue for now so the alleged lower power use of XDR2 wouldn't be a necessity.

I think only the PS3 used XDR, and I haven't seen anything that really talked like it was much of a success on there (seems more that criticisms levied about it having split memory versus unified like on the 360). I don't know who actually fabbed it in that instance, if it was Sony, IBM, or Toshiba, so maybe it would be possible.

The XDR in the PS3 was fabbed by Samsung & Elpida. XDR2 could be made at both those or Global Foundries (possibly). I'm not sure you'll see many others due to Rambus and their litigious history.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
I really hope GCN offers some excellent performance and 28nm keeps it cool. My plan is Crossfire/SLi of next gen 28nm products. If AMD gets there first I'll swoop 2 7970s, if not 2 680s, then add water ;)

28nm has been hyped for a very long time. The result is I am very excited by the leaked numbers I have seen so far in regards to the 7xxx series. If the 7870 offers the performance dictated by its SPU count and clock speed at an incredible TDP, I know what card I'll be recommending in future builds for friends.
 
Mar 11, 2004
23,444
5,852
146
The XDR in the PS3 was fabbed by Samsung & Elpida. XDR2 could be made at both those or Global Foundries (possibly). I'm not sure you'll see many others due to Rambus and their litigious history.

Thanks for the info. I knew Rambus wasn't well liked, but that sounds like they wouldn't have too much trouble if they really wanted to. It could be beneficial in other aspects as well (mobile GPUs; the simple traces, low power, and high speeds could also be good for APUs).

I really hope GCN offers some excellent performance and 28nm keeps it cool. My plan is Crossfire/SLi of next gen 28nm products. If AMD gets there first I'll swoop 2 7970s, if not 2 680s, then add water ;)

28nm has been hyped for a very long time. The result is I am very excited by the leaked numbers I have seen so far in regards to the 7xxx series. If the 7870 offers the performance dictated by its SPU count and clock speed at an incredible TDP, I know what card I'll be recommending in future builds for friends.

I'm hoping GCN is a monster (and Kepler follows suit so we get a good performance and price war going).

While I was a bit disappointed as the earlier leak of top to bottom GCN on 28nm, I would love some lower priced, efficient GPUs too. A single slot, not loud, and low power card offering ~6850 performance for like $100 would be killer.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Thanks for the info. I knew Rambus wasn't well liked, but that sounds like they wouldn't have too much trouble if they really wanted to. It could be beneficial in other aspects as well (mobile GPUs; the simple traces, low power, and high speeds could also be good for APUs).

It seems like these specs are fake. Including, or, especially the XDR2. Just for the sake of discussion, though. I imagine that AMD would just license the XDR2 from Rambus and get it made themselves at GloFlo, or elsewhere.



I'm hoping GCN is a monster (and Kepler follows suit so we get a good performance and price war going).

While I was a bit disappointed as the earlier leak of top to bottom GCN on 28nm, I would love some lower priced, efficient GPUs too. A single slot, not loud, and low power card offering ~6850 performance for like $100 would be killer.

I'm hoping that with the full node jump power usage will come down considerably. I'm afraid though that "power will corrupt". It's like a "cocaine effect". At first it feels really good, but once you start you can't stop.

We'll see soon though.
 

Mopetar

Diamond Member
Jan 31, 2011
8,491
7,746
136
I'm hoping that with the full node jump power usage will come down considerably. I'm afraid though that "power will corrupt". It's like a "cocaine effect". At first it feels really good, but once you start you can't stop.

Knowing AMD they'll keep it reasonable. Among other reasons to design smaller cards is that they'll have fewer yields problems and that they'll want to sell you a bigger card in a year or two which will be really hard if the eat up most of that budget in the first release of cards.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
I think you are emphasizing power usage abit too much 3DVagabond, and for AMDs sake, i hope they dont emphasize it as much as you or some other posters who want "efficient" cards with the same or lower performance.

AMD and Nvidia have to stretch their performance for every new card/generation. If one fails, the other takes the lead and capitalizes on that.

I do hope AMD releases a bomb (positive) that gives us 50% more performance than the 6970 and i hope Nvidia does the same, or tries atleast. For consumers, nothings better than "more"...
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I think you are emphasizing power usage abit too much 3DVagabond, and for AMDs sake, i hope they dont emphasize it as much as you or some other posters who want "efficient" cards with the same or lower performance.

AMD and Nvidia have to stretch their performance for every new card/generation. If one fails, the other takes the lead and capitalizes on that.

I do hope AMD releases a bomb (positive) that gives us 50% more performance than the 6970 and i hope Nvidia does the same, or tries atleast. For consumers, nothings better than "more"...

Power consumption is probably more along the lines of a side effect to AMD's strategy. That strategy is "small die". They want to make an efficient chip in size.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think you are emphasizing power usage abit too much 3DVagabond, and for AMDs sake, i hope they dont emphasize it as much as you or some other posters who want "efficient" cards with the same or lower performance.

I have no interest in cards that perform the same or slower than the previous gen. Except if it's a midrange card performing the same as an enthusiast card from the prior gen at ~1/2 the power.

AMD and Nvidia have to stretch their performance for every new card/generation. If one fails, the other takes the lead and capitalizes on that.

I realize this.

Last Gen (or refresh if you prefer) pushed it over the top though. ~450W cards is ridiculous. I don't consider it progress when performance is improved marginally by using a lot more power. I'm blaming that on 32nm being cancelled and hoping I'm right.

I do hope AMD releases a bomb (positive) that gives us 50% more performance than the 6970 and i hope Nvidia does the same, or tries atleast. For consumers, nothings better than "more"...

Me too. I hope it does this more efficiently than the last gen. That's real progress, in my book.

I realize I put a higher emphasis on efficiency than most do. It's OK that not everyone sees it the same way. It is important though. With more efficiency comes lower prices. Smaller better yielding chips. Less power components. Smaller less expensive cooling solutions. Smaller PSU. Less need for expensive elaborate cases. Lots of positives as the consumer for more efficient designs. Even before you consider paying the electric bill or the environment.
 
Feb 19, 2009
10,457
10
76
Mid-range, efficiency is important (for brand name PCs, dell/hp/oems). High-end, i think enthusiasts are fine with power hogs as long as the performance is there. It's a general statement, but gtx480 and 580 sold well enough, not to mention 590 and 6990.

Personally i aim at the best perf/$ and perf/watt.

It's going to be interesting to see how 28nm high-end turns out, whether its going to be another balls to wall push as much perf as they can disregarding power.. i think it may, given how fiercely competitive both companies have been and the general acceptance of high power high performance GPUs.
 
Mar 11, 2004
23,444
5,852
146
It seems like these specs are fake. Including, or, especially the XDR2. Just for the sake of discussion, though. I imagine that AMD would just license the XDR2 from Rambus and get it made themselves at GloFlo, or elsewhere.





I'm hoping that with the full node jump power usage will come down considerably. I'm afraid though that "power will corrupt". It's like a "cocaine effect". At first it feels really good, but once you start you can't stop.

We'll see soon though.

Good point.

We'll see. AMD's track record the last few years seems that they do put a considerable emphasis on power usage (well at least the single GPU stuff), so let's hope it keeps up.

I'm too lazy to read the whole thread - any update on when these will be released?

Nope. Only thing we know so far is that AMD has said they'll have 28nm GPUs out this year, but that could mean just about anything.
 

dangerman1337

Senior member
Sep 16, 2010
384
45
91
http://www.xbitlabs.com/news/graphics/display/20110913141155_AMD_Demonstrates_Radeon_HD_7000_Southern_Islands_at_Event.html

At its own media event in San Francisco, California, Advanced Micro Devices has showcased a system featuring its next-generation AMD Radeon HD 7000-series graphics processing unit (GPU) code-named Southern Islands. The demonstration is a proof-of-concept that AMD is on-track to deliver its first graphics chips made using 28nm process technology.

The company remained tight-lipped about specifications and architectural peculiarities of the demonstrated Southern Islands graphics processor. Everything that is known at present is that the company's 28nm GPUs are already fully functional and the company is ready to formally introduce them.

Unfortunately, given the rumours that Taiwan Semiconductor Manufacturing Company has issues with 28nm production ramp and recently even hiked the quotes on 28nm wafers, it is unclear whether AMD will be able to release its new GPUs in volume, or those will be limited edition products.

"Today, we provided an early look at our upcoming 28nm next-gen notebook discrete GPU, driving Codemasters’ cutting-edge driving simulator, Dirt 3, demonstrating that we have working 28nm technology in house and already delivering a great gaming experience," a statement by AMD reads.

The 28nm generation of AMD's graphics processors will be rather broad. In fact, it is rumoured that even within Southern Islands family there will be chips with VLIW4 architecture as well as more progressive so-called GCN (graphics core next) architecture.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Tahiti XT a.k.a Radeon HD7970 is shaping up to be a kickass card.:p
Wonder if NVDA will be able to put up a competitor soon after its launch?
If they get the 28 nm right,even with only an improved Fermi backend with some tweaks it oughta be very fast,run cooler and be easier on wattage.
I hope quieter has been considered as well,GPUs still seem to be the noisiest thing in my rig.Looks like next year for CGN high end cards tho.:\
 
Last edited:

m3t4lh34d

Senior member
Oct 23, 2008
203
0
0
Tahiti XT a.k.a Radeon HD7970 is shaping up to be a kickass card.:p
Wonder if NVDA will be able to put up a competitor soon after its launch?
If they get the 28 nm right,even with only an improved Fermi backend with some tweaks it oughta be very fast,run cooler and be easier on wattage.
I hope quieter has been considered as well,GPUs still seem to be the noisiest thing in my rig.Looks like next year for CGN high end cards tho.:\


Ya think that might be because they're what is composed of the most transistors in your rig, thus requiring more cooling? heh ;)