AMD Announces High-Performance Chip Set

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
It's about time they got off of their collective fat asses! I was seriously beginning to wonder/worry about them. The only problem I see with that article was that it said that software would have to be rewritten, to be able to take advantage. That would mean that it would be years before it would pay off to own one, and then only if the software authors decided it was worth their effort, which I fail to see happening anytime soon.
 

JustaGeek

Platinum Member
Jan 27, 2007
2,827
0
71
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Viditor
Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Hey, I never said I thought it was for home use. Admittedly, my post might have read that way. I was just thinking that even supercomputers need software to run. Of course, I suppose that if you've got the money to buy a supercomputer, I guess you've also got the money to hire people to write the software for it, huh?:laugh:
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: myocardia
Originally posted by: Viditor
Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Hey, I never said I thought it was for home use. Admittedly, my post might have read that way. I was just thinking that even supercomputers need software to run. Of course, I suppose that if you've got the money to buy a supercomputer, I guess you've also got the money to hire people to write the software for it, huh?:laugh:

I wasn't critiquing you, mate...just keeping it clear for anyone else reading the thread.
You're right about the money being there for software development. But think of it, we're on the verge of a low-end Supercomputer (at least by current standards) that will sell in the $50k range! 500 Gflops per card is a massive boost!!!
 

DrMrLordX

Lifer
Apr 27, 2000
22,915
12,988
136
It would be nice if Firestream cards could be utilized in dc projects and the like. Firestream@home?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: Viditor
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...

This is a relabled 2900XT. If you think they are going to put a 250 watt GPU on die with a 125 watt cpu...

 

jones377

Senior member
May 2, 2004
462
64
91
Originally posted by: Phynaz
Originally posted by: Viditor
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...

This is a relabled 2900XT. If you think they are going to put a 250 watt GPU on die with a 125 watt cpu...

It's based on RV670, look at the specs.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: jones377
Originally posted by: Phynaz
Originally posted by: Viditor
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...

This is a relabled 2900XT. If you think they are going to put a 250 watt GPU on die with a 125 watt cpu...

It's based on RV670, look at the specs.

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.



 

jones377

Senior member
May 2, 2004
462
64
91
Originally posted by: Phynaz
Originally posted by: jones377
Originally posted by: Phynaz
Originally posted by: Viditor
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...

This is a relabled 2900XT. If you think they are going to put a 250 watt GPU on die with a 125 watt cpu...

It's based on RV670, look at the specs.

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

I never said anything of the sort.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Phynaz
Originally posted by: jones377
Originally posted by: Phynaz
Originally posted by: Viditor
Originally posted by: JustaGeek
Originally posted by: Viditor
You may want to change the title...the Firestream isn't a chipset, it's a chip package for HPC computing (this isn't consumer stuff).
It uses a 55nm GPU for stream computing on a card and can achieve 500 GFlops (not something you need at home...).

You might have to ask the PC World editors to do that...

A fair call...just goes to show that you should never trust these editors. If you read through the article or the Press Release, you'll see that it isn't a chipset.

Myo - it's not as bad as you think...the Firestream is part of the CTM project, and they've been developing the software for over a year now. Keep in mind that this is strictly for High Performance Computing, and not for us at home...

Edit: BTW, this is a nice sample of what Fusion is supposed to have on the die...

This is a relabled 2900XT. If you think they are going to put a 250 watt GPU on die with a 125 watt cpu...

It's based on RV670, look at the specs.

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

And why not? A system with a "CGPU" running at 275W by today's tech standards. But what about tomorrow, and the lack of the need for powering a discrete GPU. Power would just be diverted to another place in the PC, that's all.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: keysplayr2003
Originally posted by: Phynaz
You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

And why not? A system with a "CGPU" running at 275W by today's tech standards. But what about tomorrow, and the lack of the need for powering a discrete GPU. Power would just be diverted to another place in the PC, that's all.

Well, since it's pretty much impossible to keep a 175 watt CPU from throttling when under load, even with high-end heatpipe's, how would you propose we keep a 275 watt CPU/GPU cool?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: myocardia
Originally posted by: keysplayr2003
Originally posted by: Phynaz
You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

And why not? A system with a "CGPU" running at 275W by today's tech standards. But what about tomorrow, and the lack of the need for powering a discrete GPU. Power would just be diverted to another place in the PC, that's all.

Well, since it's pretty much impossible to keep a 175 watt CPU from throttling when under load, even with high-end heatpipe's, how would you propose we keep a 275 watt CPU/GPU cool?

See bolded above, for the second time. :)

 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Phynaz

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

What you're missing is that Fusion isn't just soldering a graphics card to a cpu...
There is a HUGE power savings from integrating it into the core, both for the CPU as well as the GPU.
IIRC, the reduction in power should make the combined 8 core CGPU come to near current CPU power levels.

Edit: A good analogy is the IMC. It requires (I believe) only about 20% of the power required for a memory controller on the chipset.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: keysplayr2003
Originally posted by: myocardia
Originally posted by: keysplayr2003
Originally posted by: Phynaz
You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

And why not? A system with a "CGPU" running at 275W by today's tech standards. But what about tomorrow, and the lack of the need for powering a discrete GPU. Power would just be diverted to another place in the PC, that's all.

Well, since it's pretty much impossible to keep a 175 watt CPU from throttling when under load, even with high-end heatpipe's, how would you propose we keep a 275 watt CPU/GPU cool?

See bolded above, for the second time. :)

I honestly fail to see that being of any consequense, unless of couse, you've heard that in the near future, either of the cpu manufacturers will be shipping phase change coolers, as their stock heatsinks.;)
 

DrMrLordX

Lifer
Apr 27, 2000
22,915
12,988
136
I think the idea is that a 275 Watt CGPU today could be a 150-200 Watt device sometime down the line. Still toasty but . . .
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Yeah, but my point all along was that it's almost impossible to keep a 175 watt cpu from throttling, with the best high-end heatpipe. So, let's say that at 32nm, they're able to get this CGPU down to ~150 watts. Is AMD and/or Intel planning on shipping TR Ultra 120 Extremes with these little house heaters? Because anything less than an Ultra 120 Extreme or a Tuniq Tower, and you're gonna end up with a pile of smoldering silicon, that used to be a CGPU.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: DrMrLordX
I think the idea is that a 275 Watt CGPU today could be a 150-200 Watt device sometime down the line. Still toasty but . . .

It will never be close to that high...
For example, on-board chipset graphics like the Mobility 1150 run at 400 MHz and have a TDP well under 10w...can you think of any discrete card that has come close to that in the last 7 years?
When you integrate the GPU, it drastically changes the power requirements.

Think about it...by integrating, you eliminate the need for another memory controller (like the one on the graphics card), the PCIe signalling device, and the distances you need to send any signal are measured in microns and not inches.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Good thread. Yep 08 is going to be interesting to say the least . AMD's stream vs. NV cuda. This is cool stuff . But to be perfectly honest . The programms for both these are hard to wright code for so progress will be slow at best . Even tho AMD & NV . Have a headstart on intel. Intels larrabee using 16 x86 cores assures apps will be easy to write and Intels solution should offer seamless support. Add in the fact That Larrobbee will most likely be using xdr dimms Intel might beable to pull one more rabbit from the hat.

I am sure we all remember Intel /Rambus before DDR . Didn't work our the best.

But maybe justlike intels 90nm prescott. disaster maybe XDR with larrobee and Nehalem could be a smash hit. Not long to wait now at all compared to the wait for proper Phenom results which we still don't have.

I remember last Feb. I said intels desktop Penryn would be out befor Phenom and the fan-bois had a field day with that statement. I told them laughed now its OK . In life laughing is very important. BUt the best laugh of all is the last laugh!
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: Viditor
Originally posted by: Phynaz

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

What you're missing is that Fusion isn't just soldering a graphics card to a cpu...
There is a HUGE power savings from integrating it into the core, both for the CPU as well as the GPU.
IIRC, the reduction in power should make the combined 8 core CGPU come to near current CPU power levels.

Edit: A good analogy is the IMC. It requires (I believe) only about 20% of the power required for a memory controller on the chipset.


Do you have any proof of this at all?
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Phynaz
Originally posted by: Viditor
Originally posted by: Phynaz

You're right.

So make that a 150 watt GPU and a 125 cpu...All on the same die.

Sure.

What you're missing is that Fusion isn't just soldering a graphics card to a cpu...
There is a HUGE power savings from integrating it into the core, both for the CPU as well as the GPU.
IIRC, the reduction in power should make the combined 8 core CGPU come to near current CPU power levels.

Edit: A good analogy is the IMC. It requires (I believe) only about 20% of the power required for a memory controller on the chipset.


Do you have any proof of this at all?

You have to be more specific about which part it is that you don't understand...
If it's the part about Fusion not being a graphics card soldered on a CPU, then maybe this graphic will help you visualize it...note that both the CPU and GPU cores share both the cache and the memory controller.

To visualize the power savings, check out this diagram of a 2900x graphics card...
Note that the vast majority of this will not be needed in the core GPU (it will mainly be the stream processing units), and that the memory and cache is already available on the CPU.

Oh yeah...the power. Have a look at this graphic (note that the total combined TDP is expected to be 10-100w)
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The only problem is you contradict yourself.

You first link to slide that says mobile and mainstream, and then you talk about a 2900xt.

2900xt is neither mobile nor mainstream.

But anyway, I meant how about proof that putting something on a cpu die automatically lowers it's power consumption.

Transistors are transistors, no matter where they reside.

BTW, the fist version of fusion, if it ever happens, will be an MCM.

As far as "expected" total TDP...Expected by whom?

Note my sig, AMD has been "expecting" many things.