Uh Oh. Remember, the key word is "MAY"

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
it does not mean you won't need a video bios/ driver to have output.
When did I say that it wouldn't need a video bios or driver?
how are you going to output ? what are you going to use as driver if GPU is integrated as part of your CPU?
Ask AMD, it's not my idea.
Originally posted by: beggerking
Originally posted by: josh6079
What is a "co-processor board?" Can you enlighten me? Specifically, what is a "Radeon CrossFire Edition co-processor board?"
thats from your own quote, ask yourself. What is your big hint?
It's not my quote, it's ATi's. They said:
Originally posted by: ATI in this thread
To build your own latest generation multi-GPU system, start with any existing Radeon® X800 or Radeon® X850 graphics card and a CrossFire Ready motherboard, such as those based on the ATI Radeon® Xpress 200 CrossFire chipset. Then add a Radeon CrossFire Edition co-processor board and plug in the external cable to unite multi-GPU power.
Out of those three components, which one did they refer to as the co-processor?
1. explain to me how can you translate data into pixels without a driver?
:confused: Who said that there wasn't going to be a driver?
2. how do you even start your system without some type of video bios? you don't have input/output!!
True, but what's your point? I'm sure AMD knows how to make a bios for their hardware. I don't see what you're trying to get at with the whole bios/driver/input/output shirade.
physics processor is an add-on card, a graphic card is not.
So you don't add a graphics card to your system?
A graphic card is an output device that incidently includes a GPU to help processing...
So the GPU isn't the device that outputs according to your logic. The grapchics card already can output without the GPU, it's just that it incidentally happens to be there to help with processing? Is that what you're saying?
It is possible to put GPU as a co-processor...
Yeah it's possible, you have one, I have one, 99% of the people on this forum have one...
...but we will still need a videocard (there eliminates Josh's idea of not needing pci-e slots).
Again, it's not my theory, AMD is the one turning the gears. What part of me linking previews from AMD did you miss?

Just because it is a dedicated card with it's own board does not mean that it isn't a co-processor. If you could find some links that support your theory as to what a co-processor really is--since Wikipedia and ATI don't agree with you--please, post them. I want to see evidence supporting your claims that a GPU is not a co-processor. Until then you'll just have to live with being wrong since no reputable source--or logic for that matter--supports your claim.
performance will suffer though as it does not have its dedicated memory. ( contradicts Josh's idea that there will be performance gain).
AMD said that there would be a performance gain. Is it that hard to get?

None of this GPU / CPU integration is my idea, I'm just reflecting methods AMD said they may use and what effects may happen.
Without performance gain and still requiring a videocard makes the possible of implementing GPU as a co-processor pretty much useless.
How do you know that it will still require a dedicated graphics card? The technology isn't even out yet and you're already giving the basics of what you'll need while saying that such technology is useless?
Its not a bad idea to have co-processor sockets, but Josh is just wishful thinking...
??? Go write an e-mail to AMD telling them that such an idea is just wishful thinking and won't happen. See what kind of a response you'll get.
...and argumentive without any qualification.
:confused:.....:laugh:

Since when does one have to have "qualification" to disagree with you?
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: beggerking
Originally posted by: redbox

I understand the need for a bios with a video card it is just that the bios wasn't included in the list of what a video adapter consists of. As far as the ramdac is considered since AMD bought ATI then I would imagine it wouldn't be that hard to put a chipset specific ramdac on the mb.

Would you consider a sound processor a co-processor it has basic in and outs? What are your requirments for being a co-processor.

You say it is possible to put GPU as a co-processor why would we need a video card then? You would be right that performance would suffer if it would have to use the onboard ram we have right now. What about using an on die cache system like cpus use now? Would that speed up performance a bit. I will agree with you that the area this design is likely to incure bottle necks at is when it has to use main memory, but IMO caching could be used to circumvent this problem. Perhaps AMD is thinking about adding another heirarchy of cache onto this fusion chip? Would that be a worth while road to go down? Granted the size of the memory wouldn't be very high, but it would have the potential for high bandwidth.

1. video bios is required to initialize display at start up, and is specific to the graphic adaptor (therefore its not possible for amd to put it on mb). Same goes for Ramdec.

2. By asking this question you showed you have no idea graphic cards work, please google. Thank you.

For the same reason above, you will need a videocard if you put GPU as a co-processor. You need a BIOS(Input and output) to initialize a pathway to output to your monitor, and that job is done by the videocard. the GPU will be used much like the voodoo add-on card, except it will be using system memory(which likely to be slow).

I asked the question for clearification and I guess you can't invision a motherboard having an output to monitor on it. You don't need a video card with IGP do you? This would be about the same at least in respect to having the output on the motherboard. as far as your video bios you would need so much. Sound needs a bios too doesn't it how do they have integrated sound on motherboards? They include the needed bios software don't they?

As far as your google request. I thought Anandtech would be a place to learn about some of these things. Telling me to go search for the information elsewhere is a cop out. If you know enough to tell us that wikipedia is wrong then you should know the information well enough to resonably teach it or at least more clearly tell us where AMD's direction is off base. Next time I ask a question I do not expect to see you just say google it.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
None of this GPU / CPU integration is my idea, I'm just reflecting methods AMD said they may use and what effects may happen.

In other words, you exaggerated and misunderstood. AMD meant increasing processing power ( or overall system speed, according to appopin), not GPU power.

How do you know that it will still require a dedicated graphics card? The technology isn't even out yet and you're already giving the basics of what you'll need while saying that such technology is useless?
read my previous comments on GPGPU.

where did you read the technology is to increase GPU speed?

Its not a bad idea to have co-processor sockets, but Josh is just wishful thinking...
??? Go write an e-mail to AMD telling them that such an idea is just wishful thinking and won't happen. See what kind of a response you'll get

AMD never said to use co-processor sockets for GPU. In fact, physics processor example was used. If AMD thinks its possible to put a GPU on the socket, don't you think they will use that as an example?

------------------------------------------------
in reply to your innerself aka redbox twin:
You say it is possible to put GPU as a co-processor why would we need a video card then?

why does voodoo 1 require a videocard? search google.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: beggerking

------------------------------------------------
in reply to your innerself aka redbox twin:
You say it is possible to put GPU as a co-processor why would we need a video card then?

why does voodoo 1 require a videocard? search google.

The only reason the voodoo 1 required a separte video card is because it lacked an onboard VGA controller. Futhermore this is very old tech that you are bringing up I don't see how you can account the limits of a graphics chip made around 1996 to one that will be made around 2008. 3dfx solved the problem in august of 1997 by combining the voodoo chip with a 2D chip on the same board. So it wasn't like it was some big hurtle to get over.

As a side note I would appreciate it if you would lay off the personal attacks. If you are here to discuss then do that, on the other hand if all you want to do is lob off personal attacks then please go somewhere else.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Similarly to Josh, You misinterpreted the article. AMD's plan for GPGPU on ATI devices is to use GPU as part of CPU processing unit , not to integrate GPU on CPU for graphics.
From what I've read, there could be multiple CPU / GPU designs for different tasks. The GPU certainly could be used for more than just graphics processing and that very well may be one of the effects that will occur from creating an integrated, hybrid processing unit.
GPGPU is a plan to use GPU to increase CPU processing speed.

It's interesting that first you claimed such technology would not give any performance increases yet now are claiming that this technology will increase CPU processing speed.

It isn't going to be some "all or nothing" technology where there exists complete opposite poles in its usefulness. As shown here, AMD may offer different platforms altogether in order to divide into appropriate sections the possibilities of CPU / GPU integration. To say that they'll integrate the GPU to the CPU without any intention for graphics is a little naive.
In other words, you exaggerated and misunderstood. AMD meant increasing processing power ( or overall system speed, according to appopin), not GPU power.
Like I said above, there will probably be different platform designs catering to different uses. The fact that AMD considers a GPU/CPU integration to be an improvement in CPU processing further disproves your previous theory about there being no performance gains from such technology.

As far as your assumption that they'll not consider GPU power--
Originally posted by: AMD HERE
AMD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today?s CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing.
In addition to Fusion, AMD expects to ship integrated platforms with ATI chipsets in 2007. The platforms are expected empower commercial clients, notebooks, gaming and media computing.
I'm not saying that they won't use some of this technology for increases CPU operations, but I think it's safe to say that they'll use this technology anywhere they can, whether it be gaming, encoding, etc.
where did you read the technology is to increase GPU speed?
I never said that it will increase GPU speed alone, and I've already given several links to that very question. Re-read them if you still don't know.
AMD never said to use co-processor sockets for GPU.
PCI-E is a socket for the graphics co-processor. They're already using "co-processor sockets for GPU."
In fact, physics processor example was used.
And? A physics processor is just one type of a co-processor. A graphics processor is another. If you could understand that simple concept it would help your excessive posting.
If AMD thinks its possible to put a GPU on the socket, don't you think they will use that as an example?
1) GPU's are already on a type of socket, the PCI-E socket. That's a technology that is current, it's just that this HTX technology could serve as a replacement to PCI-E. Not saying it will, but the possibility is out there to incorporate GPU's to an HTX oriented socket.

2) It's possible, but why would they show the HTX socket holding the GPU if the accelorators incoporated into the Opteron sockets do the graphics processing?
in reply to your innerself aka redbox twin:...
Why does everyone forget the fact that we're brothers and think that we're the same person? We've stated openly many times that we're related.

This further demonstrates your poor reading comprehension and shows that you cannot discuss things civilly without calling someones posts "crap" and telling them to search google whenever you're hit with a question that proves you wrong. It is apparent that you are wanting to be argumentative since I am only reiterating what AMD has stated in numerous PR's. If you have such complaints and/or theories as to why this technology won't work, write to AMD/ATI, it's not my idea.

It's also plainly obvious that you can't get over a dispute we had and therefore resort to calling people twins for holding like-viewpoints and other peoples posts "crap". Grow up and discuss things maturely.

Provide these things:

[*] Give me evidence showing how the GPU is not a co-processor.

[*] Tell me which "co-processor board" ATI was talking about in your earlier link.

[*] Tell me where I said that this technology will not need drivers, bios, etc.

[*] Tell me where I said that this technology will only give a graphics performance increase.

[*] Give me a link showing me how any of this upcoming technology is my idea.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: redbox
Originally posted by: beggerking

------------------------------------------------
in reply to your innerself aka redbox twin:
You say it is possible to put GPU as a co-processor why would we need a video card then?

why does voodoo 1 require a videocard? search google.

The only reason the voodoo 1 required a separte video card is because it lacked an onboard VGA controller. ...q]

exactly. By making a GPU as a co-processor you are basically reverting it back to an add-on card, therefore you will need a seperate VGA controller.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079

GPGPU is a plan to use GPU to increase CPU processing speed.

It's interesting that first you claimed such technology would not give any performance increases yet now are claiming that this technology will increase CPU processing speed.

I corrected your notion of coupling GPU and CPU on the same die to increase graphic speed, stop putting words into my mouth and spout false info.


Like I said above, there will probably be different platform designs catering to different uses. The fact that AMD considers a GPU/CPU integration to be an improvement in CPU processing further disproves your previous theory about there being no performance gains from such technology.

a platform change won't help, it will require an architectural change to achieve. That is why I do not agree with you.

I never said that it will increase GPU speed alone..

well, it will increase bus speed, CPU speed, cache speed , memory speed, EXCEPT your GPU speed.

PCI-E is a socket for the graphics co-processor. They're already using "co-processor sockets for GPU."
What ** are you talking about? do you go out buying a graphic "card" or a graphic "coprocessor"?

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: beggerking
Originally posted by: redbox
Originally posted by: beggerking

------------------------------------------------
in reply to your innerself aka redbox twin:
You say it is possible to put GPU as a co-processor why would we need a video card then?

why does voodoo 1 require a videocard? search google.

The only reason the voodoo 1 required a separte video card is because it lacked an onboard VGA controller. ...q]

exactly. By making a GPU as a co-processor you are basically reverting it back to an add-on card, therefore you will need a seperate VGA controller.

Couldn't they inlclude the vga controller on the motherboard? like 3dfx did with the Voodoo rush to solve that problem?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
a GPU is a coprocessor, but not in the sense Josh is talking about.

Can you imagine the complexity of these things if they truly do develop? Just think of slapping a C2D and G80 together.

That's why I dont think that a CPU/GPU will take over the enthusiasts for a while. The performance just isnt going to be on par with what Nvidia will have with dedicated graphics cards.

After R600, Nvidia is going to be dictating what goes on in the graphics market for a while. We wont see CPU/GPU hybrids that outperform graphics cards until Nvidia is ready with their own hybrid.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I corrected your notion of coupling GPU and CPU on the same die to increase graphic speed
Where?
a platform change won't help, it will require an architectural change to achieve. That is why I do not agree with you.
What do you think a unified GPU / CPU will be?
well, it will increase bus speed, CPU speed, cache speed , memory speed, EXCEPT your GPU speed.
Define GPU "speed". By "speed" do you mean performance or clock frequency?
What ** are you talking about? do you go out buying a graphic "card" or a graphic "coprocessor"?
That's my point. A graphics card is a co-processor.
a GPU is a coprocessor, but not in the sense Josh is talking about.
When I say that the GPU is a co-processor I am meaning that it does what Wikipedia says it does:
supplement the functions of the primary processor (the CPU).
In this case, a processor such as a graphics processor also falls into the category of a processor that supplements the functions of the primary processor.

I'm not saying that the first ones will give performance increases, but I have a feeling that they will, either due to more efficient coding or better scaling for multi-cores. This has always been my opinion. Treating it as a "theory" and placing me as the founder of such an idea was a little exaggertated IMO.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: redbox

Couldn't they inlclude the vga controller on the motherboard? like 3dfx did with the Voodoo rush to solve that problem?

then that would be a seperate IGP.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
a platform change won't help, it will require an architectural change to achieve. That is why I do not agree with you.
What do you think a unified GPU / CPU will be?

your definition of "unified aka direct connection between GPU /CPU" would be an example of architectural change.
well, it will increase bus speed, CPU speed, cache speed , memory speed, EXCEPT your GPU speed.
Define GPU "speed". By "speed" do you mean performance or clock frequency?
performance. because without an architectural change you can't do it.
moreover, CPU/GPU communication is not the bottleneck here.

That's my point. A graphics card is a co-processor.
ummm... okie.... whatever :confused:
you already re-defined the definition of a "software flaw", you might as well re-define a graphic card as well. whatever. I'm out.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: beggerking
Originally posted by: redbox

Couldn't they inlclude the vga controller on the motherboard? like 3dfx did with the Voodoo rush to solve that problem?

then that would be a seperate IGP.

Would that be a bad way of doing it? It does sound though that this is working out to be a way to just increase the speed of the main processor. The current cpu's we have aren't very effecient with handling gpu type code because they can't proceess parrallel code as well as say the x1900 core. I have read that the G80 really put the strain on our current cpu's even the kensfield cores. I imagine AMD has several different uses in mind for this Fusion concept.

1.) The use of a GPGPU would be very nice to the science community, and servers based on that tech would be fairly powerful.

2.) An Increase in IGP performance. With Vista on the way and a turn to a more multimedia experience current IGP solutions are going to be stressed quite a bit. Fusion looks like it has alot it could lend to IGP solutions as well as cpu functions in relation to a graphicly demanding GUI.

3.) Laptop energy savings. With the cpu memory controler and gpu all on one chip it safes on energy and heat.

What ever they do it looks like they would probably have a platform depending on how the computer is going to be used.
AMD/ATI unified development

I also wonder if we have a gpgpu how will this affect game development? With the parrallel processing power that type of a processor would have could we use it to take some of the load off of the video card.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: beggerking
Originally posted by: josh6079
Originally posted by: beggerking
Originally posted by: josh6079
Like I said above, there will probably be different platform designs catering to different uses. The fact that AMD considers a GPU/CPU integration to be an improvement in CPU processing further disproves your previous theory about there being no performance gains from such technology.
a platform change won't help, it will require an architectural change to achieve. That is why I do not agree with you.
What do you think a unified GPU / CPU will be?
your definition of "unified aka direct connection between GPU /CPU" would be an example of architectural change.
So you agree with me? First you say that only an architectural change will achieve better performance. Then you say that a unified GPU / CPU is an architectural change. Finally you say that I am just practicing "wishful" thinking when I am thinking the same thing as you: an architectural change such as a unified GPU / CPU will increase performance.
performance. because without an architectural change you can't do it.
Again, what exactly is your debate? That the unified GPU / CPU will not give a performance increase? If so, you're arguing with yourself because you said that such an architectural change would increase performance.
moreover, CPU/GPU communication is not the bottleneck here.
No, but moving traffic from being routed through a chipset to being accessed directly through cache would certainly be an improvement. Even if they don't use direct access through a unified cache, their proposed HTX socket bypasses the chipset as well. Either technique would be an improvment from the otherwise ~8 GB/sec running to and from the chipset.
Originally posted by: beggerking
Originally posted by: josh6079
That's my point. A graphics card is a co-processor.
ummm... okie.... whatever :confused:
Again, what do you think ATI was talking about when they said:
To build your own latest generation multi-GPU system, start with any existing Radeon®...graphics card and a CrossFire Ready motherboard...add a Radeon CrossFire Edition co-processor board and plug in the external cable to unite multi-GPU power.
Answer the question instead of combating my supported claims with, "ummm... okie...."
you already re-defined the definition of a "software flaw"...
:confused:...and that deals with this thread because?

I didn't "re-define" anything. I simply gave you two examples of what I considered to be software flaws. You then claimed them to be, "re-definitions".

IMO the latest ATI drivers have been full of software flaws even on their own hardware. To say that software incapable of carrying out the functions of a device seems to be a valid example of a software flaw. Just like the two Mod's in Oblivion that resulted in pink ground was a conflcting of software that was incapable of correctly performing it's function (a.k.a. flawed).

That opinionated example of mine that you seem to hypocritically carry around in your sig is irrelevant to my classification of GPU's as co-processors however. The examples I've given pertaining to whether or not the GPU is a co-processor were derived from links you gave me. You still have not provided a single piece of evidence claiming that the GPU is not a co-processor. In fact, you haven't given a link that really supports anything you've been claiming. You still have yet to do what I asked earlier so that I can possibly understand where you're coming from:

Provide these things:

[*] Give me evidence showing how the GPU is not a co-processor.

[*] Tell me which "co-processor board" ATI was talking about in your earlier link.

[*] Tell me where I said that this technology will not need drivers, bios, etc.

[*] Tell me where I said that this technology will only give a graphics performance increase.

[*] Give me a link showing me how any of this upcoming technology is my idea.
you might as well re-define a graphic card as well. whatever.
I'm going off of the Wikipedia definition you supplied as well as the thread you linked me too earlier. Where is your evidence as to how a GPU is not a co-processor?
Promise?