AMD shakeup combines CPU and GPU business units?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
if you read further into it... the VP of AMD who led CPU design left, so the leader of the GPU design team now gets to lead BOTH teams at once... aka, does double work... this, doesn't sound too well.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: taltamir
if you read further into it... the VP of AMD who led CPU design left, so the leader of the GPU design team now gets to lead BOTH teams at once... aka, does double work... this, doesn't sound too well.

Is it that easy to jump from designing one type of technology to the next, and back again?
Something tells me it isn't.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
At that level in the org chart these guys really just need to be wizards at managing people and resources, they aren't involved in actual technical decisions regarding shaders and cores, etc. They have much smarter (hopefully) people working for them to make those decisions.

They do set the tone of aggressiveness (risk taking) that will trickle down thru the rest of their org though, that personal attribute is perhaps the single-most critical one that differentiates the resultant output of the organization. Do they inspire greatness in their organization or do they create a work environment where people piss and moan about management's decisions.

Of course nothing critical at the 50k-ft level will be done which won't be approved and have buy-in by the guy, but I doubt the Bulldozer team is waiting on him to spec out his FMOVUP32 circuit pre-tapeout or else the project is going to slip :laugh:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Keysplayr
Originally posted by: taltamir
if you read further into it... the VP of AMD who led CPU design left, so the leader of the GPU design team now gets to lead BOTH teams at once... aka, does double work... this, doesn't sound too well.

Is it that easy to jump from designing one type of technology to the next, and back again?
Something tells me it isn't.

Rick Bergman is a *genius*.

It seems to me as though he's going to act as a mini-CEO of sorts.

It's true that CPUs and GPUs are very different, but there is an accelerating convergence going on right now. Look at Fusion and that upcoming intel CPU w/ an integrated GPU.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Just wondering is it possible that General purpose GPU could replace CPU?

I know this is a little bit off topic but this thread got me thinking if these two could be combined?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Keysplayr


Is it that easy to jump from designing one type of technology to the next, and back again?
Something tells me it isn't.

For most of us no. But we are not designing f all to start with. Looking forward to the epu!!
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Just learning
Just wondering is it possible that General purpose GPU could replace CPU?

I know this is a little bit off topic but this thread got me thinking if these two could be combined?
It will replace the CPU for certain tasks like video encoding and advanced scientific calculations (in fact, it probably has already in certain sectors).

If the GPU proves itself to be better in other areas and NV/AMD add the technology to their chips, the sky is the limit, with the x86 license (or lack thereof) being the glass ceiling.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: SickBeast
Originally posted by: Just learning
Just wondering is it possible that General purpose GPU could replace CPU?

I know this is a little bit off topic but this thread got me thinking if these two could be combined?
It will replace the CPU for certain tasks like video encoding and advanced scientific calculations (in fact, it probably has already in certain sectors).

If the GPU proves itself to be better in other areas and NV/AMD add the technology to their chips, the sky is the limit, with the x86 license (or lack thereof) being the glass ceiling.

Yep I have seen some information related to GPUs being used for all sorts of calculations.

I guess how well Apps can be written matters also?

But is there anyway a GP GPU could run and entire desktop? (without need for x86 CPU type)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Just learning
Originally posted by: SickBeast
Originally posted by: Just learning
Just wondering is it possible that General purpose GPU could replace CPU?

I know this is a little bit off topic but this thread got me thinking if these two could be combined?
It will replace the CPU for certain tasks like video encoding and advanced scientific calculations (in fact, it probably has already in certain sectors).

If the GPU proves itself to be better in other areas and NV/AMD add the technology to their chips, the sky is the limit, with the x86 license (or lack thereof) being the glass ceiling.

Yep I have seen some information related to GPUs being used for all sorts of calculations.

I guess how well Apps can be written matters also?

But is there anyway a GP GPU could run and entire desktop? (without need for x86 CPU type)

It probably could. I'm not sure if we're quite there yet, though, plus they say that the GPU would suck at general code (although I have yet to see a real example of this).
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
A general purpose GPU won't replace a CPU, at least not an x86 CPU. You're trying to break the x86 monopoly and to be quite honest, if Intel couldn't break a monopoly they own then nVidia isn't going to be able to do it. I think this monopoly is too difficult to break because it is so entrenched. It's not about technological superiority so much as there is so much momentum and entrenchment in the x86 market.

It doesn't matter that nVidia's GPU would be slower at CPU type work. It could even be good as a replacement to the x86 CPU. It doesn't need to be great IMHO because Joe Consumer doesn't need much computing power. Just enough to browse the web, play some movie files and play some music. The x86 CPU is simply too entrenched and you'd have to be not just good but an awesome replacement to it in order to pose a credible threat.

One of the reasons AMD purchased ATI was always the convergence of a CPU/GPU on one die that would be sold as a one stop solution for vendors. Intel is also doing the same with an increased emphasis on GPU's (Larrabee).

What you'll likely see from AMD and Intel in the future is a multi-core CPU/GPU implementation and you're dense if you don't see nVidia being scared out of its pants. GPU's are getting better and better every year. Process shrinks are also progressing. While GPU's being better every year definitely benefit the ATI part of AMD as well as nVidia, it also aids the work going on towards a unified multi-core CPU/GPU. One definitely can see the day when a multi-core CPU/GPU implementation will be good enough to play games at a decent level and only the most demanding games will require a discrete GPU.

With where CPU's are advancing today and how they are advancing, I don't find it a crackpot theory that AMD and Intel can make an 8 core CPU/GPU in two to three years that provide what would be considered mid-level GPU performance (perhaps equaling the high end GPU's today) coupled with decent CPU performance. A low end CPU/GPU would be one with two CPU cores and two GPU cores and higher end ones would contain upwards of three of each cores in various configurations. These GPU cores would be run in a SLI/XFIRE type config.

OEM's will consider an integrated CPU/GPU a blessing. Less parts to keep track of and integrate into their assembly lines. Joe Computer could care less, just get them something that performs OK and works. AMD and Intel would happily take more money for these integrated CPU/GPU parts. AMD and Intel while they would probably make more on both a separate CPU as well as GPU sale, they would make more overall since they guarantee if you buy an AMD CPU (or Intel), you're also buying an AMD GPU (or Intel).

As any impartial observer can see, nVidia should be scared by the possibility of the lower mid-range GPU's being sliced off by AMD and Intel. They could be losing not just integrated graphics sales but a large majority of the video card sales from the lower mid-range on down. Their only recourse is to create technologies and software that necessitates the use of nVidia branded GPU's. That's where CUDA and PhysX comes in. However, it's no guarantee that CUDA and PhysX will be here long term either. While nVidia is king of the hill on the GPU front today, that can easily change tomorrow. They must continue to push CUDA and PhysX and they must also invest in improving their GPU performance.
 
Apr 20, 2008
10,067
990
126
Originally posted by: akugami
A general purpose GPU won't replace a CPU, at least not an x86 CPU. You're trying to break the x86 monopoly and to be quite honest, if Intel couldn't break a monopoly they own then nVidia isn't going to be able to do it. I think this monopoly is too difficult to break because it is so entrenched. It's not about technological superiority so much as there is so much momentum and entrenchment in the x86 market.

It doesn't matter that nVidia's GPU would be slower at CPU type work. It could even be good as a replacement to the x86 CPU. It doesn't need to be great IMHO because Joe Consumer doesn't need much computing power. Just enough to browse the web, play some movie files and play some music. The x86 CPU is simply too entrenched and you'd have to be not just good but an awesome replacement to it in order to pose a credible threat.

One of the reasons AMD purchased ATI was always the convergence of a CPU/GPU on one die that would be sold as a one stop solution for vendors. Intel is also doing the same with an increased emphasis on GPU's (Larrabee).

What you'll likely see from AMD and Intel in the future is a multi-core CPU/GPU implementation and you're dense if you don't see nVidia being scared out of its pants. GPU's are getting better and better every year. Process shrinks are also progressing. While GPU's being better every year definitely benefit the ATI part of AMD as well as nVidia, it also aids the work going on towards a unified multi-core CPU/GPU. One definitely can see the day when a multi-core CPU/GPU implementation will be good enough to play games at a decent level and only the most demanding games will require a discrete GPU.

With where CPU's are advancing today and how they are advancing, I don't find it a crackpot theory that AMD and Intel can make an 8 core CPU/GPU in two to three years that provide what would be considered mid-level GPU performance (perhaps equaling the high end GPU's today) coupled with decent CPU performance. A low end CPU/GPU would be one with two CPU cores and two GPU cores and higher end ones would contain upwards of three of each cores in various configurations. These GPU cores would be run in a SLI/XFIRE type config.

OEM's will consider an integrated CPU/GPU a blessing. Less parts to keep track of and integrate into their assembly lines. Joe Computer could care less, just get them something that performs OK and works. AMD and Intel would happily take more money for these integrated CPU/GPU parts. AMD and Intel while they would probably make more on both a separate CPU as well as GPU sale, they would make more overall since they guarantee if you buy an AMD CPU (or Intel), you're also buying an AMD GPU (or Intel).

As any impartial observer can see, nVidia should be scared by the possibility of the lower mid-range GPU's being sliced off by AMD and Intel. They could be losing not just integrated graphics sales but a large majority of the video card sales from the lower mid-range on down. Their only recourse is to create technologies and software that necessitates the use of nVidia branded GPU's. That's where CUDA and PhysX comes in. However, it's no guarantee that CUDA and PhysX will be here long term either. While nVidia is king of the hill on the GPU front today, that can easily change tomorrow. They must continue to push CUDA and PhysX and they must also invest in improving their GPU performance.

Its Joe the Plumber i swear!

Joe Computer is a scam.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
nvidia can see the writing on the wall.

that is why they are focusing on tegra . their latest quarterly report they made a lot of comments about tegra being a field that is wide open to gain share in. i'm sure they will try to keep it up with ion chipsets and add on cards, but eventually those avenues will run out.


i would imagine in 10 years... there just wont be graphics cards at all. everything will be on one die, and if you want your 3d stuff to run faster you buy a whole new cpu.

i mean most integer apps run plenty fast now. so floating point stuff can all run on the gpu part of the core and you can just buy a new cpu every time...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: hans007
nvidia can see the writing on the wall.

that is why they are focusing on tegra . their latest quarterly report they made a lot of comments about tegra being a field that is wide open to gain share in. i'm sure they will try to keep it up with ion chipsets and add on cards, but eventually those avenues will run out.


i would imagine in 10 years... there just wont be graphics cards at all. everything will be on one die, and if you want your 3d stuff to run faster you buy a whole new cpu.

i mean most integer apps run plenty fast now. so floating point stuff can all run on the gpu part of the core and you can just buy a new cpu every time...

Nah, we'll just OC the crap out of it instead :laugh: :p

(I know what you mean, couldn't help myself though, convergence is where it is headed, simply unavoidable)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: hans007
nvidia can see the writing on the wall.

that is why they are focusing on tegra . their latest quarterly report they made a lot of comments about tegra being a field that is wide open to gain share in. i'm sure they will try to keep it up with ion chipsets and add on cards, but eventually those avenues will run out.


i would imagine in 10 years... there just wont be graphics cards at all. everything will be on one die, and if you want your 3d stuff to run faster you buy a whole new cpu.

i mean most integer apps run plenty fast now. so floating point stuff can all run on the gpu part of the core and you can just buy a new cpu every time...

That's why the modern gpu designs from Nvidia are becoming increasingly more focused on general computing, not just graphics. AMD has the luxury of making both cpu's and gpu's, so they can focus their gpu on what's important for graphics. Nvidia, OTOH, does not have such a luxury, and is therefore trying to push their gpu's to double duty, and as a result having a gigantic die that is not nearly as efficient in performance/mm^2 when it comes to graphics, compared to the competition. If the early rumors about the gt300 are correct, it will only extend Nvidia's reach into cpu territory, and predictably, there must be tradeoffs in graphics performance efficiency.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
the ideal is to replace both the GPU and CPU with threadless chips... they will all show up as one "core" to the OS and scale to infinite cores perfectly. There are technical briefs on how to do something like that it, but it will require throwing out x86 and GPU tech and starting from scratch, we are talking about at least 10 years of work for it to even catch up to current processors... It is possible and it might even happen... or it might not, time will tell.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: munky
Nvidia, OTOH, does not have such a luxury, and is therefore trying to push their gpu's to double duty, and as a result having a gigantic die that is not nearly as efficient in performance/mm^2 when it comes to graphics, compared to the competition.

That is why things that don't have to be general purpose are, in general, not general purpose.

For xtor and diesize, dedicated hardware for any given task will always, always, be superior to a generalized processor employed for the same task when evaluated by the same metrics of success as you are defining them.

But we've seen where generalized processors versus dedicated processors ended up in the x86 world...

Where generalized processors are supposed to be superior to dedicated hardware for any given task is the cost...volume production for a part that can function in many product segments and applications is supposed to lower the cost per part to the point that it is cost competitive (if not outright lower cost) with the smaller/faster parts designed solely for the given product.

Nvidia isn't there yet, they are trying to get there as these things don't just happen overnight and they don't necessarily happen with profits sprouting up along the way. Nvidia is in investment mode, they are seeding the creation of market segments that will someday (their hope) turn their GPGPU's into a ridiculously high-volume part.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: taltamir
if you read further into it... the VP of AMD who led CPU design left, so the leader of the GPU design team now gets to lead BOTH teams at once... aka, does double work... this, doesn't sound too well.

Ya it does ATI took over AMD. If we are building chips for future. I think Dirk may very well be making some very good decisions. Just have to see. But i think its looking better.

AMD just paid way to much for a ATI IP.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: hans007
nvidia can see the writing on the wall.

that is why they are focusing on tegra . their latest quarterly report they made a lot of comments about tegra being a field that is wide open to gain share in. i'm sure they will try to keep it up with ion chipsets and add on cards, but eventually those avenues will run out.


i would imagine in 10 years... there just wont be graphics cards at all. everything will be on one die, and if you want your 3d stuff to run faster you buy a whole new cpu.

i mean most integer apps run plenty fast now. so floating point stuff can all run on the gpu part of the core and you can just buy a new cpu every time...

I am not so sure NV can read. As I said befor the Apple nv deal was strange and in the end you well understand completely what happened here.

Power VR is releasing new chip for handhelds for Apple This new chip does open cl and there working hard on it Apple/Imagination/Intel .
. Its said its pyhisics capable and raytracing. it 2-16 cores. Apple invested heavely in Power VR Imagination company as did intel . Of course all this after Apple got open CL threw . Thats why the Apple NV show.

For those who are unaware power VR is tile based same as larrabee. Its also said that These chips may becoming to desktop with many more cores.

Its getting interesting. Can't wait to see Nv ion go against Power VR ion. Thats right won't be no NV ion . Because Ion is goingSOC and Power VR is in the game . Hows that for here the same shit you been pulling on everone else NV . YOU have no markets left shotrtly.

Demos of chip available . Its good . Keep in mind the res. on hand helds

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
NV's pain right now has got to do with the economy. They cater to the high end and people who make a lot of money. They make excellent products. I'm not afraid to say that they make the best single GPU right now. The 4890 may beat their best card overclocked, but that doesn't count.

Their cards are not as efficient per transistor, but apparently going with GDDR5 saves on transistors in the sense that it allows AMD to go with a 128 or 256 bit bus width instead of 512 bit.

NV will be fine. Their problem is that there is a lag between the introduction of a high end GPU and the creation of their low end and midrange product line. In the case of GT200, IMO their midrange and low end lineups completely flopped and they need to learn from it. They may also need to see the benefit in what AMD is doing with their strategy. They may need to make a shift and compete directly with AMD. They should stop going after intel; it will get them nowhere. They're not at the point where they can take them on yet. As others have said, if intel couldn't break their own x86 dependence with itanium, NV certainly can't, at least yet.