If AMD graphics in all upcoming consoles, whence nVidia?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No, he said consoles over the past 6 years.

Ya, I meant original Wii. My point was that the last 2 of 3 major consoles had AMD GPUs and NV still had fast/er cards in each generation since PS3/Xbox360/Wii launched. Therefore, there is no causation between having a particular brand of GPUs inside a console and NV's graphical performance on the PC. Not to mention future GPUs 5+ years from now will have little in common with what's going to be found in Wii2/PS4 and Xbox720.

Also, you keep coming back to game engines being somewhat optimized for NV GPUs since NV still has a GPU in PS3. The optimization of the game by a developer takes place once a game is being ported to a particular platform. Since PC games are made on the PC in the first place, there is no optimization that finds its way from PS3's NV GPU into NV's GPU on the PC. And like I said PS3's GPU has nothing to do with any of the modern GPUs. Not to mention in 1 case you code directly to hardware and in another you code to the API. So the approaches are totally different too.

The hardware is world's apart too in terms of architecture. The RSX is based on NV47, with fixed pixel and shader pipelines and can only do DX9. Modern game engines use DX11 + Tessellation and the GPUs have unified shader cores. So what possible optimization can be there? Put it this way, even now any optimization for GTX4xx/5xx series for NV had 0 impact on NV's GPU performance for their older GF8/9 and 10 series. So there is no way GeForce 7 optimization have any impact on the performance of modern GPUs.

NV's driver team optimizes the game's performance after release. DX11 code can be an all new layer such as implemented in Crysis 2. In that case, that DX11 path has little to do with DX9 path on the consoles since the graphical features of that DX11 path aren't even present in the console. Similarly, as next generation consoles move to DX11, in 3-5 years, we'll have DX12/13 game engines anyway. Again, it won't matter what brand of GPUs will be found in next generation of consoles since next generation game engines will eventually exceed DX11 spec as well.

Also, studios don't design game engines to take advantage of 1 particular GPU - they look at a general set of features available to them (i.e., DX9/10/11, Tessellation capability, amount of VRAM available, etc. etc.). Next generation game engines will be designed to scale with better hardware as it is released, similar to how current engines behave. So as long as NV keeps producing cutting edge GPUs, they will have no problems whatsoever competing with AMD on the PC, even if there were 10 consoles and all of them only had AMD GPUs inside of them.
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
You will have to ask them. They seem to believe a fermi like arch is the way of the future for their cards.

As I recall, AMD buying ATI and stating their intention to 'fuse' the cpu and gpu put the handwriting on the wall and is what drove nVidia to develop a competitive, albiet non-86x, technology A.S.A.P. and throw the dice on Fermi (a massive new cpgpu oriented architecture on a new process node).
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
All margins for large contracts like that are razor thin. That's the only way to get them. If the rumors are true, the reason AMD would be able to undercut nVidia is their smaller chip designs. All else being equal, they cost less to manufacture, and therefore can be sold for less and still be profitable. Add to that they use less power, which reduces the cost of everything else, PSU, heatsinks, etc..., and it further reduces the overall cost of the platform.

M$, Sony, and anyone else at that level in the industry, know exactly how much the GPU's cost to manufacture. As a matter of fact, I wouldn't be surprised if they negotiate the price to make the chips with TSMC/GF themselves. IDC might be able to shine some light on this? I know I've been involved with contracts mega companies. They will go to the manufacturer and negotiate for themselves. They will then come to you and tell you how much the supplier is going to charge you for the item(s) and ask you for a bid to supply it.

That is one of the differences between AMD/NV and the Xbox/Xbox 360. For the original Xbox MS bought chips from NVIDIA and put them into their consoles, whereas for the 360 MS bought the design from ATI and had the chips manufactured by TSMC (I assume).

Here's a motherboard shot from the original Xbox:

xbox2.jpg


Here's a picture of the Xenos gpu from the 360:

x360Gpu-Xenos.jpg


...notice there is no ATI branding present on the Xenos, but the NVIDIA chips are easily identified in the original Xbox.

IIRC NV was only interested in selling MS complete chips allowing NV to increase their margins as manufacturing costs dropped, which didn't really benefit MS. As a result, MS looked for another option with the 360.

I'm not sure what the relationship is between NV and Sony for the PS3, but there doesn't appear to be any NV branding on the RSX heat spreader:

rsx-reality-synthesizer-cxd2982.jpg


...or underneath it:

ps3rsx01te2.jpg


I think it's pretty much a given the Xbox "720" and the new Nintendo console will be powered by AMD designs, but the new Sony console is still up in the air.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
As I recall, AMD buying ATI and stating their intention to 'fuse' the cpu and gpu put the handwriting on the wall and is what drove nVidia to develop a competitive, albiet non-86x, technology A.S.A.P. and throw the dice on Fermi (a massive new cpgpu oriented architecture on a new process node).

Fermi's roots are in the G80(8800 series GPUs) that came out in 2006. Those were in development for years before that, and well before AMD bought ATI.

ATI\AMD's GPGPU has been pretty lackluster so far. They seem to be taking it more serious with their next arch which will look more like Fermi, including, I have heard, scaleable tesselation.

Nvidia's direction to compete with things like Llano will eventually be something like project denver. I dont know if they will ever achieve the CPU performance of low end x86. But they may make a passable platform to replace low end desktops\laptops\netbooks.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ATI\AMD's GPGPU has been pretty lackluster so far.

Well actually their GPGPU strategy/marketing is to blame for that. GPU architecture wise, with highly parallel GPGPU coding, their architecture walks all over NV in modern GPGPU compute tasks. The problem is you need to work with software developers to make sure programs are created to take advantage of this highly parallel architecture. Look at the modern GPGPU computer tasks like MilkyWay@Home, CollatzConjecture, Bitcoin. AMD is massively superior. But then when it comes to more popular GPU accelerated programs like CS5, AMD is nowhere to be found.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Well actually their GPGPU strategy/marketing is to blame for that. GPU architecture wise, with highly parallel GPGPU coding, their architecture walks all over NV in modern GPGPU compute tasks. The problem is you need to work with software developers to make sure programs are created to take advantage of this highly parallel architecture. Look at the modern GPGPU computer tasks like MilkyWay@Home, CollatzConjecture, Bitcoin. AMD is massively superior. But then when it comes to more popular GPU accelerated programs like CS5, AMD is nowhere to be found.

AMD's superiority in many of those applications has less to do with their hardware as it does with the application. Those applications like a lot of cores. Regardless of of how powerful each core is. For many other highend HPC applications that Tesla addresses AMD cant compete. Hence the reason why they are moving towards a more Fermi like arch in the future.
 
May 13, 2009
12,333
612
126
Nvidia will never go out of business as long as they continue to make the best desktop gpu's coupled with the best drivers and 3D gaming experience.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Nvidia will never go out of business as long as they continue to make the best desktop gpu's coupled with the best drivers and 3D gaming experience.

Who makes the best GPU is debatable. Actually, all that matters is that people think you make the best GPU that they can afford. I've seen lots of products do well simply because people thought they were buying something good and it was cheap. Actually, we are lucky to have 2 very competent companies that are competitive.

Drivers? I'm not sure why you think they are better. We could go on forever (or just look up any of a multitude of threads) about problems with both company's drivers.

This is the umpteenth time 3D has been marketed. Maybe it will work this time? Maybe it won't? Either way, by the time it's a factor in the marketplace (if it ever is in it's current form) I doubt anyone will have a lead in the tech. I actually have faith in the open market. If it's profitable, companies will develop it. There's no need for separate proprietary platforms. Actually, in a competitive market, that's the worse way to go.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Nvidia will never go out of business as long as they continue to make the best desktop gpu's coupled with the best drivers and 3D gaming experience.

Best cards with the best drivers? On two separate occasions now Nvidia cards have burned themselves out in part thanks to the drivers. The current top of the line card is the HD 6990.

As for 3D, well Nvidia has the easier solution. Performance wise its all over the place between Nvidia and the different solutions that work with AMD.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Best cards with the best drivers? On two separate occasions now Nvidia cards have burned themselves out in part thanks to the drivers. The current top of the line card is the HD 6990.
As for 3D, well Nvidia has the easier solution. Performance wise its all over the place between Nvidia and the different solutions that work with AMD.

If you can find one...
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Nvidia will never go out of business as long as they continue to make the best desktop gpu's coupled with the best drivers and 3D gaming experience.

Ahh, a throwback to the days of blatant fanboyism.

FWIW, I agree with you about the quality of their products, but my "favorite" (3Dfx) died a long, long time ago. :(
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
I imagine that if AMD is able to win all of the next-gen console contracts, that their margins from underbidding will be so thin it will only have real value in marketing.

In the short term this is probably true. However in the long run manufacturing costs of silicon generally drop as processes mature or the GPU/CPU is switch to a new smaller manufacturing process. )I am not very knowledgeable about silicon manufacturing, but this is generally true isn't it? Perhaps IdontCare or anyone else can chime in)

If what I'm saying is true it might be worth while for AMD to take a hit on the first year or two of the console war, then reap the rewards in the long term.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Ahh, a throwback to the days of blatant fanboyism.

FWIW, I agree with you about the quality of their products, but my "favorite" (3Dfx) died a long, long time ago. :(

Imho,

Anyone can fail by not executing and failing to read markets effectively but one can see where nVidia sees growth potential, which is Tegra and Tesla and still innovating for Quadro and GeForce.

Sadly, 3dfx' business model mainly focused on retail discrete gaming and didn't read markets effectively and when they couldn't execute in a timely manner, well, it was like a house of cards that collapsed on each other. Some of 3dfx strengths were incorporated into nVidia's vision so not is all lost and 3dfx lives though nVidia.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Yup, AMDs blathering about directly coding for hardware is weird to me. AMD is a company with terribad dev relations and they are pushing for development that would benefit most a company with great dev relations like Nvidia? Me thinks they were shortsighted in this idea due to tesselation performance issues with their current hardware. I expect their tune to change again when they have hardware that looks like Fermi. :D


How was AMD short sighted? Everything and anything plays perfectly fine on the Radeon's tesselator. Sure, there is a synthetic benchmark and the supposed Nvidia-sponsored DX11 patch for Crysis 2 that favor Nvidia's tessellator. But that's pretty much it.

Maybe another way of looking at it is to suggest AMD correctly guessed what parts of the GPU to spend more time and resources on further developing when compared to the useful life of the part. And because AMD did so I was enjoying next gen gaming on my 5870 while Nvidia was showing off wood-screw dummy parts.

How much tessellation power do we need when it seems like the vast majority of AAA titles we get in the PC gaming world are console ports these days? When the next gen consoles are out, and assuming they have tessellators, do you think you'll still want to be gaming on a GTX480?

And all of this is ignoring the fact that ATI/AMD has had a tessellator available in their GPU's for a decade now.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
AMD's superiority in many of those applications has less to do with their hardware as it does with the application. Those applications like a lot of cores. Regardless of of how powerful each core is. For many other highend HPC applications that Tesla addresses AMD cant compete. Hence the reason why they are moving towards a more Fermi like arch in the future.


So you are saying those applications like very highly parallel processors? Isn't that mostly what AMD and Nvidia are pushing GPGPU to be? It sounds to me like it does have something to do with the hardware, afterall that is the type of software these chips (GPU's) are meant to be good at.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't think AMD was short-sided just similar goals but different paths, resources and road-maps. Eventually, AMD's arch was going to concentrate on GPU processing and more of a focus when it made sense with their strategies and planning.

It's all good because the work that nVidia had done has also helped pave the way for innovation and awareness.

Catalyst Maker said:
I also have to give kudos to Nvidia for coming out with CUDA and pushing the whole GPGPU market in a certain direction
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Sure, there is a synthetic benchmark and the supposed Nvidia-sponsored DX11 patch for Crysis 2 that favor Nvidia's tessellator. But that's pretty much it.

DetailTessellation11 Sample

This sample, contributed by AMD, demonstrates the use of detail tessellation for improving the quality of material surfaces in real-time rendering applications. Detail Tessellation is typically a faster-performance and higher-quality alternative to per-pixel height map-tracing techniques such as Parallax Occlusion Mapping.

http://www.anandtech.com/show/4260/amds-radeon-hd-6790-coming-up-short-at-150/13

36572.png


Well even in AMDs own application, GF100 and derivative chips are faster and before you say that High settings is too much, the highest Tessellation factor in that application is only x15 ;)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Like I said, is there anything not playable? Or is 293 FPS not getting it done on the 5870? :)

I did not say that Nvidia doesn't have a superior tessellator, just that what AMD came out with appears to be plenty for the useable lifetime of the parts. I imagine by the time the 5870 really cannot run games that require tessellation, it's 1GB frame buffer will have already been a limiting factor.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Granted we havent seen a game that really use Tessellation the way it has to be implemented so far, my hopes is for the BF3.

If BF3 really use low resolution textures (Edit : Low Polygon Count) and Tessellation then i believe that HD5870 will perform poorly.

If even BF3 will not use tessellation that way then HD5870 will have served its generation after that long in the field ;)
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
How was AMD short sighted? Everything and anything plays perfectly fine on the Radeon's tesselator. Sure, there is a synthetic benchmark and the supposed Nvidia-sponsored DX11 patch for Crysis 2 that favor Nvidia's tessellator. But that's pretty much it.

Maybe another way of looking at it is to suggest AMD correctly guessed what parts of the GPU to spend more time and resources on further developing when compared to the useful life of the part. And because AMD did so I was enjoying next gen gaming on my 5870 while Nvidia was showing off wood-screw dummy parts.

How much tessellation power do we need when it seems like the vast majority of AAA titles we get in the PC gaming world are console ports these days? When the next gen consoles are out, and assuming they have tessellators, do you think you'll still want to be gaming on a GTX480?

And all of this is ignoring the fact that ATI/AMD has had a tessellator available in their GPU's for a decade now.

They were being shortsighted by suggesting developers ditch a common API and instead code for the hardware. Nvidia is clearly the better of the two when it comes to developer relations. Suggesting a course of action that benefits your competitor because you lack in an area right now is shortsighted.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
So you are saying those applications like very highly parallel processors? Isn't that mostly what AMD and Nvidia are pushing GPGPU to be? It sounds to me like it does have something to do with the hardware, afterall that is the type of software these chips (GPU's) are meant to be good at.

The GP in GPGPU stands for "General Purpose". And if those applications are what AMD is targetting why the move to a more Fermi like arch in their upcomming GPUs?

Bitcoin works better on the 5000 series than the 6000 series.