Sony Has New Console In The Works, AMD Building Graphics

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Consoles definitely use an API. OGL on the ps3 and DX9 on the xbox360. The games themselves do have optimizations to perform better on the weak gpus but they still look quite bad on certain titles.
No one uses OpenGL on the PS3, and on the 360 you wouldn't really recognize the API as DX9. The PS3 is usually programed using Sony's API, along with NVIDIA's Cg (which is effectively bare metal). The 360 is similarly bare.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I pray that AMD slips GCN into the PS4. I highly doubt they will based on history of using older architecture for consoles for various reasons. My bet is on VLIW4/67xx based.
Price and volume are the reasons. The manufacturers need to keep the cost of their devices low.
Using a newer architecture is cheaper or equal in price, not more expensive.
The reason for using older architectures is time to market considerations, it takes a while to design the console, and test it, and then give it over to manufacturers to start developing launch titles.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Using a newer architecture is cheaper or equal in price, not more expensive.
The reason for using older architectures is time to market considerations, it takes a while to design the console, and test it, and then give it over to manufacturers to start developing launch titles.

Hence developmental devkits with interim hardware suitably compatible with final hardware. VLIW4/5 and GCN are here and out in the open. Sony could go with VLIW4 or 5 where your per transistor performance should be superior (but requires a relatively high instruction overhead, no?), but GCN is more flexible and could be made to do GPGPU like things with more relative ease.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Hence developmental devkits with interim hardware suitably compatible with final hardware. VLIW4/5 and GCN are here and out in the open. Sony could go with VLIW4 or 5 where your per transistor performance should be superior (but requires a relatively high instruction overhead, no?), but GCN is more flexible and could be made to do GPGPU like things with more relative ease.

GCN is here today, yes.
But if they started work on it a year ago then GCN was not available then, and switching to GCN could very well put them back another year (by which point there would again be an even newer architecture).
Hence time to market issues.

However, if they are just starting development now then it will have GCN, but by the time it will be out GCN would be obsolete.
 

nenforcer

Golden Member
Aug 26, 2008
1,773
13
81
Being in the new Nintendo Wii U, PS4 and XBOX 720 may be a coup for AMD but I'm not sure I like it from a product differentiating standpoint.

This generation you have studios who are exclusive to each console manufacturer (Insomniac, SCEA Santa Monica, Naughty Dog [PS3]) / (Epic Games more or less, Bungie, Remedy, Lionhead [XBOX360]) who are developing on competing and different hardware to try and make the best AAA title.

If you take this away and have a similar Radeon GPU on both the PS4 and XBOX 720 you could get developers who just develop for the lowest common denominator between the two systems.

Worst case, you get games programmed for the Nintendo Wii U that look no better at all on the PS4 or XBOX 720, and we have a repeat of the Wii shovelware syndrome.

Of course, there will always be studios like iD and Kojima Productions who will massively optimize for each platform but these are too few and far between.

It will be interesting to see how this new generation pans out.
 
Last edited:

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Well it's helps to also take into account that both systems use PowerPC. Infact the PS3 & 360 used the same model microcore from IBM. The difference is that MS went with a triple core model with SMC while Sony added the CELL cluster. I'm not sure if Sony's single core has SMC.

The Wii-U will use a Power7 based CPU and I only hope Sony & MS do the same.

Hardware has gotten much cheaper, with a little more investment up front they can get an extra few years out of their consoles.
 
Dec 30, 2004
12,553
2
76
and maybe finally push openGL/openCL to the point where we can game on linux and DX is but an afterthought...

A boy sure can dream.

directX isn't going anywhere. Microsoft's superior development tools thanks to their experience with Visual Studio etc have ensured they're the easiest console to develop for, and it's going to stay that way going forward. Sony has an uphill battle to fight.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
GCN is here today, yes.
But if they started work on it a year ago then GCN was not available then, and switching to GCN could very well put them back another year (by which point there would again be an even newer architecture).
Hence time to market issues.

However, if they are just starting development now then it will have GCN, but by the time it will be out GCN would be obsolete.

Well my big question is whether or not GCN is necessary if all Sony may want is purely a graphics processor. Beyond the compute capability of GCN, why not use VLIW5 or 4?
 

cplusplus

Member
Apr 28, 2005
91
0
0
Sony's console isn't coming out until late 2013 at the earliest, and possibly early 2014. I haven't heard anything concrete about them having any dev kits out in the wild, so they have a while before they have to lock down the hardware. I could easily see them going for something along the lines of a 7750/7770 power-wise, which would be cheap enough for them to not have to price the console at $600 again (and a lot of that cost last time was due to how expensive the Blu-Ray drives were at the time), but powerful enough to drive the majority of modern games at at least 720p with a consistent 30fps, and many games either at 720p/60fps or 1080p/30fps, especially with the optimization that would be taking place.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Being in the new Nintendo Wii U, PS4 and XBOX 720 may be a coup for AMD but I'm not sure I like it from a product differentiating standpoint.

This generation you have studios who are exclusive to each console manufacturer (Insomniac, SCEA Santa Monica, Naughty Dog [PS3]) / (Epic Games more or less, Bungie, Remedy, Lionhead [XBOX360]) who are developing on competing and different hardware to try and make the best AAA title.

If you take this away and have a similar Radeon GPU on both the PS4 and XBOX 720 you could get developers who just develop for the lowest common denominator between the two systems.

Worst case, you get games programmed for the Nintendo Wii U that look no better at all on the PS4 or XBOX 720, and we have a repeat of the Wii shovelware syndrome.

Of course, there will always be studios like iD and Kojima Productions who will massively optimize for each platform but these are too few and far between.

It will be interesting to see how this new generation pans out.

Sorry for the double post.

It's entirely possible we see such a landscape. Wii U with an RV740, NextBox with Juniper or Barts, and the PS4 with Cape Verde or perhaps Pictairn.

I think it could be quite interesting to see such a line up to be honest, but most likely uninteresting in terms of taking advantage of the real capabilities per system.
 

cplusplus

Member
Apr 28, 2005
91
0
0
This generation you have studios who are exclusive to each console manufacturer (Insomniac, SCEA Santa Monica, Naughty Dog [PS3]) / (Epic Games more or less, Bungie, Remedy, Lionhead [XBOX360]) who are developing on competing and different hardware to try and make the best AAA title.

With regards to Epic, while Gears of War is a 360 exclusive, AFAIK it still runs on the Unreal Engine, which powers a ton of games on all 3 systems, so I don't know how exclusive I'd say Epic really is.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Well my big question is whether or not GCN is necessary if all Sony may want is purely a graphics processor. Beyond the compute capability of GCN, why not use VLIW5 or 4?

VLIW5 and 4 are actually more efficient then GCN for video games. GCN sacrificed some gaming performance for GPGPU so they could compete with fermi for those corporate dollars. So actually it isn't a bad thing if we get VLIW... However, its unlikely we would get 28nm GPU in there, even 40nm is going to be a stretch considering how consoles are historically I would expect it to have a 65nm GPU.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
VLIW5 and 4 are actually more efficient then GCN for video games. GCN sacrificed some gaming performance for GPGPU so they could compete with fermi for those corporate dollars. So actually it isn't a bad thing if we get VLIW... However, its unlikely we would get 28nm GPU in there, even 40nm is going to be a stretch considering how consoles are historically I would expect it to have a 65nm GPU.

Considering they are already at 45/40nm (Xbox 360 is 45, PS3 is 40), 65nm would be ridiculously unlikely for the next gen consoles.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
VLIW5 and 4 are actually more efficient then GCN for video games. GCN sacrificed some gaming performance for GPGPU so they could compete with fermi for those corporate dollars. So actually it isn't a bad thing if we get VLIW... However, its unlikely we would get 28nm GPU in there, even 40nm is going to be a stretch considering how consoles are historically I would expect it to have a 65nm GPU.

I don't see any issue getting something like Juniper or Barts in 28 nm production by Fall of next year.

I do wonder how small Juniper or Barts would be on 28 nm, and in comparison to Cape Verde, it's TDW, power consumption, and it's performance at the same clock speed. Juniper also only has 2/3 the transistors too.

I can see AMD wanting to convince Sony to go with GCN, but chances are that Sony will want control of production of whatever GPU for them, which means Sony will want the best performance for die area and clock. Cape Verde is proving to be quite excellent there, but Juniper could be better if on 28 nm.
 

CZroe

Lifer
Jun 24, 2001
24,195
857
126
The Wii-U is coming out later this year with some form of an RV700 4XXX series chip. I would guess they are making it on 40nm ?

http://en.wikipedia.org/wiki/Wii_U#Technical_specifications

When was the last time Nintendo had the most advanced console on the market ? :D They'll be making even more money now when that new console is out, and the games that usually only make it to 360 and PS3, are now coming to the Wii-U as well.

Uhh... when they made the Gamecube? I got my hands on an import well before the XBOX launched. Be careful or you might sound like Carlos Mencia did when insulting the N64 for being "Soooooo OLD!" when it was only the previous generation at that point and it was the most powerful of that generation. He was talking as if it were an NES and nothing he said conflicted with that, so I am convinced that he thought it was an NES or Sega Genesis or something.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
VLIW5 and 4 are actually more efficient then GCN for video games.

Up to DX-10. GCN is much more efficient in DX-11 games, especially in Tessellation and Compute.


GCN sacrificed some gaming performance for GPGPU so they could compete with fermi for those corporate dollars.

They sacrificed DX-9 and 10 gaming performance but they have raised DX-11 and GPGPU compute performance.

I believe next gen consoles will take advantage of Tessellation and most of the features that we find in DX-11.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Well my big question is whether or not GCN is necessary if all Sony may want is purely a graphics processor. Beyond the compute capability of GCN, why not use VLIW5 or 4?
PRT: Partially Resident Textures. PRTs make all the sense in the world on a platform with limited VRAM, and a hardware implementation is going to be all that much faster.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Up to DX-10. GCN is much more efficient in DX-11 games, especially in Tessellation and Compute.

what DX11 games?
Also the definition of a console is "DRM laden computer that is obsolete the day it is released"

However, if we do get GCN and tesselation rich DX11 games in next gen consoles I would be very happy.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
8,102
9,352
136
If AMD is really smart about this, they will make make a stipulation in their contracts with all three console manufactures to have a "Graphics by AMD/ Radeon Graphics" sticker/logo SOMEWHERE on the console. Having the AMD/Radeon logo in practically every damn living room in America and much of the world would be worth more than whatever any of these three companies are paying them.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
If AMD is really smart about this, they will make make a stipulation in their contracts with all three console manufactures to have a "Graphics by AMD/ Radeon Graphics" sticker/logo SOMEWHERE on the console. Having the AMD/Radeon logo in practically every damn living room in America and much of the world would be worth more than whatever any of these three companies are paying them.

Ideally largish and on the front. I believe the Gamecube and Wii have this already.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
If AMD is really smart about this, they will make make a stipulation in their contracts with all three console manufactures to have a "Graphics by AMD/ Radeon Graphics" sticker/logo SOMEWHERE on the console. Having the AMD/Radeon logo in practically every damn living room in America and much of the world would be worth more than whatever any of these three companies are paying them.
If their marketing team has a single brain cell among them, they will. But this is AMD, and sadly that's a high expectation.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
If AMD is really smart about this, they will make make a stipulation in their contracts with all three console manufactures to have a "Graphics by AMD/ Radeon Graphics" sticker/logo SOMEWHERE on the console. Having the AMD/Radeon logo in practically every damn living room in America and much of the world would be worth more than whatever any of these three companies are paying them.

They should fire the entire AMD marketing department and replace them with you.
I am not being sarcastic about this, I meant it.

However in reality AMD marketing is a pile of fail and this will NOT happen. I would bet you money not a single console will do that... well I would bet money if I didn't as a police never bet money.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
They should fire the entire AMD marketing department and replace them with you.
I am not being sarcastic about this, I meant it.

However in reality AMD marketing is a pile of fail and this will NOT happen. I would bet you money not a single console will do that... well I would bet money if I didn't as a police never bet money.

The Gamecube had an ATi sticker on, and while I don't remember if the Wii does too, the box I know does have an ATi logo on it (at least in the early days, but I don't know if there is now an AMD logo). I want to say the 360 used to have an ATi sticker on it. The kiosks for 360s does advertise the fact that the 360 does have a "Custom 500 MHz graphics chip by ATi".

But yes, AMD needs to leverage their ubiquity by advertising themselves if they do manage to procure all three contracts for the next three systems.