[ArsTechnica] Next-gen consoles and impact on VGA market

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
So would moving consoles away from custom chips and towards commodity chips create an environment for game developers to make future games (for future consoles) that happen to convert/port over to PCs much better than typical console ports of today when ported over to PCs? If game developers make a game that runs on a commodity PC-style CPU and GPU combo, I would guess that the game would be much easier to port to a PC and run better on a PC?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Personally I would be far more excited about this new type of console if it meant they would release new hardware every 3-5 years instead of every 10 years. If you think about it their R&D costs are drastically cut by using pre-made hardware, and their costs will be far lower as well.

I can see a future for gaming where consoles don't exist and CPU's with GPU's on die are powerful enough to run games at acceptable levels. Once we reach that amount of performance consoles will cease to exist as a platform because everyone's $500 PC will have the exact same CPU/hardware and will be able to run games. Developers will only have to optimize for a single chip, and life will be good.

It really kinda puts a damper on the whole hardware enthusiast thing, but overall I think it would be good for the future of gaming.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
I'll get a Wii just to play Zelda, Metroid, and Mario Kart in 1080P. Other than that, I won't bother. I'll just keep my 360 as the dedicated Netflix player, and everyone else can go play Call of Dogcrap 10 with slightly better texturs on the next-gen consoles.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
So would moving consoles away from custom chips and towards commodity chips create an environment for game developers to make future games (for future consoles) that happen to convert/port over to PCs much better than typical console ports of today when ported over to PCs? If game developers make a game that runs on a commodity PC-style CPU and GPU combo, I would guess that the game would be much easier to port to a PC and run better on a PC?

Yes, yes it would.
Expect future console ports to get better FPS on PC and have nicer graphics...
The bad story, bad UI, bad controls, bad camera, and unplayable amount of bugs on release 90% of which will never be solved however will be unaffected.
But hey, better graphics.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
As I've already said on previous occasions, something like AMD getting in next gen consoles will hinder further the attempted locked down push of hardware PhysX.
Unless there are NV GPUs in next gen consoles, or NV open up PhysX to run on other hardware, it's not going to make much headway in terms of widespread adoption while it remains PC only.

Hopefully AMD will win both Xbox and PS next gen GPU s, and NV will be forced to open things up a bit, and maybe developers will start to use more other open standards such as for 3D implementations.

or say "hi" to a better-open physics

http://www.youtube.com/watch?v=143k1fqPukk
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Interestingly, NGOHQ had done a rough test porting of physX to AMD cards... at first reps from both companies sounded enthusiastic about it but soon after both companies became hostile towards it. Supposedly they are still working on it btw.

My theory is that on further thought nVidia decided they don't want to lose their "monopoly" (wholly imagined as it is) and AMD does not want to let physX spread to more games and further compromise them.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
A lot of people seem to be missing that both DirectX and OpenGL is close to meaningless when talking about consoles. Console hardware can pick and match features and MS/Sony will just provide their own graphics API for developers to use, they're not bound by any standards. From the top of my head, the Xbox 360 GPU provided most, if not all, DX 9 features, some DX 10 features and tessellation.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
A lot of people seem to be missing that both DirectX and OpenGL is close to meaningless when talking about consoles. Console hardware can pick and match features and MS/Sony will just provide their own graphics API for developers to use, they're not bound by any standards. From the top of my head, the Xbox 360 GPU provided most, if not all, DX 9 features, some DX 10 features and tessellation.

1. Nobody is missing it.
2. They CAN make their own API but the argument is that this time around they will go with more generic off the shelf stuff instead of all this very expensive custom hardware and APIs (which cost them a lot of money to develop and end up inferior to whats readily available on the market).
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
1. Nobody is missing it.
2. They CAN make their own API but the argument is that this time around they will go with more generic off the shelf stuff instead of all this very expensive custom hardware and APIs (which cost them a lot of money to develop and end up inferior to whats readily available on the market).

There are several people claiming that MS/Sony should use hardware that supports DX11+. Maybe they mean that as in the feature set not the API.

And the second point is kinda hurting my head. They won't go with off the shelf parts, that would be insane. Cost is the most important metric for consoles, you can bet that MS/Sony will spend as much money on R&D as necessary to get more out of cheaper hardware, because that pays off immensely in the long run .
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
And the second point is kinda hurting my head. They won't go with off the shelf parts, that would be insane. Cost is the most important metric for consoles, you can bet that MS/Sony will spend as much money on R&D as necessary to get more out of cheaper hardware, because that pays off immensely in the long run .

Off the shelf was the wrong phrase... I meant commodity and couldn't think of the word. And you are wrong about it being cheaper to spend "however much in R&D it takes". It was proven to be wrong several times and rumor has it they will do things differently this time.

http://www.xbitlabs.com/news/multimedia/display/20061117130000.html

Notice how cheap the commodity stuff like HDD and RAM is on that list. They can get an existing CPU & GPU design from AMD.
The PS3 cost them 129$ per nvidia GPU + 89$ per cell CPU = 218$
I can guarentee AMD can cut them a better deal than that for an APU.
 
Last edited:

Arzachel

Senior member
Apr 7, 2011
903
76
91
Off the shelf was the wrong phrase... I meant commodity and couldn't think of the word. And you are wrong about it being cheaper to spend "however much in R&D it takes". It was proven to be wrong several times and rumor has it they will do things differently this time.

http://www.xbitlabs.com/news/multimedia/display/20061117130000.html

Notice how cheap the commodity stuff like HDD and RAM is on that list. They can get an existing CPU & GPU design from AMD.
The PS3 cost them 129$ per nvidia GPU + 89$ per cell CPU = 218$
I can guarentee AMD can cut them a better deal than that for an APU.

HDD's and peripherals I agree, can't really get creative with them. But there are a ton of little details that differ just enough to recoup R&D costs. Dedicated eDRAM on the Xbox 360 GPU meant that MS could cheapen out on total system RAM and memory controllers. There was even a quote from an engineer in the thread, that said that MS had skimped on putting pads in the CD-ROMs to save a few cents on every console.
 

lamedude

Golden Member
Jan 14, 2011
1,222
45
91
A lot of people seem to be missing that both DirectX and OpenGL is close to meaningless when talking about consoles. Console hardware can pick and match features and MS/Sony will just provide their own graphics API for developers to use, they're not bound by any standards. From the top of my head, the Xbox 360 GPU provided most, if not all, DX 9 features, some DX 10 features and tessellation.
MS likely will keep Xbox's DX and the PC DX's similar to ensure easy porting. Middleware will probably keep using PS4's version of libgcm but hopefully GL will be fast enough this time for those that don't want Unreal Engines.
When Xbox 360 launched, this type of GPU power was top of the line for AMD's GPUs, actually ahead by half a generation.
2005 was the transition period from fillrate is king to moar shaders. Xenos was "future proof" but at the time it would be about equal to R5x0/G7x depending bottleneck of the game. On the flip side the transistion from immediate renderers to deferred renderers negated eDRAM's free AA but most games that stuck with immediate rendering didn't bother with tiling so you have to use sub HD res to get that free AA.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
There was even a quote from an engineer in the thread, that said that MS had skimped on putting pads in the CD-ROMs to save a few cents on every console.

That resulted in every time the console was moved the disk crashed into the laser diode, resulting in destroyed disks and CD drives. Which cost MS a ton of money.
They tried to recoup it via charging people 20$ per replacement CD.

Their internal memos show their engineers WARNED that "saving" those 50 cents would cause that before the xbox was sold to the public and yet management overruled them because they know better then the engineers. :rolleyes:

Dedicated eDRAM on the Xbox 360 GPU meant that MS could cheapen out on total system RAM and memory controllers
How do you explain that not every nVidia and AMD and intel GPU comes with such an amazing innovation?
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
Not surprising. Sony and MS lost a sack load selling at a loss while Nintendo used old tech and made a profit on each Wii sold from day one. Just look at the history of console sales. The most graphically powerful system hasn't sold the most units in over 20 years since the SNES. N64 most powerful, PS 1 sold the most. Original Xbox most powerful but PS2 sold the most. PS3 most powerful, Wii sold the most.

I don't mean to be "that guy" and really just want to strengthen the argument you put here, but I think a better example may be the NeoGeo (which outclassed literally everyone in that generation). It is just really difficult to launch a cutting edge piece of hardware and keep prices in line to maintain sales. This was managed on this SNES by using a thoroughly underpowered CPU compensated by the on cart DSPs that shifted the cost to the titles and away from the console itself. Another great example is the Dreamcast, which was leaps and bounds ahead of the PS1 and N64 at launch and actually seemed capable of trading blows or at the very least remaining in the same ballpark as the PS2 and still fell way short of expectations. Success is all about a balance between capable hardware, an attractive price point and a quality software ecosystem. The success of the Wii along side the (also successful) PS3 and XB360 shows there are multiple combinations of these values that can work as well.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
There are several people claiming that MS/Sony should use hardware that supports DX11+. Maybe they mean that as in the feature set not the API.

And the second point is kinda hurting my head. They won't go with off the shelf parts, that would be insane. Cost is the most important metric for consoles, you can bet that MS/Sony will spend as much money on R&D as necessary to get more out of cheaper hardware, because that pays off immensely in the long run .

Also, by the midpoint of most console's lifetime the marque developers are trying to get closer to metal rather than taking the performance hit of more abstraction.
 

Darklife

Member
Mar 11, 2008
196
0
0
Well it looks like the drought of graphically AAA games will be briefly interrupted by a sprinkling of console DX 11 games, before coming back in full force for yet another 5 years.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,653
2,494
136
A more realistic (but still unrealistic) number would be 3.2*3(cores)*2(32bit FP multiplies per vector unit) = 19.2 gigaflops.
I believe Xenon has 2 128-bit vector units per core, so ~80G 32bit FPmuls per clock. Also, it has fma, so you could make a credible case for ~160GFlops (that's how everyone else reports them).

There's of course no real workload that could ever achieve half of that, but the theoretical performance is certainly big. :)
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
How do you explain that not every nVidia and AMD and intel GPU comes with such an amazing innovation?

Money ;)

eDRAM is made of silicon, so it cost less over-time due the better yields

amd and nvidia uses gddr5, because gpus have shorter life cycle
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,653
2,494
136
How do you explain that not every nVidia and AMD and intel GPU comes with such an amazing innovation?

What Olikan said, but also that the tiny frame buffer requires software to be hand-coded for it. 360 cannot run games that are not aware of the memory configuration at all -- and since dektop GPUs always need to be able to run the last-gen games, you cannot build such a thing on the desktop until you have enough space that you can fit most common framebuffers there.

Intel is actually rumored to be introducing something similar (at a more reasonable size) in their Haswell cpus.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Both are false.
Both were released with horribly weak CPUs compared to PCs, outdated GPUs (at a time where a massive GPU improvement occured with the DX10 parts), and ridiculously little ram.

huh? no man, when they were released their specs rivaled high end gaming rigs, they were beasts at the time of release!
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Money ;)

eDRAM is made of silicon, so it cost less over-time due the better yields

amd and nvidia uses gddr5, because gpus have shorter life cycle

I thought you said it was better not worse and cheaper.

huh? no man, when they were released their specs rivaled high end gaming rigs, they were beasts at the time of release!

If by beast you mean "use a special cut down version of PC hardware that is less then 50% the speed of current hardware"
I even showed benchmarks documentation for the PS3 hardware on that one.
 

CNelsonPSU

Member
Jul 10, 2005
28
0
0
I hope that an important factor in these console design discussions is including some sort of anti-aliasing, hopefully as a standard for all games. Outstanding looking games like Uncharted 3 are also painful to look at sometimes with the amount of aliasing going on in many scenes.
 
Status
Not open for further replies.