[ArsTechnica] Next-gen consoles and impact on VGA market

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

railven

Diamond Member
Mar 25, 2010
6,604
561
126
This part is not true at all. I remember exactly as I followed it very closely.

Xbox 360 had specialized eDRAM that allowed a more efficient use of AA with a smaller performance hit. The GPU was somewhere between R520 and R600 derivative since it had unified shaders. Sure it was gimped due to lower memory bandwidth but its theoretical preformance was very good. I would say easily on par with X1800XT/X1950Pro at the time. When Xbox 360 launched, this type of GPU power was top of the line for AMD's GPUs, actually ahead by half a generation.

PS3 came 1 year later and wasn't as impressive. The GPU in the PS3 was pretty much a derivative of the GeForce Go 7950GTX with half of the memory bandwidth. Alternatively, that would be like a 7950GT with half of the memory bandwidth of the desktop part. At the time, the only faster NV GPU generation on the desktop was GeForce 8800 series that launched 2 days after the PS3. Sure you could say 7950GT was outdated by then but it was impossible to include GeForce 8 in PS3, especially since 8800GTX cost $599 by itself, with the 8800GTS 640 model going for $449. That's not even considering how much power GeForce 8 used. Therefore, the GPU in PS3 was the fastest possible NV GPU at the time given the form factor (top-of-the-line mobile NV part).

Specs wise, the GPU in PS3 was the previous generation upper-high-end desktop part (i.e. equivalent of GTX570 today or GTX6970 with half their memory bandwidth).

By holiday 2013, if PS4 and Xbox Durango were to use "equivalent" power GPUs, then the Durango would use at least some HD8000 style GPU, while PS4 would have the previous generation high-end NV equivalent (So ~ GTX670Ti level of performance with half the memory bandwidth of the desktop version).

Obviously, we are not going to see anything like that this round.

This doesn't bode well for PC gaming unless developers decouple game development and start building games from the ground-up to take advantage of PC's hardware. If we thought the last 5 years of console ports were stagnating PC gaming, then this 8th generation of consoles is about to give us a whole new meaning of stagnation.


Spot on. I wanted to chime in with this, but course late as usual.

They will most definitely be using DX11 capable hardware, that much is a given - even the Wii U I believe is using a HD 5-series based chipset (5670?).
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Don't cross your fingers for the PS4 to use current gen desktop performance top-tier PC class hardware.

From that article the PS4 is going to use a derivative of an AMD 7XXX series GPU and some sort of x86 CPU. I wouldn't be surprised to see them use some sort of AMD fusion chip with GPU/CPU in a single package.

All they're shooting for is 1080p without upscaling and 3D. 7850 would be enough to do that, a console gamer would flip their lid at having 7850 level performance without windows overhead and coding directly to the metal.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Spot on. I wanted to chime in with this, but course late as usual.

They will most definitely be using DX11 capable hardware, that much is a given - even the Wii U I believe is using a HD 5-series based chipset (5670?).

Yeah. It's going to be worth a chuckle seeing Nintendo having the most powerful console for a while again. As far as gaming it will be the only console I would bother with until Microsoft and Sony have their new ones out.

I'm assuming the Wii-U should be getting all the non-exclusive titles the Wii couldn't now that it has more capable hardware.
 

Golgatha

Lifer
Jul 18, 2003
12,395
1,067
126
Both are false.
Both were released with horribly weak CPUs compared to PCs, outdated GPUs (at a time where a massive GPU improvement occured with the DX10 parts), and ridiculously little ram.

I do think they put a lot of money into the CPU relative to the GPU though, and yes, the amount of RAM they provided was ridiculously small. I think the next-gen consoles will scale back on the CPU, as hardly any games are CPU limited anymore, and focus their dollars on improving the video portion of the console. Here's to hoping they will put enough RAM in there to be competitive with PCs.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I don't recall the xbox situation as clearly right now, I am on 2 hours of sleep and don't feel like looking it up. I just remember it was sub par when released. Lets focus on the PS3 for now (I promise I will do the xbox later)

1. The GeForce Go 7950GTX with half the memory used in the PS3 could not hold a candle to the GTX 8800GTX. The execute of it being too late for the PS3 to change is meaningless, the PS3 was itself in development and it was developed using existing outdated PC hardware rather then its own next gen hardware.
2. The GTX8800 was 600$ on release, the PS3 was 650$. The 8800 was cheaper...
Since you already have a computer (no matter who you are you need one... although nowadays some nutcases use laptop only) then you could just buy the video card for your existing computer.

The PS3 was unique in that it wasn't ONLY for gaming like other consoles, it had the BR which is added value... your typical console IS fair to compare to the price of just the video card.
3. The Geforce Go is the laptop variant not the full powered variant.
4. It was a cutdown Geforce Go... how much did a full desktop 7900GTX cost at the time? (cheaper and more powerful then laptop counterpart)

Geforce Go 7950 GTX benchmarked http://www.anandtech.com/show/2226/7
Geforce 8800GTX benchmarked http://www.anandtech.com/show/2116/25
First one I saw on both is battlefield 2 4X AA test @1920x1200
The GeForce Go 7950 GTX gets ~62 FPS
The GeForce 8800 GTX gets 127.4 FPS, literally more then twice.
The GeForce 7900 GTX gets 68.9 FPS
The GeForce 7950 was used only for GX2 which was a dual GPU video card.

Bottom line is, you could spend 650$ for a bluray + gaming console that had no games nor blurays at the time...
Or you could spend 600$ on a video card for gaming (works will all existing games) and GPGPU (which had no programs for at the time. I call it as it is not play favorites) that literally had more then twice (~2.05x) the performance of the GPU that was cut down into a cheaper lower performance part for the PS3.

And the 62 FPS? that is too high since the version used in the PS3 is a special cut down version with half the RAM too. (what is it with MS and sony hating ram? Ram is so cheap and so useful! developers had to riot for MS to give them 512MB instead of 256MB which was pathetic for the time already since PCs were at 2 or 4GB already; Sony has 512MB but 256MB is reserved for OS and only 256MB is available to the game)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That is a good point.
Sony lost money on the PS3 due to blue ray not due to the cell. The cell was costly for a CPU, but it accounted for very little of the cost of the PS3.

Not true at all. Cell was the 2nd most expensive single item in the PS3 behind the $350 Blu-Ray drive. At launch it was estimated that the Cell cost Sony >3x what the G70 GPU cost. Talk about making an unbalanced gaming system.

It was not until the Slim PS3 launched and several node shrinks that the Cell became cheap. Even by December 2009 Sony was still losing $ on every PS3 sold.

The xbox360 always sold at a profit, MS gaming department lost millions due to how many they had to replace under warranty.

Also, not true at all.

At launch, Xbox 360's Bill of Materials was estimated at around $500-525. The ATI GPU in 360 cost 2x more than G70 of PS3. It took about a year until MS was rumored to start making a tiny profit off Xbox360 hardware. Because of high failure rate of consoles, the division was still operating at a loss however.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I believe both consoles hardware cost them to sell at a loss for a couple years. That's without Xbox's additional warranty headaches.
http://arstechnica.com/gaming/news/2006/11/8239.ars

Also as to the power of the cpu's in them. There was a lot of stories how in specialized software , certain tasks were done faster on the PS3 cell processor and the government was buying them for projects.
Military purchases 2,200 PS3s


http://www.gamespot.com/news/ps3s-used-to-capture-child-pornographers-6240562
 

Ieat

Senior member
Jan 18, 2012
260
0
76
That is a good point.
Sony lost money on the PS3 due to blue ray not due to the cell. The cell was costly for a CPU, but it accounted for very little of the cost of the PS3.

The ps3 would have lost money with or without the blu-ray drive. The br drive added maybe $110 over the cost of a dvd drive. Although some of their prices are probably overestimated. The iSuppli tear down puts Sony at losing $240 to $305 per unit sold.

http://www.emsnow.com/cnt/files/news/11_17_2006a.gif

The xbox360 always sold at a profit, MS gaming department lost millions due to how many they had to replace under warranty.

Again the teardown says differently. I highly doubt MS was making money in 2k5 at $299 for the core edition.

http://www.pcpro.co.uk/news/80708/isuppli-reckoning-the-xbox-bill-of-materials.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The original PS3 didn't have a traditional GPU, it was buddied up with 2xCell processors and was expected to cost into the thousands until Sony brass realized Ken Kutaragi was probably losing his mind (haha.)

The decision to scrap a lot of the things in the PS3 was to reduce cost and one of those decisions was to replace the second Cell with a cheaper Geforce derivative.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76

The ps3 would have lost money with or without the blu-ray drive. The br drive added maybe $110 over the cost of a dvd drive. Although some of their prices are probably overestimated. The iSuppli tear down puts Sony at losing $240 to $305 per unit sold.

http://www.emsnow.com/cnt/files/news/11_17_2006a.gif

Interesting just how staggeringly different the numbers each of those charts present for each component.
Is the BluRay 350$ each like from the link by Russian Sensation or 125$ each like the link by Ieat says.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1. The GeForce Go 7950GTX with half the memory used in the PS3 could not hold a candle to the GTX 8800GTX. 2. The GTX8800 was 600$ on release, the PS3 was 650$. The 8800 was cheaper...

It's pretty obvious what the flaw in this logic is without even considering that it was physically impossible to fit 8800GTX into a PS3 chassis. The GeForce 8800GTX would have meant >$1000 price for PS3 at launch since the Cell + Blu-Ray drive were already very expensive... OR it would have meant Sony losing $500-600 on each console sold for years and years. Sony would have needed every gamer who bought a PS3 to have purchased at least 30-40 games just to break even had they put a $500-600 desktop GPU into the PS3.

Since you already have a computer (no matter who you are you need one... although nowadays some nutcases use laptop only) then you could just buy the video card for your existing computer.

There isn't even a point of arguing this and it doesn't have anything to do with discussing relative hardware of consoles vs. PCs and how it applies to how PS4 and Durango would stack up against PC (or rather how underpowered they are rumored to be). Not here to have a 10 page argument of PC vs. console gaming or the cost of PC vs. console games or how imo 8800GTX is worthless for modern PC gaming, or how console gamers like playing in the living room on a large TV on the couch, etc. etc. etc.

Bottom line is, you could spend 650$ for a bluray + gaming console that had no games nor blurays at the time...
Or you could spend 600$ on a video card that literally had more then twice (~2.05x) the performance of the GPU that was cut down into a cheaper lower performance part for the PS3.

This has nothing to do with the discussion at hand whatsoever. No one is arguing if it was worth buying a PS3 or a PC at the time in this thread. We are comparing how console hardware matched the power of PC hardware at the time. There was little chance to build a PC machine that could last 7-8 years with equivalent performance to PS3 for $600 at the time. I know I was there. My Core 2 Duo CPU alone cost $200 and 8800GTS 320mb card was almost $300. I remember buying 8800GTX Ultra for $800 at Futureshop, before taxes too. The $300 8800GTS 320mb card was worthless not even 1.5 years after release as games pushed 512-768mb of RAM usage in no time. The alternative was a $400 8800GTS 640mb that was also pretty much worthless just 2 years out.

Still you asserted that the GPUs in Xbox360 and PS3 weren't that powerful but they really were. Current rumors all point to the next generation of consoles not being as advanced for our time as the PS3 and Xbox 360 were for their time. By end of 2013, HD7970 will most likely be a mid-range GPU. If they put a low to mid-range HD7000 series in the next PS4, that could have significant consequences on PC gaming 4-5 years out.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
You are unfairly giving a lot of a leeway to the console just because its a console.
And your arguments are nothing but a list of excuses as to WHY the PS3 hardware was obsolete hardware the moment it was released. But it does not mean it wasn't obsolete the moment it was released.

With the exception of the claim that cell was a powerhouse (that isn't an excuse but a claim that the hardware was superior in quality). Which happens to be false.
The cell is a giant chip unnecessarily combining 1 real CPU + an 8 core array of atom like CPUs (like the canceled larrabee).

And even for that it was obsolete the day it came out since nVidia came out with CUDA hardware 3 days earlier. Making a real CPU plus nVidia GPU running a CUDA app a far superior choice
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Well them goalposts got moved rather quickly.

Just to summarize, what was already said, the 360 was definitely advanced for it's time (first product to ever use unified shaders - a new defacto standard before PCs had it), the PS3 not as much due to the year tardiness (had it launched with 360, yes it be advanced.)

Unfortunately, PS4 and "Durango" will not have the same thing to claim, which means we might see even WORSE stagnation as PC gamers.

Humbug.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Just to summarize, what was already said, the 360 was definitely advanced for it's time

I am pretty sure it wasn't. But I don't have all the data memorized like I do for the PS3 from the many times I had refuted it in the past... which means I would have to do research again to re-refute that claim that it was advanced for the time. (I know for sure it had pathetic amounts of RAM and its CPU was bad).

And right now I am a bit tired of this argument and busy to do so.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I am pretty sure it wasn't. But I don't have all the data memorized like I do for the PS3 from the many times I had refuted it in the past... which means I would have to do research again to re-refute that claim that it was advanced for the time. (I know for sure it had pathetic amounts of RAM and its CPU was bad).

And right now I am a bit tired of this argument and busy to do so.

When you have your data checked, we can discuss it, since you won't, guess we can't.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
When you have your data checked, we can discuss it, since you won't, guess we can't.

Right, so until I reacquire and post ALL the proof that it was ALL pathetic for the time (since we got that the CPU and RAM were pathetic but there is argument over whether the GPU was advanced or also pathetic) we must arbitrarily decide to assume that it is was advanced for its time rather then pathetic for its time... just because.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
As I've already said on previous occasions, something like AMD getting in next gen consoles will hinder further the attempted locked down push of hardware PhysX.
Unless there are NV GPUs in next gen consoles, or NV open up PhysX to run on other hardware, it's not going to make much headway in terms of widespread adoption while it remains PC only.

Hopefully AMD will win both Xbox and PS next gen GPU s, and NV will be forced to open things up a bit, and maybe developers will start to use more other open standards such as for 3D implementations.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
A few people have mentioned the 360 warranty issue, but I thought we should add that to the discussion from a different perspective:

The very advanced x1900xt-like VGA in the Xbox360 was far too immature to put into a video game console, leading to excessive heat, power draw, and failure.

MS and Sony (via MS's example) know better than to put cutting edge graphics hardware in a consumer product this time around. It simply doesn't make business sense. When you're talking shipments in the millions, reliability is much, much more important that pure technological prowess.

If it's true that the next-gen will use the equivalent of the 6670, then graphics power in 2013/2014 will only be about twice what it was in 2005, whereas PC hardware will be approximately 16x faster. That's as clear a statement as any on the priorities in the console business.
 
Last edited:

SHAQ

Senior member
Aug 5, 2002
738
0
76
People may as well game on a phone or Ipad if those specs are true. I predict Steam will continue its' record growth. A $500 PC will beat a next gen console handily.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Honestly I stopped caring about graphic improvements with the unreal 3 engine.
This is also about the time 3D games image quality finally surpassed 32bit sprite games. (although you could do much more in a 3d FPS game... too many 3d RTS games with a fixed camera looked worse then their 2d predecessors... like Kohan 2 vs 1. And Warcraft 3 vs 2.)

What I want in games is not graphics... it is solid stories, polished gameplay, QCed games that work, a sane interface that was designed for the platform in which you play, and good input choices (choice between Mouse and Keyboard and controller, with well designed input engine).

The deluge of crap games whose only positive attribute is the size of their video engine department is very disappointing.
We like to cry "console ports" but I have also seen "pc ports" on occasion and when I think about it honestly and without bias, games that have a crap interface and controls on the PC tend to also have a crap interface and control on the consoles. Games which are buggy as all hell on PC tend to be buggy as all hell on a console.

So really the issue is not console port is PC ports... its shovelware.
 
Last edited:

Towlie

Junior Member
Mar 28, 2012
12
0
0
UE3 looked fantastic 6 years ago, it is showing its age today. There is nothing to stop a game looking great and being fun to play, and graphical fidelity arguably improves the quality of flight simulators (pity the game quality is decreasing, see new MS FS).
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
I'll just leave these here:

Wii-U
http://www.fudzilla.com/home/item/26617-wii-u-less-powerful-than-ps3-xbox-360
http://www.fudzilla.com/home/item/26616-wii-u-to-launch-on-november-18

Durango (720)
http://www.fudzilla.com/home/item/26613-durango-said-to-have-two-gpus

Orbis (PS4)
http://www.fudzilla.com/home/item/26612-ps4-to-arrive-ahead-of-xbox-next?

Additionally, there was an assertion that games will be more developed towards AMD optimizations rather than Nvidia. Well, maybe. To that end, would you (collective) say that Nvidia addresses driver related issues more quickly than AMD? This is scratching the surface of the questions that would come from what's in this article.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
"The Xbox 360's Xenon processor, a three-core six-thread PowerPC unit running at 3.2 GHz, had a theoretical peak number crunching throughput of 115 gigaflops. A contemporary Pentium 4 at 3 GHz had a theoretical peak of around 12 gigaflops when the system launched."
Shame on Ars for it's pure BS technical information. 115 gflops? Lol. 3.2*3(cores)*6(threads)*2(issue width) = 115 so that's where they get the math. However, it's pure BS math. Supporting SMT does NOTHING to increase your max theoretical computation power. SMT only helps fill in the pipeline bubbles and increase utilization. So multiplying by SMT thread count is pure BS. Also, the P4 had HT too so why not multiply it's numbers by 2x? Also, the Pentium D was released before the Xbox so why not use that for comparison?

A more realistic (but still unrealistic) number would be 3.2*3(cores)*2(32bit FP multiplies per vector unit) = 19.2 gigaflops. For the Pentium D, it would be 12.8 Gflops. However, in the real world, the Pentium D is actually much faster because it's a more fully featured OOO processor.

So no, the 360 and ps3 did not have particularly powerful processors when they launched. The fact that these expensive custom processors were outclassed by commodity cpu's at launch is not doubt responsible for less ambitious designs we will probably have with the next generation. All things being equal, a customized design will beat a generic design. However, things aren't equal. The benefit the commodity design has is that is has a very lean development timeline compared to the lengthy custom design's development time. Also, the commodity designs are very well optimized compared with custom designs which have not had the benefit of having multiple generations to mature into a more refined product. And commodity designs are cheaper too.
 
Status
Not open for further replies.