[ArsTechnica] Next-gen consoles and impact on VGA market

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Both are false.
Both were released with horribly weak CPUs compared to PCs, outdated GPUs (at a time where a massive GPU improvement occured with the DX10 parts), and ridiculously little ram.

A triple core, hyper threaded 3.2 GHz CPU was "horribly weak" at the time? Yes, I know those advantages are counterbalanced for the 360's CPU by it being an in order processor, but it still hardly seems weak. The 360 had last gen top of the line graphics (imagine buying a $400 PC with a Geforce GTX 580 in it. Yeah, not happening, but that's what the 360 did.) Memory wasn't so much of an issue because consoles didn't have huge memory overhead with Windows and could code directly to the hardware. Also, the 360 has an 10 MB eDRAM die to help make up for it.

The PS3 is harder to compare because it uses a much different architecture, but the point is that the 360 and PS3 were comparable to all but the fastest PCs when they were released, and the faster ones probably cost 2 to 3 times more than the consoles.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wow this thread reminds me of the ole skool 360 vs ps3 wars back in the day. It looks like the boys never get to old to argue about the some ole things over and over.

Well did you not notice the thread of the title? We are just discussing possible impact on PC gaming and how that may affect us. No one is really arguing against consoles or 720 vs. PS4. We are just discussing possible specifications for either console and how those will stack up to today's GPUs vs. how 360/PS3 compared to PC graphics at the time. This sort of scenario allows us to see how much PC gaming will be held back by gimped console graphics. There is little dispute that it becomes very costly to develop a next generation game engine/game that will only run on the PC. Thus, it's pretty important that PS4 and Xbox720 have somewhat decent graphics if they are to survive another 7-8 years. This may not matter to console users but many developers have stated that they want next generation consoles to be a lot more powerful.

EPIC even did the whole Knight Unreal Engine 4.0 demo to showcase their next gen engine and let them know what kind of graphics they would want them to put into their consoles. It's not MS and Sony will follow-up on their advice but to assume that console graphics don't matter is odd. If you look at PC gaming, outside of Metro 2033, BF3, Crysis 1 and Witcher 2, hardly anything stands out. The far too extended current console run has resulted in vastly underpowered console hardware that has held back PC gaming. If you are not a PC gamer then it may not matter to you.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,691
2,595
136
A triple core, hyper threaded 3.2 GHz CPU was "horribly weak" at the time? Yes, I know those advantages are counterbalanced for the 360's CPU by it being an in order processor, but it still hardly seems weak.
In-order with a crappy memory pipeline really is that bad. By Microsoft's numbers, you can expect 0.2IPC on each thread when reasonably optimized. For integer, you can get considerably more real throughput out of a single-core P4, with really a lot better programming model. For floats, it had no real desktop equivalent until Core 2 came out.

IMHO, the CPU was the major failing of the console. IBM wanted to sell MS a proper OoO core, they decided that they just couldn't get the throughput they felt they needed to rival what they thought Sony would have out of that, and so went with a trio of much simpler and crappier CPUs. Of course, Cell was a huge bust, and the "throughput monster" that MS got is nearly as pathetic.

There was a bout of in-order insanity going on in the industry at the time -- it seems we contract that every 10 years or so.

The 360 had last gen top of the line graphics (imagine buying a $400 PC with a Geforce GTX 580 in it. Yeah, not happening, but that's what the 360 did.)
Actually, thanks to unified shaders and the ROP bandwidth provided by the eDRAM, it had a GPU that was just plain better than the high end. You could not buy a PC GPU that was as good until 6 months later.

Memory wasn't so much of an issue because consoles didn't have huge memory overhead with Windows and could code directly to the hardware.
The memory overhead of Windows is vastly overstated. Roughly at launch day, top of the line PC titles could expect 1GB of main ram and 256MB of GPU ram from the medium-high end. You could use at least 800MB of that main ram without much problems. I know that the game devs I talked with felt heavily constrained by the memory capacity of the consoles in the first year, and it just got a lot worse from there.

Also, the 360 has an 10 MB eDRAM die to help make up for it.
The eDRAM does not really help memory capacity, but oh does it help memory bandwidth.

The PS3 is harder to compare because it uses a much different architecture, but the point is that the 360 and PS3 were comparable to all but the fastest PCs when they were released, and the faster ones probably cost 2 to 3 times more than the consoles.
I'd say 360 was very competitive when released, no question. However, PS3 doesn't get that easily. The GPU wasn't just underpowered, it was outdated on release, the CPU was a monstrosity, in a bad way, and the memory shortage was made a lot worse by split pools and huge reservation for the OS made by Sony.
 

cplusplus

Member
Apr 28, 2005
91
0
0
At the same time, HD7850 will probably be $150 card by end of 2013. So really, it's feasible to put that in but probably for cost reasons we won't see it. I honestly think 7850 would have been perfect. It's seriously much faster than 6770 and it has 2GB of VRAM. It consumes under 100W in desktop form. That means a mobile version of this chip is probably doable in under 50-60W.

Cost is the other big factor people don't seem to be mentioning in this argument. Most breakdowns had the PS3 being about $700-800 worth of tech being sold for $499-$599 at launch, and the 360 was bout $500-600 worth. Since TV display tech has basically plateaued for the next 3-5 years as far as they need to be concerned (maybe 120Hz and/or 3D become more popular, but 4K isn't really going to be in people's houses for quite some time), and there's a lot of people who think we're already at the "good enough" point as far as graphics, both Sony and Microsoft are probably looking to pull a Nintendo and start actually making a profit with each console sold at launch, and going cheaper with the graphics cards looks like one of the easiest ways to shave a few dollars off the cost. And I think both know that they can't come out over $400 either, especially with the economy the way that it is.

2GB is skimping it... the 4GB 32bit barrier delayed adoption for years but we are finally breaking it and there are some games that require more. And RAM is dirt cheap and gives huge massive advantages. Especially in loading times.

They should be using 4GB at absolute minimum. 8GB is preferred. 16GB is unrealistically expensive for a console (although it would of course find use).
And thats for system ram only, GPU should be having its own dedicated 2GB of GDDR5.

Of course, in the past they used shared ram between system and GPU so it would not surprise me if they do so again.

There's no way that they need 8GB of RAM considering that there's almost no memory overhead for a console. A PC needs that much because of Windows and everything else running on it. All a console would be running is the game, the underlying OS that's designed to be as small as possible, a music player for custom soundtracks, and if they really wanted to humor people, a web browser so that you can look things up on the internet without quitting the game. So even the most strenuous games would end up needing about 3GB, 4 would probably be future-proofed enough for the life of the console, and 6 would basically be overkill, even with RAM being as cheap as it is now. And going from mechanical hard drives to SSD (and to faster optical drives, whether BluRay or something else) would lower the loading times enough that they don't need to throw in the added RAM for that.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
The eDRAM does not really help memory capacity, but oh does it help memory bandwidth.
It does help a little with capacity as well as you do not store unresolved MSAA buffers to main memory.
With 720p 4xMSAA buffer the difference is ~11MB to 'traditional' GPUs.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think they might have even more eDRAM for 720's GPU to better handle MSAA. That could be a cost savings over putting 2GB of VRAM on the GPU. Although long-term, the developers always complain about RAM/VRAM as the biggest bottlenecks.

cplusplus, agree on your points. Sony is bleeding financially, posting worst loss in 50 years of its history. Can't see them pulling a PS3 style powerhouse PS4.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
A triple core, hyper threaded 3.2 GHz CPU was "horribly weak" at the time? Yes, I know those advantages are counterbalanced for the 360's CPU by it being an in order processor, but it still hardly seems weak.
It was not only in-order. It also had too little cache, and the 3.2GHz was achieved by common instructions having P4-like latencies. The main memory latency only compounded the starving cache. It would be almost like a triple-core Atom, except such an Atom would have ton more cache, and Intel's historically strong memory performance.

They should use dx11.1 compliant nvidia hardware or they should just do 2 upgraded multicore cells each with their own memory. Those cells would need to have double fp precision and some texture units that could do full trilinear mipmapping plus hq af as good as nvidia's hardware as well as boundless texture support ... because that would be better than Dx11.0 AMD GPUs.
Not really. They would also need to be arbitrarily addressed (actually, they needed that anyway, even though it would have made things "too complicated"), support hardware multitasking (including on-chip shared memory support without a high-latency ring), and have way more memory bandwidth. It's a nice idea for A/V DSP, and SPEs are still being implemented in new chips, but for 3D? Nah.

Basically every bad idea, except general-purpose VLIW, in the least 20-30 years of microprocessors, managed to show up in the PS3, and somewhat in XB360 :).
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
I think they might have even more eDRAM for 720's GPU to better handle MSAA. That could be a cost savings over putting 2GB of VRAM on the GPU. Although long-term, the developers always complain about RAM/VRAM as the biggest bottlenecks.
Considering how different all rendering methods are today, they need to change how edram is used as well.

In x360 ROPs could only write into eDRAM and shaders and texture units couldn't touch it at all.
Hopefully this is addressed in x720 with some other things. (programmable ROPs?)
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Interesting read, any idea why it was pulled?

Something about Anand was afraid it could be tracked back to his sources...Sony and MS don't like their new toy to be shown in a "bad" (read: real) light...but it does show that people just quoting empty specs are a waste of time...

I remember same lame claims around the time consoles launched...they were debunked when the few multipatform games out there showed that not not did the PC deliver more FPS at a higher resolution...the PC versions also ran with better I.Q.
(and you didn't suffer from the controller lag...attempted fixed via autoaim.

Even if they only had a 7800GT (not x)...so the pipedream about consoles a powerhouses are no more that that...a pipedream.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
I remember my humble Core 2 Duo E6300 and 7900 GTO running Crysis at medium settings at 720p - more than either console can do.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
I remember my humble Core 2 Duo E6300 and 7900 GTO running Crysis at medium settings at 720p - more than either console can do.

Maybe from today's perspective. Don't get me wrong, I'm a PC guy, last console I owned was the original PS and both xbox and ps3 combined, I total maybe 10 hours gaming.

IIRC, back then, most midrange and high end PC's were running AMD X2 chips, ran X2 4400+ myself. Cost ~$300 at launch, high end model 4800+ was $500 while the FX editions and select Opterons were pushing towards $1000. I paid $100 for 2x1GB sticks, hefty premium on cas2 memory back then.
You can build a much better rig for $600 today than you could back then.

I don't care enough to go hunt down links but there have been a few in depth analysis articles about consoles in the past. All said pretty much the same thing, the hardware was generally inferior to (gaming) PC, with some advantages in few areas but the main theme throughout these articles was that the power/performance/efficiency ratios were off the charts.
 

jpiniero

Lifer
Oct 1, 2010
17,197
7,570
136
I think you guys are underestimating the importance of the launch price point. Ideally, it would be $199, but that seems unrealistic. But much more than $299 is just asking for trouble. If either thinks they can sell at $399+ at launch, they are just asking for something like Ouya to take over. And as others have mentioned, Sony isn't in a position to sell consoles at a big loss anymore, so it has to be close to cost at worst.

AMD is desperate for business, but not that desperate given that nobody wants to work with nvidia. So mobile gpu parts are not an option.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So mobile gpu parts are not an option.

I am not sure about this part though. AMD has GCN in mobile offerings such as Pitcairn in the form of HD7950M and 7970M. They obviously don't have to put something that powerful into a console but HD7870M is doable for sure. Also, keep in mind these consoles aren't even launching until Q4 2013. That means they can have 6-8 more months from today to finalize the design and start manufacturing them in May-June for end of 2013 launch. In that span of time, we'll have HD8000 series and GTX700 series. Current series will be on fire sale prices which would allow MS/Sony to lock in good prices.

Still, I tend to side with the idea that because of the economy, they'll try to keep the consoles below $450, probably $400. Rumours have it that the Wii U will be $299. I can't see MS launching for $299 with Kinect 2.0. Kinect is a pretty expensive device. I just think MS will instead put $ into these features as opposed to go full out on graphics. Still if you look at Uncharted, God of War and Gears of War, the graphics are very impressive for how weak the consoles are. Back when PS3 came out, I had a $300 8800GTS 320mb and it ran Crysis like a dog, something like 23-24 fps at 1280x1024 High. It was bad.....
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
It was not only in-order. It also had too little cache, and the 3.2GHz was achieved by common instructions having P4-like latencies. The main memory latency only compounded the starving cache. It would be almost like a triple-core Atom, except such an Atom would have ton more cache, and Intel's historically strong memory performance.

Don't forget the 128 bit VMX units on each core in the Xenon. Though, I'm sure the cache really holds it back.

That said, I really hope to see Cape Verde in either the PS4 or Nextbox. It's too good a GPU in terms of performance, power needs, TDW, and feature set. It's not bleeding edge, but it's a great increase in performance compared to the previous generation without costing very much.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Yes, because it was an inferior powerPC architecture which is slower then a dual core with lower ghz using a x86 architecture of the time

Sure, but a decent dual core x86 processor also generally sold for the same price as the 360 on launch.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Sure, but a decent dual core x86 processor also generally sold for the same price as the 360 on launch.
Ding! Doesn't matter how many million you order, it still costs something to make them.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,691
2,595
136
Yes, because it was an inferior powerPC architecture which is slower then a dual core with lower ghz using a x86 architecture of the time

I'd like to point out that the difference isn't ppc vs x86. There were a couple of really fast ppc cores out there back then, which would have been more than a match for x86. It's just that the one MS ended up getting was a cheap piece of shit, that's considerably worse than Atom clock-to-clock.


Sure, but a decent dual core x86 processor also generally sold for the same price as the 360 on launch.

You don't even need a dual core. A single 3.2GHz P4 core, running two threads, was faster than all the 6 threads on a Xenon at launch, in most of the common tasks that games do. (The xenon still would win in raw FP throughput, so long as you didn't actually want to use anything "complex" like data structures.) The P4 has less die area, and is thus cheaper to make. (Getting Intel to license it would likely have not been cheap, but as I said, there were better ppc cores out there back then too.).

The Xenon is absurdly weak. Wanna save a parameter on stack, call, and then load it? Boom, that's 60 cycles.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Maybe from today's perspective. Don't get me wrong, I'm a PC guy, last console I owned was the original PS and both xbox and ps3 combined, I total maybe 10 hours gaming.

IIRC, back then, most midrange and high end PC's were running AMD X2 chips, ran X2 4400+ myself. Cost ~$300 at launch, high end model 4800+ was $500 while the FX editions and select Opterons were pushing towards $1000. I paid $100 for 2x1GB sticks, hefty premium on cas2 memory back then.
You can build a much better rig for $600 today than you could back then.

I don't care enough to go hunt down links but there have been a few in depth analysis articles about consoles in the past. All said pretty much the same thing, the hardware was generally inferior to (gaming) PC, with some advantages in few areas but the main theme throughout these articles was that the power/performance/efficiency ratios were off the charts.

I built my PC for £800 all in, including KB+M and monitor. I was looking at getting an XBox and the cheapest decent 28" LCD HDTV (720p) I could at the time and it was more expensive coming to about £850. In the end I still have my PC and I sold my Xbox a long time ago, there's just nothing that's decent on it that I can't get on PC.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I built my PC for £800 all in, including KB+M and monitor. I was looking at getting an XBox and the cheapest decent 28" LCD HDTV (720p) I could at the time and it was more expensive coming to about £850. In the end I still have my PC and I sold my Xbox a long time ago, there's just nothing that's decent on it that I can't get on PC.

It's going to be even harder now to consider a console:

1) Upgrade costs will be way less than £850, esp. if you resell older parts and because modern OCed Intel CPUs last so much longer for games.

2) PC games generally cost less than console games

3) Next generation consoles won't be £200 but probably more, making the initial cost of a console as expensive as a new GPU upgrade.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I'd like to point out that the difference isn't ppc vs x86.

I specified it was x86 processors OF THE TIME.
I am well aware that there are large differences between processors of compatible architecture.

Sure, but a decent dual core x86 processor also generally sold for the same price as the 360 on launch.
1. PC has so many more uses.
2. You likely had a PC already and just needed GPU upgrade
3. Games are mostly GPU bound.
4. The cheapest xbox360 on launch didn't even have a HDD, a digital output cable (overpriced proprietary adapter), or the necessary amount of controllers and was 300$. It was subsidized but then they charged too much for the peripherals. Your assertion that a minimum of 300$ was needed for a good PC CPU at the time is simply false.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
It's going to be even harder now to consider a console:

1) Upgrade costs will be way less than £850, esp. if you resell older parts and because modern OCed Intel CPUs last so much longer for games.

2) PC games generally cost less than console games

3) Next generation consoles won't be £200 but probably more, making the initial cost of a console as expensive as a new GPU upgrade.

Agreed. I also reckon that those of us with powerful DX11 cards already will be in for a nice "free" boost in visual fidelity when next generation consoles come out because they will use DX11 features properly from the ground up (Star Wars 1313 and Watch Dogs spring to mind as obvious examples).

On another note, I've noticed that a lot of "gamers" on this site and numerous others have been stating that a new generation isn't needed "because it's about the gameplay not graphics" which is true, yet this generation has been going for 7 years and I don't see the supposed influx of games with great gameplay. If anything the market has become stagnant. It really annoys me that we're having one of our hobbies ruined by an attitude of complacency and the misunderstanding that it's the silicon causing mediocre games, when in actual fact it's big business that's responsible.

I just hope that the next gen consoles and easier to use engines (UE4) with licensing costs more suited to startups (CE3) enables some fresh blood and creativity to enter the AAA space.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
The cheapest xbox360 on launch didn't even have a HDD, a digital output cable (overpriced proprietary adapter), or the necessary amount of controllers and was 300$. It was subsidized but then they charged too much for the peripherals. Your assertion that a minimum of 300$ was needed for a good PC CPU at the time is simply false.

Exactly, the price argument isn't that straight forward.

I did a simple analysis on this years ago in my blog - http://pcgamingstandards.com/Blog.aspx?blogid=4

The business model they operate by is a clever one because so many people are quick to rush into investments that at first glance look good. The hardware was sold at a loss initially, this is to get people to buy into a closed platform, and then they charge developers royalties for every game made, the developers pass that cost back on to the gamers by raising the cost of games, that's why PC games are cheaper at launch.

My findings at the time was that the very same games on PC/Console were usually £25/£40, across 4 AAA titles the difference was £55 and average saving of £13.75.

Obviously any smart person considers not only initial investment but also any ongoing costs, for example if you're looking to buy a car you want to look at both initial cost of the car, but also the fuel consumption, because a newer/better engine might consume less and so costs less to run, over time a more expensive car could save you money it depends on a few factors such as how long you will own the car and the average mileage you do in it.

So consoles have a lower initial investment but the cost over time is higher, so at some point you can demonstrate the total cost passes that of a PC, it just depends on the number of games you buy and how long you own the system.

The argument that buying a console costs X and buying a PC costs Y then sorry but that's just dumb, it's not the whole story. You can demonstrate that for gamers who buy a large number of games that PC gaming could actually be cheaper in the long run.
 
Status
Not open for further replies.