[ArsTechnica] Next-gen consoles and impact on VGA market

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Personally I would be far more excited about this new type of console if it meant they would release new hardware every 3-5 years instead of every 10 years. If you think about it their R&D costs are drastically cut by using pre-made hardware, and their costs will be far lower as well.

+100. :thumbsup::thumbsup:

In the past console generations generally lasted 5 years, tops 6, before a new one was launched. As a result:

1) PC gaming wasn't held back as much;

2) Consoles were much cheaper since they didn't need to have top of the line hardware to last 7-10 years. Cheaper price made them more popular (PS1, PS2);

3) Developers made a lot better games since they only had a shot at making at most 1 sequel in a span of 5 years. They focused on making a good game from the get go since in the span of 5 years, there wasn't much room to make a mediocre game. Now we have games with 5-10 hour single-player, never ending sequels using graphics from 2008 (Assassin's Creed and Call of Duty 1 millionth edition) made to cater to 7-year-old hardware...

4) Graphical leaps occurred much more rapidly. If someone told me in 2007 that BF3 would be the best looking game by 2012, I would have laughed. Now, it's a sad reality that we need 4xMSAA in deferred game engines to bring a GPU to its knees while the textures and detail in the game is still stuck at 2008 levels, with only animations and lightning model being the bright spots.

This upcoming generation may just be the worst one of them all. They are looking to run PS4/Xbox Durango for another 7-10 years but to start with will probably show low-end to mid-range HD7000 hardware in them (I am guessing HD7770). So it's going to be far worse imo than what we just experienced with PS3/360. Considering HD7850 ~ HD6950 + 5% or so, by 2013 this level of performance will be downright laughable to survive 7-10 years of innovation to come. This wouldn't be a problem if they replaced the consoles in 2018.

Also, the fact that they are focusing so much more on multi-media aspects such as Twitter, Facebook, online apps, digital content streaming, Kinect integration, etc, means less and less budget can be allocated towards hardware because all these other expenditures are eating into their cost structure. Putting Kinect into the console is like taking away $50-60 budget that could have been allocated towards an SSD, a faster CPU, GPU, etc. (or just making the console cheaper).

I think the next generation of consoles is going to be less about the hardcore gamer than ever. Even now all the rumors are not really hinting at mind-blowing performance, mind-blowing graphics, next generation physics effect/ AI, etc. (you know the usual marketing they start spewing). Almost all the rumors are negative: no used games (or requiring online passes to unlock used full game content), constant Internet connection, tying your purchases to Xbox Live/PSN account to combat piracy, motion controls, etc. If true, they are on their way to making consoles closer and closer to the PC, minus the cheap Steam games or the simplicity that consoles used to offer (plug and play at a cottage and off you go).

Basically they are focusing less and less on the core aspect of consoles - gaming with next generation AI, physics, graphics - and shifting towards making them more like all-in-one multimedia devices. All that's going to mean is more rehashed sequels with console ported graphics. Also, developers are focusing more and more on multi-player aspects of gaming with single player games falling to the wayside. You can just imagine the future of gaming may be Free to Play model with them trying to sell us constant DLC, weapons/upgrades for $1-5 to keep the game running (like Blacklight Retribution).

I don't remember any generation where a GPU could crush any game at 1080P easily. There was always some game that couldn't be maxed out with top of the line hardware at that time: Quake 3, Doom 3, Far Cry 1, Crysis 1, Metro 2033, etc.

The next generation after HD7970 / GTX680 is just going to make performance bordering on absurd levels for anyone outside of 2560x1600 monitor.

When I look at Crysis 1 which launched in Nov 2007, PC graphics have hardly moved beyond that. If Crysis 1 supported DX11 with Tessellation and advanced DOF, and had high-rez texture pack, it would probably look better than any 2012 game on the PC today. That's just a sad state of innovation when console hardware serving as the lowest common denominator :|

My other gripe is how in the world did they choose HD7000 series over Kepler without waiting to see how Kepler performs with intentions of using these consoles for 7-10 years? It's almost like they just went for the cheapest option, which AMD likely provided as they are far more desperate than NV is. With Kepler's 2x the Tessellation performance, better performance/die, performance/watt, better FP16 texture performance, class leading 1080P performance that will be used in next generation console games, it would have been the perfect GPU for next generation of consoles.

b3d-filter-fp16.gif

tessmark-x64.gif

power-load.gif


Overall seems like the next generation of consoles is all about maximization of profits and cost cutting, gaining access to wider audience, rather than focusing on the core gaming experience.
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Obviously this will never happen because it's not a focus for the console makers, but, a modular console would be awesome.

Put the GPU on an add-in card and allow the console to be upgraded. Let games have scalable settings that improve IQ depending on which piece of hardware you have inside the machine. Incentivize upgrading your console to consolers by game's option screens showing a choice of graphics levels, ie, medium, high, ultra - with the higher levels greyed out on consoles that have not been upgraded. Add some marketing in to this as well.

Ship the console with a 7770 and charge an arm and a leg for an add-in 7850 and a huge premium for an add-in 7950. Something along those lines. The console would have to be designed to account for increased power draw and thermals though.

Will never happen, but a system like that would make consolers bug their parents for the PS4 gpu upgrade pack for Christmas because Timmy down the street got one for his birthday.

Would have a trickle down effect to PCs with console games being designed with better fidelity and options for the upgraded consoles.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
None of this is surprising. Console makers learned from their mistakes last generation and the market will change accordingly. They want to see a console in every house, even if there are no gamers in it. Also, AMD won the contracts because they delivered the best product first, it's very simple. Nvidia never did much with development so it remains to be seen what AMD will do to drive development now that they have the reins.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Also, AMD won the contracts because they delivered the best product first, it's very simple.

Doesn't look like that's going to be the case at all.

Today's rumors point to the GPU in PS4 to be HD7670 which is just a rebadged HD6670. That's not the best product at all by any stretch of imagination. It's actually awful since HD6670 is even worse than HD4770/8

This actually supports the view that cost and power consumption were the primary factors in choosing such a horrible GPU. It's hard to imagine they could have done worse than this if this proves to be true. It literally would have been better off to just shove an off the assembly line quad-core Kaveri chip in 2013 and call it a day.

If true, this is a failure of epic proportions in how this would translate to PC graphics stagnation. HD6670 in PS4 would be a disaster. A monkey could have chosen a better GPU.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Doesn't look like that's going to be the case at all.
It's exactly what the case is. They delivered what the companies wanted first and therefore got the contract. That's how business works.
Today's rumors point to the GPU in PS4 to be HD7670 which is just a rebadged HD6670. That's not the best product at all by any stretch of imagination. It's actually awful since HD6670 is even worse than HD4770/8
This is incorrect. The 6670 is right between the 4770 and 4870 in performance: http://www.guru3d.com/article/radeon-hd-6670-review/5 . However, in doing a little more research you will find something interesting about the 6670: http://www.techpowerup.com/reviews/AMD/HD_6670/23.html . It seems it's one of the, if not the most, power efficient 40 nm chip on the market. That shines some light on to why Sony might have chosen it for the PS4.
This actually supports the view that cost and power consumption were the primary factors in choosing such a horrible GPU. It's hard to imagine they could have done worse than this if this proves to be true. It literally would have been better off to just shove an off the assembly line quad-core Kaveri chip in 2013 and call it a day.
That very well might be the end goal here. Sony will release a "slim" PS4 that has all this on one die and make them at a fraction of the cost.
If true, this is a failure of epic proportions in how this would translate to PC graphics stagnation. HD6670 in PS4 would be a disaster. A monkey could have chosen a better GPU.
It depends on how you look at it. Since the PS4 will be using PC parts, it might be easier to port code and we therefore will get better PC games since there won't be so much performance lost in the overhead. If companies know they just have to write one engine that scales well with LOD/IQ, it might make everything easier for everyone. The major problem we have now is old consoles with proprietary hardware and software that make it more difficult to port than it needs to be. Proprietary anything is the biggest hindrance on gaming evolution.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
It's exactly what the case is. They delivered what the companies wanted first and therefore got the contract. That's how business works.

Except, price is also a factor.
And he explicitly said that what isn't the case is the claim that their GCN cards are what won the contract.
His argument for it is that they aren't going with GCN they are going with older architecture. Hence "delivering fastest hardware first" was a non issue.

What was an issue was giving the best deal overall for old hardware.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
My other gripe is how in the world did they choose HD7000 series over Kepler

don't jump the conclusion so fast... the 7870 is pretty much 40% slower and smaller than kepler.... let's just whait for the smaller keplers ok?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's exactly what the case is. They delivered what the companies wanted first and therefore got the contract. That's how business works.

Ya, it's the best for cost cutting, not the best for gamers. This isn't how PS3 and Xbox360 were built. Worst of all, HD6550D in A8-3850 and HD7670 are based on VLIW-5 design. AMD just moved away from that to GCN. So whatever optimization happens for those AMD GPUs will likely be non-existent for future AMD GPUs.

However, in doing a little more research you will find something interesting about the 6670: http://www.techpowerup.com/reviews/AMD/HD_6670/23.html . It seems it's one of the, if not the most, power efficient 40 nm chip on the market.

Right, but since the next generation of consoles will most likely last another 7-8 years, the power consumption is a moot point since they'll shrink the chips and gain the efficiency later. Regardless, it would have been better off to wait another 6 months and just launch a Kaveri AMD APU in early 2014. Since Kaveri ships in 2013, that would have given them almost a year to sort out any issues and ramp up manufacturing. Kaveri by itself will have performance ~ HD7750 which is faster than the aggregate speed of both HD6550D+ HD6670 in Cross-Fire while consuming ~ 100W of power.

So Kaveri would have been less expensive, more powerful and more cost efficient and would have required just a 6 months delay from their holiday 2013 date. For a console that's supposed to last 7+ years, delaying it 6 months is not a lot if they wanted to focus on maximum power efficiency.

It depends on how you look at it. Since the PS4 will be using PC parts, it might be easier to port code and we therefore will get better PC games since there won't be so much performance lost in the overhead.

Maybe, but did we get much better looking games on the PC because Xbox 1 was essentially a PC? Not really. I think it matters much more how powerful the consoles are rather than if they are using off the shelf PC CPU+GPU. The PC components in the Xbox didn't allow it to have much better graphics over its Gamecube and PS2 competitors. Therefore, there is no real evidence that using off the shelf PC CPU will benefit PC games. It will make console ports quicker and cheaper but I don't think it'll make them better looking at all because they are still going to be gimped due to HD6770 level of aggregate GPU performance.

For example Tessellation is one of the highlights of DX11 but HD6000 series has a very weak Tessellation unit. So we can pretty much forget about consoles helping to push this feature.

We don't even have to look at how HD6670 stacks up against HD4000 series. I found a review with modern games using the best possible scenario for PS4: A8-3870 with HD6550D + HD6670 in Crossfire = can't even beat a single HD6770 ~ HD4870:

avp.png

batman.png

battlefield.png

crysis2.png

dirt3.png

starcraft.png

Source

If these specs are true, by the end of 2013 the combined graphical capability of the PS4 will be less than a single HD6770, which itself is going to be slower than the slowest desktop discrete GPU AMD will sell for $100 at that time. Any benefits of "optimized" console ports using off the shelf PC components will be negated by the sheer lack of power that next generation consoles will have. We don't need any more optimized PC games since in 2013, an HD7970/GTX680 will drop to $350 and will max out almost every game on the PC as it is. We need a huge leap in realism. How do you achieve that when the lowest common denominator has performance < HD6770?

Unless PC developers start taking PC more seriously, the more likely outcome of next generation consoles being so underpowered will be an evident stagnation in PC graphics. Because the games will be coded for PC hardware to begin with, it might even be more likely that we'll get straight up console ports. Perhaps as you said they might develop the games in all of their glory and just dumb them down for consoles and but sell the most high-end version for the PC. If true, that would be amazing :) However, considering the previous consoles used ~X1900XT/7950GT style GPUs (which were pretty advanced at the time) and we still ended up with console ports that barely looked better on the PC, I am hesitant to believe this will happen. PC developers should start using Steam Hardware survey for the lowest common denominator for graphics because using HD6770 as the development base is a underwhelming way to start the next generation of graphics revolution.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Except, price is also a factor.
And he explicitly said that what isn't the case is the claim that their GCN cards are what won the contract.
His argument for it is that they aren't going with GCN they are going with older architecture. Hence "delivering fastest hardware first" was a non issue.

What was an issue was giving the best deal overall for old hardware.

Thanks for clarifying my point more succinctly. :thumbsup:

This time I just might have to agree with your viewpoint that it's better to buy a new GPU than a new console. If PS4 only has an aggregate performance ~ HD6770, then a $400 GPU at the end of 2013 may actually provide a better overall gaming experience at 1080P for 5 years than buying a $400 next generation console would. Add what most surely will be a lower cost of games via Steam, and it's going to be harder than ever to make a case for consoles from a financial standpoint (not even bringing better controls and graphics on the PC, etc.).

When PS3 launched at least it had BluRay and a somewhat powerful GPU for its time. A case was more justified back then for its $600 price. I am thinking these consoles will need to cost $300-350 at most now to make any sense if they are cost cutting this much on hardware. Considering AMD CPU and BluRay drive are actually fairly cheap today, this would have been the most reasonable time to expect reasonably powerful GPUs in consoles. Looks like their design decisions were primarily driven by profits, power consumption and cost cutting rather than desire to push gaming to the next level. It sounds like technologically advanced parts were put on the back burner this time likely due to how long it took for Sony to start recouping the losses on PS3's hardware.

don't jump the conclusion so fast... the 7870 is pretty much 40% slower and smaller than kepler.... let's just whait for the smaller keplers ok?

Not trying to get into HD7870 vs. GTX670 debate. My main point is if future games will use consoles as the lower common denominator, then it would have been preferable to have an underlying architecture that would have allowed to push the graphical envelope just a little more - which means higher resolution textures and tessellation. In both of these areas, Kepler architecture is anywhere from 50-100% faster than today's GCN iteration. So in the long term for us PC gamers, it would have been a lot more preferable if Kepler was in the consoles.

Worse though, the HD7670 isn't even based on GCN, but a re-badged HD6670 VLIW-5 design. They are basically selling a console with combined HD5770 level of graphics capability in 2013. Not looking great to be honest.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
They didn't choose Kepler for a very simple reason:

nVidia makes them. They aren't the easiest people to get along with. nVidia also doesn't make x86 and unifying console>desktop also means more money for everyone.

http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2003/08/15/BU190365.DTL

If the PS4 does use a Llano design + 6670 then the crossfire performance is enough to drive 1080p. Xbox720 will reportedly be using a Trinity-class APU along with 2 AMD GPUs, most likely also the 6670s.

AMD won all 3 contracts because they beat nVidia on pricing. The 680 Kepler is awesome but it's also the first time in recent history where nVidia clearly beat AMD in price-to-performance. It also costs more than the entire consoles would cost... At the moment there are no Kepler chips for the low end and there are no Kepler chips planned for that segment of the market (essentially the same as AMD. Mobile + mid/high desktop, which makes sense given how much ground AMD and Intel have been eating up on the low end. It's just not worth selling low end discrete GPUs anymore).

This isn't exactly news, though. The choice of hardware is, but [H] has reported AMD's success in the console market since last summer.

And forget the benchmarks for the moment. When developers are closer to the hardware they can optimize for that specific hardware, meaning any 6670 benchmarks would see better performance in console form and likely double that when crossfired. Think of it like Gaming Evolved or TWIMTBP titles. When developers have the hardware in hand and only have to worry about 1 specific GPU then they're able to get far more out of it then is otherwise possible. Therefore posting PC benchmarks for games just doesn't translate to what we'll see when it's dropped inside of a console.

So long as it's capable of driving 1080p at good frame rates (they will be even higher then what you've posted in the benchmarks above), is efficient (it's got great perf-per-watt) and it's cheap (they're hella cheap) then that's all the console makers care about. nVidia wasn't able to provide those 3.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
how will the 7670 aka 6670 run games at 1920x1080? even a modest 5770 is 50% faster than that level of card.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
how will the 7670 aka 6670 run games at 1920x1080? even a modest 5770 is 50% faster than that level of card.

The developers are closer to the hardware and only have to code for that specific hardware and architecture, meaning they can get far more out of it.

The hardware in the XBOX360 has a Xenos at 240GFLOPS while the 6670 is 768 GFLOPS. That's more than 3x the GFLOPS and the Xenos has no problems running current titles considering its comparatively weak hardware. What seems weak on the PC isn't weak on the console.

It's also not one 6670 but two (if the PS4 is a Llano they can crossfire) or three in the case of the XBOX720 (2 6670s and maybe a Trinity APU).
 

jpiniero

Lifer
Oct 1, 2010
17,200
7,575
136
It's also not one 6670 but two (if the PS4 is a Llano they can crossfire) or three in the case of the XBOX720 (2 6670s and maybe a Trinity APU).

I have serious doubts that the 720 has 2 6670s. The 4-6 core PowerPC plus single 6670 is more realistic.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Multiple sources have confirmed this morning that the machine will have two GPUs. One said: &#8220;It&#8217;s like two PCs taped together.&#8221;

We&#8217;re waiting for final confirmation of specs, but the graphics cards are thought to be equivalent to AMD&#8217;s 7000 series GPUs, but &#8220;not CrossFire or SLI&#8221;. The GPUs aren&#8217;t structured as they are in a normal dual PC set-up, in which the two chips take it in turns to draw lines of the same object: Xbox 720&#8242;s graphics units will be able to work independently, drawing separate items simultaneously.

http://www.vg247.com/2012/04/02/xbox-720-detailed-blu-ray-inside-always-on-netcon-required/

I'm not sure XBOX will go PowerPC again. x86 would benefit them more than it would anybody else. I haven't read anything about what/who they chose for the CPU yet so I could be wrong.

Ninja edit - I knew I read it somewhere but couldn't dig it up. It was in fact on [H] around the same time as the GPU rumors. http://www.hardocp.com/news/2011/07/20/next_generation_console_hardware_update

The 4-6 cores could potentially pose an issue for size/cost but not if the design was customized. A 6-core Trinity design doesn't exist but if AMD is good at one thing it's adding moar coarz.
 
Last edited:

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
The current model 360 already has an APU style chip in it. I'm going to bet MS is going to slap a quad core PowerPC CPU together with one or two AMD GPUs on a single die and call it a day.

For some reason I just don't see the PS4 using an x86 CPU.

I just want 4 or more cores and a big chunk of system memory. So when games are ported to the PC they don't suck so badly.

If I could wish for anything it would be the PS4 & xbox 3 would both use XDR memory for system memory for better performance that won't hold back PCs 8 years from now.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It also draws 50% more power. 50% more power they don't have in their power budget.

Total system power consumption with Core i5-750 (45nm CPU), 4GB of RAM, 1TB mechanical drive, full size ATX motherboard:

power-load.png

Source

Power consumption is not the problem. A system could have easily accommodated an HD7770 from a power perspective and still come in way under 200W. Alternatively, they could have asked for a custom 28nm HD6850 shrink to start off the bat. The total power consumption of that system would also come in well under 200W.

Furthermore, the original PS3 had a 380W power supply. Even the Slim now uses a 200-230W PSU. They had no problems with power consumption before, neither from a cooling or a power supply capacity or the PSU cost perspective. In fact PS3 was very quiet compared to the 360. Sounds like cost cutting excuses to me.

The two most expensive parts in the launch PS3 were the Cell CPU and the BluRay drive. The expensive Cell has been scrapped in favor of an off the shelf x86 AMD CPU (~$100) and BluRay drives are very cheap today ($60 retail). Sounds like Sony is trying to sell crappy hardware and make $ on it too. Nintendo's strategy is being embraced. :biggrin:

The developers are closer to the hardware and only have to code for that specific hardware and architecture, meaning they can get far more out of it.

The exact same argument can be made regarding PS3/360. Despite those consoles shipping with what at the time was at the very least GPUs comparable to a mid-range PC (and I'd argue more powerful than that), those consoles still struggled just 5 years out of release. Today their graphics are full out holding back PC gaming and even console games have to be scaled back (BF3, running games at 1280x720 and then upscaling games to 1080P, etc.)

Now imagine that by end of 2013, HD6550D+HD6670 = HD6770 performance is flat out low-end on the PC. How do they expect this round of consoles to last 7 years? Simple = graphics limitations will be exposed even quicker this round.

======

This entire generation screams = cost cutting + maximizing profits from day one by selling overpriced low-end hardware and adding fancy marketing of PS4/720 being "fully capable multi-media boxes for your living room." I wouldn't even be surprised if they price PS4 at $399 with such crappy hardware too just so that they can make $ off the software and the hardware sales like Nintendo does. Looks like the Printer or Razer blades loss leader strategy is out the window......
 
Last edited:

omek

Member
Nov 18, 2007
137
0
0
Worse though, the HD7670 isn't even based on GCN, but a re-badged HD6670 VLIW-5 design. They are basically selling a console with combined HD5770 level of graphics capability in 2013. Not looking great to be honest.

Not even.
http://www.techradar.com/reviews/pc...ds/amd-radeon-hd-6670-949140/review?artc_pg=2

It's quite sad.
We are going to be saying goodbye to the potential of tessellation and physics and I can't see the use of PC hardware helping on any large level when the GPU(s) are/is inherently so weak. Games will most likely be baked so badly at the core that increasing IQ when porting to the PC or implementing extensions won't be feasible or within a developers budget.

But I also think that the PC will start attracting developers much more than it is currently when the gap widens 2 or 3 years from the next gens' release.
It's going to come down to gamers deciding between sucking it up and using a Windows platform or dealing with a severely hindered console experience, especially the gamers that want to see something better.
Maybe this is what is needed to get some specialized PC's with stripped down Windows versions out. If it wasn't for the need of Windows for DX, I think it would have happened years ago.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
What do you think the likelihood of the next consoles GPUs being manufactured on a 40nm process is? I would hope for 28nm to be used but judging from all the rumours it seems as if this generation will be a cost cutting measure first and foremost.

Even taking into account the older Xbox 720 rumour of the dev kit using a 6670 GPU and that the Xbox 360 GPU dev kit used a X800 XT before it was switched to a modified X1900 based core for the console itself, this puts the proportionate potential increase of the GPU to as follows:

Xbox 360 - Peak theoretical performance:
X800 XT - 182 GigaFLOPS
Xenos GPU - 240 GigaFLOPS

Increase in power - 31.9%. So this means the equivalent for the next Xbox would be:
6670 - 768 GigaFLOPS
6750 - 1013 GigaFLOPS

Even if this best case scenario does happen, it's still massively disappointing IMO.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Another point I think people are missing is the reason our current console generation has lasted so long is that it's taken this long for manufacturing capabilities to catch up and for Sony and Microsoft to recoup some of the losses during the initial years. By making cheaper and more power efficient consoles, the next generation might be much shorter since the companies don't have to extend the profitability years.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Wow this thread turned into a chore, and magically again - AMD vs NV. I didn't even know NV had Kepler products out for sampling last year? I know the AMD rumors have been circulating for almost a whole year by now.

Factor in Microsoft and nVidia have a bad history, and Sony "claims" nVidia sabotaged the GPU (however, if you follow closely, you'd realize Sony wasn't prepared to have a standard GPU in their setup, so I'd blame Sony more - but business point fingers of course.)

In the end, I doubt MAXIMUM performance was anyone's top priority. These guys don't care about PC/PC-gamers. We're a rival to them. Why would they go with something that threats their bread and butter? Gamers (as a whole) are easily satisfied. Have you guys played any of these indie-games? Or the mobile games? They aren't breaking any walls in terms of graphics, but they sure are fun and are being bought without question.

Unless Valve/Steam just sign an OEM contract, expect consolitis to last another decade :(
 
Status
Not open for further replies.