• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Console companies just don't have the resources to compete with GPU makers

colonelciller

Senior member
http://www.tomshardware.com/news/nvidia-consoles-gpu-graphics-power,24390.html

Console companies just don't have the resources to compete with GPU makers.



Nvidia's Senior Vice President of Content and Technology, Tony Tamasi, recently talked with PC PowerPlay about the typical comparison between the PC platform and consoles. He revealed that, unlike generations in the past that were on par if not better than many performance PCs at launch, console makers no longer have the resources to jump ahead of the PC gaming hardware market. This became obvious during E3 back in June, as many demos showcased on the Xbox One and PlayStation 4 just didn't look as good as the earlier PC versions.

Tamasi explained that at the time of the first PlayStation console, there really weren't good graphics on the PC. It wasn't until the PlayStation 2 that 3D really started to shine on the mainstream PC platform. By then, the likes of Sony, Nintendo and Sega could dump tons of money into hardware to support (then) high-quality 3D graphics. Tamasi even admitted that the PlayStation 2 was faster than a PC at the time of its launch.

Once the Xbox 360 and PlayStation 3 arrived, their hardware was on par with the PC at launch. Look inside those boxes and you'll find hardware by AMD and Nvidia because, at the time, they were leading the innovation in PC graphics. He said that Nvidia alone spends $1.5 billion USD per year on research and development, and over $10 billion in graphics research during a single console's typical lifespan. Microsoft and Sony simply don't have that kind of pocket change to dump into research, whereas AMD and Nvidia sell millions of chips year after year.

"The second factor is that everything is limited by power these days. If you want to go faster, you need a more efficient design or a bigger power supply," he explains. "The laws of physics dictate that the amount of performance you're going to get from graphics is a function of the efficiency of the architecture, and how much power budget you're willing to give it. The most efficient architectures are from Nvidia and AMD, and you're not going to get anything that is significantly more power efficient in a console, as it's using the same core technology."

Consoles will always be less capable than a PC because they have power budgets of only 200 watts or 300 watts – they're designed to run quietly and cool in the living room. On a PC, 250 watts can be used solely on the GPU, thus consoles will never beat a 1000 watt PC. In a chart provided by Nvidia, the trajectory shows that consoles will never equal or succeed the PC platform again, that the tiny window between 2005 and 2006 will likely be the last time these two industries will ever be on the same page.

"At that time, the PC graphics industry wasn't operating at the limits of device physics and power," he said, referring to why the X360 and PS3 were on par with the PC despite their power limitations. "If you wind back the clock, a high-end graphics card at that time was maybe 75W or 100W max. We weren't building chips that were on the most advanced semiconductor process and were billions of transistors. Now we're building GPUs at the limits of what's possible with fabrication techniques. Nobody can build anything bigger or more powerful than what is in the PC at the moment."
 
Someone should remind Mr. Tamasi, MS has more resources than Nvidia and AMD combined. In 2012, Forbes valued Nvidia at just under $4 billion. MS is worth 20 times that.

It is not a matter of resources, but a matter of what the consumer will accept and the market will tolerate.
 
I stopped reading at "Nvidia's Senior Vice President."

They're really upset over losing out on this round of consoles. All they can do at this point is whine, since bad publicity is better than no publicity.
 
Last edited:
When will they understand that it doesn't matter if a PC is faster, better graphics etc when Sony and MS own the studios making the best games and there's more people actually buying full price games on console?
 
There are pros and cons to it. I also think this article comes off as a butthurt response, but at the same time it's been VERY obvious that the console rules trend of the last 7 years has held back true advancement and imagination in gaming.

Look at the first Crisis and the beast of a system it required to run in all its glory, then look and realize it is still more or less used as a comparison all these years later and nothing else has come close simply due to laziness (oh and PC gaming taking a backseat).

That is not to say "prettier" is better, but rather risks, going beyond, and pushing limits, most of which devs do not attempt these days. All of that can still be tied to overall power available. It becomes more apparent when the scope of a game is lessened due to trying to match the lowest common denominator (consoles). It also becomes apparent when the standard has been 1080p for years now and the latest consoles still struggle to do it. It's not about it being 1080p, but rather that they made a conscience decision not to, which is even more funny when everyone blasted Nintendo for making the Wii standard def.

Game development by most studios related to consoles is about being safe. So there are a handful of great games for 360/PS3 but that's over their entire lifespan and still no comparison to previous generations.

I think things would be smoother if console devs stuck to consoles and PC devs stuck to PC's, but it's just not that way anymore, and there is where so much dissatisfaction comes from. Cutting unnecessary corners due to other platform limitations.

Look at what devs are already saying about Xbone/PS4 and how utilizing the unique features and power differences of the 2 consoles will likely not be part of the dev plan due to cross platforms and not wanting to piss anyone off.

You can only blame costs for so much of it. Many companies simply don't know how to not bleed money.
 
Last edited:
Nvidia is just pissy cause nobody went with nvidia chips for the new consoles.

And OP fails at headline making... no one is stupid enough to think consoles are competing with gpu makers... consoles don't make gpus.
 
Nvidia is just pissy cause nobody went with nvidia chips for the new consoles.

And OP fails at headline making... no one is stupid enough to think consoles are competing with gpu makers... consoles don't make gpus.

This is true. Nvidia is still very profitable with a sizeable beloved patriot of the dGPU market. They needn't worry. Both nvidia and amd ended up losing money on the ps3/360 in the end anyhow. Consoles don't bring you out of the red just cause you provide the apu
 
Nvidia are currently trying to explain why it is that the claims of the console makers and AMD about it being faster than a discrete card is completely false. They have shown the historical context and the difference today, now they are saying why that difference exists, power budget.

Power budget is the reason our mobile phones run low power arm chips and not i7s, its also the reason why consoles have weak atom like cores at low clock speed and mid range GPU hardware. Its not really about Nvidia complaining its about explaining why the claims of the console makers about performance of their parts are absurdly false. AMD isn't doing that because its a partner, it can't really be playing down the performance of its own parts anyway. So it's left to Nvidia to explain that this set of consoles isn't as impressive as the xbox360 and ps3 were on release.

Thee reason why they do this is clear, they are trying to ensure more customers choose PC for this coming cycle and presumably NVidia as well. They get nothing if customers go off to console land. Doesn't mean the argument is wrong just because it comes from a company defending its market, it's still a true statement about the actual comparative performance of the parts and the reason why they have the performance that they do.
 
Last edited:
Nvidia are currently trying to explain why it is that the claims of the console makers and AMD about it being faster than a discrete card is completely false. They have shown the historical context and the difference today, now they are saying why that difference exists, power budget.

Power budget is the reason our mobile phones run low power arm chips and not i7s, its also the reason why consoles have weak atom like cores at low clock speed and mid range GPU hardware. Its not really about Nvidia complaining its about explaining why the claims of the console makers about performance of their parts are absurdly false. AMD isn't doing that because its a partner, it can't really be playing down the performance of its own parts anyway. So it's left to Nvidia to explain that this set of consoles isn't as impressive as the xbox360 and ps3 were on release.

Thee reason why they do this is clear, they are trying to ensure more customers choose PC for this coming cycle and presumably NVidia as well. They get nothing if customers go off to console land. Doesn't mean the argument is wrong just because it comes from a company defending its market, it's still a true statement about the actual comparative performance of the parts and the reason why they have the performance that they do.

Agree with this except that the markets aren't exactly the same so they are speaking to people who won't actually listen. Like many have said in other threads, most consumer console (only) people have no clue and probably won't care one way or the other.

I honestly do not care which is more powerful. I care about the quality of games.

Having said that, I also care about advances in technology. Gaming has stagnated as a whole due to the longevity of console generations and their grip on gaming. This was the opportunity to push things forward...and they didn't. IMO some of the best games to come out late are retro type games. While good, they are not by any means pushing technology forward. Even the non retro games are not really pushing anything into the future.

Another words..I have mixed emotions about the next 10 years of gaming.
 
Last edited:
Agree with this except that the markets aren't exactly the same so they are speaking to people who won't actually listen. Like many have said in other threads, most consumer console (only) people have no clue and probably won't care one way or the other.

I honestly do not care which is more powerful. I care about the quality of games.

Having said that, I also care about advances in technology. Gaming has stagnated as a whole due to the longevity of console generations and their grip on gaming. This was the opportunity to push things forward...and they didn't. IMO some of the best games to come out late are retro type games. While good, they are not by any means pushing technology forward. Even the non retro games are not really pushing anything into the future.

Another words..I have mixed emotions about the next 10 years of gaming.
Once of the problems is we've already had technology pushed so far that the increments we get are very small now. The jump from 2D to 3D engines was huge. The entire 3D environment was huge. Now, what do we get? Larger environments and better textures? Nothing to that level up an increase.

The true pushing of technology isn't coming from slightly better graphics than the previous generation. It is in completely different areas like the Oculus Rift and even in technology like the Kinect (which isn't implemented more than a gimmick, but the technology has vast potential).

Nvidia doesn't get this, it seems. Crytek is another company that doesn't seem to get this. Crysis wasn't a good game. It just looked great.
 
This is true. Nvidia is still very profitable with a sizeable beloved patriot of the dGPU market. They needn't worry. Both nvidia and amd ended up losing money on the ps3/360 in the end anyhow. Consoles don't bring you out of the red just cause you provide the apu

But, and this is speculation and I said this before, AMD getting the contract for both systems so they could setup their new APU roadmap puts AMD in a place where it can build its lead. Nvidia's gpus are #1 when you consider the current structure of separate cpu and gpu, where gpu and cpu runs their own types of code as games do now. AMD currently suffers doing this because their old fabrication process causes their performance to lag behind nvidia. AMD on the other hand sees that if they merge their CPU and GPU, and get everything programed in the same language, performance on their APUs will skyrocket.

To enforce such a roadmap in the pc realm would have been all but impossible.. their APUs have been lackluster for several years now and getting game makers to follow their roadmap of merging the programing languages of the gpu and cpu to run in their APUs (I say road map, currently their APUs don't do this yet) would be be self defeating. But now AMD has ALL the new console games being created this way, which means all those new console games will be ready to take advantage of their PC APU lineup offering up high performance compared to CPU/GPU setups which would have to be modified into games to supporte, bringing down performance.

https://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit is a good read on AMDs future.
 
Last edited:
But, and this is speculation and I said this before, AMD getting the contract for both systems so they could setup their new APU roadmap puts AMD in a place where it can build its lead. Nvidia's gpus are #1 when you consider the current structure of separate cpu and gpu, where gpu and cpu runs their own types of code as games do now. AMD currently suffers doing this because their old fabrication process causes their performance to lag behind nvidia. AMD on the other hand sees that if they merge their CPU and GPU, and get everything programed in the same language, performance on their APUs will skyrocket.

To enforce such a roadmap in the pc realm would have been all but impossible.. their APUs have been lackluster for several years now and getting game makers to follow their roadmap of merging the programing languages of the gpu and cpu to run in their APUs (I say road map, currently their APUs don't do this yet) would be be self defeating. But now AMD has ALL the new console games being created this way, which means all those new console games will be ready to take advantage of their PC APU lineup offering up high performance compared to CPU/GPU setups which would have to be modified into games to supporte, bringing down performance.

https://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit is a good read on AMDs future.

When you move off 1080p you need more than an APU and Nvidia sees this. Also, benchmarks are dominated by Intel CPUs. Probably will remain. Especially when Intel goes down to 20nm and below and Nvidia releases Maxwell which can brute force past everything with a CPU on the dGPU strategy. When it comes to the future it isn't 1080p, it's beyond that toward 1440p and 1600. Then 4k and APUs are not the answer. Unified memory is a step closer, but when the dGPU and CPU can share a pool of GDDR6 with low latency and ultra high bandwidth then we will see some big gains.
 
Last edited:
I think the real difference is that PC GPUs have become massive, expensive pieces of technology, making them impractical for consoles. GPU die size has grown a lot over the years and there's no way to shoehorn one of those into a console.

So yes, the console graphics aren't going to look as good, but that's because the entire console can be purchased for the price of just one high-end GPU that it has to compete against.
 
I think the real difference is that PC GPUs have become massive, expensive pieces of technology, making them impractical for consoles. GPU die size has grown a lot over the years and there's no way to shoehorn one of those into a console.

So yes, the console graphics aren't going to look as good, but that's because the entire console can be purchased for the price of just one high-end GPU that it has to compete against.

Or half the cost lol. I have more money in my GPUs than the PS4 and XB1 costs(not current value but retail at the time of purchase). That's cause I went with SLI.
 
What a ridiculous statement. They're not competing, they're partners.

Should console makers be competing with BluRay to develop a higher performance physical medium?

Should console makers be competing with network companies to get faster network speeds on their hardware?

A console is a point in time packaging of technology, designed to provide a common gaming platform for tens of millions of units.

Edit: Console may have, in the past, been more customized hardware. But we also used to have dozens of graphic and audio chip makers. And even then the custom silicon was often based on existing designs such as the Z80 or 68000. The market has consolidated and specialized, why would console makers want to reinvent the wheel?
 
Last edited:
nVidia is just cheesed they didn't find themselves in consoles this time around. AMD just happened to have what they wanted for the Xbone and PS4. Capacity to manufacture an x86 based SoC with high performance integrated graphics.
 
nVidia is just cheesed they didn't find themselves in consoles this time around. AMD just happened to have what they wanted for the Xbone and PS4. Capacity to manufacture an x86 based SoC with high performance integrated graphics.

high performance and integrated graphics don't fit together in a sentence lol
 
The other thing to remember is that almost all the innovations used in consoles come from elsewhere. The graphics are extracted from the PC discrete cards and placed into the consoles. The CPUs come from PC or other companies (like IBM). The unfortunate reality is that the console market couldn't exist at the performance level it does without the continuous innovation of the PC discrete GPU and CPU makers. The 7-10 year cycle of the consoles is much too long to support their vast research efforts.

Its not a good trend for everyone to be moving to package commodity machines. Its not a model that will continue to bring us improvements rapidly with enormous research budgets.
 
high performance and integrated graphics don't fit together in a sentence lol

Well, better than Tegra 4 is what I meant. You can build high performance systems as an SoC. The 360S for example has the CPU and GPU on a single die. It's cheaper and more efficient to do it this way with embedded systems.

nVidia's SoC production is limited to the Tegra 4. It's only a 32-bit chip, limiting RAM to just 4GB. It's only quad core, four thread. Project Denver, which is the 64-bit ARM chip they're working on isn't due out until 2015. There's no reason they couldn't strap a GTX 760 onto the Tegra 4 but the CPU is still the limitation. Besides, I think Sony and Microsoft wanted x86 chips to make game development easier. Simplifies the process of porting games between Xbox One, PS4, Windows, Mac, and Linux. Something devs have wanted for a long time.
 
Well, better than Tegra 4 is what I meant. You can build high performance systems as an SoC. The 360S for example has the CPU and GPU on a single die. It's cheaper and more efficient to do it this way with embedded systems.

nVidia's SoC production is limited to the Tegra 4. It's only a 32-bit chip, limiting RAM to just 4GB. It's only quad core, four thread. Project Denver, which is the 64-bit ARM chip they're working on isn't due out until 2015. There's no reason they couldn't strap a GTX 760 onto the Tegra 4 but the CPU is still the limitation. Besides, I think Sony and Microsoft wanted x86 chips to make game development easier. Simplifies the process of porting games between Xbox One, PS4, Windows, Mac, and Linux. Something devs have wanted for a long time.

Nvidia is banking on 20nm Maxwell GPU with CPU cores on it for very big gains in performance sometimes next year though. I don't know what kind of CPU they can put on it though, I wouldn't imagine it being x86 though for this purpose.
 
Things can turn so quickly in this industry, so I wouldn't worry about any of this and just enjoy the games 🙂

Hey the next console could very well have an Intel Atom CPU and Nvidia GPU again.
 
Things can turn so quickly in this industry, so I wouldn't worry about any of this and just enjoy the games 🙂

Hey the next console could very well have an Intel Atom CPU and Nvidia GPU again.

That won't happen for at least another 8 years, they want this coming generation to last a long time as well.
 
Back
Top