crappy gpu in next xbox?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think anyone dreaming of even a 7850 crazy. The entire console will likely draw <150W. I'm hoping for 7770 calibre graphics and 128bit 1gb GDDR5 Would be a huge step up from what we have now. There likely going to launch as a SOC as well.

It's not crazy because PS3 and 360 were able to do it - put a mid-range to upper-mid range GPU into a console. BTW, they don't take desktop GPUs and put them into a console. They use a different stepping, similar to mobile chips, often with reduced memory bandwidth or slightly detuned GPU clocks to lower power consumption due to reduced GPU voltage that's necessary to sustain lower GPU clocks. So the comparison of HD7850's desktop power consumption is irrelevant. How do you think AMD and NV release HD7950M / 7970M / GTX680 and put them into a 2 inch laptop? GK104 GTX680 has 1344 SPs, 32 ROPs and 700mhz GPU clocks, with reduced memory bandwidth. It's still miles faster than HD6670 desktop and consumes 100W only. They don't have to take the highest GPU though from the mobile stack.

The power consumption of HD7970M = 850mhz HD7870 desktop is 100W. That means HD7950M will be well below that (<75W TDP) and still retain the Pitcairn core, 2GB of VRAM.

You can't compare power consumption of desktop GPUs and mobile GPUs because AMD/NV bin mobile chips differently.

HD7870 1000mhz Desktop 153 GB/sec memory 2GB = 175W TDP
HD7970M = 850mhz Desktop 7870 153 GB/sec memory 2GB= 100W TDP (18% slower, 75W lower TDP!)

Here is a good article with tables comparing NV's and AMD's top mobile and desktop equivalent GPUs:
http://www.notebookcheck.net/Review-GeForce-GTX-680M-vs-Radeon-HD-7970M.77110.0.html

The original Xbox360 used 180W of power at load
The original PS3 used 240W of power at load

There is no power consumption limitation of why MS can't use an HD7950M ~ HD7850 inside the next Xbox. If they cheap out, sure, but power consumption is not the reason it can't be done.
 
Last edited:

Rezist

Senior member
Jun 20, 2009
726
0
71
It's not crazy because PS3 and 360 were able to do it - put a mid-range to upper-mid range GPU into a console. BTW, they don't take desktop GPUs and put them into a console. They use a different stepping, similar to mobile chips, often with reduced memory bandwidth or slightly detuned GPU clocks to lower power consumption due to reduced GPU voltage that's necessary to sustain lower GPU clocks. So the comparison of HD7850's desktop power consumption is irrelevant. How do you think AMD and NV release HD7950M / 7970M / GTX680 and put them into a 2 inch laptop? GK104 GTX680 has 1344 SPs, 32 ROPs and 700mhz GPU clocks, with reduced memory bandwidth. It's still miles faster than HD6670 desktop and consumes 100W only. They don't have to take the highest GPU though from the mobile stack.

The power consumption of HD7970M = 850mhz HD7870 desktop is 100W. That means HD7950M will be well below that (<75W TDP) and still retain the Pitcairn core, 2GB of VRAM.

You can't compare power consumption of desktop GPUs and mobile GPUs because AMD/NV bin mobile chips differently.

HD7870 1000mhz Desktop 153 GB/sec memory 2GB = 175W TDP
HD7970M = 850mhz Desktop 7870 153 GB/sec memory 2GB= 100W TDP (18% slower, 75W lower TDP!)

Here is a good article with tables comparing NV's and AMD's top mobile and desktop equivalent GPUs:
http://www.notebookcheck.net/Review-GeForce-GTX-680M-vs-Radeon-HD-7970M.77110.0.html

The original Xbox360 used 180W of power at load
The original PS3 used 240W of power at load

There is no power consumption limitation of why MS can't use an HD7950M ~ HD7850 inside the next Xbox. If they cheap out, sure, but power consumption is not the reason it can't be done.

Don't forget the 360's massive failure rates as well as the launch PS3's even if it didn't get the press. I honestly don't think they can go through with it again, keeping the TDP low keeps the manufacturing cheaper and the console more reliable IMO.

Also will the 7970m and 680m be less than 150$ in a year I doubt that.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
They must want to sell it for ~$399 and still make a slight profit. With a built in Kinecht I guess they have to cheapen the GPU to make that budget. So depending on what you think of motion controls, they are either going:

Gameplay over graphics
or
Gimmick over graphics

Not counting the Wii, this may be the smallest generational leap in graphics yet - despite the longest gap between generations ever. Even inter-generation graphical leaps used to be amazing: compare the Xbox 1 to the Dreamcast.

And it's quite right that some consoles used to have near the high end GPUs when they came out (Xbox, Xbox 360, PS3). This'll be something else alright.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
The Wii-U has some sort of AMD GPU in it. Does anyone know what it is approximately matched by on the desktop ?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
What I am betting on. Kinect will be force fed to you. So if you hate motion controls like me then Microsoft is the worst thing to happen to gaming ever. From buying exclusive contracts to forcing controls that don't work.

Second, I am betting on no valve console. From everything I have read from them nothing tells me they want to touch that market. It is way too risky and they don't have Microsoft's wallet to play with.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
The Wii-U has some sort of AMD GPU in it. Does anyone know what it is approximately matched by on the desktop ?
Wii U will have an GPU similar to the AMD Radeon 4850 512mb GPU from what I read don't quote me on that it's just what I have read we will know for sure when Nintendo reveals the exact specs.
 

Kippa

Senior member
Dec 12, 2011
392
1
81
What I am betting on. Kinect will be force fed to you. So if you hate motion controls like me then Microsoft is the worst thing to happen to gaming ever. From buying exclusive contracts to forcing controls that don't work.
.

You think kinect is bad, try playing decathlon on a ZX Spectrum. Ah the good old days, before the scientists discoverd repetitive strain injury.
 

sandorski

No Lifer
Oct 10, 1999
70,677
6,250
126
Seems like many are looking at this all wrong. How much better would it be compared to the current XBox/PS? More features, better performance, and more capability makes it a compelling product.

As for PC users, at least Console Ports would offer DX11 features and likely other improvements to Textures and what not from the start. It wouldn't be revolutionary, but would be a definite improvement.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Seems like many are looking at this all wrong. How much better would it be compared to the current XBox/PS? More features, better performance, and more capability makes it a compelling product.

As for PC users, at least Console Ports would offer DX11 features and likely other improvements to Textures and what not from the start. It wouldn't be revolutionary, but would be a definite improvement.

Pretty sure "improvement" was a given, what we're all unimpressed by is the rumored "amount" of improvement.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
If these rumours are true, Xbox720 is obsolete from day 1 for Unreal Engine 4 and CryEngine 3 games at 1080P with 4AA at 60 fps. Essentially they'll be lucky to get AA + 30-40fps in most games at max settings.
You're assuming they'll use MSAA or some other form of real AA. Between current console games, PC games without real AA, and NVIDIA's own efforts in this arena, everyone is already on the faux-AA bandwagon. So I wouldn't expect Xbox Next developers to use MSAA; they'll all implement FXAA (or some other form of faux-AA) and call it a day.

Even on the 6670, FXAAing a 1080p image is a couple of milliseconds at worst, which is borderline trivial. It's faster to implement, easier to implement, faster to run, and clearly no one actually cares about image quality, so why even bother implementing MSAA?:|

The goal for developers here will be 1080p at 30fps, or >720p at 60fps. Without real AA and with CTM optimizations, the 6670 actually stands a chance.

Frankly, I'm more terrified at the prospect that we're going to be locking ourselves into another 5+ years of gaming based on what will be a 4 year old feature set. 10 years is a long time for D3D11 to be the baseline feature set. Even current gen consoles didn't start that far behind (D3D9.0c wasn't even 2 years old when the 360 launched).
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I'd imagine that the 720 would use MLAA, since that's the post processing AA method AMD likes to push.

Honestly, I'm not too apprehensive of MLAA or FXAA. New iterations keep being released which increase its precision, and modern lighting engines have decreased the effectiveness of true MSAA while increasing its memory cost (ie Battlefield 3). If MLAA or FXAA use frees up more memory bandwidth for high resolution textures or other effects, I think it's worth the tradeoff (this from a guy who routinely resorts to FXAA or MLAA on his obsolete graphics card :p ).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Don't forget the 360's massive failure rates as well as the launch PS3's even if it didn't get the press. I honestly don't think they can go through with it again, keeping the TDP low keeps the manufacturing cheaper and the console more reliable IMO.

Also will the 7970m and 680m be less than 150$ in a year I doubt that.

Technically speaking AMD/NV don't sell GPUs for $500. They sell the chip (or kit perhaps) and then add-in board partners buy the cooling, the memory, put together a PCB, and pass on the premium to us, etc. For example, the actual 6970/570/580 chips only cost $90-120 directly from AMD/NV. There is no reason at all MS can't buy HD7950M or 7870M for $100 and spend $50 on memory, cooling. It's doable by end of 2013 as those chips will have been replaced by HD8000M which means they are outdated tech for AMD.

I think the main issue with 360's cooling was not the heatsink or its inability to handle the heat but MS cheapened out on the adhesive and it would degrade over time. Eventually the heatsink would separate from the GPU completely. That was probably $0.50 MS saved that cost them $ billions. Also, I am not saying they have to use 7970M/680M. I only used those to show you that the top mobile chips only use 100W of power. The chips below them such as the 7950M use much less and are barely slower (they are slower than the 7970M/680M, but still probably 4x faster than the 6670 equivalent). If cost is an issue, then there are all the other AMD chips such as 7850M/7870M, etc. If you actually think about it, something like an HD6670 that retails for $50 would probably cost $35-40 to MS since there are no intermediaries such as retail channels or AIBs.

Selling an ~ HD6670-level of performance in a $300-400 console is appauling not only from a performance perspective but because the BOM of this part will be $40-45 (tops!). That's almost insulting considering the main point of a gaming console is to play games, with everything else secondary imo, unless MS plans to rename this Media Center with gaming capability.

Either this dev kit isn't representatiive of the final product or MS wants to make a lot of $ selling a console with budget components as "high-end" based on marketing. I can already see them calling a 4-core / 4-threaded CPU a 16-core in their launch presentation. :cool:
 
Last edited:

Sonikku

Lifer
Jun 23, 2005
15,882
4,882
136
You can all blame the Wii. Business is about profit, not about having the best product. Now, generally, having the best product is conductive towards making the most profit. But not any more. Sony and Microsoft broke their backs trying to one up each other in horsepower only to take on staggering losses. It got so bad it took them years before they broke even on each unit and years more of making profit on each unit to pay themselves back for the years they were taking losses. Then came along Nintendo Wii.

Consumers voted with their wallets for the wii, easily outpacing either of it's competitors. And it did it all with significantly lower specs and costs. In other words, breaking your back to appease the small niche of graphic whores that want good graphics in their console but not good enough that they'd build a PC just isn't worth it when compared to the "mainstream gamer" that makes or breaks a console. So the question is, why bother? Why bother making a costly box if it is no longer the way to gaining market share and profit?

That is the question Sony and Microsoft asks themselves when building their new systems. And there you have it. They're both going cheapo. Profit right out of the gate! Good for them. Bad for everyone else. They learned that having a weak system isn't a liability when everyone else has a weak system too as surely as having a powerful system isn't an advantage when the other guy has the advantage as well. it's at best the "new baseline" and at worst "not relevant" to most gamers.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
To us it's shocking, but a very large contingent of console gamers only games on console. They have no idea what they are missing, so these 'new' 'more powerful' consoles will actually look like a real upgrade to them...

They're in the stone age. They're basically in the position of a PC gamer going from a AMD X1900GT up to an AMD 6670. Dreadful!
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Sophitia

You have a point however what "really" sold the Wii wasnt the graphics, it was the "experiance" you could get with the system.


1) Wii remote has a speaker that makes certain "sound" effects for games.

2) Wii remote has a rumble effect (shakes) (I know others have this too)

3) Wii "comes" with kinect like system as default, and a software ecosystem (because all Wii 's have this)


It might sound gimmicky, but I loved playing Resident Evil 4, on my brothers Wii, and useing the remote to point like a gun at that TV :) It just adds something extra to the experiance when your gameing.

Also... with nintendo you know you ll have:

Super Mario games (of all types ei, Mario kart (raceing) / Papir Mario (RPG) ect)
Donkey Kong Country
Zelda
Kirby
Punch-Out
Sonic
Metroidhttp://wii.ign.com/objects/143/14352258.html

And the random Oddball that pops up thats decent like:
Monster Hunter


When you buy a Playstation, do you know any titles that are "sony" only that make you want to buy the consol?

Why Xbox/Playstation didnt sell as well, might just be a question of price, but it could also be that "powerfull graphics" isnt enough to sell a system.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Technically speaking AMD/NV don't sell GPUs for $500. They sell the chip (or kit perhaps) and then add-in board partners buy the cooling, the memory, put together a PCB, and pass on the premium to us, etc. For example, the actual 6970/570/580 chips only cost $90-120 directly from AMD/NV. There is no reason at all MS can't buy HD7950M or 7870M for $100 and spend $50 on memory, cooling. It's doable by end of 2013 as those chips will have been replaced by HD8000M which means they are outdated tech for AMD.

Sony/Microsoft wouldn't be buying chips from AMD at all. With the last console generation, Sony and Microsoft paid ATi/Nvidia/IBM to design the CPU and graphics chips, but owned the rights to those designs in the end. They produce the chips and pay for the manufacturing costs directly. This was to avoid the situation like Microsoft got into with Intel in the last generation: the original Xbox used a Pentium III, but Microsoft had to pay Intel for each chip and Intel was reluctant to drop the price even when manufacturing costs went down.

But yeah, you have a point. Microsoft wouldn't nearly be paying the full retail price for each 7970 or 7870-based chip it made for a 720.

I think the main issue with 360's cooling was not the heatsink or its inability to handle the heat but MS cheapened out on the adhesive and it would degrade over time. Eventually the heatsink would separate from the GPU completely. That was probably $0.50 MS saved that cost them $ billions. Also, I am not saying they have to use 7970M/680M. I only used those to show you that the top mobile chips only use 100W of power. The chips below them such as the 7950M use much less and are barely slower (they are slower than the 7970M/680M, but still probably 4x faster than the 6670 equivalent). If cost is an issue, then there are all the other AMD chips such as 7850M/7870M, etc. If you actually think about it, something like an HD6670 that retails for $50 would probably cost $35-40 to MS since there are no intermediaries such as retail channels or AIBs.

Selling an ~ HD6670-level of performance in a $300-400 console is appauling not only from a performance perspective but because the BOM of this part will be $40-45 (tops!). That's almost insulting considering the main point of a gaming console is to play games, with everything else secondary imo, unless MS plans to rename this Media Center with gaming capability.

Either this dev kit isn't representatiive of the final product or MS wants to make a lot of $ selling a console with budget components as "high-end" based on marketing. I can already see them calling a 4-core / 4-threaded CPU a 16-core in their launch presentation. :cool:

At the least -- LEAST -- what I would want in the 720 is an adapted Juniper (Radeon HD 5700 series) graphics chip. Juniper remains the best non-GCN graphics chip with a 128 bit wide memory bus. Microsoft/AMD wouldn't have to spend a little extra research trimming down a bigger chip to work with a memory bus that size. Toss in an improved HD 6000 series tessellator, and you have a chip that's a damn sight better than a 6670. How much do you think that would cost?

Better than that, as I have mentioned before, would be a VLIW4 chip. Again, AMD apparently hasn't finished with VLIW4 yet, since it made its way into Trinity. In fact, VLIW4 might be better than GCN for a game console, since GCN focused on compute and GPGPU improvements that a console wouldn't really need. If they go VLIW4, at the very least we would get the same core that's in Trinity, or maybe something a little beefier (Cayman [Radeon HD 6900 series] please? :'().

You can all blame the Wii. Business is about profit, not about having the best product. Now, generally, having the best product is conductive towards making the most profit. But not any more. Sony and Microsoft broke their backs trying to one up each other in horsepower only to take on staggering losses. It got so bad it took them years before they broke even on each unit and years more of making profit on each unit to pay themselves back for the years they were taking losses. Then came along Nintendo Wii.

Consumers voted with their wallets for the wii, easily outpacing either of it's competitors. And it did it all with significantly lower specs and costs. In other words, breaking your back to appease the small niche of graphic whores that want good graphics in their console but not good enough that they'd build a PC just isn't worth it when compared to the "mainstream gamer" that makes or breaks a console. So the question is, why bother? Why bother making a costly box if it is no longer the way to gaining market share and profit?

That is the question Sony and Microsoft asks themselves when building their new systems. And there you have it. They're both going cheapo. Profit right out of the gate! Good for them. Bad for everyone else. They learned that having a weak system isn't a liability when everyone else has a weak system too as surely as having a powerful system isn't an advantage when the other guy has the advantage as well. it's at best the "new baseline" and at worst "not relevant" to most gamers.

You have a point, though recently the Xbox and the PS3 have been gaining steam over the Wii.

When you buy a Playstation, do you know any titles that are "sony" only that make you want to buy the consol?

God of War, LittleBigPlanet, Uncharted, Resistance, Killzone, etc. If I had bigger pockets I would buy a PS3 so I could play those games.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,526
136
You have a point however what "really" sold the Wii wasnt the graphics, it was the "experiance" you could get with the system.

Sure, but lack of good graphics is perhaps the largest reason it stopped selling.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,526
136
Better than that, as I have mentioned before, would be a VLIW4 chip. Again, AMD apparently hasn't finished with VLIW4 yet, since it made its way into Trinity. In fact, VLIW4 might be better than GCN for a game console, since GCN focused on compute and GPGPU improvements that a console wouldn't really need. If they go VLIW4, at the very least we would get the same core that's in Trinity, or maybe something a little beefier (Cayman [Radeon HD 6900 series] please? :'().

Or heck, why not VLIW5 in that case? More theoretical per/mm and perf/watt, just harder to extract it. That's really not a problem in the console arena since developers will learn to extract every last ounce of performance out of the system given enough time.

That being said, the advantage that simpler architectures like VLIW4/5 have over GCN in traditional rendering really pales in comparison to the advantage GCN has in compute. If GCN was placed in a console you'd be damned sure developers would become very proficient at targeting it's strengths. In this case that would mean leveraging compute capabilities.

I think the best thing that we can hope for in the new consoles is that they are as architecturally forward looking as possible. GPUs are going to continue to become more compute focused over the course of the next eight years or so these new consoles will be around, and using a slightly more efficient architecture for present workloads at the expense of likely future workloads is liable to create a schism that hurts the industry.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Which again mostly stems from the wii not having sufficient graphics horsepower to play those good games.
sure it could and does with reduced graphics. the main reason is nintendo wants to keep it PG rating. Also good graphics does not mean it's a good game.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Grooveriding, remember that thread you made with Crysis 1 on PC vs. 360? The lighting may have been improved, but they removed a lot of the detail in Crysis 1 and blurred the background/reduced draw distance to get it to run. While it is true that developers optimize much more efficiently for consoles, eventually you simply run out of power and are forced to cut corners. HD6670 level of performance may be good for 3 years, esp. compared to PS3 and 360 but then we'll be back to even faster consolization :(.

Because many console gamers don't follow hardware, they might hear that Crysis 1 was very demanding on the PC. Then they'll play it on the 360 and see that it runs smoothly. Then they think consoles are just as good as the PC for $200. Only thing is in some places Crysis 1 on the 360 barely even looks like the same game.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
sure it could and does with reduced graphics. the main reason is nintendo wants to keep it PG rating. Also good graphics does not mean it's a good game.

Not really, you're forgetting the comparatively weak CPU of the Wii being part of the equation. You can't really get around that issue as it would mean changing too much of a game if it's already leveraging even half of the performance of the CPUs in the PS3 and 360.