AMD to power Wii U

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
http://www.engadget.com/2011/06/07/ibm-puts-watsons-brains-in-nintendo-wii-u/
Says that they used the same tech as they did in the Watson. That would indicate a Power7.

Same technology, not the same CPU.

IBM promises to supply Nintendo an all-new, Power-based multi-core microprocessor that packs some of IBM's "most advanced technology into an energy-saving silicon package". For example, the chip is projected to have embedded DRAM (potentially on the same package) that will speed up memory accesses for the multi-core chip. IBM plans to manufacture the new microprocessor using 45nm silicon-on-insulator process technology at the 300mm fab in East Fishkill, New York.
 

DivideBYZero

Lifer
May 18, 2001
24,117
2
0
Not a big surprise, ATI supplied the tech for the Gamecube & Wii, so makes sense they would stick with the supplier for BC reasons if nothing else.
 

Arglebargle

Senior member
Dec 2, 2006
892
1
81
Now this is a controller I'd like to see ported over for PC!

Also, if the new Wii is substantially more powerful than the PS/Xbx, maybe it will get them off their asses, and moving towards their next, more up to date, versions.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
It'll be at least another year before we hear anything about the next Microsoft and Sony consoles. Both companies sunk a lot of money into developing these consoles and then subsidized them fairly heavily during the initial portions of the lifecycle. If they want to realize any profit on this generation of consoles, they'll need to wait a while longer.

Even if the new Nintendo console is more powerful than either the PS3 or the Xbox 360, it won't be so much more powerful than it makes the difference as large as the gap that exists between the Wii and Xbox/Playstation. I doubt that either company feels too much pressure from Nintendo.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Also, if the new Wii is substantially more powerful than the PS/Xbx, maybe it will get them off their asses, and moving towards their next, more up to date, versions.

It won't be. Although its possible the GPU will be a great deal more powerful I don't think the CPU will be, so overall I don't think it stands much chance to leave the other consoles too far behind.

Granted its likely the Wii U will be able to do true 1080p when most PS3/360 games are up-scaled 720p or less, I just don't think the games will be any more physically complex which is where CPU power comes into play
 

Deanodarlo

Senior member
Dec 14, 2000
680
0
76
If the rumours are true about using RV700 series tech, I have a strong feeling the GPU in the Wii U will be a 4770. It fits the bill - very low power, around 50-60W on load in 40nm form, and low temps due to only having 640 rearranged shaders rather than 800 as on the 4850.

It was designed and released to the public in very small numbers around the time the Wii U was being designed, 2009. If it was shrunk to 32nm, its power usage and temps would be even lower.

A 4770 will finally give consoles true 1080p resolution games. Interesting to see if Nintendo will get anything out of the tesselation unit never used on the PC version of the 4 series.

I think the PS4 and next Xbox will be able to do tesselation very well indeed however, leaving the Wii U behind, but it looks like the Wii could lead in graphics for the next couple of years.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
I'm not sure why Nintendo would want to use what was essentially a test part for the 40nm process. Given that Nintendo is going to be sending output to multiple displays, I think that they'd want some of the newer ATI technology, like Eyefinity.

By the time the Wii U comes out, the 4770 is going to be three generations behind in terms of technology and features. There's no real reason not to use something based on the 6000 or even 7000 series.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
There's no real reason not to use something based on the 6000 or even 7000 series.

Actually there are three:

Cost
Power
Lead time

My nephew stayed with me last week and brought his Xbox 360 with him. You could hear the blower on the thing throughout the house. Heck, it was louder than the games most of the time.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I'm not sure why Nintendo would want to use what was essentially a test part for the 40nm process. Given that Nintendo is going to be sending output to multiple displays, I think that they'd want some of the newer ATI technology, like Eyefinity.

By the time the Wii U comes out, the 4770 is going to be three generations behind in terms of technology and features. There's no real reason not to use something based on the 6000 or even 7000 series.

They may be better off with a 5750/5770 class of GPU as it would enable them to use the more advanced technologies while keeping power draw quite low, especially if it was on 32nm.
 

nonameo

Diamond Member
Mar 13, 2006
5,902
2
76
I remember reading somewhere that the big N likes to use proven/mature tech for their consoles. The 4770 will be waaaaayyyyyyy powerful enough for nintendo, it will trash pretty heavily anything in consoles today. In fact, I wouldn't be surprised if we see something a lot slower than a 4770, even just doubling what's in the ps3 and 360 would be a no hassle endeavor. Not only that, it's proven/mature, and its not like they gain all that much extra functionality with the newer shaders on the 5000/6000 series.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Actually the GPU in the Xbox360 is a Microsoft chip, designed by ATI. I do not believe AMD is selling Microsoft any chips.

Correct. Its an early R600ish video chip with best to call it Video cache located on the chips packaging. Microsoft pays royalties, but they actually handle the manufacturing (which of course they hand of to TSMC and others).
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
I remember reading somewhere that the big N likes to use proven/mature tech for their consoles. The 4770 will be waaaaayyyyyyy powerful enough for nintendo, it will trash pretty heavily anything in consoles today. In fact, I wouldn't be surprised if we see something a lot slower than a 4770, even just doubling what's in the ps3 and 360 would be a no hassle endeavor. Not only that, it's proven/mature, and its not like they gain all that much extra functionality with the newer shaders on the 5000/6000 series.

Considering the GPU in the Xbox only has 232M transistors, whereas the 4770 has 826M transistors, the 4770 would be overkill, by a long shot. I think they'll get something custom designed using newer ATI technology. How much power they'll need really depends on how many different controllers they want to be able to pump video to at the same time.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Correct. Its an early R600ish video chip with best to call it Video cache located on the chips packaging. Microsoft pays royalties, but they actually handle the manufacturing (which of course they hand of to TSMC and others).

The current APU chip in the 360 was done with IBM & Global foundries' engineering departments. They actually had to retard the interconnect in the chip because it was too efficient. The interconnect act like the old front end bus that originally used in the 360.

I look forward to actually find out whats in the Wii and the next gen consoles.

I'm going to guess that the Wii's GPU is some type of frankenstein chip between AMD's current & last gen tech.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
That's stupid that they're using 45 nm for the CPU. They should be using 32nm.

If they're using a current ATi tech, then I'm definitely boycotting it. I've had enough of ATi's piece of shit GPUs particularly their optimizations, the awful texture aliasing, and harsh texture transitions.

I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
 

Bryf50

Golden Member
Nov 11, 2006
1,429
51
91
That's stupid that they're using 45 nm for the CPU. They should be using 32nm.

If they're using a current ATi tech, then I'm definitely boycotting it. I've had enough of ATi's piece of shit GPUs particularly their optimizations, the awful texture aliasing, and harsh texture transitions.

I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
Who let the troll out of P&N?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
That's stupid that they're using 45 nm for the CPU. They should be using 32nm.

If they're using a current ATi tech, then I'm definitely boycotting it. I've had enough of ATi's piece of shit GPUs particularly their optimizations, the awful texture aliasing, and harsh texture transitions.

I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.

The 360 has been running on ATi tech for 5 years now and hasn't had any major graphical problems. :rolleyes:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
Huh? The Voodoo 2 didn’t even support proper trilinear, much less AF. Furthermore it’s impossible to know how it would filter shaded surfaces used in modern games.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
It really bugs me how using a 3 year old chip is considered "latest and greatest" when it comes to consoles... Shows you how much they are holding gaming back
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
It's par for the course for Nintendo, after the release of the Wii. Sony and Microsoft at least tried to release the latest chips with the 360 and the PS3 (Radeon X1950 and Geforce 7800 chips, respectively, right as the Radeon HD 2000 series and Geforce 8000 series released) In fact ATi used the Xenos chip in the 360 as a technology testbed for later desktop products.
 

IGemini

Platinum Member
Nov 5, 2010
2,472
2
81
It really bugs me how using a 3 year old chip is considered "latest and greatest" when it comes to consoles... Shows you how much they are holding gaming back

Construction of new consoles has been to favor mature technology over cutting-edge with graphics cores. A 4870 isn't really a slouch and they could easily fit it on a smaller process if they wanted. Don't consoles have less graphical overhead than their PC counterparts as well?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I very much doubt they'll use a 4870. Aside from price, heat buildup is a very real concern for such a compact console, and things get ugly when heat isn't managed right -- remember the early years of the 360? From the looks of it the Wii U is just as small and perhaps has less vents than the 360. A chip based on the Radeon HD 4770 would be much cooler and could still reach for 1080p gameplay at the same performance level that current consoles have at 720p.

However, an analysis done by the guys at DigitalFoundry seems to indicate that the gameplay videos for the Wii U were created at 720p: http://www.eurogamer.net/articles/digitalfoundry-vs-e3-nintendo. Nintendo may not be shooting for native 1080p gameplay at all, just comparative or marginally better performance to the 360 and PS3 at 720p. DigitalFoundry surmises that all Nintendo needs to reach that level of performance is an inexpensive Radeon HD 4670 chip.

Another factor that would prohibit Nintendo from using a true 4870 is the expensive 256 bit memory bus and GDDR5 memory. If they do go with a 4870 they'll probably castrate it with a 128 bit memory bus and maybe GDDR3 memory, possibly making up for it with an eDRAM chip like the 360 has.
 
Last edited:

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Huh? The Voodoo 2 didn’t even support proper trilinear, much less AF. Furthermore it’s impossible to know how it would filter shaded surfaces used in modern games.
In the Diamond Monster 3dII (which I had from mid 98 till early 2001) control panel there was a checkbox for forcing trilinear filtering IIRC (didn't it halve texel fillrate?). I certainly don't remember rough texture stage transitions or texture aliasing on any game I played on it. However, the X1k series filtering was awful, as was the filtering of the 5770 I had as a 2nd card very briefly. Nvidia's filtering is good on HQ, even with AF, as long as the application takes advantage of the HW.

I very much doubt they'll use a 4870. Aside from price, heat buildup is a very real concern for such a compact console, and things get ugly when heat isn't managed right -- remember the early years of the 360? From the looks of it the Wii U is just as small and perhaps has less vents than the 360. A chip based on the Radeon HD 4770 would be much cooler and could still reach for 1080p gameplay at the same performance level that current consoles have at 720p.

However, an analysis done by the guys at DigitalFoundry seems to indicate that the gameplay videos for the Wii U were created at 720p: http://www.eurogamer.net/articles/digitalfoundry-vs-e3-nintendo. Nintendo may not be shooting for native 1080p gameplay at all, just comparative or marginally better performance to the 360 and PS3 at 720p. DigitalFoundry surmises that all Nintendo needs to reach that level of performance is an inexpensive Radeon HD 4670 chip.
What good is HD resolution without AA?

1080p should most definitely not be a primary target, but rather minimal aliasing. In fact, a 64 bit FP RGBA buffer (with alpha blended textures instead of alpha tested textures) should be a priority before 1080p.
 

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
Yeah, was up too late last night haha.. I was actually referring to the PS3's RSX "graphics synthesizer". The Xenox chip in the Xbox is comparable to a X1900 I think. Still, my point remains that a 4800 class GPU isn't really dissapointing like a lot of people are saying. Good thread over at HardOcp about this:

http://hardforum.com/showthread.php?t=1603533

The PS3's RSX is more or less a Geforce 7900GS. It doesn't have any relation to the 6800 series AFAIK. Also, the 360's GPU is closer to the Radeon HD2900 series than it is to the x1900 series. (Unified shaders)