http://www.engadget.com/2011/06/07/ibm-puts-watsons-brains-in-nintendo-wii-u/
Says that they used the same tech as they did in the Watson. That would indicate a Power7.
IBM promises to supply Nintendo an all-new, Power-based multi-core microprocessor that packs some of IBM's "most advanced technology into an energy-saving silicon package". For example, the chip is projected to have embedded DRAM (potentially on the same package) that will speed up memory accesses for the multi-core chip. IBM plans to manufacture the new microprocessor using 45nm silicon-on-insulator process technology at the 300mm fab in East Fishkill, New York.
Also, if the new Wii is substantially more powerful than the PS/Xbx, maybe it will get them off their asses, and moving towards their next, more up to date, versions.
There's no real reason not to use something based on the 6000 or even 7000 series.
I'm not sure why Nintendo would want to use what was essentially a test part for the 40nm process. Given that Nintendo is going to be sending output to multiple displays, I think that they'd want some of the newer ATI technology, like Eyefinity.
By the time the Wii U comes out, the 4770 is going to be three generations behind in terms of technology and features. There's no real reason not to use something based on the 6000 or even 7000 series.
Actually the GPU in the Xbox360 is a Microsoft chip, designed by ATI. I do not believe AMD is selling Microsoft any chips.
I remember reading somewhere that the big N likes to use proven/mature tech for their consoles. The 4770 will be waaaaayyyyyyy powerful enough for nintendo, it will trash pretty heavily anything in consoles today. In fact, I wouldn't be surprised if we see something a lot slower than a 4770, even just doubling what's in the ps3 and 360 would be a no hassle endeavor. Not only that, it's proven/mature, and its not like they gain all that much extra functionality with the newer shaders on the 5000/6000 series.
Correct. Its an early R600ish video chip with best to call it Video cache located on the chips packaging. Microsoft pays royalties, but they actually handle the manufacturing (which of course they hand of to TSMC and others).
Who let the troll out of P&N?That's stupid that they're using 45 nm for the CPU. They should be using 32nm.
If they're using a current ATi tech, then I'm definitely boycotting it. I've had enough of ATi's piece of shit GPUs particularly their optimizations, the awful texture aliasing, and harsh texture transitions.
I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
That's stupid that they're using 45 nm for the CPU. They should be using 32nm.
If they're using a current ATi tech, then I'm definitely boycotting it. I've had enough of ATi's piece of shit GPUs particularly their optimizations, the awful texture aliasing, and harsh texture transitions.
I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
Huh? The Voodoo 2 didn’t even support proper trilinear, much less AF. Furthermore it’s impossible to know how it would filter shaded surfaces used in modern games.I've always found it fucked up how 3dfx is out of business when they had perfectly filtered textures with the Voodoo2 (may not have been single pass, but it looked good), yet 13 years later, ATi is in business and they still can't provide decent texture filtering.
It really bugs me how using a 3 year old chip is considered "latest and greatest" when it comes to consoles... Shows you how much they are holding gaming back
In the Diamond Monster 3dII (which I had from mid 98 till early 2001) control panel there was a checkbox for forcing trilinear filtering IIRC (didn't it halve texel fillrate?). I certainly don't remember rough texture stage transitions or texture aliasing on any game I played on it. However, the X1k series filtering was awful, as was the filtering of the 5770 I had as a 2nd card very briefly. Nvidia's filtering is good on HQ, even with AF, as long as the application takes advantage of the HW.Huh? The Voodoo 2 didnt even support proper trilinear, much less AF. Furthermore its impossible to know how it would filter shaded surfaces used in modern games.
What good is HD resolution without AA?I very much doubt they'll use a 4870. Aside from price, heat buildup is a very real concern for such a compact console, and things get ugly when heat isn't managed right -- remember the early years of the 360? From the looks of it the Wii U is just as small and perhaps has less vents than the 360. A chip based on the Radeon HD 4770 would be much cooler and could still reach for 1080p gameplay at the same performance level that current consoles have at 720p.
However, an analysis done by the guys at DigitalFoundry seems to indicate that the gameplay videos for the Wii U were created at 720p: http://www.eurogamer.net/articles/digitalfoundry-vs-e3-nintendo. Nintendo may not be shooting for native 1080p gameplay at all, just comparative or marginally better performance to the 360 and PS3 at 720p. DigitalFoundry surmises that all Nintendo needs to reach that level of performance is an inexpensive Radeon HD 4670 chip.
Yeah, was up too late last night haha.. I was actually referring to the PS3's RSX "graphics synthesizer". The Xenox chip in the Xbox is comparable to a X1900 I think. Still, my point remains that a 4800 class GPU isn't really dissapointing like a lot of people are saying. Good thread over at HardOcp about this:
http://hardforum.com/showthread.php?t=1603533