So the WiiU's cpu is a 1.25 Ghz tri-core PowerPC 750...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
It is interesting that they went with a core with such terrible SIMD performance. The 360 core has pretty beefy vector units, and the PS3 obviously has its SPUs. Are Nintendo expecting developers to use GPGPU more for their highly parallel floating-point algorithms?

It's a possibility, especially with the heavy emphasis on the eDRAM and lack of decent SIMDs. It's unlikely the GPU is GCN, it's either R700 or Evergreen I'm sure.

It does beg the question as to how SIMD/vector heavy multiplatform games are in general. The Wii U has the benefit of an audio DSP, more cache, eDRAM, perhaps better memory latencies, more RAM and some limited OoO processing (though this is more in benefit to developers). At 1.25 GHz, we'd see sustained GFLOPS of barely a sixth of the 360's though, assuming the 360 can even get near it's theoretical limit (96 GFLOPS IIRC). Not bolting on a VMX unit on Broadway seems like a mistake if you ask me, especially if you're going through the trouble to get three of them to connect and work as a tricore processor.

Maybe, Nintendo and IBM could've developed a die with a variable speed Broadway core for BC, OS, and background tasks, with two or three more modern cores with VMX or AVX to get them the GFLOPS that are necessary for modern games, while still having a relatively low clock.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Is anyone considering that the speed of the CPU may have been clocked lower than expected when tested and you are all coming up with numbers that make no sense?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Is anyone considering that the speed of the CPU may have been clocked lower than expected when tested and you are all coming up with numbers that make no sense?

Good point, we can't assume these are apples-to-apples configurations (even platform-wise) so we can't really get a read on true IPC or absolute performance at this stage.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Is anyone considering that the speed of the CPU may have been clocked lower than expected when tested and you are all coming up with numbers that make no sense?

Considering that the power consumption is almost the same regardless of whether it's playing a game or playing the menu I doubt it has any DVFS at all.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Is anyone considering that the speed of the CPU may have been clocked lower than expected when tested and you are all coming up with numbers that make no sense?

That question was already asked to the guy that found out all this info in the first place. Apparently ~1.2GHz is right in light with the expected clock for this CPU.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
That question was already asked to the guy that found out all this info in the first place. Apparently ~1.2GHz is right in light with the expected clock for this CPU.

If the core + L1 is unchanged or mostly unchanged from Broadway then it should be about right. It's up to 1GHz on 90nm, and the shrinks after that don't improve timing headroom as much as they used to. It's not at all a low figure for such a short pipeline (only 4-6 stages!)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Depends on what type of gamer you are. I've played it through exactly once and did not quite understand the hype. I've no interested in solving "puzzles" and such in games. If I play it's for distraction and FPS Multiplier rule there. If I want an intellectual challenge, read a good (scientific) book, work on your Open Source Project(s) and so forth.


Certainly, to each their own. My suggestion would be to play it again, and see how you feel about it. :) No biggie if ou didn't enjoy it, really. Everyone has different tastes. But if you look at reviews, the game was very well received on the whole.

From just a graphics standpoint, I really think it showcased what the Gamecube could do... Nintendo must be pretty happy with ATI/AMD. They've sourched their graphics chip from them for four consoles in a row now. Too bad Nintendo is usually so tight lipped on the specs, though.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Nintendo must be pretty happy with ATI/AMD. They've sourched their graphics chip from them for four consoles in a row now.

Three consoles in a row, and the Flipper/Hollywood IP wasn't even originally developed by ATI but ArtX whom they bought out. So it doesn't bear much of a resemblance to any of ATI/AMD's GPU heritage. Wii U is the first Nintendo console really using a derivative of a past ATI design.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
I just want to say the Artx IP was incorporated from R300 all the way to R520... so yeah. lot of heritage there. just post-gamecube. After numerous consolidations by AMD who can say where these people are now, and at which point in the R520 -> VLIW5 -> VLIW4 - > GCN evolution they were laid off.

http://www.beyond3d.com/content/interviews/8/3


Dave Orton said:
We had this concept of the “ping-pong” development between the west and east coast design centres. On paper this looked great, but in practice it didn’t work very well. It doesn’t work well for a variety of reasons, but one of them is the PC architecture, at the graphics level, has targeted innovation and clean sheet innovation and whenever you have separate development teams you are going to, by nature, have a clean sheet development on every generation of product. For one, we can’t afford that and its not clear that it’s the right thing to do for our customers from a stability standpoint. Its also the case that’s there’s no leverage from what the other development team has done, so in some cases you are actually taking a step backwards instead of forwards.

What we are now moving towards is actually a unified design team of both east and west coast, that will develop our next generations of platforms, from R300 to R400 to R500 to R600 to R700, instead of a ping-pong ball between them both. Within that one organisation we need to think about where do we architecturally innovate and where do we not in order to hit the right development cycles to keep the leadership, but it will be one organisation.

If you dissect in, for example, to the R600 product, with is our next, next generation, that development team is all three sites - Orlando, Silicon Valley, Marlborough – but the architectural centre team is in the Valley, as you point out, but all three are part of that organisation.


The Artx group is the western group. In fact, Dave Orton was the founder and CEO of ArtX, and before that he was a director at SGI. His team worked on the Reality hardware used in the N64. So there's your 4th generation of GPU heritage. Just not architecturally contiguous.

Dave was also the CEO of Aptina while their sensors were used in the 3DS. I'm just pointing this out to show these associations are not coincidental.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I just want to say the Artx IP was incorporated from R300 all the way to R520... so yeah. lot of heritage there. just post-gamecube. After numerous consolidations by AMD who can say where these people are now, and at which point in the R520 -> VLIW5 -> VLIW4 - > GCN evolution they were laid off.

What ArtX IP - can you give specific examples? And do you know that it has anything to do with features found in Gekko? Because I can't think of a single distinguishing connection between the two.

The Artx group is the western group. In fact, Dave Orton was the founder and CEO of ArtX, and before that he was a director at SGI. His team worked on the Reality hardware used in the N64. So there's your 4th generation of GPU heritage. Just not architecturally contiguous.

Yeah, and lots of SGI people went on to work at nVidia and all over the place. It's way too much of a stretch to use this to call Gekko an RDP successor because of this, or to say that it's a continuation of business relationship for Nintendo..

Dave was also the CEO of Aptina while their sensors were used in the 3DS. I'm just pointing this out to show these associations are not coincidental.

I disagree, that could very easily still be coincidental. You can find Aptina's sensors in many devices..
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
R300 was years ahead of flipper, designed for 130 nm vs 180 nm, programmable vs fixed function... They are not architecturally contiguous at all, but conceived and developed by a lot of the same architects and managers. It's like saying Abbey Road has no Beatles heritage because it is not organized to sound like Sgt Pepper. One simply took place after the other under the same care of the same individuals.

Tim Van Hook was chief designer of RDP and Flipper and followed Dave Orton, Wei Yen, Joe Macri, Greg Buchner and probably dozens of others from SGI to ArtX to ATI. You can definitely say one was the successor to the other and it is not a coincidence at all that Nintendo contracts happen to migrate with them.

There was in fact a lawsuit directly related to ArtX displacing SGI from their partnership with nintendo. For nintendo, it very much is a continuation of a business relationship.

http://news.cnet.com/Artx-ousting-MIPS-from-Nintendo/2100-1001_3-211664.html
http://eetimes.com/electronics-news/4044411/The-startup-that-saved-ATI
 
Last edited:

nforce4max

Member
Oct 5, 2012
88
0
0
Grabs a bucket of popcorn and a cold soda then watches the forum show.

Back on topic I am not sure what the gpu they used but in any case it is held back by the cpu. The only reason that I can think off had atm that Nintendo chose to use it besides costs is likely to save on power consumption. The edram cache is no big surprise but overall the console is just too weak and developers are not happy with it. If there is any saving grace it is openCL that maybe they can use for in game effects and maybe some sort of physics to help with the workload. Also the number of players/characters, and enemies on screen will likely be limited. In the end though it will likely make it very easy for emulators lol with in a few months.

As for the next XBox and PS4 I do hope that they are GCN or some derivative so that some compute features will be put to use and maybe that will trickle down to the pc platform. General purpose compute just kills Nvidia Kepler (shared data between in gpu threads).
 

Agent11

Diamond Member
Jan 22, 2006
3,535
1
0
Nintendo has some very popular franchises, as long as they invest the time and capital for quality releases they have nothing to worry about.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
R300 was years ahead of flipper, designed for 130 nm vs 180 nm, programmable vs fixed function... They are not architecturally contiguous at all, but conceived and developed by a lot of the same architects and managers. It's like saying Abbey Road has no Beatles heritage because it is not organized to sound like Sgt Pepper. One simply took place after the other under the same care of the same individuals.

Tim Van Hook was chief designer of RDP and Flipper and followed Dave Orton, Wei Yen, Joe Macri, Greg Buchner and probably dozens of others from SGI to ArtX to ATI. You can definitely say one was the successor to the other and it is not a coincidence at all that Nintendo contracts happen to migrate with them.

There was in fact a lawsuit directly related to ArtX displacing SGI from their partnership with nintendo. For nintendo, it very much is a continuation of a business relationship.

http://news.cnet.com/Artx-ousting-MIPS-from-Nintendo/2100-1001_3-211664.html
http://eetimes.com/electronics-news/4044411/The-startup-that-saved-ATI

Okay, I didn't know all of that and agree it sounds like the ArtX relationship was definitely a continuation of their relationship vis a vi SGI/RDP. That's a pretty interesting history, thanks for the information.

I still think what you said about Aptina sounds like a huge stretch though :p
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
The edram cache is no big surprise but overall the console is just too weak and developers are not happy with it.


nintendo has been doing this crap since gamecube. it should not surprise anyone but it is frustrating. they cheap out on hardware and basically just rehash clones of mario, zelda, mario cart, metroid, kirby/whatever else they know people will blindly gobble up by the millions w/o question.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
nintendo has been doing this crap since gamecube. it should not surprise anyone but it is frustrating. they cheap out on hardware and basically just rehash clones of mario, zelda, mario cart, metroid, kirby/whatever else they know people will blindly gobble up by the millions w/o question.

And yet it's still probably the only console that I'll end up buying as most of the biggest console games outside of Nintendo's first party exclusives are available on the PC, which frankly is going to make any console look like it has cheap hardware.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
What kind of hardware is present in the Wii U's controllers? Just thinking out loud here, but perhaps they chose the CPU for the console in part because they could also put one (single core?) in each controller without ridiculous cost.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
What kind of hardware is present in the Wii U's controllers? Just thinking out loud here, but perhaps they chose the CPU for the console in part because they could also put one (single core?) in each controller without ridiculous cost.

Except for the pad (one per system) they are just standard Wiimotes I think.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
nintendo has been doing this crap since gamecube. it should not surprise anyone but it is frustrating. they cheap out on hardware and basically just rehash clones of mario, zelda, mario cart, metroid, kirby/whatever else they know people will blindly gobble up by the millions w/o question.


Nintendo reuses those franchizes often, that is certain. But, at least when they rehash them, at least to me, it feels like some effort has been put into making a good game with something new to offer.

How many FPS games are rehashes of what you've already played, more or less? Madden 13 sold more than 1.6 million copies in the first week. Wanna talk about a rehash... :)

But I don't think Nintendo holds back gaming, at all. Games often seem to be developed for the middle road, Xbox/PS3, then ported up to PC or down to Nintendo's console if it is doable. How many games made specifically for a current Nintendo console have made their way to PC? So with that in mind, I am happy that Nintendo found a niche that works for them. Their games may be, in general less mature, but that doesn't take away how good they can be. And since I do not believe Nintendo has an effect on PC gaming, I say good for them, let them do their thing.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
For its time, the Gamecube was not that bad in terms of computing power. Not cutting edge but not underpowered like the WiiU is.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
For its time, the Gamecube was not that bad in terms of computing power. Not cutting edge but not underpowered like the WiiU is.

Well, considering Xbox was basically cutting edge at the time, and the gamecube was like a half cycle behind in hardware, it wasn't too bad.

The Wii was probably the worst offender in terms of being underpowered though. I think the Wii U fairs better for the time, at least in terms of graphics. CPU wise, it's hard to say which is worse relatively.
 

nobitakun

Junior Member
Dec 19, 2012
4
0
61
Ha. The Tegra 3 at 1080p will look terrible - despite Nvidia's marketing, it can't put out 360/PS3 quality at HD resolutions.

You're right, but I bet Tegra 4, which is being released sometime early 2013 will be better than 360/PS3, it even will reach Wii U quality.

So, the Wii U hardware for the price you pay pisses me off. It's like always Nintendo is a well known veteran hustler LOL. The tablet can't be the sake, chinese ones with better resolution and good hardware costs less than US$80.
 

tipoo

Senior member
Oct 4, 2012
245
7
81
So I was talking to Hector Marcan (the guy who revealed this info on the clock speed and core type) on twitter to ask if they at least replaced the infamously slow FPU of the 750, he said no, the core is nearly identical. No added SIMD over the modified core in the Gamecube either, it's pretty much a 1:1 upclocked version of the Wii core. That's pretty...Bad sounding. Yes the Wii U CPU will have higher IPC than the PS360, but at such a low clock rate...It certainly won't compete with the octa core Jaguars of the other two next gen consoles at any rate.

The 7xx family had its shortcomings, namely lack of SMP support and SIMD capabilities and a relatively weak FPU

http://en.wikipedia.org/wiki/PowerPC_7xx

https://twitter.com/marcan42


The PS4 and Nextbox are being put at 8 core 1.6GHz Jaguar cores, which will have a triple win of higher IPC, clock speed, and core count.
 
Last edited: