Go Back   AnandTech Forums > Hardware and Technology > Highly Technical

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 10-01-2013, 12:08 PM   #1
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Post Wii U Hardware Investigation

Wii U CPU:
- Tri Core IBM 45nm PowerPC750CL/G3/Power7 Hybrid
- Core 0; 512KB Core 1; 2MB Core 2; 512KB L2 Cache
- Clocked at 1.24Ghz
- 4 stage pipeline/not Xenon-Cell CPU-GPU hybrid
- Produced at IBM advanced CMOS fabrication facility
- eDRAM is L2 cache embedded in CPU(Power7 memory implementation)

*Wii CPU core was 20% slower than Xbox 360 core, since Wii U CPU is modified/ehanced and clocked 65-70 percent higher thus two Wii U cores should be on par/equal exceed all three Xbox 360 cores or if all three cores are used then Wii U CPU is 50+ percent stronger/faster than Xbox 360 processor and faster than PlayStation 3 processor. http://gbatemp.net/threads/retroarch...7#post-4365165

*X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz vs Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz

*X360 Xenon 32-40 stage pipeline vs Xbox One/PlayStation 4 Jaguar 16 stage pipeline vs Wii U Espresso 4 stage pipeline

*X360 Xenon in-order 3 Execution Units vs Xbox One/PlayStation 4 Jaguar out-of-order ? execution units vs Wii U Espresso out-of-order 15 execution units

*X360 Xenon 1MB shared L2 Cache vs Xbox One/PlayStation 4 4MB shared L2 cache vs Wii U Espresso 2MB/512KB/512KB L2 cache

Wii U CPU is next generation compared to Xbox 360 and people that say otherwise should just shut up, be ashamed, have a seat, accept the facts and deal with it and here is the proof; http://systemwars.com/forums/index.php/topic/112794-no-freaking-way-40-stage-pipeline/

*Since Xbox 360 has 32 to 40 stage pipeline and PlayStation 3 also has 32 to 40 stage pipeline thus it has severe penalties in terms of useable performance while Wii U with 4 stage pipeline is in comparison far more efficient thus more can be used from it and imagine how bad Xbox 360/PlayStation 3i s bad in comparison to Wii U;

http://www.youtube.com/watch?v=w9VWRB07yqc

*Wii U CPU codenamed Espresso could actually be three to four times faster than Xbox 360 Xenon since Xenon has less cache, way less execution units and way too long stage pipelines that is 8 to 10 times longer and this is awful for a lot of tasks and it is in-order versus Espresso that is out-of-order thus better for predicting code, better for AI, AI path finding and so on...

- Dual Core ARM Cortex 8 for background OS tasks clocked at 1Ghz with 64KB L1 cache per core and 1MB of SRAM as L2 Cache, evolution of "Starlet" chip
- Single ARM9 "Starlet" core for Backward Compatibility, rumored to have higher clocks

Wii U Memory:
- DDR3 2GB, 1GB for OS, 1GB for Games
- eDRAM 32MB VRAM + 4MB Gamepad + 3MB CPU L2 cache
- Clocked at 550Mhz
- eDRAM act's as "Unified Pool of Memory" for CPU and GPU thus practically eliminating latency between them

*Wii U's DDR3 RAM bandwidth has theoretical maximum of 51,2GBps since it has four 512MB chips and not one large 2GB chip so anyone thinking that its maximum bandwidth is mere 12.8GBps is a tech illiterate. Xbox 360 had theoretical maximum of 22.8Gbps though it has a bottleneck that turns it down to mere 10Gbps thanks to poor FSB.

*Xbox 360 has GDDR3 RAM that is bottlenecked by Xenon's FSB so it can not saturate theoretical maximum of 22.8Gbps since FSB can handle 10Gbps and latency of GDDR3 is atrocious compared to DDR3 RAM in Wii U, latency is very important for the CPU since the lower the latency the faster transfer between CPU/GPU and RAM and PlayStation 4 will have similar issues as Xbox 360 when comes to latency since GDDR5 is successor to GDDR4 that is successor to GDDR3 and all of them have higher latency than DDR3.

Wii U GPU:
- VLIW 5/VLIW4 Radeon HD 5000/6000 40nm
- DirectX 11/Open GL 4.3/Open CL 1.2/Shader Model 5.0
- Supports GPGPU compute/offload
- Customized, Using custom Nintendo API codenamed GX2(GX1 Gamecube)
- Clocked at 550Mhz
- Produced at TSMC advanced CMOS 40nm
- Uses 36MB eDRAM as VRAM
- 4MB of eDRAM is allocated for streaming feed for gamepad
- GPU is customized and according to modification of GPU's in their previous consoles we can presume that Nintendo won't waste any mm^2 to unneeded features and customized to fit own needs.

*Eyefinity is present on AMD cards since Radeon HD 5000 so it is at least Radeon HD 5000 w/ Terascale 2

*If it is Radeon HD 4000 series and has 320SPU's then it is Radeon HD 4670 55nm though since Wii U uses Eyefinity and GPU is 40nm thus being Radeon HD 4000 series is invalid.

*Radeon HD 6000 was released in Q3 2010, final Wii U silicon was finished in Q2 2012 and released Wii U in Q4 2012. Looking at gap between E3 2011 and final Wii U silicon Q2 2012 plus Radeon HD 6000 evolved from Radeon HD 5000 that evolved from Radeon HD 4000 thus I presume switching to a newer though similar GPU and architecture was not a problem and all of these GPU's were produced at TSMC's fabs.

*Wii U from E3 2011 and its development kits had Radeon HD 4850, it is rumored that Wii U's newer development kit replaced 4850 with modified/customized/cut down Radeon HD 6850.

*Radeon HD 6850 has 1.5Tflops of performance at clock of 775Mhz with 1GB of GDDR5 VRAM and bandwidth of 130Gbps

*GPU in Wii U is clocked at exactly 30% lower at clock of 550Mhz and if it has 1/3 of SPU's thus it has 0.352Tflops. 36MB of eDRAM with 70-130/275/550Gbps bandwidth, 2-player co-op as for example in Black Ops 2 Zombie Mode is using Eyefinity(?) to stream two different in-game images/views.

*Since eDRAM in Wii U's GPU codenamed Latte is embedded thus its theoretical maximum bandwidth of 275Gbps to even 550Gbps; http://www.ign.com/boards/threads/official-wii-u-lobby-specs-graphics-power-thread.452775697/page-203#post-481621933

*90nm 16MB eDRAM can do 130Gbps bandwidth, 45n 32MB eDRAM in WIi U should do the same plus since CPU's Cache and GPU uses eDRAM so latency is much much lower and AI can be offloaded to GPU, when embedded into GPU then it should do 275/550Gbps

Wii U Note;
- Canít use DirectX 11 because of OS, only Open GL 4.3/Nintendo API GX2 that is ahead of DX11
- Easier to program, no multiple bottlenecks causing issues as on Xbox 360 and PlayStation 3
- Efficient hardware with no bandwidth bottlenecks
- Launch games and ports use 2 cores and only a couple of ports/games use 3rd core to a decent degree
- Wii U CPU has much higher operations per cycle/Mhz than Xbox 360/PlayStation 3, it is unknown if it is faster
- Wii's CPU core was slower
- Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)
- Wii U power consumption at full system load is 30-35 watts(highest load ever possible in its current state)
- Wii U's PSU/power brick is rated 75 watt and has 90% efficiency thus it can handle 67.5 watts
- Wii U's Flash storage, RAM, USB ports, motherboard, fan and small secondary chips consume around 5 to 10 watts in total when fully stressed/used
- Wii U's SoC(CPU and GPU) estimated maximum power consumption is 20 to 25 watts
- Supports 4k displays, native 4k via HDMI 1.4b (possible 2D 4k games?)

*It is maybe in fact most efficient performance per watt in the world in terms of 45/40nm chips/SoC's

*Wii U's Power Brick/PSU is rated 75 watts and has efficiency of 90% so it can handle at max 68 watts without serious degrading and since Wii U consumes 30 watts there is available 28 watts though I would dare only to bump power consumption to 60 watts or 30 additional watts if I could increase performance.

*Since maximum power consumption of Wii U is 40 watts and from my calculation GPU consumes roughly mere 10 watts then Wii U's CPU could consume 15 to 20 watts and rest of system around 10 to 15 watts depending on how much Wii U's CPU consumes since it is an unknown factor. I am not counting any possible customizations on the Wii U's GPU, this is all rough estimations and we don't know the whole picture. Wii U is a beast when looking at what nm process was built on and probably most efficient machine on that nm process in the world.

In case you are wondering why some games run on Wii U worse compared 7th generation consoles then I have simple explanation; Wii U hardware is noticeably different than Xbox 360's or PlayStation 3's because their processors and not true CPU's since they can operate GPU related tasks well compared to Wii U that is primarily a CPU. Another reason is that developers don't put spend resources and time on porting the game for Wii U thus it lacks proper optimizations and adaptions of their game engines to Wii U's hardware. Even though some ports perform lower than on 7th generation consoles, in most cases run on higher resolution and/or at Native 720p/HD.

Most if not all 3rd party launch games were made on older Wii U development kits that had 20 to 40% lower clocks than final development kit that Nintendo released near the launch so developers did not had much time to adapt their games to the final devkit thus games were running poorly also games like Darksiders 2, Warriors Orochi 3 Hyper, Batman Arkham City and other were using only one to two cores while third was not used at all or was barely used to aid performance of the game involving CPU related tasks. Since most ports are from Xbox 360 and/or PlayStation 3 versions of the games there are sure to be incompatibilities since Xbox 360 and PlayStation 3 Processors do CPU and also GPU tasks plus GPU's, RAM/Memory, Latency and other factors are different than on Wii U thus optimizations and adaptations is needed though cheap ports as always tend to be a train wreck. Don't you agree?

Xbox 360 and PlayStation 3 will be supported for next three years and this is in a way a negative thing since it can really hold back performance of games on Wii U if those games are going to be mostly port from Xbox 360 and PlayStation 3 since the architecture on these two consoles are vastly different compared to Wii U. We may only see Wii U shine after three years when Xbox 360 and PlayStation 3 stop being supported, those two consoles will hold back development and time spent on Wii U.

Wii U may not be a "leap" as Xbox One or PlayStation 4 though it is a leap over Xbox 360 and PlayStation 3 when looking it very roughly and when taking into account all the implementations, features and "tricks" then the gap is even bigger. Wii U has more embedded RAM than Xbox One that has 32MB of eSRAM while Wii U has 36MB of superior eDRAM for GPU also eDRAM blows away GDDR5 in PlayStation 4 in terms of bandwidth if I am correct? 130/275/550Gbps on Wii U versus 80Gbps on PlayStation 4?

We need to take in consideration that Wii U's CPU Espresso has a certain implementation from Power7 architecture that allows usage of eDRAM and we know that CPU in Wii U has total of 3MB of eDRAM as L2 Cache and it could also use main eDRAM pool as L3 Cache and maintain connection with GPU thus creating HSA/hUMA like capabilities and reducing latency by far between CPU and GPU communications and data transfer.

Wii U's GPU was Radeon HD 4000 series in very first alpha development kit and that was Radeon HD 4850 that was 55nm and not RV740 40nm and by that time of first unveiling of Wii U there was Radeon HD 6000 on the scene for nearly a year or now almost three years so Nintendo could easily switch to Radeon HD 6000 series since it is basically evolution of Radeon HD 5000 that is refinement of Radeon HD 4000 series and further it is supported by using Eyefinity features on Wii U's gamepad that was proven by Unity demo on Wii U and Call Of Duty Black Ops 2 when in co-op in zombie mode also Wii U can stream up to two images at gamepad though it hasn't been used and maybe it will be used in near future.

Also people seem to forget about power consumption of dedicated GPU's versus embedded into motherboard ones that naturally consume less plus Wii U's GPU uses eDRAM that consumes 1 to 2 watts max compared to GDDR5 that consumes around 9 or more watts per 2Gb. GPU in Wii U is embedded thus it does not use PCI-E, additional PCB and chips plus has embedded eDRAM in it so consumption of eDRAM could be negated thus power consumption of eDRAM could be reduced to a literal zero.

Lets take for example Wii U's GPU die size and Radeon HD 6970 die size and assume that Wii U GPU is VLIW 4 based Radeon HD 6000 series GPU and not VLIW5. Radeon HD 6970 consumes 250 watts maximum and has die size of 389mm^2 and 2Gb of GDDR5 and is clocked at 880Mhz. Lets reduce it to 320 SPU's that will use 80mm^2 thus consumption is lowered to roughly 83 watts then we remove 2GB of GDDR5 and it goes down to 70-73 watts, now lets lower down the clock of it from 880mhz to 550mhz so that is roughly 35% lower clocks and when clocks are 25% then power consumption is cut in half since it is 35% so the GPU consumption goes down from 70/73 to roughly 14-15 watts without being embedded and we could easily shave off couple of more watts and it would most likely go down to 10 watts.

We can not really compare Wii U's GPU to any off-shelf/standard dedicated GPU, since some features that are found in regular dedicated GPU's are not needed when embedded thus with some minor modifications I can see Wii U having 384 SPU's thus at 550Mhz should have 420 Gflops rather than 320 SPU's if it was a standard dedicated GPU with die size of it. If Nintendo was to do drastic modifications they could even reach 500SPU's and very close to 600gflops. One of homebrew hackers counted 600 shader's so I am wondering if Nintendo is maybe using one technology that AMD has never used that was from ATI that they also never used and they call it "high density" that is going to be used in AMD's Steamroller cores, from what information AMD released... "High Density" can increase density of the chip by 30% and reduce size by 30% and reduce power consumption.

*I won't link most of this information's though I can assure you I did a lot of research and digging on the internet, collecting bits and pieces and then putting the together into one picture thus I had felling that is called "a ha!" or "EUREKA!!"
eyeofcore is offline   Reply With Quote
Old 10-01-2013, 11:11 PM   #2
Jadow
Diamond Member
 
Join Date: Feb 2003
Location: Upper Midwest - USA
Posts: 5,805
Default

uhh ok
Jadow is offline   Reply With Quote
Old 10-01-2013, 11:31 PM   #3
Ben90
Platinum Member
 
Join Date: Jun 2009
Posts: 2,826
Default

Because of this I went out and purchased 5 Wii Us. Thank you so much, they are so fast I put wheels on one of them and I can do burnouts on the highway. All my Xbox friends are so jelly they can't do burnouts on the highway with their Xbox 360s. They were like, why is your Wii U so fast and I was like: because its a next gen CPU!

They were like oh damn, ben you are so cool lets hangout. But i couldn't because all the babes were so impressed with my Wii U that they just wanted to smash all day. Thank you so much eyeofcore.
Ben90 is offline   Reply With Quote
Old 10-02-2013, 03:36 AM   #4
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

Quote:
Originally Posted by Ben90 View Post
Because of this I went out and purchased 5 Wii Us. Thank you so much, they are so fast I put wheels on one of them and I can do burnouts on the highway. All my Xbox friends are so jelly they can't do burnouts on the highway with their Xbox 360s. They were like, why is your Wii U so fast and I was like: because its a next gen CPU!

They were like oh damn, ben you are so cool lets hangout. But i couldn't because all the babes were so impressed with my Wii U that they just wanted to smash all day. Thank you so much eyeofcore.
Ok.. Should I laugh or judge you? IDK... Sarcasm?

No need to thank me, I love researching stuff and since a lot of so called gamers are hating Wii U because it is Nintendo's home console and they all talk and sprout obvious BS and shows how they are FOS. I don't know about Wii U's GPU much since I can only speculate though at least evidence and research supports that Wii U CPU codenamed Espresso is much faster so I assume ports used one to two cores while I know Batman Arkham City used two cores though all of those games were ports of Xbox 360 versions and some of them were ports of PlayStation 3 versions.

It is crazy how everyone love to take a dump on Nintendo and Wii U. Wii U reminds me of Dreamcast because EA's flip flopping, main reason why Wii U is not having huge third party support is not because of architecture that is actually far simpler than Xbox 360 and PlayStation 3 nor the performance of the hardware but because Nintendo denied extreme DRM measures that 3rd party developers wanted thus Wii U was trashed and then after Microsoft announced DRM in Xbox One and the backlash of the customers was in epic proportions thus they saw that it won't work and Sony agreed with same thing as Xbox One though after they saw the backlash they did an 180 degrees to save their face. That is the truth and the fact because I won't be blind sighted by 3rd party and Sony's Ninja smoke bombs because business is also politics and we all know that in politics there are always conspiracies and deals under the table where only those who have deep rots can know about it though those who do research will find out and uncover it.
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 08:05 AM   #5
sm625
Diamond Member
 
sm625's Avatar
 
Join Date: May 2011
Posts: 4,816
Default

I've never really had a problem with nintendo hardware. Not since they learned how to program their way around the SNES slowdown back in 1992. The problem is the controls. That wavy wand thing is just not precise at all. It jsut doesnt work. I'm not convinced it will ever work. I'm not saying xbox and PSX are any better. The analog sticks on those consoles are just pure garbage, and always have been. The N64 analog stick remains to be light years beyond all these new consoles, and its a almost 20 year old design. For frack sakes why? No one really seems to give a damn that the analog sticks are so crappy. But whatever, I simply will not play any of these consoles if they are not going to give me a better controller than what I had 17 years ago.
__________________
I am looking for a cheap upgrade to my 3 year old computer.
AT forum member #1: Buy a 4790k

I am looking for a way to get 10 more fps in TF2.
AT forum member #2: Buy a 4790k
sm625 is offline   Reply With Quote
Old 10-02-2013, 09:55 AM   #6
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

Quote:
Originally Posted by sm625 View Post
I've never really had a problem with nintendo hardware. Not since they learned how to program their way around the SNES slowdown back in 1992. The problem is the controls. That wavy wand thing is just not precise at all. It jsut doesnt work. I'm not convinced it will ever work. I'm not saying xbox and PSX are any better. The analog sticks on those consoles are just pure garbage, and always have been. The N64 analog stick remains to be light years beyond all these new consoles, and its a almost 20 year old design. For frack sakes why? No one really seems to give a damn that the analog sticks are so crappy. But whatever, I simply will not play any of these consoles if they are not going to give me a better controller than what I had 17 years ago.
So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 10:00 AM   #7
Exophase
Platinum Member
 
Join Date: Apr 2012
Posts: 2,263
Default

Quote:
Originally Posted by eyeofcore View Post
*X360 Xenon in-order 3 Execution Units vs Xbox One/PlayStation 4 Jaguar out-of-order ? execution units vs Wii U Espresso out-of-order 15 execution units
Hilarious. You can't make this stuff up.

.. okay, apparently you can.
Exophase is offline   Reply With Quote
Old 10-02-2013, 10:54 AM   #8
GalaxyWide
Member
 
Join Date: Sep 2012
Posts: 30
Default

Quote:
Originally Posted by eyeofcore View Post
So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*
Because of the games? How many games that actually matter (eg., AAA titles that are commonly played by the general public) are PC only? Or maybe you meant the other way around? I cannot think of a better reason NOT to use consoles than their horrid input devices, it's why I only game on PC where I can get real control with a mouse. Why should I buy something that is harder to use, and single purpose to boot, when I can have a multi-purpose machine that does everything better? The only reason I can think of to buy a console is Halo (still mad they stopped making them for PC) or local multiplayer use.

Also, it's a bit dickish to go calling names and personally insulting people.
GalaxyWide is offline   Reply With Quote
Old 10-02-2013, 11:11 AM   #9
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

Quote:
Originally Posted by Exophase View Post
Hilarious. You can't make this stuff up.

.. okay, apparently you can.
You can contribute, how many execution units does Xenon have in total?

We know Xenon has 3 cores and 6 threads, right? PlayStation 4 has 8 cores though I don't know how many execution units it has and Wii U has 3 cores with 15 execution units in total.

Quote:
Originally Posted by GalaxyWide View Post
Because of the games? How many games that actually matter (eg., AAA titles that are commonly played by the general public) are PC only? Or maybe you meant the other way around? I cannot think of a better reason NOT to use consoles than their horrid input devices, it's why I only game on PC where I can get real control with a mouse. Why should I buy something that is harder to use, and single purpose to boot, when I can have a multi-purpose machine that does everything better? The only reason I can think of to buy a console is Halo (still mad they stopped making them for PC) or local multiplayer use.

Also, it's a bit dickish to go calling names and personally insulting people.
Where did I called names? Where? What a failure...

Your existence showcases your failure as a gamer, so you will cry about controls being inferior even though it does the job rather well? So that is the reason why you miss some exclusives because of one simple thing, really? You came here dickish and then you accuse me of being just being that while you are in fact acting like that, dickish and like a prick...

Nice, already couple of people intentionally derailing the thread, why don't you get banned?!

Last edited by eyeofcore; 10-02-2013 at 11:21 AM.
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 11:52 AM   #10
Exophase
Platinum Member
 
Join Date: Apr 2012
Posts: 2,263
Default

Quote:
Originally Posted by eyeofcore View Post
You can contribute, how many execution units does Xenon have in total?

We know Xenon has 3 cores and 6 threads, right? PlayStation 4 has 8 cores though I don't know how many execution units it has and Wii U has 3 cores with 15 execution units in total.
There is public documentation for Cell and PowerPC750. The PPE in Cell is close to the same as XBox 360's Xenon chip, with a few differences in the SIMD part. You can find all the information you need about Jaguar from AMD's optimization manual. I could go through and list them all for you (I've looked at them enough to know you're way, way off) but there's no real point because number of executions units is a terrible proxy for performance. It's meaningless without knowing anything about what type of operations those units perform, how wide the front end and schedulers are, and a bunch of aspects that contribute to how well those units can be kept fed.

I really have to marvel at your logic though, you go from saying XBOne/PS4 has 4 units but Wii U has 15 because it's 5 * 3 cores.. so you're basically saying that XBOne/PS4 has half a unit per core, right?

The rest of your post is just full of misinformation or poor arguments, a couple of the most obvious examples:
- Your FLOPs comparison is totally crazy. Wii U's CPU cores can execute 2 fmadds per cycle, so 4 FLOPs/clock. XBox360 can execute 8. So that's 3 * 4 * 1.24 = 14.88 GFLOP/s for Wiiu vs 3 * 8 * 3.2 = 76.8 GFLOP/s for XBox 360. It's well known that Wii U is lacking in raw floating point SIMD throughput.
- The stuff about Wii being only 20% slower than PS3's PPE is based on measurements of a homebrew emulator ported to both systems. This is not even remotely representative of game performance, when you consider that they used a compiler much worse at generating code on PS3 (GCC vs IBM's compiler), were porting something written for PCs, was running an application that was poorly suited for PS3 (emulators can be very branchy, SNES9x blows through cache with its software rendering), didn't use any handwritten SIMD code as PS3 games often do, and as far as I'm aware was running on PS3 Linux. He even admits a couple posts later that the situation is different for game developers who optimized for the platform.
- The conclusion that Wii U's CPU can be 3-4 times faster because of random other numbers you listed is incredibly arbitrary
- Dual core Cortex-A8 doesn't exist for anything because the design doesn't support multicore coherency, and it's stunning to think Nintendo would use a Cortex-A8 for some little embedded peripheral processor when they're still sticking their handhelds with ARM11s. At any rate, whatever this processor is has zero bearing on game performance, there's zero evidence about anything used for "background OS tasks." There's also strong evidence that there's no 1MB of SRAM dedicated anywhere around where this CPU is believed to be. The 1MB of SRAM for mem0 is totally different.
- How could eDRAM embedded in the GPU practically eliminate latency between it and the CPU? Another comment which makes no sense. I think you meant it reduces latency of memory accesses from the GPU for buffers stored on eDRAM instead of main memory.
- Eyefinity lolz, I love this circular reasoning "since it's Radeon HD 5000 it must be Eyefinity, since it must have Eyefinity it can't be based on Radeon HD 4000." Here's a factoid - Nintendo could have this customized however they want. And streaming content to the controller is not in any way related to Eyefinity.
- "Some other eDRAM chip had this bandwidth so Wii U's should be at least the same since it's on a better process" is another terrible argument. You can more or less tell by die shot that the eDRAM isn't outputting more than 1024 bits per cycle, so unless it's effectively closed at > the 550MHz core speed (which would be a less than ideal design choice) it'll be ~70GB/s.. and for the level of graphics we've seen from anything there's no reason to believe it's using for or can benefit from a super high eDRAM bandwidth
- It doesn't matter how much AMD evolved their GPUs in the last N years, that doesn't have a bearing on what Nintendo decides to use, especially if they were only comfortable freezing their hardware decisions years ago. Based on your logic, if someone told you in 2011 3DS would be using ARM you'd say it'd be coming out with Cortex-A9, but no, it came out with ARM11 which was first in devices like 5 years prior.
- Has DX11 but can't use it because of OS is really dumb, when people say "DX" they mean feature set, not actual DirectX since of course Nintendo hardware isn't running DirectX. And if you think the OS would limit feature set like that you're crazy, the game developers would have access to the metal to get around it.
- "Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)" Seriously where do you get this crap? How would Power7 have anything to do with texture compression in the first place. The mind boggles.
- Using the PSU rating to try to say that the thing has all this untapped potential because it's running well below that is beyond ridiculous...

Ugh, I can't go on any more, you've just loaded so many silly assumptions and bad arguments into one post, it's just too much for me >_<
Exophase is offline   Reply With Quote
Old 10-02-2013, 12:15 PM   #11
VirtualLarry
Lifer
 
VirtualLarry's Avatar
 
Join Date: Aug 2001
Posts: 26,106
Default

Quote:
Originally Posted by Exophase View Post
Ugh, I can't go on any more, you've just loaded so many silly assumptions and bad arguments into one post, it's just too much for me >_<
This. Unplausible fanboy rant is unplausible fanboy rant.
__________________
Rig(s) not listed, because I change computers, like some people change their socks.
ATX is for poor people. And 'gamers.' - phucheneh
haswell is bulldozer... - aigomorla
"DON'T BUY INTEL, they will send secret signals down the internet, which
will considerably slow down your computer". - SOFTengCOMPelec
VirtualLarry is online now   Reply With Quote
Old 10-02-2013, 12:24 PM   #12
Soundmanred
Lifer
 
Join Date: Oct 2006
Location: Czelktik, Uzbenistaz
Posts: 10,668
Default

Damn, what a way to make an introduction. Another knowitall defensive asshole.
__________________
Nothing like doing a little meth and defending the honor of lions on the internet. NOTHING.
Soundmanred is online now   Reply With Quote
Old 10-02-2013, 12:27 PM   #13
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

I am sorry for my failure, I am not a tech savvy person though I just want to know the truth. I did some research, though lack of in depth knowledge is a big minus from me...

Can you please make most accurate as possible about hardware of the Wii U and comparison between Wii U and Xbox 360, please. Will you? I want to know the truth. I am not fanboy, I know that I stated some things wrong though I only wanted to point out. For example eDRAM embedded in the GPU so latency there should be rather very low compared to Xbox 360's eDRAM that is not embedded into GPU and/or not embedded at all?! Right?

What about the pipeline? If I am correct that Xbox 360 and PlayStation 4 have like 32 or 40 stage pipelines then it should have severe penalties...

Exophase... Please, do me a favor or Il just need to save up freaking 3000$ for Wii U development kit just to find out the performance of it and that scenario will need me saving for 8 months in a row or more depending if I find a well payed job or not. 3000$ plus transportation plus 25% tax and that will be close to 4000$. -_-"
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 12:39 PM   #14
Exophase
Platinum Member
 
Join Date: Apr 2012
Posts: 2,263
Default

Why are you so invested in this? You say you're not a fanboy but you definitely look like you're on a mission to prove to everyone that Wii U is so much better than they think. What difference does it even make? The games will be whatever the games will be.

If you want to know what other people have figured out about Wii U read a thread on a more informed forum, like this one: http://forum.beyond3d.com/showthread...+investigation

Just don't make a post like this there, they'll eat you alive. And understand that there's a ton people have no idea about because Nintendo has revealed so little about the hardware.

Don't worry about saving up for a Wii U dev kit, Nintendo will never sell one to a non-licensed developer (and they wouldn't do it for the paltry price $3k) and even if you had one you probably wouldn't learn much of anything from it.
Exophase is offline   Reply With Quote
Old 10-02-2013, 12:54 PM   #15
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

Quote:
Originally Posted by Exophase View Post
Why are you so invested in this? You say you're not a fanboy but you definitely look like you're on a mission to prove to everyone that Wii U is so much better than they think. What difference does it even make? The games will be whatever the games will be.

If you want to know what other people have figured out about Wii U read a thread on a more informed forum, like this one: http://forum.beyond3d.com/showthread...+investigation

Just don't make a post like this there, they'll eat you alive. And understand that there's a ton people have no idea about because Nintendo has revealed so little about the hardware.

Don't worry about saving up for a Wii U dev kit, Nintendo will never sell one to a non-licensed developer (and they wouldn't do it for the paltry price $3k) and even if you had one you probably wouldn't learn much of anything from it.
I am not a fanboy, I was fascinated by Gamecube and a bit with Wii and I want to know about Wii U since Nintendo seems to make a well balanced hardware compared to competition...

You suggest me beyond 3D? Hell no... Those guys said that Wii U GPU is 176Gflops and that is not possible since Wii U's GPU is 40nm and 320 shaders fit easily when looking at die size of the chip and the amount of eDRAM has taken and with slight modifications/customization it could have 384 shaders.

I need unbiased people... Not wanna be geeks.

Radeon HD 5550 is the candidate though Radeon HD e6760 is also candidate because Nintendo made a deal that involved OpenGL ES 2.0 and e6760 had it. Also ignoring possible evolution of Wii U's hardware is ignorant, it was known that Wii U dev kit in 2011 was Radeon HD 4850 though why Nintendo would stay with that consumes more, has older feature set and less gflops per watt then GPU's that were available by that time? They had a bit more than a year to improve Wii U. They could easily choose Radeon HD 5000 or 6000 series also eyefinity like feature is present with Call Of Duty Black Ops 2 in local co-op zombie mode and was shown on one of the Unity demos that one game utilized it.

It is not 4000, it is either 5000 our likely 6000 since Nintendo could want OpenGL ES 2.0 in their system and if they have a custom API then it couldutilize a customized OpenGL ES 2.0.
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 03:32 PM   #16
Revolution 11
Senior Member
 
Revolution 11's Avatar
 
Join Date: Jun 2011
Posts: 779
Default

Ok, Gamecube might have been "well-balanced" compared to its console peers, but the Wii U is so clearly outmatched by both Xbox One and PS4, let alone any serious gaming PC.

Nintendo may be competing on other aspects of a console besides performance for better or worse but let's call a spade a spade. The Wii U is quite slow.
Revolution 11 is offline   Reply With Quote
Old 10-02-2013, 04:11 PM   #17
eyeofcore
Member
 
Join Date: Oct 2013
Posts: 50
Default

Quote:
Originally Posted by Revolution 11 View Post
Ok, Gamecube might have been "well-balanced" compared to its console peers, but the Wii U is so clearly outmatched by both Xbox One and PS4, let alone any serious gaming PC.

Nintendo may be competing on other aspects of a console besides performance for better or worse but let's call a spade a spade. The Wii U is quite slow.
I know that, I only want to know how much is powerful...

http://www.ign.com/boards/threads/of...post-481660123

Wii U's very first development in 2011 E3 had Radeon HD 4850, later development kit(alpha) has Radeon HD 5000 series GPU and later in near end of development of Wii U and near its release the final development kit supposedly has GPU based around Radeon HD e6760 that supports OpenGL ES 2.0 that is low level API? Am I correct so Nintendo could have custom/semi-custom API based around OpenGL 4.3 and Open GL ES 2.0!

It seems Wii U GPU is around 352Gflops and since it supports DX11 feature set, Open GL 4.3/OpenGL ES 2.0 thus it extends its lead even further because of efficency of the hardware.
eyeofcore is offline   Reply With Quote
Old 10-02-2013, 04:23 PM   #18
SecurityTheatre
Senior Member
 
Join Date: Aug 2011
Posts: 672
Default

Quote:
Originally Posted by eyeofcore View Post
I am sorry for my failure, I am not a tech savvy person though I just want to know the truth. I did some research, though lack of in depth knowledge is a big minus from me...
This....

Why would you go on a highly-technical rant using bogus information.... in a highly technical forum.....

And then apologize by saying you don't know what you're talking about...

Seriously, what's the point?

Actually...

This part was worth it.

It made me LOL. I applaud you for that.

Quote:
Wii U CPU is next generation compared to Xbox 360 and people that say otherwise should just shut up, be ashamed, have a seat, accept the facts and deal with it and here is the proof; http://systemwars.com/forums/index.p...tage-pipeline/
Reading about technical features like OOP and set associativity of caches from in breathless, sentence-less, shouting made my afternoon.

Seriously, the guys working in the systems lab down the hall came to see what was so funny.

*wipes eyes*

Thanks!
SecurityTheatre is offline   Reply With Quote
Old 10-02-2013, 09:37 PM   #19
ultimatebob
Lifer
 
ultimatebob's Avatar
 
Join Date: Jun 2001
Location: Connecticutistan
Posts: 16,790
Default

Quote:
Originally Posted by eyeofcore View Post
I am sorry for my failure, I am not a tech savvy person...
Then why are you posting something (that at least attempted to be) technical in the Highly Technical forum? Is this some sort of Internet troll training?

Stick with Console Gaming next time.
__________________
<---
<--- Blame this guy if you do not like this post.
<---

Last edited by ultimatebob; 10-02-2013 at 09:39 PM.
ultimatebob is offline   Reply With Quote
Old 02-28-2014, 03:02 PM   #20
N-A-N-0
Member
 
Join Date: Sep 2013
Posts: 26
Default

Despite the nastiness he received, he was right on several points. Wii U is basically 360+, or like what the N64 was to the PS1 in terms of raw horsepower and more modern features. It certainly isn't comparable hardware-wise to PS4/XB1 though, other than using 1/4th of the same kind of RAM as XB1, bog-standard DDR3-1600,(though I believe Microsoft clocked theirs higher.) That's another point, the Wii U not only has 4 times the RAM of the PS3 and Xbox 360, but it's better stuff than the old GDDR3. Don't know much about the PS3's 256 MB or half the RAM though.

The big problem though is obviously that the big white beast that was the phat Xbox 360 went on sale 7 years before the Wii U. Way back in 2005, the Xbox 360 was extremely impressive. It's a shame that the Wii U is pretty comparable to that.

Last edited by N-A-N-0; 02-28-2014 at 03:05 PM.
N-A-N-0 is offline   Reply With Quote
Old 02-28-2014, 04:45 PM   #21
Exophase
Platinum Member
 
Join Date: Apr 2012
Posts: 2,263
Default

Quote:
Originally Posted by N-A-N-0 View Post
Despite the nastiness he received, he was right on several points.
Anything he presented that went against broadly common knowledge was wrong. Sadly he's been vandalizing Wikipedia to include these same made up facts based on incredible leaps of logic.

Quote:
Originally Posted by N-A-N-0 View Post
It certainly isn't comparable hardware-wise to PS4/XB1 though, other than using 1/4th of the same kind of RAM as XB1, bog-standard DDR3-1600,(though I believe Microsoft clocked theirs higher.) That's another point, the Wii U not only has 4 times the RAM of the PS3 and Xbox 360, but it's better stuff than the old GDDR3. Don't know much about the PS3's 256 MB or half the RAM though.
You can't just look at the RAM technology and the clock speed, you have to look at the bus width. The DDR3 in XB1 isn't just higher clocked, it's in a 256-bit configuration. In Wii U it's only in a 64-bit configuration. Every clock cycle of a transaction XB1's RAM transfers four times as much data.
Exophase is offline   Reply With Quote
Old 02-28-2014, 06:40 PM   #22
N-A-N-0
Member
 
Join Date: Sep 2013
Posts: 26
Default

Quote:
Originally Posted by Exophase View Post
Anything he presented that went against broadly common knowledge was wrong. Sadly he's been vandalizing Wikipedia to include these same made up facts based on incredible leaps of logic.
So that's been him... I wonder where these mysterious Latte = AMD Radeon 5000 and 6000 series specs have coming from. GPU die doesn't even look too similar to anything but if digital foundry says it's about a match for a 4650/4670, about a match with what they figured back at E3 2011, that's good enough for me.


Quote:
Originally Posted by Exophase View Post
You can't just look at the RAM technology and the clock speed, you have to look at the bus width. The DDR3 in XB1 isn't just higher clocked, it's in a 256-bit configuration. In Wii U it's only in a 64-bit configuration. Every clock cycle of a transaction XB1's RAM transfers four times as much data.
True, though if we're talking 720p and sub-HD 7th gen games, the better GPU is more important.

Last edited by N-A-N-0; 02-28-2014 at 06:42 PM.
N-A-N-0 is offline   Reply With Quote
Old 03-03-2014, 08:59 AM   #23
sm625
Diamond Member
 
sm625's Avatar
 
Join Date: May 2011
Posts: 4,816
Default

Quote:
Originally Posted by eyeofcore View Post
So you are crying about analog sticks and thus you don't buy those consoles? What is wrong with your head, did you fell on your head when you were born? I mean, cmon... Because of one silly thing, if it were for the games then I would have understand, but this is just... *facepalm*
What is wrong with your head, besides being filled with troll? I refuse to spend my money on a console that contains a controller design that is demonstrably, empirically, provably worse than a design from 20 years ago. Mario 64 is almost 20 years old and you can walk a tightrope in that game, and you could walk at 5 different speeds while doing it. You cant do anything like that on these crappy xbox/playstation controllers. You can walk, or you can run, and that's about it. And you'd be extremely lucky to be able to switch between those two speeds without altering course. Bad design is simply bad design. Rather than feed or reward these companies for their stupidity and dumbing down, I just moved to mouse/keyboard.
__________________
I am looking for a cheap upgrade to my 3 year old computer.
AT forum member #1: Buy a 4790k

I am looking for a way to get 10 more fps in TF2.
AT forum member #2: Buy a 4790k
sm625 is offline   Reply With Quote
Old 03-07-2014, 03:40 PM   #24
N-A-N-0
Member
 
Join Date: Sep 2013
Posts: 26
Default

Quote:
Originally Posted by sm625 View Post
What is wrong with your head, besides being filled with troll? I refuse to spend my money on a console that contains a controller design that is demonstrably, empirically, provably worse than a design from 20 years ago. Mario 64 is almost 20 years old and you can walk a tightrope in that game, and you could walk at 5 different speeds while doing it. You cant do anything like that on these crappy xbox/playstation controllers. You can walk, or you can run, and that's about it. And you'd be extremely lucky to be able to switch between those two speeds without altering course. Bad design is simply bad design. Rather than feed or reward these companies for their stupidity and dumbing down, I just moved to mouse/keyboard.
Oh my god.... You realize that's because the games have less variation in the animations and speeds for the analog stick, which Mario 64 was a showcase for...? It's nothing to do with the analog sticks themselves.
N-A-N-0 is offline   Reply With Quote
Old 03-08-2014, 03:59 PM   #25
Batmeat
Senior Member
 
Batmeat's Avatar
 
Join Date: Feb 2011
Posts: 400
Default

Don't understand the point of the original posters argument here. why are you comparing the Wii U to the Xbox 360 and the PlayStation 3 when you should be comparing the Wii to them. the Wii U, generation wise, should be compared to the Xbox one and PlayStation 4.
__________________
My Heatware
Batmeat is offline   Reply With Quote
Reply

Tags
cpu, gpu, hardware, next generation, wii u

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 01:20 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.