Author Topic: On the Wii U's CPU  (Read 3196 times)

0 Members and 1 Guest are viewing this topic.

Offline MukiDA

  • Score: 4
    • View Profile
On the Wii U's CPU
« on: December 03, 2012, 06:50:19 AM »
So I'm not sure that I'm entirely knowledgable enough to discuss this, or if I'm even in the right place, but I figured I'd drop my $0.02 in rather than yell at the podcast.


The bad? The CPU is really as bad as it looks. It's basically three overclocked Wiis, and without that graphics chip, it can't even play high def video. (Key technical point: The Wii's CPU cannot handle the h.264 codec, a popular Blu-Ray codec, in standard definition. 1080p is a little under six times that, so I will stand by my point)


The good? It probably doesn't matter as much as it appears to. The CPU's main weakness (aside from the low clock rate) is a complete lack of SIMD units. Basically, SIMD units are like mini-processors that let CPUs handle work far beyond what they're normally capable of. They're what allow the PS3 to stream video to the PSP, or even just decode Blu-Ray for that matter.


It's something that, more than likely, the Wii U's graphics chip can pick up the slack for and then some. Since about two Ati generations after the 360 (and one Nvidia gen after the PS3), graphics chips have been able to handle SIMD-like workloads by themselves. Problem on the PC is that for them to handle anything not graphics oriented (like hair, cloth, and liquid physics that we're starting to see in games like Borderlands 2), you basically have to bounce data back and forth between your main ram and graphics ram, which slows things down enough to not be worth it.


On the Wii U there's no such separation. It's all one memory bank, and I'm sure it won't take long for Epic (among other devs) to start implementing the sharing of some workloads between the two chips, if for no other reason than to take advantages of current and upcoming developments in hardware (AMD's Fusion chips, the next Xbox). That said, there's no way they were gonna get something like that ready for launch (and even if they did, games in development would be using much older versions of their engines by then. See Madden).


Without any GPGPU (using a graphics chip for CPU purposes) code in launch titles, developers basically have an Xbox 360 CPU with zero optimizations and close to a third the clock rate, and it shows. Assuming Epic and Ubisoft optimize their engines to snatch a little extra computing power from the GPU, it should look substantially different next year.


That said, the Wii U WILL be a pretty big step down from the 2013-on systems from Sony & Microsoft, but it won't be anywhere NEAR as much of a disparity as the Wii had to deal with. It's only slower, whereas the Wii just outright didn't have functionality that PC devs had been used to for five years by the time it launched. (Basically, graphics shader tech) If you want a comparison point, compare Doom 3 on the original Xbox to the BFG edition that came out this year on 360. It'll be closer to that.

Edit note: I am never writing a wall of text rant on my iPad ever again.

« Last Edit: December 03, 2012, 06:56:13 AM by MukiDA »

Offline MegaByte

  • NWR Staff... Can't win trivia
  • NWR Staff Pro
  • Score: 31337
    • View Profile
    • Konfiskated Teknologies Network
Re: On the Wii U's CPU
« Reply #1 on: December 03, 2012, 05:47:02 PM »
The bad? The CPU is really as bad as it looks. It's basically three overclocked Wiis, and without that graphics chip, it can't even play high def video. (Key technical point: The Wii's CPU cannot handle the h.264 codec, a popular Blu-Ray codec, in standard definition. 1080p is a little under six times that, so I will stand by my point)
This point, aside from shoddy math assumptions, is just simply not true.
Aaron Kaluszka
Contributing Editor, Nintendo World Report

Offline ShyGuy

  • Fight Me!
  • *
  • Score: -9660
    • View Profile
Re: On the Wii U's CPU
« Reply #2 on: December 03, 2012, 10:49:50 PM »
Doesn't the short pipeline and the 32mb of fast RAM compensate?

Offline azeke

  • He's ruining Splatfest for the rest of us
  • Score: 11
    • View Profile
Re: On the Wii U's CPU
« Reply #3 on: December 03, 2012, 11:18:45 PM »
The bad? The CPU is really as bad as it looks. It's basically three overclocked Wiis, and without that graphics chip, it can't even play high def video. (Key technical point: The Wii's CPU cannot handle the h.264 codec, a popular Blu-Ray codec, in standard definition. 1080p is a little under six times that, so I will stand by my point)
This point, aside from shoddy math assumptions, is just simply not true.
Yep. I watched lots of videos encoded with by h264 with WiiMC, both as mp4 files and on youtube. The resolution should be low though, but that has more to do with Wii's own resolution.

You don't even need to resort to homebrew to prove this. Wii's official Youtube channel uses Google's WebM codec for video which is in the same ballpark.

While i agree with that bad CPU is worrisome, we already have games as good looking as Assasin's Creed III. At launch.
Winners don't hate and W101 rocks

Offline MukiDA

  • Score: 4
    • View Profile
Re: On the Wii U's CPU
« Reply #4 on: December 04, 2012, 06:27:17 AM »
Doesn't the short pipeline and the 32mb of fast RAM compensate?


EVERY other console on the market is running on a Power-based RISC processor, so not really. Fast RAM isn't quite as fast in 2012.


Graphically, the embedded RAM on the graphics chip is gloriously fantastic. It was a great idea on the Gamecube (poorly implemented on the PS2 so we won't talk about that), it's what lets the Xbox 360 keep up with the PS3 (despite technically, and only technically, having a slower GPU), and it'll bite anyone who doesn't invest in the idea come next gen HARD.


Clock-for-clock (so comparing per-mhz the Wii U to the 360/PS3), the Wii U has a better base processor. It has features missing from the other two that have been in processors since the bloody Pentium Pro. So mhz for mhz, it's faster. However, we're not going mhz vs mhz, we're going 1.2 x 3 VS 3.2 x 3 (yes, the 360 has three cores).


We're also talking about some extra features on the 360's CPU and the PS3's (the Cell's SPUs) that do not exist on the Wii U. Once again, you could write the same functions into graphics code, and because the RAM is shared between CPU and RAM, you could probably do this without any performance hit.


BUT, this kind of code doesn't really exist yet (or it sure as hell isn't common, as the Wii U is the first hardware to have this capability save for the AMD Fusion processor and a FAAAAR weaker/limited/probably-not-useful variant in the 360), and the way they've been doing it on the other two has had 5-6 years to mature.


I can't say I'm all that worried. The only games that are really gonna take a hit are ones that are CPU-bound, and regardless of what you saw at launch, that is the exception more than the rule. Games have always been mostly GPU-bound. What we saw in terms of launch was a launch lineup without much in the way of the more hardcore optimizations listed.


The fact of the matter is that we're on the precipice of using the shaders in a GPU to assist greatly in game logic. The way the next Xbox is designed, I can guarantee you we'll see that, and it's going to trickle-down to the Wii U. The drop from those consoles will not be anywhere near as bad is it was this gen, and not only because the Wii can output in high definition. What we'll have is a system with decent processors and a useful graphics chip going up against more expensive boxes with faster chips, and given how much of this generation was PC-like in games development, we'll probably land in a spot where the Wii is what they target for lower-end gaming rigs, and the next systems is where they drop off the higher-res stuff. So the difference is gonna amount to sliders in the PC version, rather than drasticly different ports by a different studio with little-if-any resemblance to the "mainline" version of a title.


Plus Nintendo stuff is gonna look SIIIIICK =3


--- Also, I still stand by my H.264 comment. Check out the WiiMC stuff, most of the videos they have running in 480p under H264 are Youtube clips, which have VERY low bitrates at a VERY simple profile and is all but violently removed from the version of that codec that goes into high definition vids outside of Youtube.
« Last Edit: December 04, 2012, 06:28:50 AM by MukiDA »