Quote
To name a few points where the XBox is more powerful:
Higher fillrate, higher clockspeeds (CPU and GPU), better GFLOPS performance, higher integer performance, more memory, faster overall memory bandwidth and pixel shaders, to name but a few. But wait, I forgot that the GCN is much more efficient. Egads, it must be more powerful.
Aren't we Mr. Mathematician?

You really think that bigger numbers make a better product don't you?
Higher fillrateTheoretically, yes, the Xbox does have a better fill rate. But I think that you should remember that the PS2 has an even higher theoretical fill rate still.

Does that mean the PS2 is more powerful than the Xbox? The problem is that these are theoretical peaks, and they don't relate in the slightest bit to real world performance.
But this is actually an example of where the GCN's efficiency comes into play. Nintendo remembered that they are creating a console hooked up to a TV rather than a PC attached to a monitor, and made some smart decisions. Flipper's 162MHz-clock rate aligns it perfectly to with the fill rate requirements at 640x480, since the resolution's fill rate, 9Mpixels/sec at 30fps, divides evenly into 162MHz, Flipper's clock rate.
Flipper could theoretically redraw every pixel on-screen 18 times per frame. But Flipper likely has at the very least two pixel pipes in order to be competitive with PS2 and Xbox, and with two pipes it could redraw every pixel on-screen 36 times per frame. A more likely scenario is that Flipper has 4 pixel pipes, and could theoretically draw every pixel on screen 72 times per frame.
higher clockspeeds (CPU and GPU)It's a well known fact that a 400Mhz PowerPC will match a 700MHz Pentium 3. But the GameCube is clocked at 485MHz, and has special instruction sets that aid its ability to handle games. Meanwhile, the Xbox is missing half of its L2 cache, making it resemble a Celeron more than a Pentium 3. The argument of clock speed in this case is completely retarded. I'm sorry. It should be immediately obvious that the Xbox has the LEAST powerful CPU of this era's console, while I would have to give the trophy to the PS2 in terms of CPU performance. It's actually the reason that people still think MHz means something that AMD has taken to naming their CPUs in weird ways to mark how well their CPUs will perform against an equivalent Pentium 4. My CPU is clocked at 1.53 GHz, but it will outperform a Pentium 4 at 1.8GHz in just about every benchmark.
As far as the GPU is concerned, yes you are correct here. The increased clock speed on the GPU will increase performance. However, there are other factors that will affect overall system performance. There's nothing preventing you from putting a Radeon 9800 Pro in the same system as a Pentium 2 233 MHz CPU and 32MB of RAM, but your computer will still run games slow as mud.
GFLOPS performanceYou would not believe how long I have been looking for data to back you up. I am going to need some proof of this outside of official press releases. By my figuring, a 64 bit PowerPC should WASTE a Pentium III in floating point performance. The closest I can find is this:
Pentium III 733: 2.9 GFLOPS
PowerPC 500MHz: 13.0 GFLOPS
Which would you say is more powerful at floating point operations? If you have data backing up your statement, then I would love to see it. The PowerPC was built around high floating point performance it seems.
integer performanceIntegers aren't really used that much in 3D graphics. Besides, the systems are about equal in integer performance, from what I gather in the traditional P3 vs PowerPC debate.
more memoryIt's unified memory. That means that EVERYTHING in the Xbox has to share the same memory and bandwidth. The GameCube has less memory overall, but it has lower latency, and the CPU and the GPU don't have to share the same bandwidth.
faster overall memory bandwidthI was going to post a whole lot of complicated math and figures to show you, but in the end, I erased it all, because I figured it would go right over your head anyways, and it wouldn't do me any good. However, it all still exists if you want to see it.
Anyways, I would like to take this time to mention that the Xbox has one big chunk of memory. It's 64MB large, and is shared by all the parts of the system that need it, including the CPU, GPU, and SPU (which can't even talk to eachother at a fast rate). Do you know what else that means? It means that the bandwidth must be SHARED amongst those components, and there is no way in hell that any of them can meet the 6.4GB/s that is the theoretical peak data rate. Seamus Blackley even admitted that the Xbox is seriously bottlenecked by it's memory bandwidth, so I don't think that this is going to be a good place for you to try to show the Xbox's superior power. In fact, you will commonly see me referring to this very fact as a reason why the Xbox CANNOT live up to the potential of it's higher numbers in most respects.
But wait, I forgot that the GCN is much more efficient. Egads, it must be more powerful.Well, where should I start? I just listed a bunch of reasons why the GCN's efficiency makes it more powerful than the Xbox. But let's think of some more shall we?
The GCN has a much shorter data pipeline than the Xbox, making it able to process information twice as fast per clock cycle...
The GCN has memory with MUCH lower latency meaning that less time is spent looking for the piece of data that you want, and more time transferring it. It has been described as being less like working with conventional forms of memory, and more like having a HUGE L3 cache.
Gecko has 4:1 inline compression (L1 and L2 cache). The Pentium 3 supports data compression in the GPU, it must be uncompressed before entering the CPU.
Continuation of above point: Basically what I am saying is that just about any form of data in the GCN can be compressed, making the bandwidth and space needed by operations much smaller.
The GCN has 64 bit RISC instructions, making code faster overall, and has a quite nice instruction set if I do say so myself.
GCN has seperate FIFO write gather pipe for bursts of graphics data to main memory while the bus is not busy.
As for raw power....
GCN has higher mathematical accuracy and performance.
GCN has twice the amount of L2 cache.
GCN has much higher texture reas bandwidth
GCN has a MUCH higher MIPS rating, due largely to the fact that it has 8X the number of General Purpose Registers than the Xbox.
GCN has a higher bus speed (168MHz), making memory bandwidth much less of a concern in most cases. (again, I can post some math if you want).
GCN has 3 seperate sections of main memory divided into logical sections:
Main Memory: very low latency, and a bus speed make getting data fast and easy. Speed critical things are placed here.
A-RAM: low speed memory used for items not requiring a lot of speed.
Video Memory: 3MB of SRAM, which is used for the Z-Buffer and Frame Buffers. This is on die, so needless to say, it's faster than anything on Xbox.
Quote
Proof? Either way, what you're saying is that the XBox can do it without shaders, but it can use shaders to do it aswell. What point are you trying to make here?
Proof is common sense among coders. Regardless, this was a stupid point to begin with. I was just correcting you.
Quote
No, but the ability to display higher resolution textures is a sign of greater power.
I could just as easily say that the ability to read that texture and move on to the next one in much less time, or being able to render twice as many texture layers per pass is a sign of greater power.
But you know what? This is merely an example in which the systems are DIFFERENT, and not necessarily faster or slower or better than the other in practice.
Quote
No, if you read what I said again, you will find that what I said was that the XBox can do cubic environment mapping, which is better than the Spherical environment mapping that GCN can do. Why cant GCN do it? It needs more power than Spherical Environment mapping. It also gives better quality results.
GCN doesn't do it by default in hardware. Regardless, this whole point is bothering to me. It's likely that if I posted two pictures side by side emphasizing the difference between the two, you likely wouldn't be able to pick which one was which anyways. Besides, it's very likely that Flipper could be programmed to do such effects.
Regardless, GCN seems to see a lot more bump mapping. Perhaps because the Xbox requires too much power for the effects?

(See how easy it is to say, but how hard it is to back up?)
The fact of the matter is that unless you have a dev kit for both systems beside you, you don't know about this any better than I do.
Quote
What point are you trying to make here, I'm confused by your arguement again? Are you saying that this guy is saying that GCN has shaders? If that's the case I'm just going to point out that shadows are very basic shader function that even PS2 can do.
He was talking about water simulation, not shaders. My point was that your thinking implies that shaders are the end all means of producing pretty special effects, and only the Xbox has them. The same things done on Xbox by its shaders can be done just as easily on GameCube with different methods.
Quote
While I agree that there aren't a huge number of XBox games that use 5.1, and it isn't related to graphics, it's still a sign of greater power since 5.1 requires more RAM among other things, to pull off. Pro-logic II is a system that Dolby developed for PS2, I believe, but which it subsequently marketed to other companies and systems.
This kind of statement makes me want to bash my head into the wall for being dumb enough to fall for the bait and actually debate this with you.
First of all, I want to say that while most GameCube games support Prologic II, most Xbox games do not support surround. So much for the "greater power" idea.
Prologic technology dates WAAAAY back to 1987. Simply put, it's a way of producing surround sound from a stereo signal. Prologic II was a successor to it that included discrete left and right rear channels. It was a new thing around the time of the GameCube, and Factor 5 hassled Dolby into letting them use it in their game Rogue Leader. It was NOT developed for the PS2, but for home theatre. PS2 doesn't even freaking use Prologic.
GameCube has 16MB of A-RAM sitting around for whatever purpose the developer wants to use it for. A stands for Audio, as it was originally intended for Audio Buffering, but it's often used as a CD cache for better load times. At any rate, the GameCube can match the Xbox in most games as far as audio is concerned, so I don't really see the point of this whole argument.
Quote
Actually, that's been cancelled since PS2 has nowhere near enough power to pull off Deus Ex 2 in any way that would resemble a playable game.
Whatever. I could sling bullshit around all day just as easily as you, but I choose not to.
Quote
Whee, Factor 5 said it so it must be true! Or not. Since any effect that GCN and Xbox perform in hardware even a Pentium Pro can perform in software (given time), that arguement holds little weight. Also, why do they have more experience with XBox than John Carmack? Since the XBox so closely resembles PC archtecture, I think it's pretty safe to assume that he has a better understanding of it than Factor 5.
You seem to think that the Xbox can do anything because of its glorious pixel shaders, yet nothing on the GameCube can do those effects in hardware.
I think that since Factor 5 has programmed in ASM on both systems to create their DivX codecs for each system, and have actually created GAMES for GameCube, they would know a great deal more about how GameCube works than John Carmack. Carmack is a PC developer. He has never made a game for a console, and he probably doesn't even have a GameCube SDK. (just a guess though). He probably just doesn't want to port his entire game to a system which has a completely different architecture. The Xbox is perfect, as any ASM code would likely still run, and contains hardware that's very close to what he intended the game for.
Btw, I have never seen the article where John Carmack said that the Xbox was the only console capable of running Doom III, but have often heard of it. Can you link me please? I looked around, and I could not find one single article that stated that he said that Doom III was impossible on other consoles.
Quote
Wow, you completely missed the point here. What I was saying is that XBox uses LUT's to simulate true floating-point ops., but these can only be used to calculate vertices, nothing else. However, LUT's require less power to use, and so the XBox can calculate more vertices using LUT's than the GCN can using true floating-point ops.
Heh. Yeah, I checked into this, and you are right. But I gather that nVidia was trying to make up for their bandwidth limited GPU by scrapping some things such as accurate math. It's common knowledge that nVidia cards are mainly memory bottlenecked, and on the Xbox, that bottleneck can choke the life right out of an nVidia GPU.
It's a daring attempt at etching some more power out of the GPU, and I dare say it works to a degree, but GCN's compressed vertices, and the much higher floating point performance on the CPU can take quite a load off of Flipper. That's the real key to understanding the GCN hardware. The CPU and the GPU are meant to talk to eachother and divide the work. There's some seriously high bandwidth between Flipper and Gecko.
Quote
You mean you don't actually know? Integer calculations are calculations performed by a CPU (and only a CPU) that end up with a number, with no decimal points. Floating-point calculations end up with a number that has a decimal point somewhere along it (i.e. is not an integer), and in graphics-related computers these are often calculated by a GPU (to free up the main processor for other things). What this roughly translates to: floating-point calculations create things. Integer calculations tell things what to do. Example:
No, I knew what they meant. I was just curious if you knew, as it seemed to me like you didn't, and were just using big words that you thought would sound convincing. Since you never actually defined them, I can't prove this...
But I gather that you think that MIPS means "Millions of Integers Per Second" or something like that, given all that talk about integers. MIPS actually means "Millions of Instructions Per Second", or commonly among some circles as "Meaningless Indication of Processor Speed". An instruction is the basic operation of a CPU, and involves changing data around. I don't really feel like a long winded explanation, as I am sure I would be wasting my time. Learn some ASM, and you will know all about operations in a short time. I will just say that these numbers are very difficult to compare, and are essentially meaningless in the real world anyways. Yes, integers are manipulated via one operation, but there are more operations than just those involving integers.
All that niceness about places using integers for stuff was very nice, but I don't think that the ability to do 20 gazillion operations per second is really important when deciding whether the box is blown up or not, or whether the character wants to move or not. Things like that are usually only done once per vblank anyways. (1/60th of a second).
The thing is that floating point numbers are used for a lot of the things you mentioned. 3D movements need to be SMOOTH, and if your physics modelling is based on integer arithmetic, you are going to be really screwed when the game is running. For example, if you fall off a bridge and the world is expressed in integers, you won't fall in a smooth way. In the first moment, you will be falling at 1m/s, and then it will keep incrementing in 1/ms intervals. There won't be a transition between the two, it will be abrupt, and you will fall at a constant rate between the shifts. This is why you need to track object positions in floating point numbers. Positions of objects need to be tracked with sub-pixel accuracy.
As for FLOPS, it means "Floating Point Operations Per Second", and is a hell of a lot more related to real world performance. I think you actually got the definition right on that one. All those nice 3D Models you see on the screen are likely composed of 32 bit floating point numbers. (possibly 64 bit on GameCube).
I have no idea whether the XGPU or Flipper has a better floating point unit, but I think I will give the nod to the XGPU. It's just too bad that the GameCube's CPU has a FLOP rating of about 4X that of the poor Intel CPU.
Quote
And guess who the reviewers are quoting? YES! Factor 5! Since Factor 5 is a second party developer, paid by Nintendo to show of the better parts of GCN and to speak its praise, do we think that they can be trusted? Hmmm, how about no.
Thanks for replying at last. I hope my explanation of types of calculation met your obviously rigorous standards (sorry if the pictures of the boxes look a little odd).
Factor 5 is a 3rd party. They have made things on all 3 systems.
As for thanking me for a reply, don't do that just yet. Odds are that I simply won't feel like replying to anything further in this thread, as I simply DO NOT feel like it. This post took me a really long time to write, and I don't feel that I have gained anything. The fact still remains that in the real world, GameCube games seem to boast a higher polygon count, faster load times, and more effects than Xbox games, in spite of the higher numbers on the Xbox side.
The thing I was trying to point out is that the Xbox has a lot of high numbers, but they are only there for marketting purposes. People seem to think that more "MHz" means more power, and ignorant though it might be, I can't really blame them if they don't know better. The problem I have is that when I actually do enlighten people that maybe the Xbox isn't all it's cracked up to be, they ignore me and tell me I'm wrong, when they have considerably less practical computer knowledge than myself. Just because I am a Nintendo fan doesn't mean I am blind. Just because the Xbox's CPU has more "MHz" doesn't mean that it's faster in any way.
Anyways, I'm done. I am going to go do something. Anything.