On some games like like Godfather what you will see is EXTREME color banding where the gradients aren't the slightest bit smooth, especially when on certain lit surfaces and particle effects. In the worse case scenarios, things that should be a gray will have a greenish hue to it since that would be it's nearest neighbor color. Capcom vs. SNK 2 EO had this problem with the sprites/monogons (which the PGC review surely failed to mention) and on Luigi's Mansion it was all over the place.
On other games like WW, the color is generally good but on an HDTV or a TV with a really good picture output the effects like fog tend to have a scanline like, interlaced effect to them as if you're seeing only half a frame of it. On a regular TV, since the display is interlaced it won't show up at all. My old TV tuner card with a deinterlacing program didn't even show this, but the HDTV exposed it like a fiend.
On the Wii there is an incredibly easy way to tell, just hit the home button mid game. If you see any dithering whether it's lines or dots running through the button graphic, it's operating at 16bit. If you don't see them, chances are it's operating at 24bit and looks a heck of a lot better, especially on HDTVs. Mario Party 8 was funny with this because it's menus and board game parts were 24bit color, but the minigames were 16.
24bit color is on a whole better looking than 16bit, but the thing about the GC and apparently the Wii hardware as well is the fact that due to framebuffer issues if you're looking to do a lot of things, 16bit color is far more pratical than 24 in terms of performance, though some games have done fine with 24. (SSBM, which is why I'm curious as to which way SSBB swings)