In the GameCube's case, it's 720x480i @ 60Hz video, s-video.
This delay bullsquash is totally inexcusable. My Matrox Marvel G400, a 6-yr old video card, running 8? yr old vidcap/tuner technology, displays 60Hz-interlaced feeds *just fine*, in a proper video overlay stream, no perceivable delay. The ATI All-in-Wonder cards of the same era also performed just as well. What the heck happened to TV tuning technology since then?