120Hz does weird things. Essentially, it's function is to carefully compute ghost frames in between the progressive frames of a video source to provide an eye-pleasing motion blur via an artificially high framerate.
There were two sets in "120Hz Mode" sitting adjacently to one another at a Best Buy, one was running HD Fast/Furious Tokyo and the other was a HD Pirates/Carribean demo clips. Normally, film is 24Hz which does not evenly mesh into 60Hz. With "120Hz mode" active, the 24 frames get multiplied/interpolated/FUDGED to become 120 frames in a 1-second period. Mathematically, 1 in every 5 frames was a real, original frame. In both demos, when I saw objects travel/pan across the screen, some motion was ghastly super smooth but not all the all time, causing a visual disconnect. Like fractions of each second couldn't maintain the smoothness and just jumped back to 24Hz/60Hz/regular motion. In low speed scenes, I didn't feel a smoothness increase; it looked regular.
When you increase the source framerate, the results slightly improve since you have more real frames (they have real detail, and exist at correct points in time) occupying the display.
HD sports do benefit since it's real-life images getting a more lifelike motion blur. Video games? I don't know, not sure if you want blur introduced into your clear, computer-generated images. Example: All HDTVs in the USA are somewhat flawed by way of forcing video sources to conform to 60Hz display. RE4 in p.scan is outputting 30fps, but regardless the TV has to display 60fps (60Hz). The compromise is every other frame is some kind of ghost/interpolated frame, and it's painfully obvious when you aim at a vertical object (tree, column, wall corner, etc) then pan the camera around, showing ghostly edges. 120Hz is a more advanced implementation of this concept.