i always thought bittage meant nothing. but i read the link above and i thought it proved that bittage does matter:
"New processors coming soon from Advanced Micro Devices and Apple suggest 64-bit computing will make its way to a desktop near you this year. But what does that really mean for you?
Let's put it this way: If you think today's computers are fast, wait until they make the leap from 32 bits to 64 bits. This isn't about more megahertz--it's about actually doubling the amount of data a CPU can process per clock cycle. Servers and high-end workstation have been reaping the technology's benefits for years"
"Game makers--traditionally among the first to make use of new technology--see clear advantages to 64-bit computing.
that extra speed will let programmers add remarkable detail to their software, says Tim Sweeney, founder and lead programmer at Epic Games, maker of the popular Unreal game franchise.
"You'll see better textures, more realistic sounds, and larger and more realistic environments," Sweeney adds.
Plus, the characters themselves will be rendered with dramatically more detail. You'll see more realistic representation of features such as hair, skin, and eyes. And the computer-run characters will have more realistic artificial intelligence, he says."