tldr warningQuote
Originally posted by: Smash_Brother
No, because "emulated" implies that the application being run is being sustained through virtual fabrication of the resources which the application utilizes so that it can operate. There's a colossal difference between an IBM G5 processor emulating an Intel processor and having an ACTUAL Intel processor. Emulation is slow, sh*tty and unreliable.
I'm referring to the latter where the Windows environment is running natively off of all the same hardware it would typically have in a PC anyway but is still contained within OSX. No "emulation" actually occurs.
You're waving a lot of hands around in here. Let me give you a heads-up on operating systems - I might be a bit rusty as it was a couple of years ago that I studied this.
When you boot up, you load a segment of code called the kernel - the foundation of any operating system - into the main memory of a computer. This becomes the interface between the hardware and the programs that are run by the user. The kernel allows programs to access parts of the hardware, such as the processor or a network interface, but through a series of system calls. These system calls differ from kernel to kernel depending on the designer - Windows, Mac, Linux, etc do not have the same kernel. There are some similarities but you cannot simply drop in a new kernel and expect it to work correctly.
Now, if someone wants to run a different kind of operating system, they have two choices. One is that they can reboot and load the different kernel into memory. Or the second choice is to run the operating system within the current operating system. This isn't as easy as you think, even with the same computer architecture.
Software (lets call this the environment manager) has to be used to convert the kernel calls of the new operating system into system calls that the original operating system can execute. The software never has access to the hardware as it is running as a userspace program which talks to the original operating system. Unless OS X is keen to open up their kernel to system calls from userspace programs, this is the way that all emulation/virtualisation programs will access the OS.
In this case, if you were running a Windows program you would need to have a Windows installation (with the required libraries and kernel that Windows applications interface with). The program makes calls to the Windows OS, and the operating system then "thinks" it is talking to the real hardware, but it is tricked into talking to the environment manager. The environment manager then takes the Windows system calls and maps them to the relevant OS X system calls, and the calls are executed by the OS X operating system.
Not anywhere along the line is this process "natively executed" because any other operating system has to be run within a software environment in order to correctly handle the system calls of other operating systems. That's why its emulation - but it was "slow, sh*tty and unreliable" in the past because of the processing power required to do it. These days, the software and hardware power is more suited to the task.
Virtualisation on the other hand is a broad term which involved producing "customised hardware configurations" on top of the original hardware. Different architectures can be run on the same system, with an operating system assigned to each configuration. The configurations are managed by software, which then talks to the original operating system in order to execute code and access hardware.
Feel free to correct me on this, but I don't see Apple pulling any rabbits out of their hat regarding running Windows code natively. And that's because its not a hardware issue, its an operating system issue. They're already including BootCamp with Leopard so I don't see them making huge steps without negotiating a lot of technical and legal issues.
Quote
Well, I was talking about the general perception of Vista being...hard to run? I'd get it if my laptop had more than half the bare minimum of RAM it needed.
Having used Vista on and off for the last two months, I will say that it is a disappointment. They've taken what was at its core the most popular Windows OS (XP) and then made it an annoying experience. From a power user's perspective, I feel like I'm computer-challenged when I use it.
It does things that I don't want it to do. Like setting the home page in IE7 to some MSN site. I swear it changes itself at random intervals.
It does things that I have no control over. The default shutdown is not a true shutdown but a terrible excuse for hibernation - and even that takes forever. Just turn the monitor into a black screen, wait a random period of time, check if the hard drive is still working, wait a bit more. And this was hyped as a real feature of Vista - fast startup and shutdown, when in essence it doesn't do either.
And when I want to do something then it feels like I'm fighting the OS in order to get there. User access controls are a nuisance, and even when I switch them off there is still a popup every time I start Vista telling me this fact.
And when I want to find out information that is important to administrators (like how long a disk defragment will take) it tells me "This could take minutes or hours." No numbers, no little graphic of moving colours around the hard drive space. Nothing. Not good enough.
I believe there was too much of a focus on minor things like visual effects and keeping information away from users that they've made the system aimed towards people who don't wish to be in control of their computer. Just sit back and enjoy the ride, sponsored by Microsoft. Oooh look, pretty colour schemes.
I will be removing it for XP when I get a couple of hours free. Unless they change the OS significantly, with a real focus on usability rather than "shiny things" then I won't be trying it again.