The one thing that somewhat confuses me is that I can ping of death even when the firewall is on, but I hope that this will be softed by RTM. The auto firewalling on install is good, as is the way that it asks if a network is public or private when you connect (public turns on the firewall). IE has also been loosened in the default config to something that uses ActiveX etc by default, unlike 2003. I do however worry that there are so many of these (I got one for removing an icon from the desktop) that users will merely be trained to click yes all the time. The defaut user is still admin, but with the requirement to click on a confirmation before anything is done that would require admin privalidges. Speed wise it is noticibly more demanding then XP, and takes up about 30% of the CPU time of a XP 2500+ at desktop, but peaks semi-randomly to 100% for 20 seconds or so. I’m not a windows user by choice, but I thought that 2003 was approaching something I could recommend for some server tasks, it’s nicely locked down. I don’t know if it will make Novemeber, but I hope not. You can do the ping of death against it quite reliably – I can make it crash from ping flooding a VMWare image. The crashes mainly seem to come from intensive for the desktop (well, 1-2 megabit) use of the network stack. To be honest, I was somewhat surpised at this, as the code was refactored to work from the Server 2003 codebase, something that is fast, stable, and a close second to Windows 2000 as the best thing to come out of Redmond. With the CPU occasionally pegging at 100% with *no apps open*, and a crash about twice daily, I feel that I can’t recommend this for a daily use system. What is the point of this “reservation” if the OS is prepared to give it up at first request? How is this different from simply allocating more memory as it is needed?Īs someone who is trying a build from not 2 weeks ago, let me tell you that this is not like Whistler betas at all. If other applications need the memory, the mamory will be allocated to the needed process, that is why windows and all other OS:es (linux, solari, windows and so on) has great memory management where it controls the resources itself. Wheither or not this memory is actually used by OS is, well, irrelevant. This is the point of reserving something, so it would not be available to others. We’ve had previews out the wazoo on the UI, how about something a bit deeper?Ĭould be good to know that 700 Mb for Firefox is a lot of cache memory, memory that is not used, but cahced and ready for the application to use, the same goes for windows, it takes up an larger portion of memory to be ready if it is needed, that does not mean that it uses all 700 Mb at once.Īt the risk of sounding stupid, I’ll say the following: If an OS (or an application) “reserves” a certian amount of memory, that memory is not available to other applications. There’s a lot of changes in here but we see the same stuff. Unless something changes with WMP or Photo Gallery, we’ve all seen them. The only questions they answered concerned UAC:ĭoes MS realize this and working toward it? Yes. The MS WinHEC site provides good information on Boot Configuration Data (Vista + EFI/ BIOS, bootloaders – very interesting read), XPS & Vista’s Printing, Kernel Mode Enhancements (new memory management and dll handling/ updating were interesting), Audio effects, Longhorn Server, Drivers, WinFX, etc.ĭid anyone know they had a new graphics format to compete with TIFF and JPEG? I didn’t until I read some of the XPS info.īasically meat is available, but the news sites are only touching on fluff. I realize reviewers are throwing something out and haven’t had any time to really dig in yet, but this is just page fodder. None of the previews (PC Mag, PC World, etc) add new info to what’s available. Home Basic doesn’t have Aero, Home Premium does.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |