Agreed. Windows Server 2000 through Windows 7 were peak Microsoft operating system.
By Windows 2000 Server, they finally had the architecture right, and had flushed out most of the 16 bit legacy.
The big win with Windows 7 was that they finally figured out how to make it stop crashing. There were two big fixes. First, the Static Driver Verifier. This verified that kernel drivers couldn't crash the rest of the kernel. First large scale application of proof of correctness technology. Drivers could still fail, but not overwrite other parts of the kernel. This put a huge dent into driver-induced crashes.
Second was a dump classifier. Early machine learning. When the system crashed, a dump was sent to Microsoft. The classifier tried to bring similar dumps together, so one developer got a big collection of similar crashes. When you have hundreds of dumps of the same bug, locating the bug gets much easier.
Between both of those, the Blue Screen of Death mostly disappeared.
I agree with one big exception, the refocus on COM as the main Windows API delivery mechanism.
It is great as idea, pity that Microsoft keeps failing to deliver in developer tooling that actually makes COM fun to use, instead of something we have to endure.
From OLE 1.0 pages long infrastructure in Windows 16 bit, via ActiveX, OCX, MFC, ATL, WTL, .NET (RCW/CCW), WinRT with. NET Native and C++/CX, C++/WinRT, WIL, nano-COM, .NET 5+ COM,....
Not only do they keep rebooting how to approach COM development, in terms of Visual Studio tooling, one is worse than the other, not at the same feature parity, only to be dropped after the team's KPI change focus.
When they made the Hilo demo for Windows Vista and later Windows 7 developers with such great focus on being back on COM, after how Longhorn went down, a better tooling would be expected.
Drivers can crash the rest of the kernel in Windows 7. People playing games during the Windows 7 days should remember plenty of blue screens citing either graphics drivers (mainly for ATI/AMD graphics) or their kernel anticheat software. Second, a “proof of correctness” has never been made for any kernel. Even the seL4 guys do not call their proof a proof of correctness.
Kernel drivers have to be verified by the driver verifier to pass Windows Hardware Qualification Labs certification and get signed with the Windows signing key that lets them load without warnings. There are fewer outside kernel drivers today, though, because plugging random peripheral cards into PC buses is no longer a big thing.
This is true for certification, which is mandatory for Server OS, distributing through Windows Update, or certain classes of drivers such as anti-malware or biometric authentication, but you can still submit drivers to Microsoft for "attestation signing" that will load without warnings on desktop OS without having to run them through the testing suite.
In any case, running the certification tests does not provide runtime protection for drivers running in kernel mode, as demonstrated by CrowdStrike. Only Windows 10 started introducing hardware virtualization-based isolation of kernel components (to provide isolation of security subsystems, not runtime checks to prevent crashes): https://learn.microsoft.com/en-us/windows-hardware/design/de...
Yet drivers that have passed Windows Hardware Qualification Labs certification have had blue screens. Also, Microsoft hands out Windows kernel driver signing keys to anyone who pays them. You don't need to have a driver go through the Windows Hardware Qualification Labs to be able to sign it with a key signed by Microsoft.
It does not prove that the driver will not crash the kernel. It should be fairly easy to find a driver that passed QA testing under that tool, yet still crashed the kernel. You just need one of the many driver developers for Microsoft Windows to admit to having used that tool and fixed a crash bug that it missed, and you have an example. Likely all developers who have used that tool can confirm that they have had bugs that the tool missed.
The original ribbon sucked but with the improvements it's hard to say it's generally a bad choice.
The ribbon is a great fit for Office style apps with their large number of buttons and options.
Especially after they added the ability to minimize, expand on hover, or keep expanded (originally this was the only option), the ribbon has been a great addition.
But then they also had to go ahead and dump it in places where it had no reason to be, such as Windows Explorer.
> The ribbon is a great fit for Office style apps with their large number of buttons and options.
To me this is the exact use case where it fails. I find it way harder to parse as it's visually intense (tons of icons, buttons of various sizes, those little arrows that are sometimes in group corners...).
Office 2003 had menus that were at most 20-25 entries long with icons that were just the right size to hint what the entries are about, yet not get in the way. The ribbon in Office 2007 (Word, for example) has several tabs full of icons stretching the entire window width or even more. Mnemonics were also made impractical as they dynamically bind to the buttons of the currently visible tab instead of the actions themselves.
Close to 20 years later, people still complain about the ribbon. (1)
I think that says something about it.
--
1. And not just "grumble, grumble... get off my lawn..." Many of its controls are at best obscure. It hides many of them away. It makes them awkward to reach.
Many new users seem as clueless, or even more so, than pre-existing customers who experienced the rug pull. At least pre-ribbon users knew there was certain functionality that they just wanted to find.
(And I still remember how MS concurrently f-cked with Excel shortcut keys. Or seemed to have, when I next picked Excel up after a couple year hiatus from being a power user.)
I know nothing of your objections, so this is more about how I think of mine and how they relate to these kinds of changes.
Being a power users is difficult, I think the best way to do software is to make it APL complicated and only educate one guy in it. The way power users in Excel/Emacs/Accounting software out perform user friendly stuff is amazing. But somethings are meant for the masses, e.g. opening a file.
Dumbing down or magification of interfaces was needed for many other reasons. Gnome and Ribbon were necessary changes IMO, what we had was never going to improve. Of course I wish there was elements that could be reused elsewhere, but that is a pipedream of Smalltalk proportions.
I am now stuck with windows at work, and it is a horrible experience. Everything is so needlessly complicated. In the same way Linux is. I do believe Gnome did manage to improve things, at least when I look at children using Mac, Linux and Windows as power users. My view is that the complexity of Linux is still a little bit easier to understand, but that is just because of a long history and easy abstractions.
I think core objections are often not compatible with products that need to fit and be produced for many people. I do software that is used once by many this has changed my view if GUIs for ever, especially in regards to desktops.
For me peak UX was before Ribbon. Just menus and customizable toolbars. Didn't need nothing more to be productive enough. Nowadays I can hardly use Office suite, its feature discoverability essentially zero for me.
> I never understood the issue with the ribbon UI. Epecially for Office it was great, so much easier to find stuff.
1. I don't need to find stuff.
I knew where stuff is.
2. I read text. I only need menus. I don't need toolbars etc. and so I turn them all off.
I cannot read icons. I have to guess. It's like searching for 3 things I need in an unfamiliar supermarket.
3. Menus are very space efficient.
Ribbons hog precious vertical space. This is doubly disastrous on widescreens.
4. I am a keyboard user.
I use keys to navigate menus. It's much faster than aiming at targets with the mouse and I don't need to look. The navigation keys don't work any more.
Ribbons help those who don't know what they are doing and do not care about speed and efficiency.
They punish experts who do know, don't search, don't hunt, and customise themselves and their apps for speed and efficient use of time and screen space.
> They punish experts who do know, don't search, don't hunt, and customise themselves and their apps for speed and efficient use of time and screen space.
The problem is, most users are utterly braindead, they barely manage to type at speed instead of pecking at single keys. The astonishment I've gotten in some places for literally nothing more than Ctrl+C/Ctrl+V is more than enough proof.
That's also IMHO a large portion of why Linux never really took off on desktop. UX/UI people are rare enough to begin with, most of them don't work on FOSS in their free time, and so development is primarily done by nerds for nerds. That's great if you already know something about the application - but usually the learning curve is so steep that most users frustratedly give up. And documentation is either not existing, incomplete or horribly outdated, and StackOverflow etc. are even worse.
The exception is Blender. They got some serious money IIRC, cleaned up their act, and now there's a headline of some movie or game using Blender every few weeks.
The sad thing is that Windows has a great keyboard UI and it's superbly accessible for people with visual and motor disabilities.
Who have reduced earning opportunities because they are disabled, so FOSS should be great for them, but it isn't, because the nerds don't know CUA and don't know the keyboard UI. They spend their time mastering a couple of ancient apps like Vi and Emacs and ignore the fiery furnace of UI R&D that followed for the next 20Y after those early efforts.
Learn Windows' keyboard UI and you can drive the whole OS and all its apps with the speed of a genius Vim user with 20 years' practice. It makes Emacs look like a wet paper pad and a burned stick compared to a Moleskine notebook and a top quality fountain pen.
Xfce comes close and implements maybe 75% of the UI but once you are in an app all bets are off.
> Learn Windows' keyboard UI and you can drive the whole OS and all its apps with the speed of a genius Vim user
Do you have a reference for this? I've often needed to control Windows using only a keyboard and failed to do so. I'm aware of most shortcuts in this list[1] but these are for a few very specific things. (As an aside, I also remember controlling the mouse with the numpad using the Mouse Keys accessibility setting but this is worse than both keyboard shortcuts and the mouse.)
Look for underlined single letters in menus. With apps that use the "classic" style menus instead of ribbons or plain Electron crap, the single letters are the key.
I'm curious to know if this is what lproven meant in their comment above. Alt + a-z to access menu items is available in every OS and all "native" apps, but you can't "drive the OS and all apps" this way.
For example, I would like to set options that are a few menus/button clicks deep in the Windows control panel (either the "classic" or new variant) using keyboard shortcuts/navigation. Or navigate the Windows registry editor. I'm not aware of a way to do this.
No, it's not in all native apps. KDE reinvents its own set of keystrokes, for instance, and half the KDE apps have no menu bars any more... And there's no global way to force them either.
Yes, the control panel and RegEdit are totally keyboard controllable.
You can literally just unplug the mouse from a Windows desktop and it remains totally 100% operable.
Some apps may not, because the developers didn't do their jobs right, but the OS is.
Windows actually had a decent built-in manual system with CHM, tooltips and whatnot. Even games could and did use it, like EarthSiege 2.
Back in the days when application developers stuck to the Windows-provided widgets instead of doing their own UI, it was wonderful. Symbols were consistent across applications, as were color schemes (IIRC, if you wrote your CSS correctly, Internet Explorer would pass these on to websites!) and behavior.
> And documentation is either not existing, incomplete or horribly outdated, and StackOverflow etc. are even worse.
Or the documentation is very complete, but only useful if you read and comprehend it in its entirety. Open source devs need to understand that not everyone using their software wants to become an expert in it. They just want to get a task done and the software is facilitating completing that task. That is something totally normal and those users should not be thought of as less important than the power users.
On a Mac, that's fine. On Windows, it's not, because then I can't control the app any more.
I have been using Word since version 4 on DOS and version 5 on Classic MacOS. On Windows, I used WinWord 1, 2, 6, 95, 97, 2000, XP and 2003... then 4 years later MS ripped out the UI I knew backwards and had known for about 16 years, since 1991, and replaced it with one inferior in every way for me.
I'm not denying it might be better for others but for me it's now a waste of disk space.
The old versions do all I need, so I keep them. For everything except Word, there is LibreOffice.
But LibreOffice Writer has no outline mode, and I am a writer: that is THE killer function of Word for me.
So, Word 97 under WINE on Linux and Word 2003 when I have to use Win10 or -- shudder -- Win11.
My big problem with it is that it’s stateful. A menu or toolbar admits muscle memory - since you get used to where a certain button or option is and you can find it easily. With ribbons you need to know if you’re in the right submenu first.
Though personally, I’m increasingly delighted by the quicksilver - style palette / action tools that vscode and IntelliJ use for infrequently used options. Just hit the hotkey and type, and the option you want appears under the enter key.
Your monitors, those of a well-off power user, may have become larger. Most regular users I've seen are on 15" laptops with screens at 1366×768, or (if they're lucky) 1920×1080 with scaling at 1.25× or so. 17" desktop monitors used to be commonplace about 20 years ago.
The slightly larger screen real estate (if any) is more than wasted by very inefficient "modern UIs" where you won't find paddings smaller than 16px, with three buttons where there used to be enough space for 9.
I don't know quite when it started to happen, but changing and/or eliminating the default Office keyboard shortcuts in the last few iterations has really irked me.
People hated it because it was all over the place. Change this or that setting? UAC. Install anything? UAC. Then you'd get a virus in a software installer, confirm the UAC as usual, and it wouldn't stop a thing.
No, in XP you were essentially logged in as root 24/7 (assuming it was your machine), and any program -- including your browser -- was running as root too. I remember watching a talk about how stupidly easy it was to write rootkits for XP. "Drive-by viruses" were a thing, where a website could literally install a rootkit on your machine just by visiting it (usually taking advantage of some exploit in flash, java, or adobe reader). Vista flipped it, by disabling the admin account, so that in order to do something as admin you needed to "sudo" first. That alone put a stop to tons of viruses.
I used to work in the security team at a financial institution that was still running XP until around 2017.
We got to a point around 2015 where drive-by exploit kit developers just weren't targeting XP and IE8 anymore. Phishing landing pages would roll through all the payloads they had and silently exit.
> It is more of a warning than an actual security mechanism though. Similar to Mark of the Web.
It's both a warning and an actual security mechanism.
Obviously its most visible form is triggered when an application tries to write to system-level settings or important parts of the filesystem, and also when various heuristics decide that the application is likely to want to do so (IIRC "setup.exe" and "install.exe" at the root of a removable disk are assumed to need elevation).
Because Microsoft knew that a lot of older software wrote to system areas just because it predated Windows being a multi-user system UAC also provided a partial sandboxing mechanism where writes to these areas could be redirected to user-specific folders.
The warning was also a tool in itself, because the fact that it annoyed users finally provided the right kick in the ass to lazy software developers who had no need to be writing to privileged areas of the system and could easily run under a limited user but hadn't bothered to because most non-corporate NT users were owners and thus admins and most corporate environments would just accept "make users local admin". A portion of the reason we saw UAC prompts a lot less in later versions of Windows is because Microsoft tweaked some things to make certain settings per-user and to reorganize certain dialogs so unprivileged settings could be accessed without escalation, but a lot of it is because applications that had been doing it wrong for as long as NT had existed finally got around to changing their default paths.
> The big win with Windows 7 was that they finally figured out how to make it stop crashing.
Changing the default system setting so the system automatically rebooted itself (instead of displaying the BSOD until manually rebooted) was the reason users no longer saw the BSOD.
Well, the Crowdstrike driver isn't (wasn't?) static. It loaded a file that Crowdstrike changed with an update.
Most drivers pass through rigorous verification on every change. But Crowdstrike is (was?) allowed to change their driver whenever they want by designing it to load a file.
I'm all for anti-trust and anti-monopoly but christ alive an operating system vendor gatekeeping their kernel is literally the whole point of being an operating system vendor. Braindead regulation.
Only because OP didn't give the full story. Microsoft wanted to close direct access to the kernel. AV companies complained to regulators in the EU. The EU asked Microsoft if they were willing to maintain access to replacement functionality and to stick to using that functionality for its own separately sold AV products. Microsoft said no, and instead of fighting, just let Windows wither on the vine with full kernel access for all the bozos. Crowdstrike was inevitable.
The issue isn’t with the gate keeping per se. The issue is that windows defender, a competitor AV, gets full access while third parties would not. This would leave the, at a competitive disadvantage.
No, braindead take. The purpose of being an operating system vendor is to sell an operating system. If someone else modifies your operating system after they buy it, they get to keep both pieces. You don't get to stop them from modifying the thing they bought.
Do you like nanny states? How about nanny corporations?
This particular case is weird because crowdstrike is complianceware.
So, it’s more like “you don’t get to improve your product if doing so would also stop random companies from forcing your customers to break the stuff you sold to them”
A modern reimagining of Windows 2000's UI - professional, simple, uncluttered, focused, no cheapening of the whole experience with adverts in a thinly-veiled attempt to funnel you into Bing - with modern underpinnings and features such as WSL2 would have me running back towards Microsoft with open arms and cheque book in hand.
I’ve been watching ReactOS development for years and and progress is slow but steady. I’m excited for the point where it will be fully usable as a drop in replacement for old Windows software.
There are Linux distros that meet your description (no need for WSL2 either!). I am guessing you're not running towards them with open arms and cheque book in hand ... or maybe you already ran to Linux and are just nostalgic about going back to Microsoft ... ?
Linux UIs can’t even align fonts correctly within the elements.
It is miles away from the original and you can immediately see its Linux because things don’t quite line up. Huge difference in quality, attention to detail, and the entire interface becomes unpleasant to look at.
Also, Linux power management and lack of hibernation means its useless on laptops
I do not know what kind of Linux UI you have seen, but the problem mentioned by you is certainly not universal.
I have never seen it, but it may exist, because there are many kinds of Linux UI that I do not use, e.g. Gnome.
That said, I have seen many Linux GUI applications that are ugly, at least by default, but many of them can be reconfigured to be beautiful enough.
I have never been content with the default appearance of any Linux distribution, but the good ones can be customized to look completely different and better than Windows, especially if you replace all default fonts with some high-quality fonts.
Out of curiosity, when was the last time you used Linux on a laptop platform? Anecdotally, it's come a long way since 5 years ago - daily driving Ubuntu 24.04 on my Thinkpad, and I can get 8 hours of use (engineering workload) in a day. It's not ARM level of performance, but far from "useless".
That's "It works on my machine". Especially with ThinkPads, Dell XPS, and other laptops usually used by the Linux folks. Try it on a random cheap HP and you might not even have sound working (I've gone through this a few months ago). You can sometimes easily fix it through the terminal, but then we get into the debate of whether a normal user would be able to do that.
Well, how many "it works on my machine" does it take to make it a general statement? It works on all of my machines, from dirt-cheap lenovo EDU series ThinkPad to portable workstation Dell M6800 and Pixel Chromebook in between.
It generally doesn't work for people buying computers from vendors who use hardware were the manufacturer doesn't disclose the documentation. Just don't give money to those who seek to prevent free software.
Solid advice, but choosing good hw for the rig is already a challenge. Tick the virtual "linux" checkbox and you often get an empty list. That is, if you have that checkbox, in any sense. How the hell should I know if a mobo/laptop is supported? Googling "<model> linux issue" always yields hundreds of threads regardless. Even when <model> is Thinkpad: https://www.google.com/search?q=linux+thinkpad+issue
I mean, yeah? But it’s far from the seamless experience of macOS or windows. On my desktop pc:
- My wireless card isn’t detected
- I’m using Linux mint, which means I’m still on X11. Some software doesn’t support X as well as wayland. Some only supports X I guess?
- I use Davinci resolve - which has a native Linux install. But I need to use some weird tool to convert it to a dpkg to install and run it. It doesn’t have a window bar - so the only way I can change the size of the window is by right clicking in the task bar
- My two monitors have different DPI - so I need to use window scaling. This confuses IntelliJ - which made all the text super tiny for some reason. I have a DPI override for that in a weird Java config file.
- I want consistent copy / paste shortcuts. I can’t use ctrl+C in terminal because that’s SIGINT. So I have it set to meta+C. But I can’t bind meta+C in IntelliJ because of Java limitations. So my copy/paste shortcut is just different in different apps now.
- Smooth scrolling is still an inconsistent mess between different programs. Particularly Firefox.
I’ve also been running into problems where my second monitor won’t turn on after I resume the computer from sleep. But apparently that’s a bug that affects windows as well when using recent nvidia drivers, so that isn’t Linux’s fault.
I’m not saying it’s bad. It mostly works great! I love my workstation, and I’m enjoying distancing myself from Apple’s increasingly buggy software stack. But it’s far from perfect.
I’m happy enough to use Linux despite all its warts. But when my parents ask for a new computer, I recommend macOS or windows.
Plenty of distros/skins get it 99% of the way there for a similar looking screenshot but only 25% of the way there for the actual user interface experience. ReactOS is probably the closest (in terms of going down a holistic user interface approach) but saying it's 25% the way there to being a finished solution would be generous.
While DEs often emulate the look of macOS or Windows, they always get the feel wrong. You can put a global menu bar, Dock, etc into KDE, but ultimately it still acts like KDE and nothing like macOS.
It's not like macOS or Windows is the pinnacle of UI. I'm on Fedora Silverblue, and it's so relaxing to not deal with the usual ['Yes', 'Not Now'] prompt on notifications. Or have your computer became suddenly unresponsive because of random scans you can't disable.
It's not like kde is either. In windows I have never thought about carefully moving the cursor through start menu. In kde it's one wrong move and you're in a different section, cause in 25 or how many years they didn't figure out hover activation delay.
> In windows I have never thought about carefully moving the cursor through start menu
Well, Windows 11 got rid of start menu. To get to it you have to click an obscure button.
In Windows, you have to carefully think how you move the cursor at the edges of the window because Some Idiot thought is a good idea to make the window border 1 pixel wide. Even on (Q)UHD monitors.
The internet says the borders are 1px since w10, but the actual resize handle area on my pc is much thicker and is adjustable, cause I remember changing it.
Windows surely has its quirks in dumb places. But what linux desktop achieved in the last 10-15 years is being driven by a bunch that simply shits on its users and doesn't care for years after. I left back to windows at xfce 4.6 which broke all my effing menus and told me to gfm. Kept trying biannually and seen it getting worse and worse, at the stupidest places. Like, they have to be from really special demographics to do some of that.
Windows Whistler (XP Beta) had an interesting theme that was like a bit modernized Windows 2000. Small non-rounded title bars, non-obnoxious taskbar, etc. Too bad they never finished it and offered a stable version for Windows XP users.
Adding to this
If you're willing to go third party
Everything can give you instant search, and with a PowerToys plugin you can integrate it into PowerToys Run, which gives you an Alfred Style search bar
WizTree works for visual inspection of Storage
Screenshots apps, Markdown Viewers, are common enough, won't comment further on those
On the printer disconnections:
I've had some weird experiences, recently, a technician showed to me that, using the default windows update driver, my work printer regularly disconnected, but using the manufacturers driver, the setup has so far worked without a problem
My main gripe for work laptop is that Windows 11 is dog slow. I think they have rewritten Explorer but not for the better. Word is also driving me nuts. The formatting does a ton of weird stuff that's totally unpredictable. Outlook has this weird flat UI where it's hard to tell what is a button and what isn't. Search has been broken for a long time.
Both Windows 11 and modern macOS are slow as shit now. The other day I clicked the Notifications settings in the outhouse they call a Settings app on my Mac and it took a solid 3 seconds to render the UI.
And behind me, was a G4 Cube that could open the System Preferences app off of a spinning hard drive in less time.
Seriously though I think Microsoft has mostly given up on the B2C market. They have good capture of B2B with hardware and software. Why make great products when you can make mediocre products that people have no choice but to use?
> Runs Windows update and reboots without my permission
This might be an unpopular opinion but I'm actually glad they do this by default now (you can turn it off). My understanding is that MS was continually getting blamed for users getting viruses because they would never update their system, so in the best interest of the users they decided to force it.
I know a lot of people will still disagree with me, but I think if you were in their situation and you were getting tired of not only end-users but also world governments blaming you for things your users did (or did not do)... you'd probably want to control that a little more too, for both your sakes.
In the end it will hurt MS's reputation for being a broken mess even if it's 100% the users' fault for not updating, so I absolutely get it. And yes I know there's plenty of other things you can blame them for, I'm not saying this is their only issue.
It does... "Your PC will restart in 2 days to finish installing important updates". I believe it has been doing it since Windows 8. Of course you can always restart manually any time before then to apply the updates immediately. I think it's a good compromise.
I agree with these. Here are some third party tools that can help with some of the gripes though:
> - No instant search (macOS has had it for how many years now?)
Everything search somehow does instant search across the entire file system. It is the first thing I install when I get a new computer, cannot stress enough how much time this has saved me:
https://www.voidtools.com/
> - No tool to graphically show where my diskspace went; allowing me to find and delete large files
I use WizTree to see what's taking space. On NTFS volumes it uses the same method as Everything does to quickly read all the file info straight from the filesystem.
> I laughed at your response too, because tar is part of Windows 11. See C:\Windows\System32\tar.exe
We don't usually browse System32 to see what programs Microsoft decided it should be part of the OS. Especially when Microsoft is not sure which Paint is the best.
It doesn't let you type text on the image. That's so important! The main thing I want to do when I take a screenshot is to circle something, or draw an arrow, then type some text about the item I am pointing to.
As someone who built an IT career on Microsoft’s entire suite, only to recently (past six years or so) migrate wholesale to macOS (endpoint) and Linux (server), I can definitely say MS’ best days are behind it. 2000 was rock solid, Server 2003 had some growing pains (mainly the transition to x64 and multi-core processors), and 2008 fully embraced the long march into irrelevance even as it tried to shake up the hypervisor space. Now the company is so obsessed with arbitrary and unnecessary feature creep and telemetry-as-surveillance that I’m loathe to recommend it when I don’t have to.
Honest to god, if an IdP like Okta made an Active Directory replacement that ran via container instead of a full-fat VM or appliance template, I’d gladly toss ADDS out the window with all its stupid CALs. Basic directory functionality in 2025 shouldn’t require a bloated ADDS/LDAPS virtual machine to run, especially with the move to cloud providers for identity. If you make it easier to do identity without ADDS, you remove Microsof’s major trojan horse into the enterprise - and M365’s as well.
You’re not wrong, but depending ln the org size those charges are still cheaper than Windows Server + CALs.
Ideally though, it’d be like Okta in that its core directory is in the cloud, but also like ADDS/LDAP in that local servers/objects can join to a domain via local containers posing as domain controllers.
Yes, I know modern device management and cloud-based IdP means the need for a directory is decreasing by the day, but Enterprises still want it for ease of user and computer management via a centralized database of sorts. Having someone, anyone offer me a leaner way of achieving this without a crusty LDAP deployment or expensive Windows Server + CALs, would be hugely appreciated.
Okta was going to charge us $6/user/month just for MFA. So I migrated my company to Azure AD with free MFA. We still had AD DS in the mix, but endpoint management was moving to cloud w/ Autopilot + Intune.
An on-prem AD DS is going to be difficult to move away from. From a management cost perspective, it is still cheaper than every other LDAP + Kerb + endpoint policy solution out there. And since a CAL is provided with every copy of Windows Enterprise, thinking about CALs for clients is a non-issue.
That’s assuming your org is all-in on the Microsoft product suite though (but you do make excellent points on orgs who stick to AAD vs Okta in terms of cost savings). For companies who aren’t, or don’t want to be, there’s a huge gap in the market for a modernized, lightweight, cloud-friendly directory.
If I can have my PDC in the cloud IdP, and rely on containers for replicating at local sites or network segments as needed, then I can ditch ADDS, CALs, and M365 wholesale in favor of other products. It removes Microsoft’s trojan horse product from the enterprise and shakes up a lot of attached markets in the process.
In my opinion, the year of the Linux desktop happened more than 2 decades ago, when the last kinds of applications that were previously available only on Windows became also available on Linux, e.g. movie players and device drivers for some less common hardware, e.g. TV tuners.
That is when I have converted all my computers, desktops and laptops, from dual-booting Windows and Linux, to Linux-only. For some servers I have continued to use FreeBSD and I have continued to use Microsoft Office Professional, but on Linux with CrossOver, where it worked much better than on Windows XP (!).
I agree that installing and configuring in the right way Linux remains a job for someone with decent computer management skills.
However, I have also installed Windows professionally, and on less common hardware, like embedded computers, I have encountered far more problems and far more difficult to solve than when installing Linux on the same hardware. Moreover, the solution for most Windows installation problems was not using some menu in a graphic tool, but using some obscure Windows command in a CLI window, with some very cryptic and undocumented command-line options, which I typically found by searching Internet forums where Windows users complained about the same problem.
Therefore the only real reason why Windows is more user-friendly is because it comes pre-installed on most computers, after professionals have solved any compatibility problems.
For whomever has a friend or relative that is knowledgeable about Linux, Linux can be more "user-friendly" than Windows.
My parents, older than 80 years, have been using Linux (Gentoo!) on their desktops for many years, without any problems, for reading/writing documents, Internet browsing, movie watching, music listening, TV watching, e-mail using, and so on, despite the fact that they do not even know what is "Linux".
Imagine coming into the mac world with a 10.5 cd and upgrading to the current 10.6 and then watching it deteriorate through years into 10.10. Yeah that's me. Peeking at a glimpse of perfection only for it to flash and fade.
They really did offer a lot of features that really helped productivity. Snapping windows, jump lists, having libraries act as a virtual folder for many folders, etc.
It was the best for its time. But one of the reason why XP was "better" is that it had built-in support for WiFi. That ended up being a dealbreaker for 2k.
That's the issue.. every new OS has brought some features or stability improvements that are huge upgrades over the older OS.
WSL 2 is a must-have for me now, so Windows 10/11 is much better than anything that came before in that way. I may be alone in this, but I actually think Windows 11 has the best design of any Windows so far. The problem as usual, is that they haven't made the entire OS consistent. I wouldn't mind the new control panel if you could actually change every setting in windows in that one control panel.. and not have to dig through to find control panels that still date back to Win2k. And the new/old context menu in explorer is an absolute disaster. Then new design is fine.. but how the hell did they manage to not make it support all the options of the old context menu?
Also let’s not forget that windows 11 puts random news stories in the start menu. Here in Australia, a lot of them are clickbaity scams. I really can’t believe Microsoft is endorsing whatever horrible choice of news provider they’ve teamed up with. It really spoils their brand image.
There’s a way to remove it, of course, by running some obtuse console command. But normal people have no idea how to do stuff like that.
There were pre-release 64 bit alpha versions of win2k, but otherwise you needed XP/2k3 for 64 bit. XP for amd64 was a bit of a shitshow with drivers (especially on consumer-grade computers), though. It wasn’t until vista that it ironically got better on that front, though people held out upgrading because of how terrible it was…
yeah while 2K was their best ever single breakthrough improvement, it was a v1 and XP/2003 in classic mode was a more refined 2K eg more drivers and better plug and play, more graphics compatibility. And 2003 Active Directory had a number of quality of life improvements.
Perfectly stated. It was more stable and had better UX than NT4, but didn't have all the unwanted anti-features that came in later versions of Windows. It was the last version of Windows that didn't get in my way.
Agree. My company ran a bunch of web servers on Windows 2K and Apache web server, because management was afraid of Linux (general FUD and Microsoft's lawsuit threats) and the engineering staff was afraid of Microsoft's IIS web server (security dumpster fire at the time). It was actually a pretty good system, super easy to maintain.
You can't say WDDM wasn't a step forward... Being able to crash your video drivers and reboot them without crashing and rebooting your whole machines made Windows a lot more stable.
Nostalgically, yes, Windows 2000 was amazing. At the time of launch, on period hardware, it was the fastest and most lightweight OS released by Microsoft. And looking back, I always appreciate that I can look in Task Manager and immediately recognize all of the processes by name.
Windows 7 (except for the last few updates that introduced telemetry and ads) comes in as a close second. But everything after is just bloated crapware.
The only bad things I remember about Windows 2000 are that some software written for Windows 3.x and 9x had compatibility issues and it took an eternity to boot up. It was go take a coffee break as soon as you turn your computer on for the day bad.
IIRC, Win2K would wait for most / all service startups to complete before showing the login prompt. XP and later would allow login to occur while many services were still starting up.
It's a tradeoff. A Win2K system was pretty responsive when you log in after a reboot/startup, but you've got to wait for that experience. In the days of spinning disks and single core CPUs, you had to fight those still-starting services for resources, making the first several minutes of XP usage painful.
Win2k also had the smoothest mouse movements that I had ever seen. If you had a PS/2 Mouse, you could turn up the sample rate up to the max. Dragging windows looked incredible. Even my Mac to this day with a fancy brand new 4k display can't match it. My mouse still looks blurry as it moves across the screen.
I remember using the BootVis tool (IIRC was an early part of what would be the performance toolkit) to profile the startup process, and then you could it to optimize the location of data loaded from HDDs to reduce the seeking required. Also back when PATA was still in use depending on your motherboard I seem to remember making sure windows wouldn't try to autodetect link speed on unused attachments as that could take ages trying to find something that wasn't there.
I used the NT 5 betas for a while, and loved alerts not stealing focus. But that came back in the released W2K, and I remember being slightly annoyed by it.
It was anything but lightweight on a Pentium 90 or Pro, or whatever was common at the time. Really needed to upgrade to 16MB of RAM (lol) which was expensive at the time. Why only business and not normal folks used it.
It's a shame ReactOS never got mature enough to be a serious competitor. If it had modern app and hidpi support but was suck in a 2000-era UI and didn't have feature bloat, it could be a great daily driver.
Can it run Firefox yet? The last time I tried reactOS, it BSODed just trying to launch Firefox... and don't even try using Windows drivers on it, it'll either hardlock, BSOD immediately, or during startup, regardless of whether you're in safe mode or not...
As someone who still runs Win2003 R2 (32bit) on my desktop I can confirm. It was peak MS. System is very stable, UI (classic) is great. It is quick, snappy and good looking. For basic POSIX I have cygwin. For other stuff I use VMs. I have all tools needed to handle some mainmentance (compilers, DDK, docs, ...).
But there is problem.. HW.. The pool of old HW is shrinking, and one day I will not be able to run it anymore.. I guess I will move to Linux. There are few nice and lightweight distros...
For folks that pick Windows 2000 Server, why not Server 2003? Is it just because by then NT had XP out as the "Windows for Home Users" and people didn't use Server 2003 as much or were there changes about it folks hated for some reason? To me it always seemed to bring so many more features/capabilities without trashing the classic UI.
Remember that it also introduced Active Directory. I helped build out a global enterprise network that was consistent and supported the same way, with like a quarter million users and tbh, it pretty much worked flawlessly.
Of course that innocence was lost with Welchia and other issues, but Windows 2000 made the year 1999 feel like ancient history in 2001.
Sure, Windows 2000 was definitely great in a lot of ways, but Server 2003/R2 either extended on all of those (e.g. greatly improving AD and its management) or added on it's own big firsts such as x64 support - all without really introducing any of the types of downfall people hate the more modern versions for.
I'm just surprised that it feels like very little deep innovation in the OS world has happened since windows 2k. 3.11 brought networking in. 95 brought true multitasking to the masses and 2k brought multi-processing/multi-user (yes, NT3.1 had it, but 2k is where most normal users jumped in). And, yes, I know these things existed in other OSes out there but I think of these as the mass market kick offs for them. In general I just don't see anything but evolutionary improvements (and back-sliding) in the OS world beyond this time. I had really hoped that true cloud OSes would have become the norm by now (I really want my whole house connected as a seamless collection of stuff) or other major advances like killing filesystems (I think of these as backdoor undocumented APIs). Have we really figured out what an OS is supposed to be or are we just stuck in a rut?
[edit] 3.1 should have been windows for worgkroups 3.11
"Normal users" did not jump into Windows 2000 Workstation. That was still an 'enterprise only' OS. Normal users either suffered with WinMe shipping on their desktop computer or jumped from 98SE to XP, given their computer could handle it (aka they bought a new computer).
Sorta. It was real pain-in-the-ass to run 2000 as a regular (non-administrator) user. Assuming your software worked at all that way, as even Office 2000 had some issues. UAC was necessary.
It required attention to detail, from a sysadmin / desktop admin perspective, but it was definitely possible and paid dividends in users being unable to completely destroy machines like they could on the DOS-based Windows versions. I put out a ton of Windows NT Workstation 4.0 and Windows 2000 Pro w/ least-privilege users. It was so convenient to be able to blow away a user's profile and start w/ a clean slate, for the user, w/o having to reload the machine.
Looks like there is some negative feelings towards this comment. So if we aren't in a rut, what are the big revolutionary OS advancements that have happened since this time?
Desktops have been in a rut for a decade. Windows has sucked post Win7 in ways that are either conspiracy or the most stinging indictment of managerial incompetence possible. Osx is good except it's key bindings are alien and the hardware is closed and apple hasn't improved it really at all in ten years and it has loads of inconsistencies with Linux cli. Linux has been in a huge rewrite of the desktop and graphics lift for no real end user benefit and flubbing the opportunity to make ground on windows while it tried to commit market share suicide.
3d compositing, ssds, mega displays, massive multi core, all completely wasted.
You know what I should be able to do? Hot execute windows and Linux and Osx on the same desktop without containerization that leaves 3d as an afterthought or worse a never thought.
Isn't Sharepoint like an enterprise management tool? I've never interacted with it once.
As for appdata... There's many faults to find in modern Windows changes, but I'm not willing to pin this on MS. Microsoft stuff tends to use %appdata% fairly sensibly, in most cases. On the other hand, the behavior of third-party developers has been really frustrating. What was initially intended as a universal storage location for some program data has become some kind of program container. Now, whenever you download some giant 300-500mb Electron app or whatever, you can be sure that it will force its entirety into appdata with no way to change the location. Every one of these developers has decided that their program is so valuable and Important that it's inconceivable that the user might want to install it on anything but the system drive. No, our program is unique and deserves nothing but the best!
Definitely stuck. We found a pretty strong optimum that no one has been willing to venture outside, strong enough to keep selling and that seems to be all that matters these days.
It was during an era where there was actual competition over Operating Systems. OS/2 definitely pushed Microsoft hard. BeOS woke everyone up even if it wasn't on popular hardware. Bell labs was still experimenting with plan9. There were several commercial Unix vendors.
Monopolies. They ruin markets. Destroy products. Reduce wages. Induce nostalgia.
I am 50/50 on this particular argument for why OSes are in a rut. I think there is actual competition in the form of the various distros out there and they have passionate advocates with real skill trying new things but they don't really take in a more mainstream way and they rarely feel revolutionary. I think this is more of a track gauge problem. It is hard to provide a truly novel OS without building all the infra around it so that people can actually use it. That takes resources at a scale that few can muster. What if I wanted to build something that kills off the idea of a file system? All the apps out there are written at their very core with this concept in them so even if it is a better idea, it is incredibly hard to bring it out to market and only huge companies can do it. At the same time though the huge companies are pushing their versions of things which makes it hard to compete and have even small innovation in the space. I don't have a solution here. I had hoped things like web assembly would have led to OS breakaway by now but it hasn't really happened. Maybe it still will.
The desktop UX was good, but nearly every network service on Windows 2000 had a critical vulnerability at some point. The Code Red worm (MS01-033) comes to mind as particularly impactful. This was the golden age of stack smashing.
In the 90s, Windows was simple enough that I was able to read tech articles and understand a lot of what is going on inside, up to the point of Windows 2000, and to a certain extend, Windows XP. That completely changed with Vista/7 where I can no longer recognize the name of many processes that are running or understand what actions/situations make my computer lag.
Nowasdays, even through I don't worry anymore as Windows 11 is happy as long as you give it a quadcore cpu, ram, and an SSD, sometimes I still wonder why it writes 40GB to the SSD everyday.
I’ve been running Windows 2000 Pro as my main machine from I think the second beta until it was completely dropped from support.
It ran all of my games, stable as hell, quite light with none of the bullshit added later, none of the graphical bullshit added by XP but still classic Chicago.
The only thing that could make it better were the UI rendering engine introduced with Vista and its enhanced driver and security model.
I see a vast majority of comments here agreeing that UIs were significantly better and faster twenty years ago or more. Assuming HN is representative for the software community, how is it that slow, inferior and dumbed-down interfaces have prevailed in the end? And this hasn't been happening just to popular consumer products.
I don't know whether Windows is for corporate desktops, enterprise servers, PC manufacturers, Azure, home users or advertisers. It certainly doesn't feel like it is the right product for me anymore.
I started my Windows Server career around 2003R2, so can't comment on the peakness of 2K.
2008R2 snowballed it's own "revolution" when it introduced PowerShell 2.0, which was pivotal for many future things to come.
Out of the more modern ones, 2012R2 was "peak".
I would guess we will still see 2012R2 installations well into the future. Still running bits and pieces of critical infra, even if it has been EOLed long ago, but that's the way of the Server I guess. Can't wait for the 5000+ uptime day screenshots of ancient 2012R2's.
My guess is the next "long LTS" will be Win2025. Just because it's introducing the NTLMv2 deprecation path and working solutions to replace it.
If MS stripped *ALL* ads and bloatware (telemetry for calc??) out of Win 11 and restored the traditional UI of start menu + desktop, it would be fairly good overall. Certainly within their top 5. They really are close to peak yet again but cannot realize they are striving to make it worse.
"A computer on every desktop and Windows on every computer" was the company's goal. Ever since they claimed success on that one (glossing over the ubiquity of Linux everywhere else) they've been sorta directionless.
TBF XP and 7 are both decent. Everything went down after those, including the Ads, the update, etc.
I didn't upgrade to 10 until I purchased a used Dell laptop (which includes 10 prof) a few years ago, and I never used 11 and hopefully never needs to use it.
I love 2000 and XP but 7 has a special spot for me because it’s a “modern” Windows (supporting proper alpha blending in its theme drawing and such) without the various problems that 8 and newer bring. I have an old laptop with it installed and booting it up is honestly refreshing. Its visual style is a little dated feeling but not that much.
I like it for the same reasons. I just wish it supported high DPI. It, and Snow Leopard to Mountain Lion era OSX, at high res would be peak desktop usability.
I believe XP was when Windows Activation started, so that's a pretty big negative for me. Other than that, XP, 7 and 10 were pretty good, although 10 introduced advertisements if I'm not mistaken.
XP also inexplicably required at least twice the ram as 2000. when XP came out that was a significant cost, and I personally was able to salvage many laptops at the time by downgrading them from XP. Eventually XP became the default for me because ram got a lot cheaper and the service packs and driver support made it more viable.
But then, tangentially, I started using ubuntu at work, in a sort of misguided belief it would make me a better sysadmin, and it was only a matter of time before I couldn’t stand windows at home as well.
I thought win7 was pretty solid, though I didn’t upgrade until well after win8 was shipping. But lucky for me, proton finally got really good, and that allowed me to basically skip win10+. Now it’s only for the rare tool that I even boot into my windows partitions anymore. When I do, being bombarded by random attention grabbers is completely jarring and I want flee as fast as I can.
I'm already moving into Linux for one of my laptops. If the drivers and desktop experiences are good enough (or bad enough in Windows) I might move 100% to Linux in a few years.
I made the jump a few years ago and the experience has been largely great. Lots of learning, which has been half the fun, and no goddamn ads in my start menu.
Totally usable as a daily driver, provided you don't need Windows only software. The year of linux on the desktop was probably about 2020.
Steam's proton has made gaming on linux astoundingly good. The only thing that still needs improvement is mod support, as mod managers, game downgraders, bin patchers, some more involved mods involve little utilities written for windows that are not easily runnable on linux.
It is slowly improving though. The steam deck has moved things forward in leaps and bounds.
You do competitive games?
Those have been so far Proton's biggest weakness, but that's usually less of a technical limitation and more of a company decision, which given enough pressure can be changed
I play Battlefield 2042, Call of Duty Warzone, Apex Legends, PUBG, Rainbow 6 Siege, and Fortnite all somewhat regularly and none of these as far as I know work.
The only games that I do play regularly that work are Counter Strike 2 and DotA. Though I can't use Faceit for CS2 which would be ideal.
Ah, see as a guy in my 40's, my reflexes just aren't what they used to be. I used to be lethal in my teens, but these days I tend to mainly play single player or coop games. If you play games that require kernel level anti-cheat, yeah those will probably never be supported on linux.
Force override it in the game settings. Works on 99% of games. Probably not multiplayer games with anti-cheat, because if you're not using a software chain fully validated by Microsoft then you're cheating.
If you intend to stick with Windows for the long haul, you will have to upgrade eventually. I hung on to 7 for a while, but several apps stopped getting updates: iTunes, the Spotify desktop client, Google Chrome, and even Firefox dropped support. I was using iTunes to download podcasts, which after a while became impossible with some feeds because I would get an SSL error each time on that old version. For 10, the ESU period ends one year after 10/14/25 for consumers and three years for organizations. It's possible that apps will continue to receive updates during that time.
Come try out Fedora, or whatever flavor of Linux you want.
It's surprisingly fantastic for almost all modern computing tasks. Yes, it's true, some software won't work, such as Adobe Photoshop, but most people aren't using software like that anyway. For gaming, I'd say we're close to 99% of games supporting Linux out of the box on Steam. The few left that still don't choose not to via kernel-level anti-cheat or forgetting to toggle a checkbox for Linux support (EasyAntiCheat and friends).
The point is, it "Just Works" for darn near everything these days and is a very pleasant experience. Try it out!
The best Linux I have ever seen is Linux Mint. I tried it out because I needed to do something with firewire, but all of the other Linux kernels had dropped firewire, and it was the only one left that still supported it. I found it to be intuitive and friendly and everything just worked.
Mint leans towards the "ultra-stable" side of Linux Distros. Fedora leans towards the "bleeding-edge". Both are great in their own ways. If you want the latest and greatest of everything, Fedora is a great pick. If you just want long-term stability, Mint is a great pick. With both, you can choose the Desktop Environment you prefer (I like KDE personally, but many like Gnome, MATE, Cinnamon, etc).
That's not to say Fedora is unstable - it's just that it iterates fast to keep pace with packages as they release new versions. There's a new major Fedora release every year, for example.
Eh, this is going to sound like a I'm a stick in the mud, but I've tried Linux about a dozen times now, and every time has eventually led to 'a Linux evening' that disenchants me from the fantasy and back to reality. It's fantastic as a server OS, however.
Try it again if you haven't recently. I'm unsure what specific issues you encountered, but anecdotally I can say I've been driving Fedora full-time on my home workstation for nearly 2 years now. I love it. I drove Fedora full-time on my laptop off-and-on for nearly a decade as well before that.
For me, gaming was what kept me away. But, besides a few titles, it's been a non-issue. It was very pleasantly surprising.
My desktop runs Fedora Kinoite[1] - an immutable version of Fedora. It poses a set of unique challenges for a development workstation (my primary use), but has resulted in rock-solid stability through several major OS upgrades, and a lot of development-related hackery.
I don't see myself going back to Windows anytime in the future. Every time I'm at the office an on my Win11 machine, I remember why I switched in the first place. Just my experience though.
Often Linux is great, until
You update some esoteric dependency that breaks a bunch of stuff, and fixing it is just a little past your experience level …
That's the best part of the immutable versions, containers by default so weird dependency interactions are minimized, system is stable and has a good rollbacks in case something does go wrong, and updates are more or less invisible
Takes some getting used to, but has really been a smooth experience
Windows 10 LTSC IOT has all the bloat and spyware stripped out and will get security updates for years. It's super lean.
Will third party apps keep installing updates ? Hard to say. The adobe suite already refuses to install the latest version on any LTSC (for no reason other than they don't want to support it - it works great) so who knows.
Suspect my next OS will be Windows 12 LTSC if I can hold out long enough - every other Windows version alway seems to be experimental crap going all the way back to ME (millennium edition)
I tell customers that they should use LTSC for things like virtual desktops. You need stability, such as it not randomly deciding to install a 4 GB game like Minecraft for every user as a “critical update”.
Microsoft joined a meeting and told the customer that they don’t agree with my recommendations because they want to make sure all users get the “latest experiences”.
There’s your problem right there: pushing your own KPIs instead of what’s best for the customers.
Windows 7 was my all-time favorite. I remember you could not use it straight out of the box, there was a whole bunch of UI tweaks that I would make right away. After that, it was perfect.
For me it's windows 7, if nothing else for being the first and last major Windows where Universal Search worked well.
The Windows 8.x line gets some credit for having the strongest pen interface integration, which regressed significantly in the 10 line, but the overall shell in Windows 8 was rough, and a lot of features were broken in the rushed out and mostly failed attempt to Appify windows and redesign much of the UI at the same time.
The way I see it (and similarly with browsers now) is that the OS is a venue providing a stage for others to perform on, they provide the facilities so every act doesn't need to build their own venue. Most of the time people don't visit/use a venue for the sake of it.
Didn't expect to see this-but after Windows 95/98, I went to Windows 2000 for a long time, didn't switch to Windows until 10/11. After Win2k, I went to Linux because I wasn't a fan of XP/7. (I know this is an unpopular take.
The glasses are rose-tinted. There were a number of little bits missing from Windows 2000 that were helpful to have in XP, and you could change the theme to make it look just like Windows 2000.
And I really don't know how Windows 7/Server 2008 R2 doesn't win this battle.
Context. We upgraded from Windows 98 to Windows 2000. That was a major upgrade. First stable NT platform that we could use for everything, including games.
For all I know, windows server 2025 is amazing, but have you priced it out? There's no way to justify it.
I actually like Windows despite their aversion to committing to a UI redesign but do I really need to pay $1100 (per core!?) for the hope (but not the promise) of no ads?
There are software and scripts to decrapify Windows 11. After uninstalling and stopping everything that's not needed and making start menu and the bar behave like in Windows 7, it's quite decent.
This adds maybe 20 more minutes to install time but it's worth.
Unfortunately all that crap eventually comes back. Microsoft likes to reset settings… I’m pretty sure I must’ve spent the majority of my youth setting the same explorer settings over and over and over again … And it never ends with any custom setup you do; given enough time it reverts.
I don't think anyone doubts that you can do this. It's more that I refuse to pay for an OS which needs to be de-crapped in the first place. If Microsoft can't make something which prioritizes my needs above their corporate metrics, then they don't get my money.
Can you just expand on the significance of LTSC for a personal user ? MS says it's for "Medical Devices, Kiosks ..." but I presume the reason the you mention it is that it's a version of Windows 10 that is expected to receive security updates for X years into the future ?
Is it also de-crappified ? No games, requests for Microsoft accounts etc ?
Okay, mostly use Windows 7 Professional (with 100+ "updates") for general purpose and software development but for such usage and/or a Web server what to get now? Windows 2xxx?
This brings back a lot of nostalgia and I wholeheartedly agree. Back then I ran Windows 2000 server beta 2 on a dual proc system with P2-300s. It was rock solid.
One cool thing Microsoft did with Windows NT was the whole local security model and a filesystem that supported them (NTFS), which was definitely richer than UNIX. I don't really know if other UNIXes at the time had anything more than the 16-bit uid and gid and mode bits on everything the filesystem. I wonder how it would have looked if Microsoft kept Xenix as the base and added ACLs on top of it, for example.
Well, Xenix was extremely popular since it was the cheapest option on x86. That said, I doubt Bill Gates would have hired Dave Cutler had he stuck to Xenix.
You wouldn't want to connect a fresh installation of Windows 2000 to the internet today. "Net Send" and default-on Administrative Shares are some brain-dead design decisions that made sense on a trusted LAN, but not the Internet.
XP was the last that I really REALLY used. I've had Windows 7 (on my work machine that I didn't use) and I have a Windows 10 machine that I boot from time to time when I want to mess with recording gear. But I kinda fell into "they're all bad, I was just used to them".
I'll give my prime example. I used to know Device Manager/Control Panel SO well. I could just get things done. Now I have to hunt around forever to do any sort of hardware related task. In their attempt to make it "so easy, even your grandma could use it" they've alienated power users. My grandma still has to call me to help her attach a printer... but now I have to say, "I dunno... let me watch a YouTube video and pray that it matches the sub-version that you're using".
I don't know how good Windows 95 was in practice, but in our country where 99.9% of internet cafes didn't have licenses, or service pack updates (if they even had any for the 95 variant), it was a pretty easy Windows to DoS via the netbios vulnerability. https://en.wikipedia.org/wiki/WinNuke
On well supported hardware 95 was a major upgrade. The Start menu, long file names, preemptive multitasking, plug and play hardware, and Direct X gaming support. In many ways it even surpassed MacOS at the time.
3.0 was ok but a bit rough around the edges and it crashed a lot.
3.1 was a substantial improvement in that regard. It also brought major features like TTF fonts, the registry, a usable file manager, audio and video support, and networking in the Workgroups version.
When Win10 started, it was clearly Bad. No good reason for updates, invasive privacy-breaking telemetry, updates at random moments of the day, and everything was a little different but nothing was better. People flat out refused to upgrade when it was given for free. Microsoft had to force it trough windows update, and did multiple rounds of breaking software people explicitly installed to block the upgrade.
When did it become good? WSL and DirectX 12 were real changes, but all in all, my impression is that the user has been frog boiled over the years, with 2K,XP and 7 becoming distant memories.
The only 'bad' thing about Vista was it's change (and thus deprecation of many drivers) of driver model. Once tweaked and with good native drivers it was the first good 64bit windows - far more reliable than XP64. At least until 7 came out.
Vista was indeed fine. I used it for many years and had nary a problem with it. The problem with 11 isn't the core (everyone seems to agree that is fine), it's that Microsoft insists on putting ads and other user-hostile BS in.
I basically skipped windows XP entirely, only seeing it on other people’s computers.
I staying on a thinkpad R31 with win2k until I got a R61 (4gb ram) with vista on it several months after vista’s release. At that point it seemed like driver and other early teething had been worked out, so my experience was pretty positive.
When I eventually moved to win7 I didn’t notice any real difference.
Windows Vista was essentially unusable on release unless you had very high-end hardware.
A couple of weeks after release the first step after getting a new computer was changed from "downloading firefox" to "downgrade to windows xp". Unironically, many people did that.
And that unusuability was mostly due to the driver model change, once native Vista drivers appeared it performed better than XP/XP64 unless you were running old video hardware that couldn't handle aero - in which case you were still better off running Vista with the classic UI, although that did entail forgoing the Luna styling.
Even with native WDDM drivers it performed poorly in desktop graphics, because Vista also removed all GDI hardware acceleration support. This caused many 2D graphics operations to execute in software, or worse, an even slower mix of hardware and software rendering. Windows 7 improved on this by re-adding hardware acceleration for some GDI primitives and adding aperture windows to reduce DWM memory footprint.
It's a story all right, but that's all it is. Windows has a GetVersion function that returns a struct of major/minor/build, and they're all ints. That's how you've always checked for versions, with older versions checking against a single int that contained both major/minor.
Microsoft had no reason to support blatantly stupid development practices that no one ever actually did. They were trying to avoid brand confusion with the consumer, because even people who know about versions will still do a mental double take at seeing "Windows 9", expecting another digit. The confusion might not last long, but it still detracts from the brand.
/** Performs computation and returns the result, or throws some exception. */
public HashSet<String> call() throws Exception {
final String arch = System.getProperty("os.arch");
String name = System.getProperty("os.name").toLowerCase();
String version = System.getProperty("os.version");
if (name.equals("solaris") || name.equals("SunOS")) {
name = "solaris";
} else if (name.startsWith("windows")) {
name = "windows";
if (name.startsWith("windows 9")) {
if (version.startsWith("4.0")) {
version = "95";
} else if (version.startsWith("4.9")) {
version = "me";
} else {
assert version.startsWith("4.1");
version = "98";
}
} else {
...
I suppose Java didn't offer too many alternatives to checking the OS version the standard way, but I really have a hard time imagining MS bending over backwards to support that approach on that platform. There wasn't even a need to check for the "windows 9", the code was already checking for a Windows platform and would have worked the same without it. Avoiding confusion in the minds of the end users is still the most plausible explanation to me.
Microsoft has a history of trying to avoid breaking older software with new Windows releases. To do that, they definitely do need account for how people are actually coding things in their software rather than just what they've documented as the way to do things.
The string check makes a lot of sense when you consider software written in languages like Java or Python rather than something that's coded directly against the OS APIs. In those cases you would get strings back with the OS name which of course is going to lead to many people just doing taking the simplest route of string matching.
It's the Register and therefore too worthless to get worked up about, but their naming a server version of Windows as peak anything is an indication that they probably just polled a few drunks at a bar.
Agreed. Windows Server 2000 through Windows 7 were peak Microsoft operating system.
By Windows 2000 Server, they finally had the architecture right, and had flushed out most of the 16 bit legacy.
The big win with Windows 7 was that they finally figured out how to make it stop crashing. There were two big fixes. First, the Static Driver Verifier. This verified that kernel drivers couldn't crash the rest of the kernel. First large scale application of proof of correctness technology. Drivers could still fail, but not overwrite other parts of the kernel. This put a huge dent into driver-induced crashes.
Second was a dump classifier. Early machine learning. When the system crashed, a dump was sent to Microsoft. The classifier tried to bring similar dumps together, so one developer got a big collection of similar crashes. When you have hundreds of dumps of the same bug, locating the bug gets much easier.
Between both of those, the Blue Screen of Death mostly disappeared.
I agree with one big exception, the refocus on COM as the main Windows API delivery mechanism.
It is great as idea, pity that Microsoft keeps failing to deliver in developer tooling that actually makes COM fun to use, instead of something we have to endure.
From OLE 1.0 pages long infrastructure in Windows 16 bit, via ActiveX, OCX, MFC, ATL, WTL, .NET (RCW/CCW), WinRT with. NET Native and C++/CX, C++/WinRT, WIL, nano-COM, .NET 5+ COM,....
Not only do they keep rebooting how to approach COM development, in terms of Visual Studio tooling, one is worse than the other, not at the same feature parity, only to be dropped after the team's KPI change focus.
When they made the Hilo demo for Windows Vista and later Windows 7 developers with such great focus on being back on COM, after how Longhorn went down, a better tooling would be expected.
https://devblogs.microsoft.com/cppblog/announcing-hilo/
Drivers can crash the rest of the kernel in Windows 7. People playing games during the Windows 7 days should remember plenty of blue screens citing either graphics drivers (mainly for ATI/AMD graphics) or their kernel anticheat software. Second, a “proof of correctness” has never been made for any kernel. Even the seL4 guys do not call their proof a proof of correctness.
Not the operating system:
https://en.m.wikipedia.org/wiki/Driver_Verifier
Driver Verifier is a tool that developers can choose to use for testing and debugging purposes.
It's not used on production machines and it does nothing to prevent a badly written driver from crashing the kernel.
Kernel drivers have to be verified by the driver verifier to pass Windows Hardware Qualification Labs certification and get signed with the Windows signing key that lets them load without warnings. There are fewer outside kernel drivers today, though, because plugging random peripheral cards into PC buses is no longer a big thing.
This is true for certification, which is mandatory for Server OS, distributing through Windows Update, or certain classes of drivers such as anti-malware or biometric authentication, but you can still submit drivers to Microsoft for "attestation signing" that will load without warnings on desktop OS without having to run them through the testing suite.
In any case, running the certification tests does not provide runtime protection for drivers running in kernel mode, as demonstrated by CrowdStrike. Only Windows 10 started introducing hardware virtualization-based isolation of kernel components (to provide isolation of security subsystems, not runtime checks to prevent crashes): https://learn.microsoft.com/en-us/windows-hardware/design/de...
Yet drivers that have passed Windows Hardware Qualification Labs certification have had blue screens. Also, Microsoft hands out Windows kernel driver signing keys to anyone who pays them. You don't need to have a driver go through the Windows Hardware Qualification Labs to be able to sign it with a key signed by Microsoft.
My PC used to regularly crash Windows 10 because of buggy Nvidia driver. Eventually they fixed the bug, but until then, I had a crash every few days.
From your own link:
"Driver Verifier is not normally used on machines used in productive work. It can cause ... blue screen fatal system errors."
I lost less time to bluescreens than I have to forced updates and sidestepping value add nonsense like one drive, edge.
They didn't "prove the kernel is correct", they built a tool to prove that a single driver maintains an invariant throughout execution.
It does not prove that the driver will not crash the kernel. It should be fairly easy to find a driver that passed QA testing under that tool, yet still crashed the kernel. You just need one of the many driver developers for Microsoft Windows to admit to having used that tool and fixed a crash bug that it missed, and you have an example. Likely all developers who have used that tool can confirm that they have had bugs that the tool missed.
I think it ended at the first "ribbon" UI, which was in the 2003 era, but not all products ate the dirt at once.
Yeah the ribbon drove me to LibreOffice and Google Docs and I haven’t been back.
Windows 2000 Pro was the peak of the Windows UX. They could not leave well enough alone.
The original ribbon sucked but with the improvements it's hard to say it's generally a bad choice.
The ribbon is a great fit for Office style apps with their large number of buttons and options.
Especially after they added the ability to minimize, expand on hover, or keep expanded (originally this was the only option), the ribbon has been a great addition.
But then they also had to go ahead and dump it in places where it had no reason to be, such as Windows Explorer.
> The ribbon is a great fit for Office style apps with their large number of buttons and options.
To me this is the exact use case where it fails. I find it way harder to parse as it's visually intense (tons of icons, buttons of various sizes, those little arrows that are sometimes in group corners...).
Office 2003 had menus that were at most 20-25 entries long with icons that were just the right size to hint what the entries are about, yet not get in the way. The ribbon in Office 2007 (Word, for example) has several tabs full of icons stretching the entire window width or even more. Mnemonics were also made impractical as they dynamically bind to the buttons of the currently visible tab instead of the actions themselves.
Close to 20 years later, people still complain about the ribbon. (1)
I think that says something about it.
--
1. And not just "grumble, grumble... get off my lawn..." Many of its controls are at best obscure. It hides many of them away. It makes them awkward to reach.
Many new users seem as clueless, or even more so, than pre-existing customers who experienced the rug pull. At least pre-ribbon users knew there was certain functionality that they just wanted to find.
(And I still remember how MS concurrently f-cked with Excel shortcut keys. Or seemed to have, when I next picked Excel up after a couple year hiatus from being a power user.)
> The original ribbon sucked but with the improvements it's hard to say it's generally a bad choice.
This is also what I hear about GNOME. "OK, yes, GNOME 3.x was bad, but by GNOME 40 it's fine."
No, it's not. None of my core objections have been fixed.
Both ribbons and GNOME are every bit as bad as they were in the first release, nearly 20 years ago.
I know nothing of your objections, so this is more about how I think of mine and how they relate to these kinds of changes.
Being a power users is difficult, I think the best way to do software is to make it APL complicated and only educate one guy in it. The way power users in Excel/Emacs/Accounting software out perform user friendly stuff is amazing. But somethings are meant for the masses, e.g. opening a file.
Dumbing down or magification of interfaces was needed for many other reasons. Gnome and Ribbon were necessary changes IMO, what we had was never going to improve. Of course I wish there was elements that could be reused elsewhere, but that is a pipedream of Smalltalk proportions.
I am now stuck with windows at work, and it is a horrible experience. Everything is so needlessly complicated. In the same way Linux is. I do believe Gnome did manage to improve things, at least when I look at children using Mac, Linux and Windows as power users. My view is that the complexity of Linux is still a little bit easier to understand, but that is just because of a long history and easy abstractions.
I think core objections are often not compatible with products that need to fit and be produced for many people. I do software that is used once by many this has changed my view if GUIs for ever, especially in regards to desktops.
> The original ribbon sucked but with the improvements it's hard to say it's generally a bad choice.
It is a terrible choice. Always have to search for items.
For me peak UX was before Ribbon. Just menus and customizable toolbars. Didn't need nothing more to be productive enough. Nowadays I can hardly use Office suite, its feature discoverability essentially zero for me.
I never understood the issue with the ribbon UI. Epecially for Office it was great, so much easier to find stuff.
> I never understood the issue with the ribbon UI. Epecially for Office it was great, so much easier to find stuff.
1. I don't need to find stuff.
I knew where stuff is.
2. I read text. I only need menus. I don't need toolbars etc. and so I turn them all off.
I cannot read icons. I have to guess. It's like searching for 3 things I need in an unfamiliar supermarket.
3. Menus are very space efficient.
Ribbons hog precious vertical space. This is doubly disastrous on widescreens.
4. I am a keyboard user.
I use keys to navigate menus. It's much faster than aiming at targets with the mouse and I don't need to look. The navigation keys don't work any more.
Ribbons help those who don't know what they are doing and do not care about speed and efficiency.
They punish experts who do know, don't search, don't hunt, and customise themselves and their apps for speed and efficient use of time and screen space.
> They punish experts who do know, don't search, don't hunt, and customise themselves and their apps for speed and efficient use of time and screen space.
The problem is, most users are utterly braindead, they barely manage to type at speed instead of pecking at single keys. The astonishment I've gotten in some places for literally nothing more than Ctrl+C/Ctrl+V is more than enough proof.
That's also IMHO a large portion of why Linux never really took off on desktop. UX/UI people are rare enough to begin with, most of them don't work on FOSS in their free time, and so development is primarily done by nerds for nerds. That's great if you already know something about the application - but usually the learning curve is so steep that most users frustratedly give up. And documentation is either not existing, incomplete or horribly outdated, and StackOverflow etc. are even worse.
The exception is Blender. They got some serious money IIRC, cleaned up their act, and now there's a headline of some movie or game using Blender every few weeks.
100% true.
The sad thing is that Windows has a great keyboard UI and it's superbly accessible for people with visual and motor disabilities.
Who have reduced earning opportunities because they are disabled, so FOSS should be great for them, but it isn't, because the nerds don't know CUA and don't know the keyboard UI. They spend their time mastering a couple of ancient apps like Vi and Emacs and ignore the fiery furnace of UI R&D that followed for the next 20Y after those early efforts.
Learn Windows' keyboard UI and you can drive the whole OS and all its apps with the speed of a genius Vim user with 20 years' practice. It makes Emacs look like a wet paper pad and a burned stick compared to a Moleskine notebook and a top quality fountain pen.
Xfce comes close and implements maybe 75% of the UI but once you are in an app all bets are off.
> Learn Windows' keyboard UI and you can drive the whole OS and all its apps with the speed of a genius Vim user
Do you have a reference for this? I've often needed to control Windows using only a keyboard and failed to do so. I'm aware of most shortcuts in this list[1] but these are for a few very specific things. (As an aside, I also remember controlling the mouse with the numpad using the Mouse Keys accessibility setting but this is worse than both keyboard shortcuts and the mouse.)
[1]: https://en.wikipedia.org/wiki/Table_of_keyboard_shortcuts
It's called CUA.
https://en.wikipedia.org/wiki/IBM_Common_User_Access
There are dozens of them out there.
Random example:
https://www.system-overload.org/windows-shortcuts.html
General guide...
Activate menu bar with Alt. Alt + the underlined letter opens that menu or submenu.
Alt+Space opens the control menu for that window. In MDI apps, alt+hyphen opens the document's window control menu.
Then...
Alt+space, x = maXimise Alt+space, n = miNimise Alt+space, s = reSize followed by cursor key to select which edge, then cursors to change.
Hotkeys are Ctrl+letter and do that action now.
Ctrl+... p = print s = save o = open f = find c = copy x = cut (looks like scissors) v = paste (looks like an arrow: paste _V_ HERE )
Shift modifies or reverses many commands, and selects while moving.
In dialogs and forms, Tab moves forwards; Shift+Tab backwards
Ctrl+PgDown = next tab Ctrl+PgUp = previous tab Ctrl+Enter = save and close form
Ctrl+left/right = move by word instead of character Shift+home/end = select to start/end of line
Esc = cancel
Ctrl+Esc = open start menu
Then tab, and you're tabbing through the taskbar, which is a sort of dialog box.
Ctrl+Shift+Esc = open task manager
Maybe this should be on a wiki somewhere so it can be documented collaboratively...
> Do you have a reference for this?
Look for underlined single letters in menus. With apps that use the "classic" style menus instead of ribbons or plain Electron crap, the single letters are the key.
I'm curious to know if this is what lproven meant in their comment above. Alt + a-z to access menu items is available in every OS and all "native" apps, but you can't "drive the OS and all apps" this way.
For example, I would like to set options that are a few menus/button clicks deep in the Windows control panel (either the "classic" or new variant) using keyboard shortcuts/navigation. Or navigate the Windows registry editor. I'm not aware of a way to do this.
None of that is correct.
No, it's not in all OSes. I wish it were.
No, it's not in all native apps. KDE reinvents its own set of keystrokes, for instance, and half the KDE apps have no menu bars any more... And there's no global way to force them either.
Yes, the control panel and RegEdit are totally keyboard controllable.
You can literally just unplug the mouse from a Windows desktop and it remains totally 100% operable.
Some apps may not, because the developers didn't do their jobs right, but the OS is.
How else could blind people use PCs?
I totally forgot about this until just now. That really was a brilliant feature.
> Learn Windows' keyboard UI and you can drive the whole OS and all its apps with the speed of a genius Vim user with 20 years' practice
I'm sure you can give me some hints, because Microsoft, can't.
See here:
https://news.ycombinator.com/item?id=43681191
> The sad thing is that Windows has a great keyboard UI
Windows also has a great help system, online. /s
Windows actually had a decent built-in manual system with CHM, tooltips and whatnot. Even games could and did use it, like EarthSiege 2.
Back in the days when application developers stuck to the Windows-provided widgets instead of doing their own UI, it was wonderful. Symbols were consistent across applications, as were color schemes (IIRC, if you wrote your CSS correctly, Internet Explorer would pass these on to websites!) and behavior.
I miss these days.
> And documentation is either not existing, incomplete or horribly outdated, and StackOverflow etc. are even worse.
Or the documentation is very complete, but only useful if you read and comprehend it in its entirety. Open source devs need to understand that not everyone using their software wants to become an expert in it. They just want to get a task done and the software is facilitating completing that task. That is something totally normal and those users should not be thought of as less important than the power users.
> The problem is, most users are utterly braindead
Yeah, that's Microsoft's idea. All user are idiots. That's why they are not able to fix bugs but only change the UI.
Just hide the ribbon.
On a Mac, that's fine. On Windows, it's not, because then I can't control the app any more.
I have been using Word since version 4 on DOS and version 5 on Classic MacOS. On Windows, I used WinWord 1, 2, 6, 95, 97, 2000, XP and 2003... then 4 years later MS ripped out the UI I knew backwards and had known for about 16 years, since 1991, and replaced it with one inferior in every way for me.
I'm not denying it might be better for others but for me it's now a waste of disk space.
The old versions do all I need, so I keep them. For everything except Word, there is LibreOffice.
But LibreOffice Writer has no outline mode, and I am a writer: that is THE killer function of Word for me.
So, Word 97 under WINE on Linux and Word 2003 when I have to use Win10 or -- shudder -- Win11.
And it'll be back in the next update.
My big problem with it is that it’s stateful. A menu or toolbar admits muscle memory - since you get used to where a certain button or option is and you can find it easily. With ribbons you need to know if you’re in the right submenu first.
Though personally, I’m increasingly delighted by the quicksilver - style palette / action tools that vscode and IntelliJ use for infrequently used options. Just hit the hotkey and type, and the option you want appears under the enter key.
It's not easily customizable and it takes more space, not much to understand
I'm not sure it takes more space than a menu and toolbar, but regardless, monitors are a LOT larger now than in 2003 so...
Frankly, I'm motivated sure customizing is a win either. I fo a lot of remote support and it's nice to have a consistent interface.
Personally I find it faster than menus, and easier to find things I seldom use.
But I appreciate it's a personal taste thing, and some older folks prefer older interfaces.
Your monitors, those of a well-off power user, may have become larger. Most regular users I've seen are on 15" laptops with screens at 1366×768, or (if they're lucky) 1920×1080 with scaling at 1.25× or so. 17" desktop monitors used to be commonplace about 20 years ago.
The slightly larger screen real estate (if any) is more than wasted by very inefficient "modern UIs" where you won't find paddings smaller than 16px, with three buttons where there used to be enough space for 9.
Just compare and become sure! The larger screen isn't a good excuse to waste space either.
And users are way more important than the tiny group of tech support.
Also those that need tech support will be less likely to customize.
Those of us working in jobs use the same couple of functions in our office products. We don't really go and find features.
> I think it ended at the first "ribbon" UI, which was in the 2003 era,
Nah. 2007 era.
Office 2007 introduced the ribbon to the main apps: Word, Excel, I think Powerpoint. The next version it was added to Outlook and Access, IIRC.
I still use Word 2003 because it's the last pre-Ribbon version.
I don't know quite when it started to happen, but changing and/or eliminating the default Office keyboard shortcuts in the last few iterations has really irked me.
Another often-underappreciated advancement was the UAC added in Vista. People hated it, but viruses and rootkits were a major problem for XP.
People hated it because it was all over the place. Change this or that setting? UAC. Install anything? UAC. Then you'd get a virus in a software installer, confirm the UAC as usual, and it wouldn't stop a thing.
It is more of a warning than an actual security mechanism though. Similar to Mark of the Web.
No, in XP you were essentially logged in as root 24/7 (assuming it was your machine), and any program -- including your browser -- was running as root too. I remember watching a talk about how stupidly easy it was to write rootkits for XP. "Drive-by viruses" were a thing, where a website could literally install a rootkit on your machine just by visiting it (usually taking advantage of some exploit in flash, java, or adobe reader). Vista flipped it, by disabling the admin account, so that in order to do something as admin you needed to "sudo" first. That alone put a stop to tons of viruses.
I used to work in the security team at a financial institution that was still running XP until around 2017.
We got to a point around 2015 where drive-by exploit kit developers just weren't targeting XP and IE8 anymore. Phishing landing pages would roll through all the payloads they had and silently exit.
> It is more of a warning than an actual security mechanism though. Similar to Mark of the Web.
It's both a warning and an actual security mechanism.
Obviously its most visible form is triggered when an application tries to write to system-level settings or important parts of the filesystem, and also when various heuristics decide that the application is likely to want to do so (IIRC "setup.exe" and "install.exe" at the root of a removable disk are assumed to need elevation).
Because Microsoft knew that a lot of older software wrote to system areas just because it predated Windows being a multi-user system UAC also provided a partial sandboxing mechanism where writes to these areas could be redirected to user-specific folders.
The warning was also a tool in itself, because the fact that it annoyed users finally provided the right kick in the ass to lazy software developers who had no need to be writing to privileged areas of the system and could easily run under a limited user but hadn't bothered to because most non-corporate NT users were owners and thus admins and most corporate environments would just accept "make users local admin". A portion of the reason we saw UAC prompts a lot less in later versions of Windows is because Microsoft tweaked some things to make certain settings per-user and to reorganize certain dialogs so unprivileged settings could be accessed without escalation, but a lot of it is because applications that had been doing it wrong for as long as NT had existed finally got around to changing their default paths.
It got old people to call their grandsons when an image or .doc file asked for permissions though, which at the time was a huge help
> The big win with Windows 7 was that they finally figured out how to make it stop crashing.
Changing the default system setting so the system automatically rebooted itself (instead of displaying the BSOD until manually rebooted) was the reason users no longer saw the BSOD.
> This verified that kernel drivers couldn't crash the rest of the kernel.
How did crowdstrike end up crashing windows though?
> Static Driver Verifier
Well, the Crowdstrike driver isn't (wasn't?) static. It loaded a file that Crowdstrike changed with an update.
Most drivers pass through rigorous verification on every change. But Crowdstrike is (was?) allowed to change their driver whenever they want by designing it to load a file.
The EU forced MS to allow stuff like CrowdStrike as part of an anti-trust settlement.
MS tried to use the incident to get the regulators to waive the requirement.
I'm all for anti-trust and anti-monopoly but christ alive an operating system vendor gatekeeping their kernel is literally the whole point of being an operating system vendor. Braindead regulation.
> Braindead regulation.
Only because OP didn't give the full story. Microsoft wanted to close direct access to the kernel. AV companies complained to regulators in the EU. The EU asked Microsoft if they were willing to maintain access to replacement functionality and to stick to using that functionality for its own separately sold AV products. Microsoft said no, and instead of fighting, just let Windows wither on the vine with full kernel access for all the bozos. Crowdstrike was inevitable.
The issue isn’t with the gate keeping per se. The issue is that windows defender, a competitor AV, gets full access while third parties would not. This would leave the, at a competitive disadvantage.
No, braindead take. The purpose of being an operating system vendor is to sell an operating system. If someone else modifies your operating system after they buy it, they get to keep both pieces. You don't get to stop them from modifying the thing they bought.
Do you like nanny states? How about nanny corporations?
This particular case is weird because crowdstrike is complianceware.
So, it’s more like “you don’t get to improve your product if doing so would also stop random companies from forcing your customers to break the stuff you sold to them”
Microsoft has no obligation on protect its customers from themselves if they're dead set on shooting themselves in the foot.
Microsoft has no right to prevent its customers from having full access to the things they bought.
Must human arms be braced at birth so they can only point level, lest someone try to point any object at their own foot?
Yeah, but using your analogy, we do allow people to protect their communities from random strangers that want to disfigure other people’s arms.
In fact, I pay taxes to the police and they generally handle this sort of thing pretty well.
> First large scale application of proof of correctness technology.
Curious about this. How does it work? Does it use any methods invented by Leslie Lamport?
A modern reimagining of Windows 2000's UI - professional, simple, uncluttered, focused, no cheapening of the whole experience with adverts in a thinly-veiled attempt to funnel you into Bing - with modern underpinnings and features such as WSL2 would have me running back towards Microsoft with open arms and cheque book in hand.
Not an obligation, but ReactOS exists and needs help:
https://reactos.org/donate/
Surprisingly close. I recently tried its package manager and installed a recent Python! So better than the original XP-era Windows in some respects.
I’ve been watching ReactOS development for years and and progress is slow but steady. I’m excited for the point where it will be fully usable as a drop in replacement for old Windows software.
There are Linux distros that meet your description (no need for WSL2 either!). I am guessing you're not running towards them with open arms and cheque book in hand ... or maybe you already ran to Linux and are just nostalgic about going back to Microsoft ... ?
Linux UIs can’t even align fonts correctly within the elements.
It is miles away from the original and you can immediately see its Linux because things don’t quite line up. Huge difference in quality, attention to detail, and the entire interface becomes unpleasant to look at.
Also, Linux power management and lack of hibernation means its useless on laptops
I do not know what kind of Linux UI you have seen, but the problem mentioned by you is certainly not universal.
I have never seen it, but it may exist, because there are many kinds of Linux UI that I do not use, e.g. Gnome.
That said, I have seen many Linux GUI applications that are ugly, at least by default, but many of them can be reconfigured to be beautiful enough.
I have never been content with the default appearance of any Linux distribution, but the good ones can be customized to look completely different and better than Windows, especially if you replace all default fonts with some high-quality fonts.
I don’t want to spend eons heavily customizing, only for my customizations to break down the line, though.
But you[1] are willing to spend money, so you can pay someone else to worry about that.
[1] Based on the original comment. If you personally are not willing to spend money, your reply doesn't fit the conversation.
Out of curiosity, when was the last time you used Linux on a laptop platform? Anecdotally, it's come a long way since 5 years ago - daily driving Ubuntu 24.04 on my Thinkpad, and I can get 8 hours of use (engineering workload) in a day. It's not ARM level of performance, but far from "useless".
That's "It works on my machine". Especially with ThinkPads, Dell XPS, and other laptops usually used by the Linux folks. Try it on a random cheap HP and you might not even have sound working (I've gone through this a few months ago). You can sometimes easily fix it through the terminal, but then we get into the debate of whether a normal user would be able to do that.
Well, how many "it works on my machine" does it take to make it a general statement? It works on all of my machines, from dirt-cheap lenovo EDU series ThinkPad to portable workstation Dell M6800 and Pixel Chromebook in between.
It generally doesn't work for people buying computers from vendors who use hardware were the manufacturer doesn't disclose the documentation. Just don't give money to those who seek to prevent free software.
Solid advice, but choosing good hw for the rig is already a challenge. Tick the virtual "linux" checkbox and you often get an empty list. That is, if you have that checkbox, in any sense. How the hell should I know if a mobo/laptop is supported? Googling "<model> linux issue" always yields hundreds of threads regardless. Even when <model> is Thinkpad: https://www.google.com/search?q=linux+thinkpad+issue
I mean, yeah? But it’s far from the seamless experience of macOS or windows. On my desktop pc:
- My wireless card isn’t detected
- I’m using Linux mint, which means I’m still on X11. Some software doesn’t support X as well as wayland. Some only supports X I guess?
- I use Davinci resolve - which has a native Linux install. But I need to use some weird tool to convert it to a dpkg to install and run it. It doesn’t have a window bar - so the only way I can change the size of the window is by right clicking in the task bar
- My two monitors have different DPI - so I need to use window scaling. This confuses IntelliJ - which made all the text super tiny for some reason. I have a DPI override for that in a weird Java config file.
- I want consistent copy / paste shortcuts. I can’t use ctrl+C in terminal because that’s SIGINT. So I have it set to meta+C. But I can’t bind meta+C in IntelliJ because of Java limitations. So my copy/paste shortcut is just different in different apps now.
- Smooth scrolling is still an inconsistent mess between different programs. Particularly Firefox.
I’ve also been running into problems where my second monitor won’t turn on after I resume the computer from sleep. But apparently that’s a bug that affects windows as well when using recent nvidia drivers, so that isn’t Linux’s fault.
I’m not saying it’s bad. It mostly works great! I love my workstation, and I’m enjoying distancing myself from Apple’s increasingly buggy software stack. But it’s far from perfect.
I’m happy enough to use Linux despite all its warts. But when my parents ask for a new computer, I recommend macOS or windows.
> - I want consistent copy / paste shortcuts.
I really miss Sun keyboards, with dedicated copy and paste keys.
The skinny Enter key, not so much... anyone else ever set SUNKEYBOARDHACK in zsh?
Anecdotally, it's come a long way since 5 years ago
I hear this trope for two decades now.
my Thinkpad
Are you even surprised that I'm not?
2025 is truly the Year Of The Linux Desktop.
> Also, Linux power management and lack of hibernation means its useless on laptops
I gotten better battery life under Linux than on Windows on every Thinkpad I've owned in the last 15 years or so.
That's not to say everything is smooth sailing. Audio is a battle I'll still be fighting on my deathbed in ~40 years.
Pulseaudio was not a blessing to Linux.
Overcomplicated and hard to configure, the only way it even got measurably usable is because of distro's putting in the sweat here.
However, pipewire is really great. Audio on Linux has been a lot better since pipewire became increasingly the default.
[flagged]
Plenty of distros/skins get it 99% of the way there for a similar looking screenshot but only 25% of the way there for the actual user interface experience. ReactOS is probably the closest (in terms of going down a holistic user interface approach) but saying it's 25% the way there to being a finished solution would be generous.
While DEs often emulate the look of macOS or Windows, they always get the feel wrong. You can put a global menu bar, Dock, etc into KDE, but ultimately it still acts like KDE and nothing like macOS.
It's not like macOS or Windows is the pinnacle of UI. I'm on Fedora Silverblue, and it's so relaxing to not deal with the usual ['Yes', 'Not Now'] prompt on notifications. Or have your computer became suddenly unresponsive because of random scans you can't disable.
It's not like kde is either. In windows I have never thought about carefully moving the cursor through start menu. In kde it's one wrong move and you're in a different section, cause in 25 or how many years they didn't figure out hover activation delay.
> In windows I have never thought about carefully moving the cursor through start menu
Well, Windows 11 got rid of start menu. To get to it you have to click an obscure button. In Windows, you have to carefully think how you move the cursor at the edges of the window because Some Idiot thought is a good idea to make the window border 1 pixel wide. Even on (Q)UHD monitors.
The internet says the borders are 1px since w10, but the actual resize handle area on my pc is much thicker and is adjustable, cause I remember changing it.
Windows surely has its quirks in dumb places. But what linux desktop achieved in the last 10-15 years is being driven by a bunch that simply shits on its users and doesn't care for years after. I left back to windows at xfce 4.6 which broke all my effing menus and told me to gfm. Kept trying biannually and seen it getting worse and worse, at the stupidest places. Like, they have to be from really special demographics to do some of that.
The GP is talking about the Windows 2000 UI.
Sounds like you are describing XFCE.
I made the switch to a *nix OS with XFCE 20 years ago. Couldn’t be happier.
Windows Whistler (XP Beta) had an interesting theme that was like a bit modernized Windows 2000. Small non-rounded title bars, non-obnoxious taskbar, etc. Too bad they never finished it and offered a stable version for Windows XP users.
Here are my gripes with the modern Windows experience:
- Runs Windows update and reboots without my permission
- Keeps trying to make me switch to Bing
- Keeps trying to make me use Microsoft Account vs. local account
- Does a crappy job of reopening windows on reboot. Miserable copy of macOS.
- Fan spinning on my laptop with no easy way to figure out what process is consuming CPU
- Flat UI
- No built-in way to view markdown files
- No tool to graphically show where my diskspace went; allowing me to find and delete large files
- Printers keep getting disconnected; it is easier to print from iPhone thanks to bonjour
- No dictionary app (macOS has it)
- Can't airdrop to iPhone (3rd party apps can do it)
- No screenshot tool that allows you to type text (in addition to circling and highlighting and arrows)
- No command-line zip / unzip
- No instant search (macOS has had it for how many years now?)
Command line zip/unzip is available in PowerShell:
https://learn.microsoft.com/en-us/powershell/module/microsof...
Markdown rendering is also available:
https://learn.microsoft.com/en-us/powershell/module/microsof...
I agree with a bunch of your criticism, but modern PowerShell is pretty decent and has a lot of tools.
Adding to this If you're willing to go third party
Everything can give you instant search, and with a PowerToys plugin you can integrate it into PowerToys Run, which gives you an Alfred Style search bar
WizTree works for visual inspection of Storage
Screenshots apps, Markdown Viewers, are common enough, won't comment further on those
On the printer disconnections: I've had some weird experiences, recently, a technician showed to me that, using the default windows update driver, my work printer regularly disconnected, but using the manufacturers driver, the setup has so far worked without a problem
Markdown should have a UI viewer
IIRC You can use the Explorer Preview pane to render basic markdown
My main gripe for work laptop is that Windows 11 is dog slow. I think they have rewritten Explorer but not for the better. Word is also driving me nuts. The formatting does a ton of weird stuff that's totally unpredictable. Outlook has this weird flat UI where it's hard to tell what is a button and what isn't. Search has been broken for a long time.
Both Windows 11 and modern macOS are slow as shit now. The other day I clicked the Notifications settings in the outhouse they call a Settings app on my Mac and it took a solid 3 seconds to render the UI.
And behind me, was a G4 Cube that could open the System Preferences app off of a spinning hard drive in less time.
What happened to us?
But besides those things, it's great!
Seriously though I think Microsoft has mostly given up on the B2C market. They have good capture of B2B with hardware and software. Why make great products when you can make mediocre products that people have no choice but to use?
> Runs Windows update and reboots without my permission
This might be an unpopular opinion but I'm actually glad they do this by default now (you can turn it off). My understanding is that MS was continually getting blamed for users getting viruses because they would never update their system, so in the best interest of the users they decided to force it.
I know a lot of people will still disagree with me, but I think if you were in their situation and you were getting tired of not only end-users but also world governments blaming you for things your users did (or did not do)... you'd probably want to control that a little more too, for both your sakes.
In the end it will hurt MS's reputation for being a broken mess even if it's 100% the users' fault for not updating, so I absolutely get it. And yes I know there's plenty of other things you can blame them for, I'm not saying this is their only issue.
You can still give users a 24-hour warning at a minimum, and only force a reboot for really critical issues.
It does... "Your PC will restart in 2 days to finish installing important updates". I believe it has been doing it since Windows 8. Of course you can always restart manually any time before then to apply the updates immediately. I think it's a good compromise.
I don't get that warning on my Windows 11 laptop. Instead it updates and reboots while the laptop lid is closed.
I agree with these. Here are some third party tools that can help with some of the gripes though:
> - No instant search (macOS has had it for how many years now?) Everything search somehow does instant search across the entire file system. It is the first thing I install when I get a new computer, cannot stress enough how much time this has saved me: https://www.voidtools.com/
> - No tool to graphically show where my diskspace went; allowing me to find and delete large files
This one takes a while to scan but produces an excellent visualization: http://www.steffengerlach.de/freeware/ (Scanner)
I use WizTree to see what's taking space. On NTFS volumes it uses the same method as Everything does to quickly read all the file info straight from the filesystem.
> No command-line zip / unzip
Yes it does. It's just called Compress-Archive/Expand-Archive.
So much easier to use tar -z
I laugh when I see responses like this. tar is so discoverable and easy, you can install WSLv2 if you want to use it.
I laughed at your response too, because tar is part of Windows 11. See C:\Windows\System32\tar.exe
> I laughed at your response too, because tar is part of Windows 11. See C:\Windows\System32\tar.exe
We don't usually browse System32 to see what programs Microsoft decided it should be part of the OS. Especially when Microsoft is not sure which Paint is the best.
Wooosh.
> - No screenshot tool that allows you to type text (in addition to circling and highlighting and arrows)
Snipping tool works for all of this
It doesn't let you type text on the image. That's so important! The main thing I want to do when I take a screenshot is to circle something, or draw an arrow, then type some text about the item I am pointing to.
I don't really agree with half the list as those are just apps you can get but...
> Does a crappy job of reopening windows on reboot. Miserable copy of macOS.
Please! Can windows figure this out and can Macs figure out how to restore window to monitor configuration as well as Windows.
And that's just the user experience! For developers:
- multiple heap allocators
- have to install runtimes, even for C
- all useful permissions are off by default
- entire GUI is permeated by "Not Invented Here" mistakes
- msi is opaque and crusty
> - Fan spinning on my laptop with no easy way to figure out what process is consuming CPU
Huh? Ctrl+Shift+Escape will bring up task manager. Is that not enough?
That's not enough. Sometimes it is system tasks that don't show up in the task manager.
As someone who built an IT career on Microsoft’s entire suite, only to recently (past six years or so) migrate wholesale to macOS (endpoint) and Linux (server), I can definitely say MS’ best days are behind it. 2000 was rock solid, Server 2003 had some growing pains (mainly the transition to x64 and multi-core processors), and 2008 fully embraced the long march into irrelevance even as it tried to shake up the hypervisor space. Now the company is so obsessed with arbitrary and unnecessary feature creep and telemetry-as-surveillance that I’m loathe to recommend it when I don’t have to.
Honest to god, if an IdP like Okta made an Active Directory replacement that ran via container instead of a full-fat VM or appliance template, I’d gladly toss ADDS out the window with all its stupid CALs. Basic directory functionality in 2025 shouldn’t require a bloated ADDS/LDAPS virtual machine to run, especially with the move to cloud providers for identity. If you make it easier to do identity without ADDS, you remove Microsof’s major trojan horse into the enterprise - and M365’s as well.
> Honest to god, if an IdP like Okta made an Active Directory replacement that ran via container
https://goauthentik.io/ can run in docker. It can be paired in with openldap containers, too.
If Okta made an AD replacement, they’d charge for each extra attribute beyond fullName, firstName, surName, and drink.
Identity Admins don’t let Identity Admins buy into Okta.
You’re not wrong, but depending ln the org size those charges are still cheaper than Windows Server + CALs.
Ideally though, it’d be like Okta in that its core directory is in the cloud, but also like ADDS/LDAP in that local servers/objects can join to a domain via local containers posing as domain controllers.
Yes, I know modern device management and cloud-based IdP means the need for a directory is decreasing by the day, but Enterprises still want it for ease of user and computer management via a centralized database of sorts. Having someone, anyone offer me a leaner way of achieving this without a crusty LDAP deployment or expensive Windows Server + CALs, would be hugely appreciated.
Okta was going to charge us $6/user/month just for MFA. So I migrated my company to Azure AD with free MFA. We still had AD DS in the mix, but endpoint management was moving to cloud w/ Autopilot + Intune.
An on-prem AD DS is going to be difficult to move away from. From a management cost perspective, it is still cheaper than every other LDAP + Kerb + endpoint policy solution out there. And since a CAL is provided with every copy of Windows Enterprise, thinking about CALs for clients is a non-issue.
That’s assuming your org is all-in on the Microsoft product suite though (but you do make excellent points on orgs who stick to AAD vs Okta in terms of cost savings). For companies who aren’t, or don’t want to be, there’s a huge gap in the market for a modernized, lightweight, cloud-friendly directory.
If I can have my PDC in the cloud IdP, and rely on containers for replicating at local sites or network segments as needed, then I can ditch ADDS, CALs, and M365 wholesale in favor of other products. It removes Microsoft’s trojan horse product from the enterprise and shakes up a lot of attached markets in the process.
The fact Windows 2000 was peak Microsoft and OS X 10.5 was peak Apple is proof that the golden age of software is way behind us, unfortunately.
That's not true, next year has been and always will be the year of desktop Linux, I'm sure of it!
In my opinion, the year of the Linux desktop happened more than 2 decades ago, when the last kinds of applications that were previously available only on Windows became also available on Linux, e.g. movie players and device drivers for some less common hardware, e.g. TV tuners.
That is when I have converted all my computers, desktops and laptops, from dual-booting Windows and Linux, to Linux-only. For some servers I have continued to use FreeBSD and I have continued to use Microsoft Office Professional, but on Linux with CrossOver, where it worked much better than on Windows XP (!).
I agree that installing and configuring in the right way Linux remains a job for someone with decent computer management skills.
However, I have also installed Windows professionally, and on less common hardware, like embedded computers, I have encountered far more problems and far more difficult to solve than when installing Linux on the same hardware. Moreover, the solution for most Windows installation problems was not using some menu in a graphic tool, but using some obscure Windows command in a CLI window, with some very cryptic and undocumented command-line options, which I typically found by searching Internet forums where Windows users complained about the same problem.
Therefore the only real reason why Windows is more user-friendly is because it comes pre-installed on most computers, after professionals have solved any compatibility problems.
For whomever has a friend or relative that is knowledgeable about Linux, Linux can be more "user-friendly" than Windows.
My parents, older than 80 years, have been using Linux (Gentoo!) on their desktops for many years, without any problems, for reading/writing documents, Internet browsing, movie watching, music listening, TV watching, e-mail using, and so on, despite the fact that they do not even know what is "Linux".
You jest, but with Android desktop mode support it might actually turn out to be true!
I thought 10.6 Snow Leopard is peak OS X?
10.6.8
Still kicking myself for not getting an Axiotron Modbook running Snow Leopard.
God, I wanted one of those so badly!
Yeah, my current plan is to just get a Mac Mini and a Wacom Movink 13 (or ideally, some higher-resolution successor).
Imagine coming into the mac world with a 10.5 cd and upgrading to the current 10.6 and then watching it deteriorate through years into 10.10. Yeah that's me. Peeking at a glimpse of perfection only for it to flash and fade.
They really did offer a lot of features that really helped productivity. Snapping windows, jump lists, having libraries act as a virtual folder for many folders, etc.
Libraries confuse me to this day. Just give me a path!
Fedora Kinoite with KDE Plasma 6 is pretty good. And will not get worse in the future either. Just need to look outside of the commercial offerings...
The world of software is far larger than those two operating systems.
it was built before, we can build it again and even better than the first time
2K 100% was the best Windows. The NT benefits with none of the XP downsides.
Hard agree. The windows 2000 UI was peak UX and each step since has been a downgrade, (with a possible exception of windows 7)
It was the best for its time. But one of the reason why XP was "better" is that it had built-in support for WiFi. That ended up being a dealbreaker for 2k.
That's the issue.. every new OS has brought some features or stability improvements that are huge upgrades over the older OS.
WSL 2 is a must-have for me now, so Windows 10/11 is much better than anything that came before in that way. I may be alone in this, but I actually think Windows 11 has the best design of any Windows so far. The problem as usual, is that they haven't made the entire OS consistent. I wouldn't mind the new control panel if you could actually change every setting in windows in that one control panel.. and not have to dig through to find control panels that still date back to Win2k. And the new/old context menu in explorer is an absolute disaster. Then new design is fine.. but how the hell did they manage to not make it support all the options of the old context menu?
Also let’s not forget that windows 11 puts random news stories in the start menu. Here in Australia, a lot of them are clickbaity scams. I really can’t believe Microsoft is endorsing whatever horrible choice of news provider they’ve teamed up with. It really spoils their brand image.
There’s a way to remove it, of course, by running some obtuse console command. But normal people have no idea how to do stuff like that.
Windows Server 2003 was the best Windows by far. All of the good parts of NT/2000 with any parts of XP available when you needed them.
Except that AFAIK 2003 kernel was different enough that a few apps and specially games refused to run, properly or at all, compared to XP.
No too many, and lots actually run without any issues by using compatible mode.
Cannot agree more. Used Windows Server 2003 for over a decade, until I moved away from desktop to laptop and started having driver issues.
I agree here. I ran server 2003 on my early 2000's desktop for a while.
I preferred XP/2003 in classic UI mode. Lots of little improvements.
If you could get winterm on it and recent Firefox it’d be quite usable. Perhaps ReactOS one day.
and 64-bit (x86_64 not IA64), which no version Windows 2000 was AFAIK
Windows 7 with classic UI is probably the most-recent decent version.
There were pre-release 64 bit alpha versions of win2k, but otherwise you needed XP/2k3 for 64 bit. XP for amd64 was a bit of a shitshow with drivers (especially on consumer-grade computers), though. It wasn’t until vista that it ironically got better on that front, though people held out upgrading because of how terrible it was…
yeah while 2K was their best ever single breakthrough improvement, it was a v1 and XP/2003 in classic mode was a more refined 2K eg more drivers and better plug and play, more graphics compatibility. And 2003 Active Directory had a number of quality of life improvements.
NT3.5 was perhaps the most stable version I ever used. NT4 brought the new UI, making NT5 aka 2k not the first version of the 95+ UI.
Perfectly stated. It was more stable and had better UX than NT4, but didn't have all the unwanted anti-features that came in later versions of Windows. It was the last version of Windows that didn't get in my way.
I loved Windows 2000 so much. I was a beta tester back then and they sent me a copy in the end. Was very cool for me as a broke high school student.
I bought it from my college computer store for like $5 for the CD and an endlessly reusable license key. Truly the good old days.
2K was so much better than XP. The UI rendering thread was decoupled in a way that XP's wasn't.
Definitely. If 2K supported ClearType I would have stuck with it on my personal machines for another half a decade.
Agree. My company ran a bunch of web servers on Windows 2K and Apache web server, because management was afraid of Linux (general FUD and Microsoft's lawsuit threats) and the engineering staff was afraid of Microsoft's IIS web server (security dumpster fire at the time). It was actually a pretty good system, super easy to maintain.
WinXP was also NT family. It wasn't from that married-in Win9x gene pool.
You can't say WDDM wasn't a step forward... Being able to crash your video drivers and reboot them without crashing and rebooting your whole machines made Windows a lot more stable.
Peak doesn't mean that it's a monotonic decline without any steps forward.
One of the innovations in NT 4.0 was adding the ability for video drivers to crash the kernel. They went full circle.
Nostalgically, yes, Windows 2000 was amazing. At the time of launch, on period hardware, it was the fastest and most lightweight OS released by Microsoft. And looking back, I always appreciate that I can look in Task Manager and immediately recognize all of the processes by name.
Windows 7 (except for the last few updates that introduced telemetry and ads) comes in as a close second. But everything after is just bloated crapware.
The only bad things I remember about Windows 2000 are that some software written for Windows 3.x and 9x had compatibility issues and it took an eternity to boot up. It was go take a coffee break as soon as you turn your computer on for the day bad.
IIRC, Win2K would wait for most / all service startups to complete before showing the login prompt. XP and later would allow login to occur while many services were still starting up.
It's a tradeoff. A Win2K system was pretty responsive when you log in after a reboot/startup, but you've got to wait for that experience. In the days of spinning disks and single core CPUs, you had to fight those still-starting services for resources, making the first several minutes of XP usage painful.
Win2k also had the smoothest mouse movements that I had ever seen. If you had a PS/2 Mouse, you could turn up the sample rate up to the max. Dragging windows looked incredible. Even my Mac to this day with a fancy brand new 4k display can't match it. My mouse still looks blurry as it moves across the screen.
I remember using the BootVis tool (IIRC was an early part of what would be the performance toolkit) to profile the startup process, and then you could it to optimize the location of data loaded from HDDs to reduce the seeking required. Also back when PATA was still in use depending on your motherboard I seem to remember making sure windows wouldn't try to autodetect link speed on unused attachments as that could take ages trying to find something that wasn't there.
I used the NT 5 betas for a while, and loved alerts not stealing focus. But that came back in the released W2K, and I remember being slightly annoyed by it.
It was anything but lightweight on a Pentium 90 or Pro, or whatever was common at the time. Really needed to upgrade to 16MB of RAM (lol) which was expensive at the time. Why only business and not normal folks used it.
You are confusing 2k with NT 3.1. Win2k was not happy with anything less than 128 MB RAM.
edit: changed to 128 MB. It was XP that needed 256 MB to be any good.
Must have been NT4 I was thinking about, only used 3.51 at work.
It's a shame ReactOS never got mature enough to be a serious competitor. If it had modern app and hidpi support but was suck in a 2000-era UI and didn't have feature bloat, it could be a great daily driver.
ReactOS is not dead though. They just made a release.
And it has the 2000-era UI and the modern app support.
It's just dragging on other things, such as SMP and 64bit. But development focus seems to actually be focused on precisely these two.
Can it run Firefox yet? The last time I tried reactOS, it BSODed just trying to launch Firefox... and don't even try using Windows drivers on it, it'll either hardlock, BSOD immediately, or during startup, regardless of whether you're in safe mode or not...
I have not tested it myself, but I hear that the latest release can run Firefox 52, whereas the previous one would, at most, run Firefox 48.
As someone who still runs Win2003 R2 (32bit) on my desktop I can confirm. It was peak MS. System is very stable, UI (classic) is great. It is quick, snappy and good looking. For basic POSIX I have cygwin. For other stuff I use VMs. I have all tools needed to handle some mainmentance (compilers, DDK, docs, ...).
But there is problem.. HW.. The pool of old HW is shrinking, and one day I will not be able to run it anymore.. I guess I will move to Linux. There are few nice and lightweight distros...
For folks that pick Windows 2000 Server, why not Server 2003? Is it just because by then NT had XP out as the "Windows for Home Users" and people didn't use Server 2003 as much or were there changes about it folks hated for some reason? To me it always seemed to bring so many more features/capabilities without trashing the classic UI.
Remember that it also introduced Active Directory. I helped build out a global enterprise network that was consistent and supported the same way, with like a quarter million users and tbh, it pretty much worked flawlessly.
Of course that innocence was lost with Welchia and other issues, but Windows 2000 made the year 1999 feel like ancient history in 2001.
Sure, Windows 2000 was definitely great in a lot of ways, but Server 2003/R2 either extended on all of those (e.g. greatly improving AD and its management) or added on it's own big firsts such as x64 support - all without really introducing any of the types of downfall people hate the more modern versions for.
I'm just surprised that it feels like very little deep innovation in the OS world has happened since windows 2k. 3.11 brought networking in. 95 brought true multitasking to the masses and 2k brought multi-processing/multi-user (yes, NT3.1 had it, but 2k is where most normal users jumped in). And, yes, I know these things existed in other OSes out there but I think of these as the mass market kick offs for them. In general I just don't see anything but evolutionary improvements (and back-sliding) in the OS world beyond this time. I had really hoped that true cloud OSes would have become the norm by now (I really want my whole house connected as a seamless collection of stuff) or other major advances like killing filesystems (I think of these as backdoor undocumented APIs). Have we really figured out what an OS is supposed to be or are we just stuck in a rut?
[edit] 3.1 should have been windows for worgkroups 3.11
"Normal users" did not jump into Windows 2000 Workstation. That was still an 'enterprise only' OS. Normal users either suffered with WinMe shipping on their desktop computer or jumped from 98SE to XP, given their computer could handle it (aka they bought a new computer).
I think the major change has been that computers are very stable and secure these days. It's night and day compared to the 2000s.
There's a lot working against fundamental change of PC desktop OSes that corporations use, therefore OSes that Microsoft can make money from.
- Big software vendors (Autodesk, Adobe, etc.) making it difficult for Microsoft to deprecate or evolve APIs and/or design approaches to the OS.
- Cybersecurity/IT security groups strongly discouraging anything new as potentially dangerous (which is not incorrect).
- Non-tech people generally not caring about desktop PCs anymore - phones have that crown now.
- Non-tech people caring much more about interface than the actual underpinnings that make things work.
Outside of the PC there's some innovation happening, at least with the OS itself and not user interfaces. Check out Fuschia sometime.
> 2k brought multi-processing/multi-user
Sorta. It was real pain-in-the-ass to run 2000 as a regular (non-administrator) user. Assuming your software worked at all that way, as even Office 2000 had some issues. UAC was necessary.
It required attention to detail, from a sysadmin / desktop admin perspective, but it was definitely possible and paid dividends in users being unable to completely destroy machines like they could on the DOS-based Windows versions. I put out a ton of Windows NT Workstation 4.0 and Windows 2000 Pro w/ least-privilege users. It was so convenient to be able to blow away a user's profile and start w/ a clean slate, for the user, w/o having to reload the machine.
Yes, I ran that way on principle, and you could mostly make it work. But not really OOB. Registry ACL templates and etc, qualify for a real PITA.
UAC and the other magic on Vista/7 mollified that by a lot.
Looks like there is some negative feelings towards this comment. So if we aren't in a rut, what are the big revolutionary OS advancements that have happened since this time?
This is a forum populated almost entirely by people whose day-to-day existence depends upon building the new stuff that sucks :) (mine too!)
Android (all apps sandboxed). Desktop OSes are still barely catching up to this one.
Desktops have been in a rut for a decade. Windows has sucked post Win7 in ways that are either conspiracy or the most stinging indictment of managerial incompetence possible. Osx is good except it's key bindings are alien and the hardware is closed and apple hasn't improved it really at all in ten years and it has loads of inconsistencies with Linux cli. Linux has been in a huge rewrite of the desktop and graphics lift for no real end user benefit and flubbing the opportunity to make ground on windows while it tried to commit market share suicide.
3d compositing, ssds, mega displays, massive multi core, all completely wasted.
You know what I should be able to do? Hot execute windows and Linux and Osx on the same desktop without containerization that leaves 3d as an afterthought or worse a never thought.
Virtualization. FDE. Hot patching. Io Ring (io_uring), etc.
Virtualization is old. I used VMWare on Windows 2000.
Not used in the same way, see VBS.
I know VMWare workstation is not the same, but there were VBS-like systems back in the 1970's.
IBM pioneered hardware virtualization for isolation on their System/370 mainframes with VM/370 in 1972.
VMWare ESX hypervisor brought virtualization to x86 servers (2001).
Xen hypervisor introduced open-source virtualization for x86 (2003).
By the time VBS showed up, the concepts were already several decades old.
VBS isn't even the same idea as a full blown VM in a type 1 or 2 hypervisor. I'm not sure why you're attempting to make the comparison.
I am talking about core concepts, not specific implementation details.
VBS uses a hypervisor and hardware virtualization to isolate processes for security. These concepts trace back to systems like the IBM VM/370.
Conceptually it's essentially the same thing. The difference with VBS is the scope and purpose of what's being virtualized.
Windows 2k already had an io_uring equivalent. That's more of an example of Linux being out of date due to being based off UNIX.
IOCP, which originated in NT 3.1, is not a circular buffer like io_uring, but both are completion-oriented.
Microsoft introduced I/O Rings, more or less a 1:1 copy of io_uring, in Windows 21H1.
https://learn.microsoft.com/en-us/windows/win32/api/ioringap...
https://windows-internals.com/i-o-rings-when-one-i-o-operati...
https://windows-internals.com/ioring-vs-io_uring-a-compariso...
Windows 8/2012 R2 did introduce Registered I/O for WinSock which is very similar to I/O Rings and io_uring.
https://learn.microsoft.com/en-us/previous-versions/windows/...
Where? Completion ports are not an io_uring equivalent.
Win11 does have something similar: https://learn.microsoft.com/en-us/windows/win32/api/ioringap...
Windows moved everything to sharepoint now, so documents are stored "somewhere" and can be edited by many users. What often causes strange bugs.
Also a big degradation is the whole "hidden" %appdata% folder that grows and grows in size with no tools to deal with it.
Isn't Sharepoint like an enterprise management tool? I've never interacted with it once.
As for appdata... There's many faults to find in modern Windows changes, but I'm not willing to pin this on MS. Microsoft stuff tends to use %appdata% fairly sensibly, in most cases. On the other hand, the behavior of third-party developers has been really frustrating. What was initially intended as a universal storage location for some program data has become some kind of program container. Now, whenever you download some giant 300-500mb Electron app or whatever, you can be sure that it will force its entirety into appdata with no way to change the location. Every one of these developers has decided that their program is so valuable and Important that it's inconceivable that the user might want to install it on anything but the system drive. No, our program is unique and deserves nothing but the best!
Definitely stuck. We found a pretty strong optimum that no one has been willing to venture outside, strong enough to keep selling and that seems to be all that matters these days.
It was during an era where there was actual competition over Operating Systems. OS/2 definitely pushed Microsoft hard. BeOS woke everyone up even if it wasn't on popular hardware. Bell labs was still experimenting with plan9. There were several commercial Unix vendors.
Monopolies. They ruin markets. Destroy products. Reduce wages. Induce nostalgia.
I am 50/50 on this particular argument for why OSes are in a rut. I think there is actual competition in the form of the various distros out there and they have passionate advocates with real skill trying new things but they don't really take in a more mainstream way and they rarely feel revolutionary. I think this is more of a track gauge problem. It is hard to provide a truly novel OS without building all the infra around it so that people can actually use it. That takes resources at a scale that few can muster. What if I wanted to build something that kills off the idea of a file system? All the apps out there are written at their very core with this concept in them so even if it is a better idea, it is incredibly hard to bring it out to market and only huge companies can do it. At the same time though the huge companies are pushing their versions of things which makes it hard to compete and have even small innovation in the space. I don't have a solution here. I had hoped things like web assembly would have led to OS breakaway by now but it hasn't really happened. Maybe it still will.
The desktop UX was good, but nearly every network service on Windows 2000 had a critical vulnerability at some point. The Code Red worm (MS01-033) comes to mind as particularly impactful. This was the golden age of stack smashing.
In the 90s, Windows was simple enough that I was able to read tech articles and understand a lot of what is going on inside, up to the point of Windows 2000, and to a certain extend, Windows XP. That completely changed with Vista/7 where I can no longer recognize the name of many processes that are running or understand what actions/situations make my computer lag.
Nowasdays, even through I don't worry anymore as Windows 11 is happy as long as you give it a quadcore cpu, ram, and an SSD, sometimes I still wonder why it writes 40GB to the SSD everyday.
I think I am not the only one who memorized this: FCKGW-RHQQ2-YXRKT-8TG6W-2B7Q8
Raises hand: I always remembered the first series of numbers as f*ck GW (as in Bush).
I always heard it as FuCK GateWay (the PC maker)
That's XP though, not Windows 2000...
I’ve been running Windows 2000 Pro as my main machine from I think the second beta until it was completely dropped from support.
It ran all of my games, stable as hell, quite light with none of the bullshit added later, none of the graphical bullshit added by XP but still classic Chicago.
The only thing that could make it better were the UI rendering engine introduced with Vista and its enhanced driver and security model.
I see a vast majority of comments here agreeing that UIs were significantly better and faster twenty years ago or more. Assuming HN is representative for the software community, how is it that slow, inferior and dumbed-down interfaces have prevailed in the end? And this hasn't been happening just to popular consumer products.
The answer is, HN is not representative.
There will be a much, much higher prevalence of computer enthusiasts on this board, not just people looking for a paycheck
That makes sense.
Webification and phones.
You make more money selling software for phones and it is cheaper to use one stack to build all so you build things for the web first.
I don't know whether Windows is for corporate desktops, enterprise servers, PC manufacturers, Azure, home users or advertisers. It certainly doesn't feel like it is the right product for me anymore.
I started my Windows Server career around 2003R2, so can't comment on the peakness of 2K.
2008R2 snowballed it's own "revolution" when it introduced PowerShell 2.0, which was pivotal for many future things to come.
Out of the more modern ones, 2012R2 was "peak".
I would guess we will still see 2012R2 installations well into the future. Still running bits and pieces of critical infra, even if it has been EOLed long ago, but that's the way of the Server I guess. Can't wait for the 5000+ uptime day screenshots of ancient 2012R2's.
My guess is the next "long LTS" will be Win2025. Just because it's introducing the NTLMv2 deprecation path and working solutions to replace it.
People just like the Windows they used when they were younger. It's the same with movies, cars, whatever.
Governments, diseases, weather.
If MS stripped *ALL* ads and bloatware (telemetry for calc??) out of Win 11 and restored the traditional UI of start menu + desktop, it would be fairly good overall. Certainly within their top 5. They really are close to peak yet again but cannot realize they are striving to make it worse.
11 is mostly a solution looking for a problem. I don’t do windows day to day anymore, but the folks I work with who do aren’t excited anymore.
"A computer on every desktop and Windows on every computer" was the company's goal. Ever since they claimed success on that one (glossing over the ubiquity of Linux everywhere else) they've been sorta directionless.
Are we just doing OSes or are we doing the entire conglomerate?
#1 Windows 7
#2 DOS 5.0
#3 Office 2003
#4 Windows 95
Honorable mentions: IntelliMouse Optical and XBOX (2001)
[dead]
TBF XP and 7 are both decent. Everything went down after those, including the Ads, the update, etc.
I didn't upgrade to 10 until I purchased a used Dell laptop (which includes 10 prof) a few years ago, and I never used 11 and hopefully never needs to use it.
I love 2000 and XP but 7 has a special spot for me because it’s a “modern” Windows (supporting proper alpha blending in its theme drawing and such) without the various problems that 8 and newer bring. I have an old laptop with it installed and booting it up is honestly refreshing. Its visual style is a little dated feeling but not that much.
I like it for the same reasons. I just wish it supported high DPI. It, and Snow Leopard to Mountain Lion era OSX, at high res would be peak desktop usability.
I believe XP was when Windows Activation started, so that's a pretty big negative for me. Other than that, XP, 7 and 10 were pretty good, although 10 introduced advertisements if I'm not mistaken.
XP also inexplicably required at least twice the ram as 2000. when XP came out that was a significant cost, and I personally was able to salvage many laptops at the time by downgrading them from XP. Eventually XP became the default for me because ram got a lot cheaper and the service packs and driver support made it more viable.
But then, tangentially, I started using ubuntu at work, in a sort of misguided belief it would make me a better sysadmin, and it was only a matter of time before I couldn’t stand windows at home as well.
I thought win7 was pretty solid, though I didn’t upgrade until well after win8 was shipping. But lucky for me, proton finally got really good, and that allowed me to basically skip win10+. Now it’s only for the rare tool that I even boot into my windows partitions anymore. When I do, being bombarded by random attention grabbers is completely jarring and I want flee as fast as I can.
If you think 11 is bad, I bet 12 will be even worse. When 10 is unsupported and 12 is out, you will probably be reaching for 11 by then...
I'm already moving into Linux for one of my laptops. If the drivers and desktop experiences are good enough (or bad enough in Windows) I might move 100% to Linux in a few years.
I made the jump a few years ago and the experience has been largely great. Lots of learning, which has been half the fun, and no goddamn ads in my start menu.
Totally usable as a daily driver, provided you don't need Windows only software. The year of linux on the desktop was probably about 2020.
Steam's proton has made gaming on linux astoundingly good. The only thing that still needs improvement is mod support, as mod managers, game downgraders, bin patchers, some more involved mods involve little utilities written for windows that are not easily runnable on linux.
It is slowly improving though. The steam deck has moved things forward in leaps and bounds.
It seems like basically all the games I play aren't supported on this unfortunately and it feels like they never will be.
You do competitive games? Those have been so far Proton's biggest weakness, but that's usually less of a technical limitation and more of a company decision, which given enough pressure can be changed
All I play are online multiplayer games so yeah.
I play Battlefield 2042, Call of Duty Warzone, Apex Legends, PUBG, Rainbow 6 Siege, and Fortnite all somewhat regularly and none of these as far as I know work.
The only games that I do play regularly that work are Counter Strike 2 and DotA. Though I can't use Faceit for CS2 which would be ideal.
Ah, see as a guy in my 40's, my reflexes just aren't what they used to be. I used to be lethal in my teens, but these days I tend to mainly play single player or coop games. If you play games that require kernel level anti-cheat, yeah those will probably never be supported on linux.
Force override it in the game settings. Works on 99% of games. Probably not multiplayer games with anti-cheat, because if you're not using a software chain fully validated by Microsoft then you're cheating.
Even then, a VM can get you really far.
If you need direct hardware access (like for gaming) then you can run a passthrough VM. You can do that even on a single video card system.
> You can do that even on a single video card system.
Like with consumer video cards? Tell me more.
With consumer video cards: https://github.com/martinopiaggi/Single-GPU-Passthrough-for-...
I don't believe you have to VBIOS patch anymore
Fair enough! I am probably just projecting my own probable fate haha.
If you intend to stick with Windows for the long haul, you will have to upgrade eventually. I hung on to 7 for a while, but several apps stopped getting updates: iTunes, the Spotify desktop client, Google Chrome, and even Firefox dropped support. I was using iTunes to download podcasts, which after a while became impossible with some feeds because I would get an SSL error each time on that old version. For 10, the ESU period ends one year after 10/14/25 for consumers and three years for organizations. It's possible that apps will continue to receive updates during that time.
Thanks, yeah, I figured. Maybe I can move to Linux in 5 years. I'm already using Linux for my dev laptop.
Come try out Fedora, or whatever flavor of Linux you want.
It's surprisingly fantastic for almost all modern computing tasks. Yes, it's true, some software won't work, such as Adobe Photoshop, but most people aren't using software like that anyway. For gaming, I'd say we're close to 99% of games supporting Linux out of the box on Steam. The few left that still don't choose not to via kernel-level anti-cheat or forgetting to toggle a checkbox for Linux support (EasyAntiCheat and friends).
The point is, it "Just Works" for darn near everything these days and is a very pleasant experience. Try it out!
The best Linux I have ever seen is Linux Mint. I tried it out because I needed to do something with firewire, but all of the other Linux kernels had dropped firewire, and it was the only one left that still supported it. I found it to be intuitive and friendly and everything just worked.
Mint leans towards the "ultra-stable" side of Linux Distros. Fedora leans towards the "bleeding-edge". Both are great in their own ways. If you want the latest and greatest of everything, Fedora is a great pick. If you just want long-term stability, Mint is a great pick. With both, you can choose the Desktop Environment you prefer (I like KDE personally, but many like Gnome, MATE, Cinnamon, etc).
That's not to say Fedora is unstable - it's just that it iterates fast to keep pace with packages as they release new versions. There's a new major Fedora release every year, for example.
There really isn't a wrong choice here.
Eh, this is going to sound like a I'm a stick in the mud, but I've tried Linux about a dozen times now, and every time has eventually led to 'a Linux evening' that disenchants me from the fantasy and back to reality. It's fantastic as a server OS, however.
Try it again if you haven't recently. I'm unsure what specific issues you encountered, but anecdotally I can say I've been driving Fedora full-time on my home workstation for nearly 2 years now. I love it. I drove Fedora full-time on my laptop off-and-on for nearly a decade as well before that.
For me, gaming was what kept me away. But, besides a few titles, it's been a non-issue. It was very pleasantly surprising.
My desktop runs Fedora Kinoite[1] - an immutable version of Fedora. It poses a set of unique challenges for a development workstation (my primary use), but has resulted in rock-solid stability through several major OS upgrades, and a lot of development-related hackery.
I don't see myself going back to Windows anytime in the future. Every time I'm at the office an on my Win11 machine, I remember why I switched in the first place. Just my experience though.
[1] https://fedoraproject.org/atomic-desktops/kinoite/
Often Linux is great, until You update some esoteric dependency that breaks a bunch of stuff, and fixing it is just a little past your experience level …
That's the best part of the immutable versions, containers by default so weird dependency interactions are minimized, system is stable and has a good rollbacks in case something does go wrong, and updates are more or less invisible
Takes some getting used to, but has really been a smooth experience
Windows 10 LTSC IOT has all the bloat and spyware stripped out and will get security updates for years. It's super lean.
Will third party apps keep installing updates ? Hard to say. The adobe suite already refuses to install the latest version on any LTSC (for no reason other than they don't want to support it - it works great) so who knows.
Suspect my next OS will be Windows 12 LTSC if I can hold out long enough - every other Windows version alway seems to be experimental crap going all the way back to ME (millennium edition)
I tell customers that they should use LTSC for things like virtual desktops. You need stability, such as it not randomly deciding to install a 4 GB game like Minecraft for every user as a “critical update”.
Microsoft joined a meeting and told the customer that they don’t agree with my recommendations because they want to make sure all users get the “latest experiences”.
There’s your problem right there: pushing your own KPIs instead of what’s best for the customers.
Windows 7 was my all-time favorite. I remember you could not use it straight out of the box, there was a whole bunch of UI tweaks that I would make right away. After that, it was perfect.
For me it's windows 7, if nothing else for being the first and last major Windows where Universal Search worked well.
The Windows 8.x line gets some credit for having the strongest pen interface integration, which regressed significantly in the 10 line, but the overall shell in Windows 8 was rough, and a lot of features were broken in the rushed out and mostly failed attempt to Appify windows and redesign much of the UI at the same time.
the point of an OS is to be out of the way, w2k was both the best and last windows to do so
The way I see it (and similarly with browsers now) is that the OS is a venue providing a stage for others to perform on, they provide the facilities so every act doesn't need to build their own venue. Most of the time people don't visit/use a venue for the sake of it.
Let's be real, Windows Server hasn't changed much since W2K... they may have slapped the 7/Vista UI onto it, but at its core, nothing has changed.
It still operates just fine for AD, DHCP, DNS, SMB, etc etc... the only thing they could drop without the majority freaking out is IIS.
They 've then slapped Windows 8 UI into it, after that, Windows 10 UI, and lastly they've slapped the Windows 11 UI. The reason should be obvious.
Didn't expect to see this-but after Windows 95/98, I went to Windows 2000 for a long time, didn't switch to Windows until 10/11. After Win2k, I went to Linux because I wasn't a fan of XP/7. (I know this is an unpopular take.
The glasses are rose-tinted. There were a number of little bits missing from Windows 2000 that were helpful to have in XP, and you could change the theme to make it look just like Windows 2000.
And I really don't know how Windows 7/Server 2008 R2 doesn't win this battle.
Context. We upgraded from Windows 98 to Windows 2000. That was a major upgrade. First stable NT platform that we could use for everything, including games.
To be fair, driver support sucked ass for NT3/4, and I don't think 4 even had DirectX support...
For all I know, windows server 2025 is amazing, but have you priced it out? There's no way to justify it.
I actually like Windows despite their aversion to committing to a UI redesign but do I really need to pay $1100 (per core!?) for the hope (but not the promise) of no ads?
Microsoft stopped being the innovator. Now they duplicate or appropriate the innovations of others.
I only used Windows at work and I was very happy with NT, when XP came out I was able to go to Linux (RHEL) for my workstation at work.
I never had Windows 2000, but lots of people said it worked great compared to the other Windows systems.
But really for me, the best M/S setup was DOS with Desqview.
I wish Desqview supported higher text modes than 80x25.
There are software and scripts to decrapify Windows 11. After uninstalling and stopping everything that's not needed and making start menu and the bar behave like in Windows 7, it's quite decent.
This adds maybe 20 more minutes to install time but it's worth.
Unfortunately all that crap eventually comes back. Microsoft likes to reset settings… I’m pretty sure I must’ve spent the majority of my youth setting the same explorer settings over and over and over again … And it never ends with any custom setup you do; given enough time it reverts.
I don't think anyone doubts that you can do this. It's more that I refuse to pay for an OS which needs to be de-crapped in the first place. If Microsoft can't make something which prioritizes my needs above their corporate metrics, then they don't get my money.
LTSC is likely what you want then (needs to be purchased through a VAR but it's not hard to find a smaller one that will sell single copies)
Can you just expand on the significance of LTSC for a personal user ? MS says it's for "Medical Devices, Kiosks ..." but I presume the reason the you mention it is that it's a version of Windows 10 that is expected to receive security updates for X years into the future ?
Is it also de-crappified ? No games, requests for Microsoft accounts etc ?
My company has access to these licenses to resell through our distributor Pax8. Contact me (profile) if interested.
Okay, mostly use Windows 7 Professional (with 100+ "updates") for general purpose and software development but for such usage and/or a Web server what to get now? Windows 2xxx?
That's kinda mean. Surely Windows ME was peak Microsoft.
This brings back a lot of nostalgia and I wholeheartedly agree. Back then I ran Windows 2000 server beta 2 on a dual proc system with P2-300s. It was rock solid.
Hard agree. 2000 was capable and never felt bloated.
It felt solid.
Microsoft never should have dropped Xenix to invent its own OS.
One cool thing Microsoft did with Windows NT was the whole local security model and a filesystem that supported them (NTFS), which was definitely richer than UNIX. I don't really know if other UNIXes at the time had anything more than the 16-bit uid and gid and mode bits on everything the filesystem. I wonder how it would have looked if Microsoft kept Xenix as the base and added ACLs on top of it, for example.
Microsoft bought Dave Cutler so that he could reinvent his OS (VMS), which became the NT line.
Why would anyone pay for Yet Another UNIX?
Plus Dave Cutler hated UNIX.
Well, Xenix was extremely popular since it was the cheapest option on x86. That said, I doubt Bill Gates would have hired Dave Cutler had he stuck to Xenix.
You wouldn't want to connect a fresh installation of Windows 2000 to the internet today. "Net Send" and default-on Administrative Shares are some brain-dead design decisions that made sense on a trusted LAN, but not the Internet.
[dead]
I don’t know why they always alternate a good with a bad release. Technically Windows 12 should be good.
It feels like Windows 12 will be riddled with AI stuff nobody wants and ads, and forced to be online and connected to Microsoft in some way.
People always say that, but it’s not really been completely true.
< 3.1 Bad
3.1 Good
3.11 WfW Good
NT 3.5 Okay
95 Good
NT 4.0 Good
98 Good
Me Bad
2000 Good
XP Good
Vista Bad
7 Good
8 Bad
8.1 Okay
10 Good
11 Bad
There just really isn’t a pattern to it.
XP was the last that I really REALLY used. I've had Windows 7 (on my work machine that I didn't use) and I have a Windows 10 machine that I boot from time to time when I want to mess with recording gear. But I kinda fell into "they're all bad, I was just used to them".
I'll give my prime example. I used to know Device Manager/Control Panel SO well. I could just get things done. Now I have to hunt around forever to do any sort of hardware related task. In their attempt to make it "so easy, even your grandma could use it" they've alienated power users. My grandma still has to call me to help her attach a printer... but now I have to say, "I dunno... let me watch a YouTube video and pray that it matches the sub-version that you're using".
I don't know how good Windows 95 was in practice, but in our country where 99.9% of internet cafes didn't have licenses, or service pack updates (if they even had any for the 95 variant), it was a pretty easy Windows to DoS via the netbios vulnerability. https://en.wikipedia.org/wiki/WinNuke
>95 Good
That's arguable, I thought it was poor at the time.
On well supported hardware 95 was a major upgrade. The Start menu, long file names, preemptive multitasking, plug and play hardware, and Direct X gaming support. In many ways it even surpassed MacOS at the time.
Windows users have low expectations. I still have PTSD from all the problems 9x caused me.
Windows 3.0 was good. 3.1 was a minor improvement.
3.0 was ok but a bit rough around the edges and it crashed a lot.
3.1 was a substantial improvement in that regard. It also brought major features like TTF fonts, the registry, a usable file manager, audio and video support, and networking in the Workgroups version.
When Win10 started, it was clearly Bad. No good reason for updates, invasive privacy-breaking telemetry, updates at random moments of the day, and everything was a little different but nothing was better. People flat out refused to upgrade when it was given for free. Microsoft had to force it trough windows update, and did multiple rounds of breaking software people explicitly installed to block the upgrade.
When did it become good? WSL and DirectX 12 were real changes, but all in all, my impression is that the user has been frog boiled over the years, with 2K,XP and 7 becoming distant memories.
I remember that too. Microsoft was more aggressive and hostile towards the user than ever before.
The only 'bad' thing about Vista was it's change (and thus deprecation of many drivers) of driver model. Once tweaked and with good native drivers it was the first good 64bit windows - far more reliable than XP64. At least until 7 came out.
NT 3.51 Best
These are also mixing two separate streams: Win3.x/9x/ME and NT+
There is a pattern when you remove the versions few people used.
Win 11 and Vista have been unfairly maligned, with some minor tweaks (and start11) both are solid performant windows releases.
Vista was indeed fine. I used it for many years and had nary a problem with it. The problem with 11 isn't the core (everyone seems to agree that is fine), it's that Microsoft insists on putting ads and other user-hostile BS in.
I basically skipped windows XP entirely, only seeing it on other people’s computers.
I staying on a thinkpad R31 with win2k until I got a R61 (4gb ram) with vista on it several months after vista’s release. At that point it seemed like driver and other early teething had been worked out, so my experience was pretty positive.
When I eventually moved to win7 I didn’t notice any real difference.
I think the vista hate is well earned. Remember when Microsoft had to trick users into trying it by calling it 'Mojave' instead?
Also the unending and relentless UAC prompts.
It felt like malicious compliance. Oh, you want security? OK, here you go, hope you choke on it.
Windows 11 is the only version of Windows I’ve used where the taskbar routinely crashes on login and refuses to load.
Windows Vista was essentially unusable on release unless you had very high-end hardware.
A couple of weeks after release the first step after getting a new computer was changed from "downloading firefox" to "downgrade to windows xp". Unironically, many people did that.
And that unusuability was mostly due to the driver model change, once native Vista drivers appeared it performed better than XP/XP64 unless you were running old video hardware that couldn't handle aero - in which case you were still better off running Vista with the classic UI, although that did entail forgoing the Luna styling.
Even with native WDDM drivers it performed poorly in desktop graphics, because Vista also removed all GDI hardware acceleration support. This caused many 2D graphics operations to execute in software, or worse, an even slower mix of hardware and software rendering. Windows 7 improved on this by re-adding hardware acceleration for some GDI primitives and adding aperture windows to reduce DWM memory footprint.
they should’ve just skipped 11 like they skipped 9
The story I heard[1] was that Microsoft skipped 9 because people used to check for "Windows 9" prefix string to identify 95 and 98:
[1] https://www.reddit.com/r/technology/comments/2hwlrk/comment/...It's a story all right, but that's all it is. Windows has a GetVersion function that returns a struct of major/minor/build, and they're all ints. That's how you've always checked for versions, with older versions checking against a single int that contained both major/minor.
Microsoft had no reason to support blatantly stupid development practices that no one ever actually did. They were trying to avoid brand confusion with the consumer, because even people who know about versions will still do a mental double take at seeing "Windows 9", expecting another digit. The confusion might not last long, but it still detracts from the brand.
> no one ever actually did.
There is an example further down that thread:
https://www.reddit.com/r/technology/comments/2hwlrk/comment/...
https://issues.jenkins-ci.org/secure/attachment/18777/Platfo...
I suppose Java didn't offer too many alternatives to checking the OS version the standard way, but I really have a hard time imagining MS bending over backwards to support that approach on that platform. There wasn't even a need to check for the "windows 9", the code was already checking for a Windows platform and would have worked the same without it. Avoiding confusion in the minds of the end users is still the most plausible explanation to me.
Microsoft has a history of trying to avoid breaking older software with new Windows releases. To do that, they definitely do need account for how people are actually coding things in their software rather than just what they've documented as the way to do things.
The string check makes a lot of sense when you consider software written in languages like Java or Python rather than something that's coded directly against the OS APIs. In those cases you would get strings back with the OS name which of course is going to lead to many people just doing taking the simplest route of string matching.
Windows ME was peak Microsoft, buggy, glitchy, a haven for viruses. It was pure crap and that defined best what Microsoft was and is: cancer
[flagged]
it sure does
[flagged]
It's the Register and therefore too worthless to get worked up about, but their naming a server version of Windows as peak anything is an indication that they probably just polled a few drunks at a bar.