Oh, that. I wouldn't put too much stock in that. That's market share in terms of the "market" being a website's audience, and their numbers are derived from a subset of sites. That doesn't tell you how many computers are running a given OS, but how many web browsers reporting that OS are visiting a specific selection of sites.
If I had to guess, I'd guess that 26% isn't actually "market share", but install base, or maybe some kind of adoption metric. I don't know what it means, either.
So someone posted what you thought might be a bullshit number, and you responded by making up different numbers?
I'm not asking you to cite a source, no. I'm just wondering what market you're talking about, if that makes sense. Like Macs have (last I saw) somewhere around a 15% market share of the PC market, so 3% sounds lower than I'd expect. Now, my number might not be right, but I'm not sure if you're talking about a different thing.
Market share in terms of OS purchases? I'm not sure when you'd count a MacOS purchase. Do Windows upgrades count as purchases? Do Mac upgrades? Do free Linux installs count as purchases?
Do you see what I mean? I'm not looking for a citation, just a vague idea of what those numbers mean.
What I'm going to say may sound crazy, but this is honestly what I think Microsoft should do:
Open source Windows. Not every single thing, every single feature, but most of it. Make it so, if a home user wants to use Windows 10, they can just download and use a free version of it. If they really want, they could still make the licensing such that OEMs would need to pay a small licensing fee to include it bundled on a new system, saying that the open source license is for non-commercial use only. They may lose some purchases of Windows licenses from home users, but excluding OEMs, how many of those are there?
They can reap the rewards of open source software without losing much.
Among the things that they don't open source, put those features into office 365 licensing. Basically, make it so you can get an open source version of Windows Home for free. Take some of the Pro/Enterprise features that individual users might want, and bundle them into the Office 365 individual licenses. Take the features that businesses might want, and bundle them into the Business and Enterprise plans.
Doing it this way would win them huge points with the FOSS crowd, simplify their licensing, and provide additional incentive for businesses to use their subscriptions (which is what they really want anyway). Plus, it just makes everything simpler.
You can say the idea is stupid, but they're trying to force everyone to use Windows 10, and they're saying they don't plan to come out with a Windows 11. Purchases of new Windows versions will only come through OEMs regardless. I honestly think they'd have a lot to gain, and not much to lose.
Why can't *nix seem to get past that one?
Part of the problem is the question of, "Which *nix?" There are a bazillion distros. If you're a developer, you don't want to develop for and test on all of them. It's one thing when it's open source and you can just write it and rely on someone else to fix it if it doesn't work on a given distro. Otherwise, you're going to want one platform to target.
I was hoping SteamOS would solve that problem. It had the potential to bring together the best things about the PC gaming world (i.e. buy whatever hardware you want) with the best thing of the console world (a single platform optimized for gaming). I'm not sure why SteamOS hasn't come together. Is it just the chicken and the egg problem (i.e. developers won't develop for it until their are enough users, and users won't use it until developers are making games for it)? Is it that Windows is easier to develop for? Or is Microsoft bribing and strong-arming developers into sticking with Windows?
I have no real insight here, as I'm not in the industry. However, I would switch to SteamOS in a heartbeat if my games would run on it.
The problem is that the PC Desktop is a dead market, it has gone to the Tablets and Phones for a normal personal computing... For our Workstations, we don't need a Table OS, or a Server OS. But a work station OS...
These two points don't really make sense together. There's no market for workstations, but we need an OS focussed on workstations.
The problem isn't that Microsoft developed an OS UI aimed at tablets. They could easily have (and in fact *do* have) a UI that looks and behaves slightly differently on a tablet vs a workstation. The problem is that Microsoft intentionally tried to make their workstation OS look and work like a tablet OS in an attempt to railroad people into using their tablet OS. They did it because they weren't doing well in competing for the tablet OS market, and they looked at it as a way of leveraging their dominance in the workstation market to push the for dominance in the tablet market. It didn't work, but it made their workstation UI much worse. They also forced that UI onto the server OS for some inexplicable reason.
If they need a workstation OS, that'd be easy enough for them to do. Take Windows 10 and strip out the junk. I'm not even sure why they don't do that, other than possibly that they're betting the farm on everyone buying Surface Pros, and they don't want their tablet UI to be too different from their workstation UI. Also, they want to force everyone to use the Windows Store so they can collect royalties. However, if you're willing to live with that kind of limited choice and lack of backward compatibility, I don't know why you wouldn't migrate to using a Mac.
These regressions seem to occur after updates.
One of the unfortunate things about Microsoft's new upgrade cycle is that the major releases are no longer obvious. Some updates are just patches, and many of the patches fall into a cumulative roll-up patch, sort of like a little service pack. And then a couple times a year, they release a new version that's almost like a totally new version of the OS. There are major functional and UI changes.
When those major releases come out, Microsoft resets a lot of system settings. Sometimes they introduce entirely new settings that didn't exist before, which may overlap with previous settings. For example, Windows 10 did have an option to "defer updates". Later, they changes it so that you could change your system to use the CBB branch, which behaves similar to deferring updates, and then in addition they introduced an option to defer updates by X days. I had my system set to defer updates, and I manually installed an update, which neither moved my system into using CBB or deferring updates. I don't know if that's how it was supposed to work, but it seemed like a bit of a whacky choice.
I guess part of my point here is that, in addition to introducing new options and ditching existing options without making an attempt to map the old options to new ones in order to preserve existing behavior, the change came in an update that you might not know was coming. Your computer might just decide to update one day, and then the behavior of your OS has changed significantly. That's not what I want out of an OS.
The real shame of it is, they know how to fix it all. They could just turn the crap off. Stop trying to force people to use Cortana, Edge, and Bing. Stop trying to force people to use live tiles, or Metro/Modern apps. Stop trying to force people to use Microsoft accounts (Outlook.com or Live or Passport or whatever). Stop forcing advertising into the system, or covertly taking metrics on usage.
They won't do it. They don't care whether users are happy. They know that the people using Windows are pretty much stuck using Windows, and will accept whatever abuse they decide to dish out.
Your straw man conception...
You made up a Boogeyman that doesn't really exist...
Really, the precise role of any position within the IT department, and indeed the role of the IT department as a whole, depends a fair bit on the company you work for. I've seen companies where most employees were fairly technical, and basic helpdesk support was rare. I've seen companies where most employees were supposed to be extremely technical, but people still needed help logging into their computers on an almost daily basis. I've seen companies where the IT department is largely about development and server work, and I've see companies where it's all desktop and VoIP support. I've seen companies where that have one role (even if they have multiple instances) of "IT guy", and it's a never-ending barrage of different kinds of work covering anything remotely associated with computers, but I've also been in companies with highly specialized roles, including some where a guy sits around waiting, in case some particular thing breaks.
There is only one unifying factor of all the IT positions I've seen: You will get shit on. People within the company will treat you like a servant. Management will treat you as a cost center with no productive value. When things are working well, the executives will be angry that they're spending money on your department because they assume you're not doing anything. When things aren't working well, the executives will be angry that they're spending money on your department because they assume you're ineffective.
I have yet to work in a place where IT personnel are treated as respectable hard-working skilled experts.
It isn't easy being the parent of a six-year-old. However, it's a pretty small price to pay for having somebody around the house who understands computers.