What specific problem are you talking about that has been "pushed"?
The fact that you're running Windows. Windows is still Windows, even in a VM, and has all of the same issues that Windows on hardware has.
Why do you consider running Windows to be a problem?
Isolating it in a VM away from things like your personal data, browsing information, etc
Except all of the other issues I've mentioned about this being annoying as fuck to deal with and not exactly easy for a layperson to setup in the first place.
If it's too hard for you then don't do it, again if you're not willing to resolve the problem then you can opt to just live with the problem. Or maybe you don't consider it a problem to begin with in which case this is all irrelevant to you anyway. Why is that so difficult for you to understand?
No.. I'm just saying that there are costs to the alternatives as well.
Oh really? Solving a problem requires actually doing something? Who would have guessed?!
Its not just "MS evil everyone else good!"
I never said or implied that at all, you're just making things up.
You've only provided one solution -- "just use something else" without any regard to the potential downsides of switching.
Actually I provided multiple options:
>Run an alternative operating system exclusively.
>Run an alternative operating system with Windows in a VM for Windows-specific things.
>Run an alternative operating system with Windows as a dual-boot option for use cases where a VM is insufficient.
And these are only solutions if you consider these other 2 options of running Windows 10 exclusively to be a problem or if you consider running older versions of Windows on newer hardware such that you don't get security updates to be a problem.
Hell even with Linux you have to know and trust your distro a good amount before you can claim with any conviction that they aren't also doing similar things.
No actually I can see what network traffic it is sending and it obeys the HOSTS file, from my router I can see that Windows 10 doesn't obey the hosts file and sends encrypted packets out to Microsoft servers. So yes I can pretty easily see that my Linux distro isn't sending off data to other people. Is telemetry data a concern for me? Not really, but if it is for you then you probably want to limit your use of Windows (possibly macOS too but you can opt out of telemetry on macOS).
No, your advice is predicated on a blind hatred for MS without regard to the fact that the alternatives also have negatives and almost all companies are going to screw you if you let them, not just MS.
What "blind hatred of MS" are you talking about? Could you cite what exactly you think implies that? It seems you are confused.
What problem do you think I'm providing solutions for? And what is your proposed solution to the problem?
Linux and Mac are only viable if:
- You can stand their interfaces. Linux is configurable enough that its probably OK but macOS is a bloody nightmare to use when you're used to Windows.
- You can configure it. Applies mostly to Linux in order to deal with #1 since Apple's UI design motto is basically "do it our way or fuck you." This is not really an easy chore and requires some fairly strong computer skills if you want anything beyond the defaults.
If you'd rather suffer lack of security updates because you can't cope with any non-Windows GUI then - aside from being a pretty lame attempt at an excuse - you're only stuck because you choose to be.
- You don't require any software that runs only on Windows. Yeah VMs work but they start getting into the previous point of requiring computer skills. Plus they're typically a pain in the ass and always at least a little bit slower compared to running applications natively.
They're typically not a pain in the ass, in fact they're typically VERY VERY easy and with modern virtualization they tend to offer near-native performance, but most people that have some program that only runs on Windows can cope with a minor performance hit as a tradeoff to running Windows 10 natively wrt privacy. So what specifically do you find so difficult about VMs?
Never mind if you're into games that don't have Mac ports (and Linux gaming is still barely worth talking about..)
If you really need those games then dual boot into Windows 10 just for games and nothing else.
- And even if you set up the VM, all you've done is push the problem from the hardware to the virtual hardware -- you're still running Windows on that VM and unless you're running a clean image every time you start the VM, you've got all of the same problems (and of course doing the clean image plan has its own massive problems in terms of convenience.)
What specific problem are you talking about that has been "pushed"? I specifically said a "sanitized VM" give it only access to things it absolutely needs and only do the things you absolutely need Windows for, that eliminates almost all of the issues.
- And then forgetting all of that, you have to rely on your replacement OS to not be just as bad.
Yes you do seem to be making every effort to come up with excuses as to why it's all just completely hopeless and everybody should just do what Microsoft says and dutifully upgrade to Windows 10. I'm providing solutions, you're making excuses, if you want to just knuckle under and do whatever Microsoft tells you then go ahead. I'm not trying to stop you, my advice is predicated on a willingness to go to even the slightest effort to mitigate some of the bad things in Windows.
..no, they just want to put a gun to everyone's heads and force them to use Windows 10. Really, they do.
Oh don't be so melodramatic.. We've had viable alternatives in macOS and Linux (which of course you also need to run newer versions of if you want to run these newer processor architectures) for many many years now and if you're only just realizing "oh maybe Microsoft doesn't have my best interests in mind" then that's your own fault. Even if you really really need to run that program and it's Windows only? Run it in a sanitized VM, again a solved problem for many many years.
I agree with you on the way the instability of the kernel ABI has an effect on binary drivers but even then there is a solution in the form of compiling a kernel module to load the binary driver - which is what nVidia (and others) do with their binary drivers. Of course doing that requires a compiler and kernel headers which Android may not provide and it might not be practical to do so.
Google could enforce all of this through their licensing of Android in the way they enforce having their apps installed with Google Play Services...but they don't and it's in the interest of OEMs to get people to buy a new device rather than maintain older ones especially when the margins aren't particularly large.
My point was only that the binary blob requirement has, for the most part, always been there whether it's Linux on the PC or Linux on the smartphone.
Not only has it always required binary blobs and Android, but Google routinely dragged its feet in releasing the latest updates to the Open Source "community".
Requiring binary blobs is pretty much the status quo in the PC market too, there are very few fully open source PCs and of course there are also very few fully open source phones. That isn't to say many people haven't tried but it's hard to develop and fund something that nobody wants.
There is a huge difference when an application uses spyware to when a whole operation system uses spyware.
AC isn't talking about an application, he/she is talking about a company that collects tracking and analytical information on billions of webpages that you could visit on the web.
If you don't trust an application then you just don't use it and use something else that you do trust.
And you should do the same for your operating system. Linux and macOS have long been viable alternatives and if you really need a program that only runs on Windows then you can run Windows in a VM, isolated from everything else. This has been a know solution for many years already.
And yet the numbers are clear that Edge is superior.
If you really think that then just run the tests on Chrome on the same system in both Windows and Linux and you can prove or disprove your hypothesis.
It wouldn't surprise me if MS has added code into Windows 10 to drain a battery faster if certain conditions are met.
If that really were the case it would be trivially easy to prove: Just run the same benchmark on Windows and Linux. No need to speculate.
I somehow suspect that Windows won't exactly give me the option to say 'no' to this update.
That's right, going forward they're trying to keep everybody up to date. It's much easier to address security and stability issues when the operating system is the same on all the systems, especially given the millions of different possible hardware configurations.
But this is hardly a new thing, my Macs don't forcibly update but they certainly constantly ping me to tell me I should install and prompt to allow auto updates because keeping the OS up to date is generally a good thing. If you consider it an afront to your freedom or control then you wouldn't be using Windows anyway, instead you would use a Linux or BSD system and just use Windows in an isolated, sanitized VM if you need to do Windows-specific things.
Yes they did. It was the only way to hit their Thermal Budget (which was a very good thing!)
You may have a point there, the rate at which the fans ramp up just displaying the most simple webgl pages I would hate to see what happens if you increase the thermal budget.
I most certainly do. In this case, I think those users are actually WRONG. Difference between "perception" and "reality". Those users PERCEIVED that there was some great impediment to using their legacy peripherals with the new MBP, when in the vast majority of cases, there was not.
No they didn't percieve some "great" impediment. The fact is using any non-USB-C device on the new MBP is clumsier than on previous generations.
The fact is, there are many "peer" laptops released at the same or nearly same time that also had a max. RAM of 16 MB; but nobody seems to target THEM as being "limited RAM".
What's with this idiotic mentality? If you can point me to the "peer" laptop that runs OSX then certainly I'll call that out as having limited RAM.
There is nothing "relatively poor" about the performance of the AMD GPUs.
I can see this is upsetting you and rendering you completely unable to understand that there is more to the world that what Apple puts out, comparatively the performance of AMD GPUs relative to nVidia's available mobile GPUs is poor.
Apple made a design decision to support more/higher-res external displays at the expense of some gaming performance. Many more people use Macs for high-end graphics and monitor-heavy applications like video-editing than they do gaming
I tried to explain this to you before but - as seems to be a theme with you here - you aren't reading what is written, this has nothing whatsoever to do with gaming performance, I have absolutely zero interest in doing any gaming at all on a Mac.
So, Apple gets "dinged" for putting "obsolete" hardware in their designs; but when they put in the latest (laptop) CPU, highest-level External-Display Support (in a laptop), and best I/O Ports for the next 5 years (that are inexpensively and relatively painlessly backward-compatible to most legacy ports), they get hammered for that, too???
Why are you taking it personally? It's a company that makes computers and you're getting all emotional as if my criticisms of the deficiencies of the product are criticisms of you personally.
So, what's a computer-designer to do? Look back, or look forward.
Or do both, like, you know every other computer does. Like what Apple themselves have done. You really believe Apple can't figure out how to put ports of multiple types in a laptop? Did they suddenly forget how to do it? I'm pretty certain they've done it on just almost every laptop they've built in the past.
I'll try and make this as clear as possible for you, if you read one thing in this post before you get all angry about it then read this:
This is not a criticism of *you*, this is outlining the problems the latest Macbook Pro (and again, that is not you) has with respect to the use cases of *some* (not all, so again probably not you - and not necessarily gamers) of the professional market of OSX users. Please try to understand - as Apple (again not you) have done with the Mac Pro - that it doesn't serve everybody as well as it could.
I have friends that work in engineering at Apple that I have communicated this to and even *they* understand the feedback and even *they* don't take it personally like you do. Now obviously the secretive nature of Apple means they haven't provided any information on whether these issues will be remedied but they at least acknowledge the issues as issues and don't get all upset about it like you do.
But not QUAD CORE Kaby Lake's. Check again.
They didn't have to use kaby lake.
I happen to AGREE with the decision to use USB-C/TB3 on the new MBP. Seriously.
Good for you, it would seem that maybe the reason you're so defensive is that you missed the key point that I said first and then also reiterated that I am referring to ***some*** of the market, just like Craig Federighi said "it didn't well suit some of the people we wanted to reach". You understand what that means right?
The limited RAM and relatively poor GPU performance means it, like the Mac Pro, doesn't suit well to some of their supposed target market. As for the USB issue it is a small issue but Apple built a reputation on being sleek and efficient rather than having clumsy solutions and they've certainly regressed in that respect.
Now if you like it and can suffer the performance issues and don't mind the clumsiness then great, that's good for you. The fact that it doesn't work well for me is also fine, I'm not sure why you have such a problem accepting that, but I hope they remedy that in the future just like are doing with the Mac Pro.
RAM limits are Intel's fault. They haven't kept up with their own Timeline.
No, other laptops, including ones like the Razer Blade come with 32GB of RAM.
Apple chose number and depth of displays over gaming performance with the GPU choice.
There are many things besides gaming that GPUs are used for, in fact Apple are quite invested in projects like OpenCL. Regardless I'm not phased about why they made those decisions.
so it was a smart choice, IMHO.
And if they backflip like they have with the Mac Pro you'll agree with that too.
I simply don't know what you mean by "very little works with it". Very little worked with the USB Ports on the original iMac, too
You say "i don't know what you mean by" and then in the next sentence you use the exact same phrase in the exact same context. I didn't carry my iMac around so adapters weren't a problem.
I don't see why you're so defensive about this, I'm pointing out the problems (and many many people have voiced the same complaints) that it has for my use case. The same thing happened with the Mac Pro and ultimately Apple have listened to their customers. It has shortcomings, you don't have to take it personally and I hope they resolve those.
"Our vision is to speed up time, eventually eliminating it." -- Alex Schure