Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re: Good decision? (Score 1) 352

Every argument is not about defending at attacking windows UI. This one is against your misconception about GPUs being sentient beings.

So you've still not read, or understood the statement I repeated in my last post. GPU doesn't do anything on its own. It needs a driver. Lacking a driver, you cannot find a GPU, any class, that can draw a single triangle.

No fooling? With over 30 years of embedded dev. experience, I never would have thought of that! (rolls eyes)

But what I have been trying to get through everyone's collectively addled brains is this:

The excuse that "Windows' 'Moderrn UI' has to be "simple", because they have to work with a wider-range of (Desktop-Class) GPU hardware" is patently absurd, due to the fact that the Windows' software engineers (OS and Driver Devs.) should be able to code an interface with as much "UI-finesse" as what is available in OS X (which is undeniably more "advanced" than the Windows 'Modern UI'), using any reasonable "Desktop-Class" GPU.

I do believe, however, that the main reason that MS decided to make "Metro" so bog-simple (no "shading", no "textures" and no "overlapping windows"), was because they wanted (which is way different than "had to" ) come up with an interface that wouldn't tax the capabilities of Phone and Tablet-Class GPU hardware.

IOW, whereas Apple wisely matched the UI of OS X and iOS more closely to the TYPICAL "Class of Devices" that they were running on (Desktop vx. Mobile), MS just "raced to the bottom" with "Metro", and forced all their "Desktop" users to unnecessarily suffer from a "Lowest-Common-Denominator" UI.

In short: Microsoft took the lazy way out, and then tried to pass it off as a "Unified" UI design.

Comment Re: Good decision? (Score 1) 352

OK, so

1. Microsoft made a decision - to use "primitive" graphics.

2. They have a business model where they need to support a wide variety of graphics chips.

You are saying 1 was surely not caused by the driver insanity resulting from 2. Based on what?

Jeebus! Are you just TRYING to be obtuse; or do you REALLY have a mental defect?

Based on the fact that you can't FIND a "desktop"-Class GPU, that couldn't do stuff like Apple is doing with Mission Control and Spaces, and even at least support for two monitors.

And are you REALLY here to DEFEND Windows "Modern UI", or just argue against me?

Comment Re: What about other devices? (Score 1) 421

Exactly.

I am an embedded Developer with over three decades of paid experience. Do these people really think someone like me (or me) doesn't realize that technically, these devices could be considered a "computing device"?

But, the less fanatical among us nerds, you know, the ones that don't have to prove that they are "smart enough" to get Linux to run on their toaster, just because they can say they did it (woohoo), realize that these are still, at the end of the day, Appliances with an Embedded microcontroller, or System-on-Chip, inside.

So, with that in mind, is a device with a mask-programmed microcontroller a "computer"? You can't run arbitrary code on it. Isn't the microcontroller just another form of ASIC at that point? You can't install Linux on it, any more than you can do so on your Cat. Yet inside that MCU, it's the same CPU core, same RAM, same peripherals, running the same instruction set. The only difference is that it has been built with a last-mask that happens to have a printed pattern on it that causes the part to act as a particular state-machine. But is it a "computer". No, it is not.

So please quit trying to Impress yourselves by declaring just any-old-thing that happens to have an MCU in it a "computer". Because, in just a very few decades (when it will be even harder to find anything that isn't an Embedded System), people will simply look at you like you're daft, punks.

Comment Re:No vendor should be allowed to cram any kind of (Score 1) 421

... software down the users throat.

I don't care if it's free or not. If it's annoying or unnecessary, I don't want to have to spend two hours to rid my newly bought computer of crapware I don't want.

Then buy a Mac. Not one whit of "crapware". Macs used to come with "trial versions" of MS Office; but I don't think that has been true since they developed the iWork suite. They also had a "trial" version of that, too; but then started simply including the suite for "free" with new Macs.

Comment Re:Apple? (Score 1) 421

If OSX comes "free" with their hardware, but is also sold separately - or even just has a defined value separately - they will likely fall afoul of the law.

Both Mavericks (current version) and soon-to-be-released Yosemite versions of OS X are distributed FREE; but are only licensed for use in Apple-Approved hardware.

Just because you might be clever enough to download Cisco's Router firmware from a Cisco download site (read "App Store") into a D-Link router (and even make it work), does not magically transmogrify that Firmware into Libre "OS" code.

Comment Re:Apple? (Score 1) 421

They might have a better defence as the OS is free. If anything where it might get interesting is that effectively you are buying the OS and it comes with a machine. Thus there might be a way to convince a judge that where Apple is going legally wrong is to insist that you use their machine.

That will never happen, unless you can make Nikon offer its DSLR cameras' OS ("Firmware") to Canon EOS Rebel owners, too.

When an "OS" is offered only for a particular manufacturer's products (which is undeniably the case for OS X, but not for Windows (or even Linux)), then it is more appropriately termed "Firmware", and should be considered simply one more BOM component, like the CPU or Memory.

The fact that Apple specifically states that OS X is licensed only for Apple-Approved hardware (and the fact that they are the authors and publishers of same), only strengthens their legal position.

Comment Re:How does MS get away with it in the US? (Score 1) 421

Can't wait until the tie in between OS and hardware for Macs is shut down too. Being able to use MacOSX on any x86 compatible computer or buying able to buy a macbook without the OS (and don't give me the "Apple give away the OS!" crap, its value is baked in into the hardware, its just indirect...).

We wouldn't want any double standard here!

All Apple has to do is start calling OS X "Firmware", and that neatly sidesteps the whole issue. No court is going to say that a product can't be sold with Firmware written by the same manufacturer.

Think about it.

Comment Re:What about other devices? (Score 1) 421

It only applies if the OS and device are really two separate entities. For Macs you could argue that you should be able to buy the device without the OS. For phones, it seems that the OS is part of the device, especially in case of iPhones (what else are you going to run on them). Keep in mind that iOS isn't sold separately either, nor are there any charges for upgrades.

That's because the iPhone (which really should be called a computer) is locked down in the firmware by the manufacturer to only run operating systems provided by them. If they would disable this blocking then alternative operating systems could run on the iPhone. It has in the past when good hackers were able to work around Apples attempt to dominate the user, but that has not been successful recently.

I argue that the iPhone and iPad really should not be called a "computer" (unless you also want to call your microwave oven, TV set, A/V receiver, DSLR, DVD/BD Player, VCR, etc. a "computer"), because there are simply absolutely no practical alternatives for the Firmware "OS" that completes the "product" design.

Even a "jailbroken" iPhone is still running iOS; otherwise it would be useless as a phone. All that the "jailbreaks" do is provide a method whereby "unsigned" software packages can be "side-loaded" into the iPhone. However, that unsigned software must still be developed in XCode to run under iOS.

I the case of the Mac, you may have had a (weak) argument; but now that Apple distributes OS X for free, I'm pretty sure that the Italians won't be interested in going after Apple.

Comment Re:What about other devices? (Score 1) 421

Since computing is moving to tablets and phones, can we get OS refunds for iDevices and Android tablets and phones also ?

Also, is this applicable to Macs?

I wondered the same thing. But now that the most recent two versions (Mavericks and Yosemite) of OS X are free (and iOS has been free for quite a while), I don't think that the Italian gummint could force Apple to assign a "price" to that which they are distributing for free.

Comment Re: Good decision? (Score 1) 352

OK, that is fine, I'm not sure you read "But as a general statement it is completely not true that a competent GPU is capable of doing something on its own."

So if you didn't talk about GPU drivers, you are missing something important.

No. I just assumed that anyone reading and posting on Slashdot understood that in both cases (Windows 8's "Whatever-Replaced-Aero-Glass"', and "OS X's "Quartz Compositor" (or whatever it is called these days)) that the difference between using those APIs (and the lower-level drivers and hardware) to acheive either the Windows 8 "Modern UI" or OS X's "Window and Desktop(s) Management" could not be attributed to having "tightly controlled" Hardware selection (in Apple's case), vs. "Anything Goes" hardware selection (leaving the Windows Approved Hardware List out of it for the moment), because, in each case, the actual underlying hardware and software was almost assuredly capable of presenting essentially the same complexity of display, and that the REAL difference was that Microsoft simply (and IMHO, wrongly) thinks their primitive UI is what "the people" actually want and need (but which "the people" have simply and roundly rejected, as referenced by W8's frighteningly-low adoption rate)..

Comment Re: Good decision? (Score 1) 352

So obviously, it isn't the tightly-spec'ed hardware (since what Apple is doing could be handled by any competent GPU designed in this century)

Well, (both) the competent GPU makers suck so hard at driver writing, that black holes have developed an inferiority complex. So on top of a competent GPU, one needs a competent GPU writer, or at least hold the hands of GPU makers in writing drivers for an OS and/or test the drivers beyond imagination.

Apple is doing that well with their limited GPUs supported. But as a general statement it is completely not true that a competent GPU is capable of doing something on its own.

Actually, I wasn't actually talking about the GPU-drivers, or their authors (except to attempt to make the point that OS X's advanced use of the WIMP UI (as compared with Windows 8) is not "rocket-science", GPU (or GPU-driver) wise; but rather the authors of the OS (as embodied by their employer, i.e. Microsoft or Apple) are the "Rocket-Scientists" here. Because it is the OS that contains the functions and functionality that enable all that cool (and useful!) Window and "Multiple-Desktops"-Management in OS X, that makes Windows 8 look and feel so downright antediluvian by comparison.

So, what I was saying to the GGP was that "This has NOTHING to do with 'Restricted Hardware' choices (because any number of readily-available hardware combinations could easily handle the task), and EVERYTHING to do with OS Designers and Developers, and their ability to solve the UI challenges in an elegant, easy-to-use way."

Comment Re: Good decision? (Score 1) 352

When you have a relatively small customer base and are highly restrictive about what hardware your OS will run on, you have a lot of freedom to be very VERY controlling of your environment.

Seriously?

Within a very large "set" of possible motherboards, video cards, etc, What possible bearing would the range of a certain class of hardware that an OS can run on have to do with whether that OS uses featureless, monochromatic "tiles" that look like they were designed by a six-year-old (but which are running on a GPU that can crank out 25 zillion individually shaded and textured polygons per second), and which barely knows how to do an overlapping window, let alone multiple desktops, as opposed to an UI that actually looks like it was designed by someone who not only implemented easy-to-use features to compensate for systems with limited screen real-estate, while taking full advantage of systems with multiple displays? (Yes, I am fully aware that other OSes have supported things like multiple desktops for some time; but this is about Windows "Modern UI" vs. OS X).

So obviously, it isn't the tightly-spec'ed hardware (since what Apple is doing could be handled by any competent GPU designed in this century) (trackpad gestures notwithstanding). So maybe, just maybe, it is something else, eh?

Slashdot Top Deals

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...