No workstation has ever used OS X, because Apple has never produced a workstation grade computer and Apple does not allow their operating system to be used on workstations made by those who do.
Why would anyone assumed they weren't using their search engine to promote their services? Why shouldn't they, for that matter? It seems like common sense for Google to do this and for users to expect this.
I became interested in wearables when I read Google's creative vision for the Android Wear platform:
Prior to reading that, I also didn't see much point in them. However, if Google can deliver the system they describe, I am absolutely on board. It's a short read, worth it if you want to know why people are into the Android Wear idea IMHO.
From what I've read about Apple's device, their approach seems less focused.
the moto 360 could almost recharge in the time it takes to shower.. 15-30 minutes depending on where it starts from. I'm assuming the apple thing will be similar. its certainly not ideal, but it also isn't like you *have* to charge it overnight. for whatever reason I usually end up setting it on the charger while I get ready in the morning.
I doubt that this is being done at the insistence of Apple's marketing department, but I also doubt it was done without careful consideration of the potential impact on Apple Inc. and would not be happening if the anticipated results were negative. As the officer of a public corporation, there might even be legal obligations not to damage the company with public announcements like this (IANAL, hell I am not even mildly interested in law, so that's just a guess but it seems sort of right).
Most likely, any positive effects are happy coincidences. Surely Apple is very aware and willing to take advantage of them, but hard to believe they are the motivation behind the whole thing. Apple isn't exactly desperate for some marketing.
so... any manufacturer that makes a device to meet or exceed these parameters is selling a "Retina" display?
"iMac with Retina display"
What does this mean? Is Retina is a technical term that should convey some specific meaning now?
It's easy to let rose tinted glasses come into play when looking back at computing's past, and excusable to a certain degree, but your claims really push the limits of how far I can distort my direct memories of some events.
For one thing.. the Mac was always a small if not a niche player in the personal computing realm. DOS or Windows based PCs have outsold them by a large factor, from a minimum of about 6 times as many sold in when the Mac was introduce in1984 to over 50 times more sold in the mid 2000s and about 20 times more Windows machines sold than Macs today.
This means Apple didn't introduce anything to the vast majority of computer users. Most computer users have never owned a Mac. You could argue that Mac's existence made Microsoft's GUI better indirectly, but there were GUIs before the Mac that made the Mac better too (some might say without Xerox's research the Mac would never have been possible), so that's a difficult thing to quantify and fairly give Apple alone much credit for. Ultimately, the role of actually introducing the masses to computing was played by Microsoft products, not Apple products.
Your fond recollection of the iPhone is over enthusiastic as well. Apple sold it's 500 millionth iPhone earlier this year. They won't make it to a billion until 2017 without some serious changes in trends. "Total market domination" is an "aggressive" way to describe a device that has never accounted for even 20% of the smartphone market.
If the startup made the same huge profit margins that Apple does, I don't see why doing any of these things would be a problem.
The real lesson is that you'll need your customers to pay *a lot* more than it costs to make something if you want to do silly, expensive things while making it.
I see a lot of "I can already do the same things on my phone" posts, but that doesn't close the book on smartwatches for me.
A phone can do everything a tablet can do. A laptop can do everything that a phone or tablet can do. A desktop can do everything a laptop can do. Go back to the start of microcomputers and there is the "the office mainframe can do everything a PC can do" argument.
All of the above platforms succeeded because they helped people do mostly the same things (or a subset of overlapping things) but added some advantage in portability, cost, ease of use, etc.
Google's official vision for Android Wear is interesting because it clearly explains where Google thinks the advantage for watches will be: timely notification of important information *without* interrupting what you are doing. If you're not understanding why you'd want to own a smartwatch, it is worth reading:
I wouldn't say the current crop of hardware and software delivers this vision all that well, but if this is the goal then I will remain interested. Apple's strategy seem less focused and more of just "hey, it's yet another thing like an ipod or an iphone or an ipad, but tiny". Maybe that isn't accurate, I haven't spent as much time reading about their platform yet, just a first impression.
I gave up on the Pebble long ago (found it mostly useless), but one thing I do remember is that I was constantly running out of battery while out and about. I've been using an LG G watch for quite a while now, and running out of battery has never been an issue.
It is counter intuitive, but the LG's ~36 hours better life actually works much more reliably for me than the Pebble's two days (when it goes all buggy) to 7-8 days (when it doesn't, and you mostly ignore it), The reason is that I *know* I charge the LG every night, just like my cell phone. I quickly got into the habit of setting the watch down on it's charger thing as I put my wallet, phone, whatever on the nightstand etc. The LG is forgiving enough if you do forget to charge that you'll probably make it at least till noon the next day, but I almost never forget for the simple reason that it's a short, consistent pattern.
With the Pebble, I didn't want to wear out the battery by partially discharging each day then charging back to full each night (TBH I don't know if it's battery works that way, I just figure less charge cycles is probably "good" somehow.. shrug). So I tried to remember to check, pay attention to the low battery alert, whatever. It just didn't work. It was much more tedious and failure prone then just knowing I set something in a certain place each night. Sure, I could have just said screw it and charged it every night regardless, but then it's only equivalent to the Android options at best.
yeah, except for the part where it actually works fine in sunlight.
I've been messing around with an LG G watch for several weeks now (part time android dev, just getting to know the platform). It lasts over 24 hours on a charge, often approaching 48 hours if I don't fool with it a whole lot and turn off the "always on" display. To me, this is quite reasonable. Set it on the charger when you take it off each night, but if you do forget, it will still work for a good part of the next day. FWIW the battery life on the Moto X is similar, about 36 hours, and being able to use the phone all morning even when i fail to charge the night before has been very, very handy. I suspect if you're a heavy watch user you would appreciate the "day and a half" range of battery life in a similar manner.
As for the 360.. Initial reports from early users said they were getting "almost two days", which would correlate with what I've seen on the LG G, and make it perfectly reasonable in the battery life department IMHO. However, as the official reviews come out today it seems they are reporting ~25% of the LG G lifespan, which is insanely awful... If true, that puts the 360 firmly into the useless category.
definitely won't be people
Since we can assume XP will never change once support is over, can't we then do new things to secure it that were impractical in the past?
Hard coded file checks, read only filesystems, out of band checks and so on.. It wouldn't take much to install Linux on a USB key and have it check the local HDD or even just overwrite the OS files at boot, and that's just the first idea that comes to mind. Maybe a bios that won't boot if any of the xp boot files are changed, etc. I'm not saying it's ideal, but it seems like a once moving target is now static, so maybe that can be leveraged to create some safety, especially for the types of systems that are required to continue using XP (I.e. not consumer desktops).