"Suppose a stadium is holding an event; knowing how much traffic is making its way toward the arena might help the venue change its parking lot resources accordingly, he said."
Suppose you wanted to cut some carrots, but they were really thick. Wouldn't it be nice to keep a raised guillotine in your house for such occasions?
I really think we should have a dual focus, neither of which involves Mars:
1. Permanent manned presence on the moon. It's ridiculous that we went there, poked around a little bit, and now we have folks saying we should basically forget about it and send manned missions to Mars? Huh? What a crazy waste of an opportunity to test out long-term space solutions nearby, where we can monitor things closely and have round-trip human travel.
2. Robotic exploration and mining of asteroids. It's silly to expect that every visit to every place in our solar system will have to involve people at all times. Asteroid mining may not be a way to get rich quick, but it's a big goal and realistic task that will further advance practical space exploration and travel.
Long-term, both of these tasks have to be privatized, just as airplane flight started primarily as government/military and became commercial. Once we've worked on long-term human habitation beyond earth orbit, and once we've been able to have industrial-scale machinery operating in conditions far from earth, then it would make sense to leave the moon and asteroids to companies and move on to Mars.
I think a lot of the Mars-or-bust folks took away the completely wrong lesson from the moon missions. The moon missions were one of the most inspirational things mankind has ever done, true. But we only half did it. I can't believe anyone alive at the time didn't believe that within 40 years, we'd have a permanent presence on the moon. That idea had been floating around for a century before the moon missions, and was foremost in most of our thinking.
But then, after poking around a bit and doing all kinds of circus tricks (drive a golfball, speed around in a vehicle, etc), we abandoned the moon and set our sights somewhere else. The resulting drop in achievements and public inspiration (including the desire for funding and the desire to enter science careers) is NOT because we didn't have humans going somewhere humans have never been before, risking their lives, it is because we didn't even try to achieve the expected and exciting goal of a permanent presence. We can't look up at the moon and know that people are living and working there.
I keep hearing people talk about wanting (or worse, "needing") a stylus. Handheld devices have had styli for decades now. They sucked then and they suck now: slow you down, easy to lose, etc. Phone keyboards? The same: phones have had them for decades. They sucked then and they suck now. That's why the iPhone was such a big hit: no stylus, no keyboard. Bigger screen? Something of an acquired taste for those who spend a lot of time away from a laptop/tablet/home.
I just don't see those things adding a lot of sales. Mostly, it's that Android phones are cheaper: Google gives the OS away so that it can spy more effectively on you -- Google makes all of its money from selling information about you, not making products -- and what manufacturer and carrier don't like free and don't want to mess with the user experience? Apple has bravely fought the carriers, who screwed up cellphone interfaces for decades, but Google doesn't really care about that issue, since it doesn't actually market to end users.
So it boils down to price (cheap) and carrier control and I'd say that affects world-wide sales a LOT more than people who want to step back two decades to physical keyboards or styli.
Sorry, left WoW a while ago and am really enjoying Guild Wars 2. GW2 isn't perfect by any means, but it looks like they did a lot of thinking about WoW's (and other MMO's) plusses and minuses and they kept most of the plusses and rethought/fixed the minuses. Not sure why I'd step back to a game like WoW that was fundamentally flawed in so many ways.
Patent trolls don't make anything. They simply hold patents and sue others to make 100% of their profits. This keeps their overhead down, and ensures that they can't be counter-sued by their victims who might want to use their own patents to strike some kind of cross-licensing deal.
Apple isn't a patent troll.
Say what you will about Apple, but without them we'd still be the slaves of the cell carriers, who kept cellphones painful for decades. Nokia, Samsung, Motorola, et al, deserve a pox on their houses for going along with the carriers all that time. Finally, Apple stood up to the carriers and now people expect to get phones that work conveniently and do a lot. Google, by way of contrast, doesn't make products for consumers but rather makes Android for manufacturers and CARRIERS. And Google doesn't fight real patent trolls because Google doesn't make products: they make all of their money by spying on you, so there's nothing for patent trolls to gain injunctions against.
We haven't even gotten into the other kind of patent abuser (Samsung, Motorola -- now Google) who get their patents inserted into international standards which everyone is required to use. They promise to charge reasonable (small) fees and not use the patents against other companies, but those are the patents that are being thrown at Apple in an attempt to force Apple to cross-license its patents that no one is forced to use. Yeah, that's bullying.
Apple wants to dump MacOS and you know this how?
Developer tools have been free since MacOS X came out. Apple supports many open source projects. Apple is still porting new features from MacOS to iOS and vice-versa. In fact, it's Apple that has maintained differentiated laptop/desktop and handheld interfaces at a time when Microsoft, for example, has bet (again) on a unified interface. On the other side of the coin, the iPhone and iPad are very similar, but Apple also maintained a differentiation that Android did not initially do. (So Android tablet apps were simply upscaled Android phone apps.)
Seems to me that Apple has thus maintained a useful and meaningful differentiation between product form factors where others have blurred the lines. That would suggest that MacOS will in fact continue.
Also note that it's Google that has no actual products, but instead makes its profits entirely through gatekeeping. Apple actually makes products that it sells to end users, while Google's customers are advertisers and carriers, not end users. The likelihood of Apple switching to the Google model is pretty low.
Um, you do realize that Apple's patents were Apple's, and they don't have to license them to anyone. Motorola's patents were incorporated into a standard, so Motorola has to offer them at reasonable rates to anyone who wants to use them. Big difference.
You're statement about "rounded corners" is silly.
You do realize that "boots on the ground" carry weapons, call in artillery and air support, etc, right? And boots on the ground attract attacks by guerillas dressed as civilians, so ultimately result in higher-risk situations for actual civilians who make a mistake and appear threatening to someone who has a couple of seconds to decide whether they must respond with deadly force in order to protect their own lives.
To be honest, it sounds more like you think that drones are unfair, since it's asymmetrical: the high-tech trigger-puller's life is not in danger when they pull the trigger. That's a legitimate concern, though a boots-on-the-ground campaign has the same asymmetry in reverse. "Boots on the ground" will not restore chivalry, single combat, or even set-piece army-on-army battles to conflicts, nor will it clear the fog of war. Drones may be out of hand, but let's not pretend that the US is carpet bombing villages harboring guerillas.
You're completely wrong. There are several dozen emails from Climategate that are disgusting. They do not show a group of scientists besieged, but a group of politicians trying to destroy those who disagree with them and working together to spin their research to their own advantage.
People like you have spun things they said. For example, there was the talk of the "trick". The spinners said, "Hey, a 'trick' is a techie name for a clever technique, not a 'trick' as in 'trick-or-treat'." Well, yes, people who saw the word "trick" and immediately assumed nefarious things were mistaken. But the spinners carefully avoided what the clever technique (trick) was doing: it was mixing proxy and actual data and obscuring which was which in order to hide an inconvenient divergence that might call into question the usefulness of the proxy data. So the trick (clever technique) was in fact used as a trick (deceitful tactic), and the spinners spun the two concepts until they could deny the trick (deceitful tactic).
Very true. Even though I'm very mathematically-inclined, I had to be forced into statistics. I took diff eq and partial diff eq, and got A's and B's, but somehow the probability and statistics course was barely a C. It just didn't click with me the first time I saw it, or the second, though about the third (graduate degree), it began to make sense and I've been studying it intently on my own for a couple of years. (It's now a pleasant and relaxing hobby, actually.)
I'm suffering from the gaps of a self-taught student, which are mainly reflected by a fragility: many of the tools and techniques make sense and I use them, but in tricky situations I don't know enough about them to know what to do.
I can't answer for the other poster, but I can say that self-taught always leaves holes in your learning. You don't know what you don't know. I'm in that boat right now: I've been teaching myself statistics over the last couple of years.
In terms of programming and the relative quality of self-taught programming, it of course depends on two factors: 1) How rigorous is the formal schooling, and 2) how explorative, disciplined, and self-aware is the self-taught programmer?
The best programmers have used a bunch of languages, and their mind expands with each one. The best programmers worry about preventing or catching errors instead of coding straight-ahead without checks. The best programmers do a bit of intelligent over-designing, based on their experience with how projects morph over time. The best programmers consider working with others and worry a bit about maintenance and modification of their code. The best programmers learn to discover the user's NEEDS rather than their wants or their opinions.
You can do this self-taught. You can do this with rigorous training. You can learn this with real-world experience. I certainly agree with other posters that someone who was doing programming on their own and then (but not too much later) gets formal training may be the best combination: someone who loves the field and has a knack for it, but has also had others helping to fill in their blind spots.
You do realize that Steve Jobs himself offered this kind of cross-licensing to Samsung and Samsung refused? They wanted to use their FRAND technology patents to force Apple to give them its design patents. And it looks like Samsung is about to try to use their LTE FRAND patents against the iPhone 5.
Wow, that's a cool theory. I'd have to check the timing of things, but that would be a brilliant way to switch from sugar to HFCS (curses upon it and its inventors).
Three issues with your "Vista is installed on more machines than all versions of MacOS combined" idea:
1. Are you counting iOS (iPhone, iPad, iPod), which is substantially based on MacOS? Which is probably the reason MS is going with its Windows 8 strategy.
2. I'd guess that the vast majority of these Vista installs were what business machines shipped with. The consumer market is different, and its growth is also something that's driving MS's Windows 8 strategy.
3. MS' profit margin is pretty high, but if you look at the smart phone and tablet markets, Apple is making most of the profits, and its laptops have pretty much captured the most profitable segment of the traditional OS market. So installed base isn't really a good measure of impact.