It doesn't force you onto one platform, though it does limit your options to two of the most popular ones: Windows PCs and Xbox. But in the case of Windows PCs, they also support OpenGL, which is part of why I said that it's no surprise that OpenGL is being so widely adopted now. And from what I understand, DirectX does have some technological advantages going for it, though I am by no means an expert in that area, so I won't even to list off any. Even if it didn't, however, competition, I think we can all agree, is better for everyone in the long-term, and since DirectX is the only major competitor to OpenGL, I'd rather see it stick around than see it be abandoned.
Just to mention something, they already had a legal term for patent trolls before the Internet or whoever came up with that term. Companies that engage in these sorts of tactics are called NPEs: non-practicing entities. They're companies that litigate without actually making anything based on the patents that they hold, and you will find that term all over the place if you start looking into various patent trolling statistics and cases.
I just popped open the Mac App Store and took a glance at the first page of games. Just to name a few that were listed, there's Borderlands 2, CoD: Black Ops, Batman: Arkham City, Deus Ex: Human Revolution, Civ V, Bioshock, Amnesia, Witcher 2, Assassin's Creed II, and XCOM: Enemy Unknown. And if I pop open my copy of Steam, I can find pretty much all of Valve's titles, as well as a whole lot more. Granted, they're not all the latest and greatest (e.g. Bioshock, not Infinite; AC2, not AC3; Black Ops, not Black Ops II), but it's a wide selection of well-known games from a number of developers.
Jokes like yours are funniest when they use humor to take the edge off of a point that would otherwise be painful to swallow, but yours is simply off the mark entirely. Unreal, Source, Gamebryo, id Tech, IW, and Unity engines all work with OpenGL and have a number of games out using it. There are strong rumors that Crytek already has an in-house version of CryEngine 3 running with OpenGL, and based on job listings at DICE, it looks like they're porting their Frostbite engine over as well for use with Battlefield.
Given the disappointment that some of the major game developers have expressed (e.g. Gabe Newell's public statements) towards Windows 8, along with Microsoft's signals that DirectX may be at its end of life, is it really any surprise that all of the major game engines have already been ported or are in the process of being ported to OpenGL? Even more so when you consider that the two major smartphone OSes (i.e. the platforms on which most games today are now played) only make use of OpenGL? Not to mention that on gaming devices that support one or both of OpenGL or DirectX, all but one of those devices (Xbox) supports OpenGL in addition to or to the exclusion of DirectX? And the fact that Linux is quickly gaining recognition as a high-performance gaming platform and is getting some love from developers and publishers? Finally, is it really all of that surprising that the developers are actually making use of these game engines to put games on as many devices as possible?
Mind you, I'm not suggesting that DirectX should be abandoned, by any means, since it's still quite powerful and is still the library that's used on one of the major consoles out today. All I mean to do is point out the folly in your assertion that OpenGL is not being utilized in games.
Actually, Amazon really isn't making that much money once they cover their costs. They were in the red for the last quarter or two, as I recall. They may be bringing in a lot of money, but that's because they go for the razor-thin margin/high volume approach that Walmart exemplifies, and those margins haven't been sufficient to cover all of their costs recently.
To be the best of my knowledge, none of us have any student loans.
As for the value, my only gripe with your original comment is that you've overgeneralized, and you've done so again here. For instance, rather than being a good gauge of the value of a degree, at best, we can only say that it'd be a good gauge for the value of a CS degree for people pursuing the work we're in. Saying anything more than that would be to engage in poor reasoning. For instance, if the grad school attendee had ended up going into a research position, which was apparently his initial plan, he would clearly be getting more value out of his degree than he is now. Similarly, if one of the college graduates had accepted a worse position at a different company for reasons other than financial ones (e.g. location), then measuring the value of the degree based on their financial state would be improper, since they would be benefiting from non-financial rewards.
I'm a bit split on what you've said. On the one hand, I find it a bit disingenuous that someone who dropped out of business school can generalize their experience to apply to college as a whole, particularly for people who work in engineering disciplines. I can think of several people I know who simply could not even work in the professions they're in without having gone to school, since there's no way to either gain that knowledge or to get licensed in that field without having done so. Even so, I do believe what you've said holds some truth to it, even if it isn't universally applicable.
To provide some examples from my own life, there are three of us at the software company I work at now who are roughly the same age (mid-to-late 20s) and have roughly similar positions in the company who were comparing their experience up to this point a few days ago. We realized that one of us never attended college, one of us did an undergrad, and one of us dropped out of grad school, yet we all landed in more or less the same location.
The one who never attended college, had to endure working for several years as tech support in a call center at an Office Space-esque company before finally picking up enough to move into IT, then into development, and then switching over to the development company we're at now, getting hired on during the recession. The one who did his undergrad did well in school, worked part-time at the company we're at while in school, and then started immediately with them once he graduated, prior to the recession and without looking elsewhere for a job. The one who went to grad school did some internships during his undergrad, spent four years in grad school, talked to a handful of companies at a campus career fair, and had offers coming in from all around the country, also getting hired on during the middle of the recession despite his dropping out.
Our stories back up your points, it would seem.
You are correct in saying that it was derived from British law, however, I think the person you are replying to was referencing the fact that the British Parliament had established an embargo on firearms — in part or in whole — as well as munitions to the Americas in the years leading up to the war, in addition to the widespread disarming efforts underway by British and Loyalist forces in the months leading up to and after the start of the war. The Second Amendment was in direct response to that, in that it firmly re-established a "right" that had been abridged by the British government.
Besides which, as someone else has already pointed out to you, if a "right" is heavily regulated, then it is no right at all. Imagine applying that standard to other typical rights, such as the right to free speech or association. Would we still consider it a right to freely associate if we were told that we could only associate with members of our organizations (e.g. a religious one) if we did so at designated locations after being properly licensed to do so? Mind you, it would be for the good of the people, since we don't want disorderly mobs roaming around the streets, of course. And would we still have the right to free speech if self-publication were prohibited and our only option was to publish via a state-sponsored newspaper that engaged in heavy censorship? Once again, we'd be doing it for the good of everyone, since some ideas are of a dangerous nature and need to be carefully controlled so as not to incite violence.
Suggesting that the right to bear arms is in some way a special right that should be subject to such heavy regulations just because it involves a killing tool ignores the fact that many of our other rights can be far more dangerous when applied effectively and could just as easily be subjected to that same (faulty) logic. As a quick note, however, I will point out that I do not support the abolishment of all regulations regarding gun ownership, just as I would not support the right to falsely cry "FIRE!" in a crowded theater. I do believe there is a balance to be had, but that the point of equilibrium is when the minimum amount of regulation necessary, and no more, has been applied.
The honest truth is that all of these companies are vicious when it suits them, and conciliatory when it suits them. And it suits them when it means that it will make them more profit. Google, I honestly believe, was at one point the sort of altruistic company that many still paint it as, but with its rampant growth it has moved well past that point. Today's Google is far different from the Google of 10 years ago, and they are definitely the sort to engage in the embrace, extend, extinguish tactics you were talking about.
I read that as "How Blackberry is struggling to stay relevant after people stopped using the devices on which their services are used".
I do think it's better for everyone when there are more viable choices and more competition in the market, but let's not kid ourselves that Blackberry putting an app on another platform or two is them riding those platforms in order to "power their comeback". At best, it's analogous to what Sega did during/after the Dreamcast, and while we might be able to say that Sega is a decent software company now (a topic that's worthy of a separate debate), no one would suggest that they can exercise as much control over their destiny as they could before, nor that they are doing as well as they were in their heyday.
And, honestly, I question whether or not Blackberry's services are strong enough to stand on their own any more. There have been a number of "good enough" alternatives that have popped up in the last few years, either from first-party or third-party developers on the competing platforms.
Thanks! In all sincerity, I'm always happy to be corrected, especially when the actual source of information is provided. That's really something I should have checked before I posted, but I had figured I'd have heard of it by now if it was there. Apparently I was wrong.
Pardon my ignorance, but when has Apple ever provided contextual ads? The only ad network they run is iAds, and as far as I've heard, they don't tailor the ads for the user. A developer who includes iAds in their app can tailor the ads for their app (e.g. only allow ads for techie things), but that isn't user-specific.
Also, it's worth pointing out that Microsoft also offers a number of free services (e.g. search engine, e-mail, etc.). As such, it makes sense why both they and Google would seek to monetize more effectively through the use of targeted ads, while Apple would not, since Apple is receiving its payment in other ways (e.g. paying for the service or paying for the hardware necessary to access the service).
TL;DR: I have no doubt Apple tracks its users (they've said as much). I'm merely asking for a citation that they have ever sold that data or used it to tailor ads for their users, since I can't think of an example where they have done so.
I know you're being sarcastic, but I'd still like to respond seriously by pointing out that that's only true until the price to build new robots drops. The price to create new humans has remained roughly the same for as long as humans have been around, and it isn't getting any cheaper (if anything, the cost has gone up as new forms of fetal care have been put forward).
It doesn't require iTunes on an ongoing basis. A single sync would be sufficient.
It doesn't. The author seems to be under the mistaken belief that you need iTunes if you have an iPhone. The iPhone hasn't needed a PC running iTunes to sync with for years. Assuming iTunes is misbehaving (which wouldn't surprise me, though the summary's author doesn't actually provide any evidence stating what resources were being used, merely anecdotes of bugs reported on forums), simply uninstall it and never deal with it again. Or, if you "must" deal with it to manage your music library, just don't set it up to sync with your iPhone and shut down the service. Easy.
Yeah, I had just woken up from a nap, and my lack of correct grammar and spelling was quite apparent. E.g. "They're" instead of "their" and "of" instead of "Off". Oh well.