Regardless, he'll still be around for quite a while yet. In an interview with Wired, he said he just wants to live until the year of Linux On The Desktop.
If you're a Ron Paul supporter voting for Trump, I fear that "confused" is rather an understatement of your mental state.
I think not so much "confused" as "shallow". I can see a very surface correspondence between Paul and Trump: They both like to buck the establishment. The fact that the do so in very different ways and for very different reasons requires looking past the top millimeter of each. I suppose a vote for Obama (in his first presidential campaign) could fit as well if the same incredibly shallow analysis just focused on the "Hope and Change" slogan.
Google does something like this, on a selective basis.
I think it started as something done only for special cases, but I know a few people who arranged it. One woman I know works three days per week instead of five, for 60% of her normal salary. She has also taken a large chunk of her 18-week maternity leave and uses it one day per week, so she actually works two days per week but gets paid for three, until the maternity leave runs out. Her husband has arranged a similar structure with his employer (not Google), working three days per week so one of them is always home with the kids. She's a fairly special case, though, because she's a freakishly brilliant software engineer who any smart company would bend over backwards to accommodate.
However, it's now been expanded to be made generally available to full-time employees. It requires management approval, but the descriptions I've seen make it clear that management is expected to agree unless there are specific reasons why it can't work. Salary, bonuses and stock are pro-rated based on the percentage of a normal schedule that is worked. Most commonly, people work 60% or 80% schedules (i.e. three or four days per week instead of five). Other benefits, such as health care, etc., are not pro-rated, but either provided or not, depending on the percentage of normal hours worked.
I could see myself going to a 60% work week in a few years, having a four-day weekend every week in exchange for a 40% pay cut.
One part of your experience that rings false to me is the level of support required for Windows machines vs Macs. My experience is narrower than yours, because I'm a programmer not an IT support guy, but I do get used as an IT support guy by friends and family because, you know, I "do computers". With that caveat, my experience is that the single biggest thing I can do to reduce my support burden is to get them to trade in their Windows laptop for something else. The very best alternative is a Chromebook, then a Macbook. Installing Ubuntu instead of Windows is also a good support-reducer, but not as many have gone that route.
As far as mobile devices go, I do more Android support than iOS support, but I think that's mostly because all of my immediate family, and most of my extended family, uses Android. Plus the Apple users are a little less likely to come to me for help because they know I'm an Android guy (because I work on Android system development).
Until the dolls literally spray genuine, authentic baby shit and vomit on you in the middle of the night, they are going to be inadequate to the task of dissuading girls from wanting to make babies.
If you can't actually fill them with a truly realistic substitute for unwanted infant fluids, they're worthless.
I don't think that has anything to do with it.
I've raised four kids (youngest is now 15, oldest is 23), and the bad parts of having children, and babies, really have nothing to do with the icky body fluids. I've changed more than a few "blowout" diapers, and even had a couple of kids puke into my mouth and that's really not the bad/difficult part of having and raising children. The bad/difficult part is the commitment required. Kids require very close to 24/7 effort for years, and a lower level of focus and attention for decades. They're financially expensive, emotionally and physically demanding and they require you to be able to deal with your life so you can also deal with theirs.
On the surface, caring for a robo-baby for a few months should be a reasonable approximation of that. Where it falls down is not the lack of body fluids, I think, but the knowledge that (a) it's only a grade, not a life and (b) it is only a few months. (a) means that if you screw it up, it's not so terrible, and (b) means that you know there's an end in sight. Both of those probably significantly reduce the impact.
The schools in my area do something similar, but they don't use a robot, they use a bag of flour. That's not as good in that it won't rat them out for failing to care for it, but it may have another advantage (besides the low cost): It's not cute. I wonder if the robo-babies don't backfire because they get girls thinking about how cool it would be to have a cute little baby all their own.
I just won a game of Tic-Tac-Toe for the first time ever.
Since it's trivial to write an algorithm that plays optimally and since a player using an optimal strategy will never lose, Google clearly did not try to create an "AI" whose focus is winning. Instead, they appear to have created an algorithm that is a fairly decent novice player. Which, actually is a good deal harder than optimal play.
Well, maybe not. It wouldn't be too difficult to take an optimal play implementation and randomly cause it to choose a bad move. For example, if it's playing X you could have it select a move at random, rather than always taking a corner. And at each subsequent move you could give it a smallish chance of making a bad move. That approach might simulate a decent novice well enough.
Perhaps a better approach would be to use machine learning and have it learn from novice games, or even from well-played games, but leave it incompletely trained. That might make it more "human-like".
wait for AOSP to hit the repo (2-6 months)
One or two months, sure. It won't be six.
[wait for XDA to**] tweak the hardware-specifics (1-12 months)
So don't wait for XDA. You don't actually need anything they provide. AOSP + the vendor binaries from Google will get you running.
My belief is that there's an overwhelming consensus amongst scientists who are experts in this field that man-made climate change is real and worth taking action to mitigate.
My belief is that whether or not the warming is man-made is almost completely irrelevant. It's clear that the planet is warming, and it's clear that this is going to make our lives more difficult, meaning it's going to consume huge amounts of labor and resources to adapt. Therefore, we should absolutely be taking action to mitigate the change, as long that action consumes less labor and resources than would be required to adapt to the change (which argues for pretty aggressive action, since adaptation is going to really costly, e.g. relocating a large portion of the human population).
The source of the warming is only relevant because it may point us towards some possible mitigation strategies. We should not, however, focus only on ameliorating the causes. Other, more direct, climate manipulation strategies should be seriously investigated.
Very effective at making operators forget that they are training to kill other human beings, make it easier to unthinkingly shoot when told regardless of right/wrong.
I don't think video games are particularly effective at changing the way people think about real combat, when there are real people downrange.
What does work well is what has always worked well... tribalism and intentional dehumanization, which includes calling the enemy "hun", "jerry", "jap", "slope", "slant", "gook", "raghead", "tango", "target", etc., and attributing subhuman and evil characteristics to them.
Apple pay isn't on android, by definition. Unless you're talking about the competing Google Pay, which is a different competing standard.
You mean Android Pay, not Google Pay. And it's not a different, competing standard. Both Apple Pay and Android Pay use the same NFC technologies and standards.
On the name, I should point out that it's somewhat understandable that you call it "Google Pay", since Android Pay is a successor to Google Wallet, which was Google's original NFC payment solution, released in 2011 (long before Apple Pay). The Google Wallet approach was a little different, though. Because of payment network limitations, Google used a "proxy card" solution, where a Google-issued credit card was what was actually used to pay at the point of sale, and Google then charged your credit card on the backend. That approach had problems both for the user, who might not get full credit from rewards cards, and for Google, who lost money on every transaction due to the difference in fees between the card-present transaction at point of sale, and the card-not-present transaction used for user's payment, but had the supreme advantage that it would work with any credit or debit card. Banks also really disliked the proxy card solution because it threatened to take too much control of the payment systems away from them. With the intermediate routing step Google could have arranged to use any payment system on the back end, and then used its clout to get the point of sale updated to a solution that didn't involve the banks, and removed the banks from the process completely. There's no evidence Google was going to do that, but the banks were afraid of it and chose to make Google's life very hard in all sorts of ways around the NFC proxy card (and its physical, plastic analogue, which Google issued for a while).
Apple waited until networks were ready to do "network tokenization", and until some more banks were ready to handle NFC transactions, both of which are required to enable the Apple Pay model where the payment is done directly against the user's card, with payment clearinghouses routing the the transaction directly to the bank that issued the credit card. Android Pay uses this same model, with the difference that if you have a credit card which was previously used with the Google Wallet proxy card solution, Google "grandfathers" your card in and continues using the proxy. This direct model fixes the disadvantages of the proxy card solution, but means that you can only use cards whose issuers have set up the necessary infrastructure. But these days, lots of them have. In particular, the big bank service providers like First Data have got everything set up so their clients who issue credit cards can do NFC. This means that nearly all small banks and credit unions can do it, and most of the big banks can do it. Some of the big banks, and many of the medium-sized banks still aren't set up.
(Note that I've intentionally left out some details, like the first version of Google Wallet using a direct, non-tokenized approach that only worked with one bank, and some of the other intermediate steps. I figured this was long enough.)
Someone doesn't know their history. its you. Look at the wars america was in before 1940. For example- the Spanish-American war. Basically caused because we wanted some of Spain's stuff in the Carribean, and trumped up on an explosion in port that ended up being an accident.
The Mexican American war- because we wanted to move our southern border to the Rio Grande.
The War of 1812- multiple causes and may have happened anyway, but at least part was a desire to annex Canada.
The Indian Wars- all undeclared, but we took each tribe's land one at a time.
The US has been an imperialistic war monger from the beginning. We just kept it to our own continent until the 1900s.
Non-unlockable bootloaders are a bug.
I agree. Talk to your device manufacturer about their bug, but I don't expect them to listen to you. If you want to avoid that bug, you have to buy a device from an OEM that allows unlocking. If enough people voted with their wallets in this way OEMs *would* listen, and non-unlockable bootloaders would disappear.
If Google had designed (? or something?) Android so that updating the base OS was something that could be pushed direct from Google instead of from each manufacturer's bollixed version of the system, there'd be no problem for any of us.
That may seem obvious now, but it's far from clear that Android would have succeeded the way it has if OEMs hadn't been allowed to differentiate their versions. That was (and is) something that's important to them, and they may well have decided that they wanted to do their own thing instead if Google hadn't given them the degree of control they wanted. Or maybe they'd have adopted Windows, since while it wouldn't allow them to customize it would have had the advantage of being from the then-biggest OS maker around.
It seems very likely that the ability of OEMs to customize was a core component of what made the Android ecosystem successful.
Also, keep in mind that the only way Google could really have kept OEMs from modifying Android however they like would have been to keep it closed. Personally, I'm glad that Google made the choices it did, not because I'm a Google employee working on Android (though I am), but because I've been an open source and free software advocate since before Google even existed. Android is far from perfect, and devices aren't as open as I would like, but I think the mobile software world is much better than it would have been without a F/LOSS mobile OS.
I still think that two years of updates is outrageous forced obsolescence that is prematurely adding electronic garbage to landfills.
FWIW, it's actually two years of upgrades and three years of security updates on Nexus devices.
I'm seriously considering going back to an iPhone on my next phone upgrade, despite all the concerns I have about them too. They at least support their hardware for around 5 years.
At least they have done so in the past. Note that they've never made any commitment to that, so they could stop.
Unless you bought a device with an unlockable bootloader, any way that you can get root is a bug, not a feature. It may useful to you, but it would be equally useful to an attacker.
The easiest way to figure the cost of living is to take your income and add ten percent.