Even in the U.S. it is possible to find place names (nearly) that old. American Indian names that are still in use are not hard to find, although the pronunciation tends to be corrupted.
I think your post makes good points mostly, but I have to ask, since when did the phrase "actuary" fall in to disfavor, and when was anyone going to tell me?
A P&C actuary
That's forensic accountants, dammit. Actuaries aren't going to help you unless you need a quote on some liability insurance - and you'll be wanting to get that before you lose all your Bitcoins.
While I mostly agree with you on your last sentence, can you imagine the uproar if Microsoft made an analogous decision about Windows?
Everything you say is completely correct. However, I'd point out that it can actually be a bad thing for the property owners when this type of political maneuvering makes flood prone homeowners ineligible for federal flood insurance because they lie outside of the federally defined flood zones entirely. When Nashville, TN was flooded in 2010, it came out that some of the flooded homeowners had attempted to purchase flood insurance beforehand, but had been told that they couldn't purchase or didn't need insurance. The property developers had made sure that the houses were in a no-risk zone.
I don't know how widespread of a problem that kind of thing is, but it has definitely happened.
According to this article, there are plenty of reasons to doubt these rankings, even if press freedom in the U.S. is worrying. And ranking changes like these are not new. Here are the U.S.' rankings over the last 10 years (there's a typo in their own press release, the U.S. actually fell 14 slots):
That seems...a bit inconsistent. Again, that's not to say there isn't plenty to worry about in the U.S., but I'd still take these rankings with a grain of salt.
It's also worth noting that that's 50% of marriages ending in divorce. Only around one third of the (married) U.S. population will have a divorce, but some of them will have 2, 3, or more divorces, which drives up the average. For an extreme illustration, a population consisting of Elizabeth Taylor and my wife would have had 88.9% of their marriages end in divorce, but only 50% of that population would have ever had a divorce.
I do wonder if this is an important point. Those of us who read Slashdot hear about astrology fairly frequently, mainly when we're scoffing at it. People who get their information from mainstream news, on the other hand, hear about horoscopes. I've never heard anyone ask someone else what their "astrological sign" was, but I have occasionally heard talk about the Zodiac or "Zodiac signs." It seems possible that some of the survey respondents have simply not heard much about astrology, at least under that name. It seems like an interesting question - I really don't know whether it would be a factor or not.
I think that what you meant to say was "I can assure you your depiction of free education bears no relation to reality in Britain, while I was in school ~25 years ago."
In the present day United States, many if not most state schools are (relatively) cheap to residents of that state, if not quite free. The state school I taught at a couple of years ago while I was getting my graduate degree had a nominal tuition fee of $7800 for in-state students (not including dorm fees or meal plans, but the majority of students lived off campus). The large majority of in-state students qualified for the $5000 per year lottery scholarship, leaving an effective tuition of $2800 per year. Not free, but hardly expensive either.
The problem is that such schools tend to also have admissions standards that are too low. Thus, an extremely high number of unqualified students decide, incorrectly, that they should attend college because it's "free" - loans will take care of that extra $2,800, and loans are free money, right? - and these unqualified students take up a disproportionate amount of university's resources but then don't actually graduate. At the above mentioned university, less than 40% of the entering freshmen class graduated within four years, and I can guarantee you that a good three quarters of my time and resources as a teacher were spent on students in the never-going-to-graduate category.
The point that the grandparent is making is that if the U.S. is going to give so much money to college students, there has to be some incentive to actually get a degree. In theory the best way to do that would be to only admit students with abundant academic talent, and indeed, many private colleges and a few state colleges still do this (but they tend to be the more expensive ones too). But since many state schools have given up on that quaint notion, financial incentives are the easiest tool for encouraging students to either get a degree or get out.
Admittedly, financial incentives (or disincentives in this case) are a crude and unfairly discriminatory tool. The part of me that believes in equality of opportunity for all dislikes the idea of making anyone pay for college. But the part of me that was an actual college teacher for a couple of years would like to point out that free and non-selective universities without an incentive to graduate quickly are unworkable in practice, as noble as they may sound in theory.
Not to be pedantic, but this mistake always annoys me. Carbon dioxide is not deadly, except in the sense that if you try to breathe nothing but CO_2 you would die from lack of oxygen. I believe you are thinking of carbon monoxide, which is indeed toxic in sufficient concentration.
Credit cards are almost always better for a U.S. cardholder than debit cards for the following reasons:
-Credit cards often have a reward for use when debit cards do not. My credit card gives me a 1 dollar credit on my Amazon account for every 100$, plus bonus credit in some cases.
-Credit cards grant the option, but not the obligation, of deferring payment, when debit cards don't. I've never paid a cent in credit card interest since I turned 18, so obviously this option isn't worth much to me, but it is there in the incredibly unlikely event that I need it.
-If you try to spend funds you don't have with a debit card, the bank may overdraft your account and charge you a penalty, instead of denying the transaction. This penalty is usually higher than the equivalent interest rate on a credit card. As of 2010, this is not really an issue anymore, because the customer now has to be dumb enough to voluntarily agree to this arrangement.
-Other than the above, there's no functional difference (to the cardholder) between the two types . Fraud protection is the same, payment processing is the same, etc. This includes prices - very, very few merchants charge credit card users extra, although they are allowed to now.
Given that, the only reasons to avoid credit cards in the U.S. are moral objections or lack of self control to handle them responsibly. Rational consumers will use a credit card every time. Of course, this says nothing about what's best for merchants or banks, but that wasn't your question.
I don't know about Europe, but in the U.S., credit cards are always the superior option for the cardholder. That is not the case for the merchant taking the card. The only two reasons to use debit cards are 1) for ATM withdrawals 2) lack of emotional control. If you're interested here is a slightly dated but still mostly accurate opinion piece about why credit cards are the better choice.
I know that your question is derisive, but the Wall Street Journal provides some pretty valid reasons:
"There’s a historical view to this. In the past, other markets migrated for two reasons. First, there were higher fraud rates in some other markets, and they wanted to make this move [to chip and PIN] to combat fraud. Second, this system can operate in offline mode – the card and the terminal can authorize a transaction independent of communication with the bank’s systems. In some other markets they struggled with robust telephony networks, so this offline capacity was attractive. Both those factors were not driving factors here in America."
To put that statement into context, as of 2010, merchants were experiencing losses from credit card fraud at a rate 6 cents per $100 of credit card charges (in the US, merchants pretty much always bear all costs of credit card fraud). So for a busy retail location that did $10,000,000 in card transaction per year, card fraud losses would be $6,000 per year. Even in the highly unlikely event that moving to chip and PIN would cut fraud in half, that would be a savings of $3,000 per year. That's hardly compelling, since it's at least an order of magnitude less than what a store that size would lose from employee theft alone. From a practical, financial perspective, credit card fraud is just not an issue in the U.S. It's only important in terms of public opinion.
The WSJ article also mentions the very large size, maturity, and complexity of the American card network relative to other markets, and a certain amount of weirdness caused by the way the Durbin Amendment forces processors to handle debit card transactions.
I would also add that, as I alluded to earlier, end consumer protection from card fraud in the U.S. has always been extremely strong - it's very, very unlikely for the cardholder themselves to lose money from fraud. This meant that there was little impetus from consumers for a switch. There was also some worry that moving to chip and PIN would be used as an excuse to shift some of the liability for fraud to the cardholder, so ironically the old system was seen as safer (for consumers, at the merchant's expense). As the American chip and PIN system has been rolling out, it's becoming clear that this last concern is a non-issue.
I'm having a hard time deciding whether to argue with you or not. On the one hand, a great deal of consulting work is not bullshit, and is very valuable to the client company. Most companies would find it astonishingly stupid to pay a six figure salary to a full time actuary just to calculate their workers compensation reserves once a quarter, when they could pay a consultant (like me) four figures for the same service.
On the other hand, IBM's consultants are so incredibly expensive and useless that, in the context of this article, you are absolutely spot on. I count it as a great blessing that I work for a small firm that never has to deal with IBM or their ilk.
According to Motley Fool, only 14% of IBM's sales are from hardware. And that 14% is including the x86 server business that they just sold. And yet, between 2002 and 2012, their sales grew 28% and their earnings per share grew a whopping 7x (total earnings grew much less because of huge share buybacks, but it's earnings per share that matter). IBM is a software and services company. They keep selling some "big iron" to promote lock-in for their software and services - essentially their hardware is the corporate version of Amazon's Kindle Fire. Cool stuff like Watson notwithstanding, IBM usually tends to dump hardware divisions around the time when that division can no longer provide any reliable software or services lock-in.