Comment Re:America! (Score 1) 230
Cause the terrorists have won.
We thought their game was through, but they knew the Konami Code.
Cause the terrorists have won.
We thought their game was through, but they knew the Konami Code.
The constitution exists to limit the government's power to interfere with your liberty.
Specifically, it can only do so if it thinks it's for the best ("general welfare") or might have any effect whatsoever ("interstate trade").
Only leftist idiots think that it's the government that grants you your rights.
The government doesn't grant people rights, but it oversees and manages the web of institutions which enforce them. The property rights right wing so adores don't mean a thing in a jungle.
That's 100% Nanny State backwards.
"Nanny State" exists because of Gilded Age. Every time economic controls are loosened, it leads to wealth concentration and eventual collapse. It's what's happening right now, and will only end with re-instatement of a Nanny State strong enough to enforce sufficient redistribution of income.
They're computer crackers. What are they going to do? Why all the fear?
The competition among virtual currencies and their continuing evolution demonstrate their uselessness as stores of value.
Economic value is like potential energy: it only makes sense in the context of some system. A dollar, a bar of gold or unspent transactions in the Bitcoin ledger have no inherent value, but someone might accept any or all of them in exchange for something else. But economy is ever-evolving, and in fact currently going through a major crisis, so economic value cannot be reliably stored for any length of time. The best you can do is watch which way the changes are going and transferring value away from failing forms.
It's not as expensive to spend the money to properly maintain your security than it is to have it massively breached and all your data stolen.
Not as expensive if you only count money.
But in my experience, the problem is the upper executives and their insistence on special exceptions for them and their people who are doing work that is just so important that they cannot be burdened with following the security that applies to non-important people.
And I hope Sony, and all other Big Companies (tm), learn a lesson.
I think that this reinforces the wrong lesson. Everything is okay as long as you can find someone else to blame. Whether it's an employee or a hacker group or a country. The focus will be more on THEM rather than Sony executives who broke security so that they could feel more important than the nerds in IT.
Yep. And even more so.
If you live in the USofA then you have a larger chance of being killed by your spouse / boyfriend / girlfriend / YOUR OWN CHILDREN than by a terrorist.
Just by waking up alive you have alread beaten the "terrorist" odds today.
And in this specific case, what are the "terrorists" going to do? Steal your credit card number? Pay cash instead.
If we look at jet aircraft, wear depends on the airframe and the engines, and the airframe seems to be the number of pressurize/depressurize cycles as well as the running hours. Engines get swapped out routinely but when the airframe has enough stress it's time to retire the aircraft lest it suffer catastrophic failure. Rockets are different in scale (much greater stresses) but we can expect the failure points due to age to be those two, with the addition of one main rocket-specific failure point: cryogenic tanks.
How long each will be reliable can be established using ground-based environmental testing. Nobody has the numbers for Falcon 9R yet.
Weight vs. reusable life will become a design decision in rocket design.
As long as there are cowards, there will be people selling insurance.
As long as some entities have a higher capacity to absorb temporary setbacks than others, they can trade on this ability like any other good. But I suppose that doesn't make as good a soundbite.
Unless they get employed doing something else.
Suppose you have 10 people and 10 jobs. One job is eliminated by technology. Now you have 10 people and 9 jobs. That 1 newly unemployed dude tries to get another job, but to do so he'll have to oucompete 1 of the remaining 9 employed people out of their job. So how will he compete? Why, he'll do the job for less money. So now we have 9 people with lower average wage, and 1 unemployed dude. This merry-go-round will then continue. Also, as wages fall so will the total buying power of the workforce, which creates further downward pressure.
Capitalism cannot handle a situation where labour is not the resource that limits production. It predates Industrial Revolution, almost collapsed as a result of it, and is heading back towards the cliffs now that true believers have managed to convince themselves that the fall of Soviet Russia means revolution is no longer possible and dismantled the compensating systems.
The only real question at this point is whether it'll collapse into a dystopia where the poor are kept down by brute force, or incorporate sufficient income redistribution to guarantee a middle-class minimum income. US is trapped to the former fate by the aftereffects of Cold War rhetoric, but Europe and Japan have hope. And China, of course, is a dystopia as is.
"Remaining jobs" need not decline and it's worth noting that they actually aren't declining at present.
According to the article they do. Also, when was the last time job market was good for the employees?
Second thing, most examples given are low wages jobs, then the argument does not hold water if you pretend it is responsible for stagnation of the average wages, the average wages should go up if there is less people with minimum wages.
If you destroy a low-wage job, the workers who previously did it become unemployed, and their wage goes to zero. Also, there's more competition for the remaining jobs, thus even non-zero wages tend to fall.
It seems that wrinkling may be the price we pay for clearing potentially cancerous UV-damaged cells from the skin. It might be a bargain.
An English major is NOT getting into a STEM Ph.D. program, no matter what.
Even if they were, job prospects are worse for STEM Ph.D. holders than for MS/BS holders—there are far fewer jobs that require Ph.D. level qualifications outside of the professoriate and academics, and for Ph.D. holders in particular, employers are absolutely loathe to hire overqualified people.
Inside the professoriate and academics, the job market is historically bad right now. It's not "get a Ph.D., then become a lab head or professor," it's "get a Ph.D., then do a postdoc, then do another postdoc, then do another postdoc, then do another postdoc, really do at least 6-7 postdocs, moving around the world every year the entire time, and at the end of all of that if you've managed to stay employed at poverty wages using highly competitive postdocs that you may not even get, while not flying apart at the emotional seams, you may finally be competitive enough to be amongst the minority of 40-year-old Ph.D. holders that gets a lab or a tenure-track position, at which point the fun REALLY begins as you are forced onto the grantwriting treadmill and feel little job security, since universities increasingly require junior faculty to 'pay their own way' with external grants or be budgeted out."
And that's INSIDE STEM, which this person is almost certainly likely to be uncompetitive for as a B.A. holder trying to get into graduate programs.
Much more likely is that with great grades and GRE scores they'll be admitted to a humanities or social sciences Ph.D. program, with many of the same problems but with CATASTROPHICALLY worse job prospects due to the accelerating collapse of humanities budgets and support on most campuses.
Ph.D. is absolutely not the way to go unless you are independently wealthy and are looking for a way to "contribute to the world" since you don't actually have to draw a salary.
For anyone with student loans, it's a disastrous decision right now, and I wouldn't recommend it.
I say this as someone with a Ph.D. who is on a faculty and routinely is approached by starry-eyed top students looking to "make the world a better place" and "do research." Given the competition out there right now, only the superstars should even attempt it, and then only if they're not strapped for cash. Hint: If you don't know whether or not you're a superstar, you're not.
I think in a decade I've strongly recommended that someone enter a Ph.D. program once, and greeted the suggestion favorably maybe three times total, out of thousands of students, many of them with the classic "4.0 GPA" and tons of "books smarts."
In short, I disagree strongly with the suggestion. Unless you absolutely know that you're competitive already on the academic market, DO NOT GO. Don't listen to the marketing from the schools; it's designed to drive (a) your enrollment and tuition, and/or (b) your cheap labor as a teaching assistant/research assistant forever once you're in the program. It's a win for the institution, not for you.
The easiest sanity checks: Do you know exactly what your dissertation will be about and what you'll need to do, in broad strokes to conduct your research, as well as what resources you'll need? Do you already have personal contact with faculty on a well-matched campus in a well-matched department that are championing you and that want to bring you in as one of their own students/assistants?
If you answers to either one of these questions is "no," then while you may be offered a position somewhere, you will be on the losing end of the deal and would be naive to take it.
And he makes a FUNDAMENTAL mistake by focusing on "defining how a new technology approach will add value".
At the CxO level that is easy to do. It will allow the company to synergize your core with blah blah buzzword blah buzzword.
But the reality is that it is about adding more achievements and buzzwords to someone's resume so that they can move on before their choices bite them.
It is easier to change the specification to fit the program than vice versa.