... As basically no profession has a good grip on statistics except [statisticians],
There I fixed it for you.
... As basically no profession has a good grip on statistics except [statisticians],
There I fixed it for you.
Or to paraphrase whimsically: "C: it's been ignoring Unicode since before you were born."
Can any language do unicode right yet?
You can throw away any language that uses UTF-16 right from the start. What's left is C/C++, if you are careful enough.
In fact C (C89/90 to C11) is character-set neutral, and continues to maintain support for EBCDIC for those still stranded without 7-bit ASCII (which is technically superseded by ISO/IEC 646) by providing trigraph (126.96.36.199) and digraph (188.8.131.52) support.
See Section 1 Scope: 2. "This International Standard does not specify / the mechanism by which C programs are transformed for use by a data-processing system" and the definition of 3.6 Byte which is an "addressable unit of data storage large enough to hold any member of the basic character set of the execution environment". So yes Cray, you can even have 12-bit bytes (CDC Cyber).
Kids these days.
> These shiny new processor having working TSX instruction sets? The ones that are supposed to help with virtualization?
TSX is not for virtualization, but for transactional synchronization, it provides efficient transaction locking for multi-threaded applications. Not necessarily virtualization, although it can benefit from efficient locking as well
No, as far as I know, these have TSX disabled, or will be with a microcode update, as TSX isn't expected to be fixed until 2015 in Broadwell or Haswell-EX Xeons (not Haswell-EP which these are).
From what little I have read on calculators in standardized testing, Texas Instruments never had or has a monopoly on approved calculators. In particular I have never seen a list of approved calculators that did not have at least some Casio or Sharp models as well, and more complete lists often included at least one HP model -- the real premium calculator for geeks.
The fact that the TI-84 Plus was probably the most advanced model approved, meant it marketed that position into a perception of being a highly desirable model that encouraged parents who were willing to pay the premium because they wanted to give their child ever advantage they could possibly afford or find.
Parents are "trained" (indoctrinated) right from pregnancy to buy "educational" toys, aids, to prenatal music blankets. After the child is born, they are constantly bombarded with "educational" products that often have no merit, beyond the statistical correlation between parents who invest the most money into their children's learning, are more likely to be the same ones who invest time in their children's learning as well.
While a Math/CS major in undergrad I used a $5 whatever-brand scientific calculator rather than my $100++ graphing calculator, because a) I didn't need any graphic or fancy features - they were useless, b) the battery life sucked, and I was too cheap to constantly replace them, c) other than engineering students who bought RPN calculators (e.g. HP-35) it didn't matter one iota, beyond merely having the cheapest scientific calculator you can find.
Maybe you missed this part of the heading (not even TFA):
"Nearly half of the software developers in the United States do not have a college degree."
Did you not notice the lack of citation, age, or accuracy of this "factoid?" It's worth less than the electrons used to create it, with no creditability to be seen.
"Nearly third-quarters of software developers smell funny." Hey look, I created a "factoid" too. Just as meaningful as the one in the article and summary, and probably slightly more reliable and accurate.
Actually in terms of self-awareness, development and personal growth, experiencing university life can have a tremendous impact beyond the classroom. On average, I'd say it can at least doubles your social skills to an order of magnitude improvement social skills for some, and improve your quality of life. My personal opinion is that for many young people (and perhaps those not so young) considering this question, this can be an far greater benefit, and a more important benefit to your quality-of-life.
Having a degree can also make it easier to get a chance to be considered during a tight job market, and improve chances at negotiating a better salary / contract.
Getting a degree, without learning to code, will certainly make you an incompetent bane of your co-workers existence, no matter how short that career may be.
While being aware of the financial realities (and potential opportunities for assistance) of the cost of university, the strongest case tends to obviously be: do both.
Others have pointed out the obvious complimentary nature of knowledge (theory) combined with experience (practice). If you don't know what to work towards, you can waste a lot of time and effort doing things the hard way or rediscovering the bubble and merge sort. Or if you don't know what can and can't be done, or how to do it, you end up a hard working monkey with a very limited playbook. You may find the ever constant change in technology a burden, rather than an enjoyment (I mean after the first 5 years), because in my experience those who understand the fundamentals, those abstract or theoretical bits, can adapt to change more readily and often with dramatically less effort.
Most famous university dropouts (in Sciences and IT) both made it through admissions obviously, and more importantly left before they could finish their degree: that is to say, they were most likely in their 3rd or 4th year, not entirely flunking out first semester (though having a rough to horrible first year grades isn't particular uncommon even for many who later become professors themselves). In a fair number of cases, including some William guy from Redmond, they complete their degree later in life.
In the end; it is what you make of it, just like everything else in life.
What do you see or expect for the future of electronic-centric publishing?
Are e-books going to be dominated by the established publishing companies tendency to try and extend their control over the works of their authors, and their customers, as demonstrated with the limiting of adopting due to DRM, and fear of digital piracy?
Will there be a role for publishers, perhaps as curators and editors (in both senses of the word) of fiction and non-fictional works, separate from that of the retailers?
Will authors be able to find an economically sustainable means of financing their writing (including any necessary research) that can withstand the perils of near-free proliferation of illicit unlicensed digital copies of their works? Or will authors have to have either patrons (sponsors) (e.g. literary awards' prize money) or employers (e.g. academics) who pay them to write, perhaps limiting most content to be "safe" or "salable" topics for the most part.
It's only "secret" in the sense that almost all pharmaceutical research is completely ignored by the media.
If you dig around you'll find some articles about ZMAPP in no-name low-impact journals like PNAS and Science.
They (the media) mean Mapp Biotech didn't issue a big-name PR firm to issue a press release about this "secret" (pre human trials) treatment, which is how most "science" and "health" news is researched by the media.
Does Mapp even have a publicly traded stock? No mention of ticker symbol, how could they be a real pharmaceutical company without hyping that?
I mean my kids have a NASDAQ Biotech company now, after their lemonade stand was closed down by the IRS for not printing a "forwarding looking disclosure" on their investment prospectus (aka napkins).
As another researcher in the pharma industry: reread your post. Your entire post is only highlighting how poor of a job pharmaceutical companies do at effectively bringing drugs to market, all while adding the inefficiency of a 20% profit margin.
Notice that said "bringing drugs to markets," not the basic funding for preliminary basic research into the actual discovery and isolation of the basic drug and/or drug interaction, which continues to be funded (95+%) by the federal governments of the G8 nations.
Then being granted a 18 or 20-years monopoly (from patent file date admittedly, not marketing approval date), if you successfully complete the marketing research without killing too many test participants. Although for any "successful" to "blockbuster" drug the entire pre-approval expense including administration and marketing is more than recouped by double in the first year of sales.*
The cited book ($800 Million Pill) is not the only ones to criticize and rebut the $800 million dollar figure which is oft-touted in the media, actually comes from the DiMasi's 2001 paper The price of innovation: new estimates of drug development costs.. Thought even the Wall Street Journal notes "[f]or instance, only $403 million of Dr. DiMasi's $802 million total are actual out-of-pocket expenses. The rest is an estimated cost of capital -- or the return that investing the money at an 11% rate of return would have earned over time." Non-executives-types would call it fudging the numbers.
* The $800 million pill book by Merrill Goozner.
There is no reason American health programs can not do the same.
Actually there is a law against that."The 2003 Medicare law* prohibits Medicare from negotiating drug prices, setting prices or establishing a uniform list of covered drugs, known as a formulary."
*: full title "Medicare Prescription Drug, Improvement, and Modernization Act"
The company I worked for was dialing into UUNET back in 1987/88. Why aren't they considered the first
Because UUnet didn't offer dial-up TCP/IP connectivity (or Inter-networking) back in the 1980's, they offered dial-up UUCP (unix-to-unix-copy) services for Usenet (NetNews) and email (with ! (bang) addresses).
They offered backbone IP access in 1990 via its AlterNet service based on the Wikipedia article you linked.
I know there are brilliant doctors and scientists in Atlanta who handle highly-communicable diseases, but is this such a brilliant idea?
When did Slashdot become home to stupid FUD* spewing dweebs with little or no common sense? The subtitle is "News for Nerds," which would suggest somebody who submits something might have half a clue about what they are talking about (leaving the plebs to pontificate on logical and scientific fallacy or imagine a Beowulf cluster of hot grits ).
I want my Slashdot with nerds filter enabled.
And yes it is an excellent idea, because it gives the CDC a living "test tube" of the actual active Ebola virus, not a sample of infected blood collected, and shipped on ice. Making it ideal for study, and possibly detection of any variant (i.e. mutation) that had not been notice before. Of course, this will likely cost the American doctor his/her life, but such is the risk of fighting an viral outbreak, and the real-world beyond web forums and politicians rambling.
* FUD: Fear, Uncertainty, and Doubt
[...] The point being, Wikipedia is not a source of anything, it is the product of a series of sources. So you do not cite Wikipedia, you cite the article it points to.[...]
Careful. While you can use Wikipedia as a meta-index to find references, you can only cite those 3rd-party references if and only if you actually obtain a copy and view the content yourself.
Otherwise you are merely shifting from hoping that the content to Wikipedia is accurate, complete, etc. to hoping that the citations are both a) Real, and b) the Wikipedia content that cites the 3rd-party source, is accurate in its portrayal of the cited work's content.
For example, I could cite the Gutenberg bible (or pick your favorite "lost" historic title found only in the Vatican library) as the reference for the existence of aliens on Earth, and given the rarity and inaccessibility of such a reference, very few, if any Wikipedians have access to counter such citation. And that Douglas Adams ripped off "42" and "the meaning life, the universe, and everything" from Al-Khwarizmi's Hisab al-jabr wÃ(TM)al-muqabala, Kitab al-Jabr wa-l-Muqabala. Not necessarily true, but hard to disprove.
In the dark days of computing history before AJAX was even conceived and Mad Men were still crazies, "in-memory databases" meant that the database INDEX was in RAM (ideally if you DB admin was worth their salary), but then people wanted to pretend their were the next Google, famous for their massive search index in the pentabytes of storage, so hipsters started the NoSQL fad to be awkward like middle-aged men in skinny jeans as a vain attempt to self-proclaim their importance.
Now Oracle is making money by selling RDBM to organizations that spent more money and time on hookers and coke than doing real IT management. The Old is New, yet again.
Rinse, Lather, Repeat.
This system will self-destruct in five minutes.