Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
It is, however, gaining members and activity again.
After a long period of gentle stagnation, it was acquired a year ago by an internet company with a long experience of Cix and its community, and is now enjoying something of a revival. We're all fairly optimistic about its future again!
(I'm gidds@cix, and spend more time there than here or anywhere else. I'd recommend it to anyone looking for interesting, intelligent, wide-ranging discussion.)
a day's work of coding meant sitting in one spot, staring at chars/text, thinking, and then more of the same.
To judge from some of my co-workers over the years, the 'thinking' part would seem to be optional...
Only idiots believe that patents encourage innovation
But encouraging innovation isn't the (direct) intention of patents: it's to get the technical details published and available to the public (in return for which the inventor gets a time-limited monopoly on its use).
In industries where seeing an invention in use offers little clue to its construction (e.g. drugs), that may be a reasonable bargain. It would take huge resources to rediscover and re-test a drug, and so publication saves money in the end. That might also apply in fields like microelectronics.
However, where the invention is blindingly obvious once seen*, publication adds little or nothing, and so it seems a very bad bargain. You don't need to bribe people to innovate; as you say, they'll do that anyway! You only need to bribe them to share the details — and only in where those details are worth more than the monopoly.
It's rather a pity** that the various national Patent Offices don't use this criterion to judge patentability...
(* I'm not addressing whether an invention is obvious before it's seen; that's a very different question.)
(** Ironic understatement.)
From Wikipedia, the background rate of HIV in the USA is 0.375%: 1,200,000 people are HIV+ and 310,800,000 are HIV-.
From the sensitivity: of those 1,200,000 who are HIV+, 100,000 (1 in 12) would test -ve, and 1,100,000 will test +ve. And from the specificity: of the 310,800,000 who are HIV-, 310,737,840 would test -ve, and 62,160 (1 in 5000) would test +ve.
This means that of those 310,837,840 who test +ve, 94.7% would actually be HIV+. And of those 1,162,160 who test -ve, 99.97% would be HIV-.
So while the test isn't totally accurate, it seems good enough for general use. Certainly, a +ve test result would necessitate proper medical advice.
(Of course, the calculation's simpler if you use Bayes Theorem directly, but filling in the numbers can be easier to follow. And the conclusions only apply in general; if you're in a high-risk group, then a -ve test result will be less reassuring.)
If only there, there were another interconnected network
The first book that got me thinking deeply about computing, not as a collection of hacks and tricks, but as something to be done properly. It's not a programming manual as such, but it's chock-full of advice-by-example, illustrating some of the really good bits of design in the Beeb, how the various subsystems interacted with each other, how to make best use of extremely limited memory without sacrificing design -- and the value of writing something well, which could allow it to be used in ways you never thought of.
I've read many great books since then, of course -- K&R, Knuth, Sedgewick, Programming Pearls, Effective Java, Programming in Scala, and more -- but the Advanced User Guide has probably had the most influence on me and how I think about computers. It also has the most annotations and other scribblings all over it!
Be careful with that. In some jurisdictions, continuing to accept paycheques can constitute tacit acceptance of the new terms and conditions, regardless of whether you physically sign. (IANAL, but we consulted one.)
But I concur with the parent threads. I myself have negotiated the wording in my own contract with more than one employer, so that the IP clause covers only things I do on their business, time, or equipment. If you're polite, explain your concerns, suggest ways to address them, and indicate that you'd be happy to sign once they're addressed, then IME companies will often work with you.
Your comment saddens me. — Not because you didn't learn about this at school, but because it sounds as if you expected to.
Like you, I don't think I ever heard about this experiment, nor anything remotely connected with it, at school (or university).
However, school should merely be the start of learning! (There's a strong argument that the most important function of a school isn't to teach you facts, nor even ideas, but to teach you how to learn, and how to research.) For example, you can catch interesting documentaries and discussion programmes on the radio and TV (e.g. BBC Radio 4, BBC2), you can read web sites and magazines, and you can keep your eyes and ears open.
That's how I heard about this experiment. I've come across several references to it, and been shocked and fascinated by its results and what they say about human nature.
Ultimately, the responsibility for finding out about the world falls to each one of us. Luckily, it's easier than ever to learn about almost anything imaginable; even an imperfect tool like Wikipedia can be immensely valuable. I still have vast areas of ignorance, but I discover new things every day, and I hope I always will.
Though I'm not sure whether that's his own failing, or whether it just reflects the attitude of the people he's discussing — perhaps especially those in the world of education.
Personally, I see external devices (reference books, handheld devices, Internet access, etc.) as a pretty good repository for data; also for facts (if you can sort out issues of trust and validation), and maybe even for knowledge. But wisdom is something very different, and can't be externalised in the same way.
To use the article's example, there's little point in memorising the data of the Battle of Hastings simply so that you can respond to questions like “When was the Battle of Hastings?” However, knowing that date allows you to relate it to other dates, events, situations, trends, and patterns; it can become part of a mental framework that gives you a broad understanding of the whole time and place, and that's the sort of thing that no external references can substitute for.
To pick a more extreme example:
- Is a dictionary useful for looking up rarely-used words and subtle distinctions? Absolutely.
- But can it substitute for learning a language at all? Absolutely not.
And any discussion of ‘knowledge’ which fails to distinguish those cases is going to get mired in misunderstanding.
For example, how could they omit Elite?
With everything available today, it's hard to understand just how revolutionary Elite was. Back then, games were generally trivial things you spent your pocket money on, with a simple premise and repetitive game-play.
Elite was on a radically different scale: immersive, open-ended, and one of the most influential games ever. It gave you a whole galaxy to explore (8 of them, in fact), with many different strategies and roles you could play, missions to take on, and first-person 3D combat. All on an 8-bit micro (initially the BBC Micro).
Even the box was different: instead of the usual plain cassette case, this was large, stylish, and included a specially-written novel to bring you into its world. It was launched with a big publicity campaign, spawned umpteen conversions and sequels, and pretty much defined the space flight simulation/trading genre.
I'm not sure what TFA's criteria were, but Elite's wireframe graphics, innovative 3D scanner, and the various ship designs must surely count as 'art' even aside from its huge technological and game-play advances. (I suspect, though, that the real criteria were 'familiar to today's youth', 'played on consoles' and/or 'made in the USA', which explains it...)
2. "The IRA were terrorists, yet they made peace with the UK." Only if you define 'made peace with' as 'stopped bombing'...
* The 'lone wolf' measure allows FISA court warrants for the electronic monitoring of a person for whatever reason -- even without showing that the suspect is an agent of a foreign power or a terrorist. The government has said it has never invoked that provision, but the Obama administration said it wanted to retain the authority to do so.
The government has said it has never invoked that provision -- but how would we know?!
(And if you always believe your government, then I have some nice Iraqi WMDs to sell you...)
"Mobile phone use causes a 0.01%* increase in the chance of certain types of brain cancer over an entire lifetime" would be tricky to prove either way, the effects being almost indistinguishable without careful study over a long period. (* claim entirely made up)
But "mobile phone use kills 50% of users within a year" is clearly loopy — the hundreds of people I know personally who've regularly used mobile phones for several years, all of whom have entirely failed to drop dead, is pretty overwhelming counter-evidence!
In the same way, it's pretty effin' obvious by now that any effects on brain activity from using them, even over many years, must — if they exist at all — be so extremely subtle and hard to spot that they're really not worth wasting time and worry on.
Maybe the interesting question here is why people seem so keen to disregard a lifetime's experience, and disengage all critical thinking, when reading these scare stories...