Naw, that's only useful if you want to track an IP address.
Naw, that's only useful if you want to track an IP address.
I take it you don't live in a state where anyone has tried to drag the "ID vs. evolution" issue into political debate recently. The problem is that for most people "true" means "I want to believe this", not "This can be independently verified".
Still fighting the civil war? It's time to move on. The battle was won ages ago. The only ones still fighting it are the odd evangelical and the science cheerleaders.
Just let it go, man. Just let it go...
On truth, I should probably point out that science does not deal in truth. Science neither offers nor tends toward it. It is, in fact, antithetical to science. You will never hear "science says that x is true" from anyone who has a basic understanding of science.
I know that it's useful for fighting against the enemy in your imaginary creation/evolution war, but distorting the very nature of science can only hurt the public understanding of science.
Otherwise there would be no incentive to create in the first place.
If the internet has taught us anything, it's that people will create without any sort of extrinsic motivator. For the few that need an extra push to get their work off their desk and out to an audience, a little recognition from a supportive community of like-minded creatives is more than adequate.
Without copyright, we'd not only have music, movies, books, games, etc. we'd have them in abundance. As a bonus, Justin Bieber wouldn't be a name you'd recognize.
(Yes, copyright is still a very good thing. I agree that the term should be much shorter. Something like 10-15 years.)
A real doctor or an MD?
An MD I can see.
I don't feel obliged to mustard all over Cathy Guisewite because her comic doesn't amuse me. Why do people dump so hard on xkcd and Randall Munroe?
My guess? Cathy Guisewite isn't a pretentious ass that panders to the bottom 1% of self-described "rationalists".
The constant flow of links on forums like this along with the wasteful printouts that find their way inexplicably to my desk makes xkcd difficult to ignore. Cathy, in contrast, is happily confined to the back of the local paper and rarely (if ever) brought to my attention.
I'll admit that I used to be a regular xkcd reader. I checked out this article as "Time" seemed like it could be interesting. I was wrong. It's the same nonsense that I and others outgrew years ago.
Apple prides itself on its engineering, and "deceptive simplicity". The "it just works" strategy is behind most all of their products. And in many ways, that is a really great thing. And if you have a Mac at home and an iPhone or iPad, everything "just works together", too.
Not quite. Apple's strategy is to create the myth of "deceptive simplicity" and "it just works". As any honest iPhone user will tell you, neither of those things are true.
(Just an example: Putting music on an iPhone can only be described as frustrating; particularly if you're using a computer that is not your own. On an Android or BlackBerry phone, you simply copy it over like it's a flash drive. No hassle, no "syncing", and no clunky software required. It just works.)
It's a bit like the old myth "Macs are better for graphics" which has never been true. It kept them alive through the 90's, though it almost wasn't enough. It'll be interesting to see how long they can ride the quality and easy-of-use myths.
Actually it's the other way around: tablets have opened up content creation in ways that were previously closed to "normal" people who couldn't afford thousands of dollars for expensive applications
Finger painting has never required thousands of dollars worth of anything. It has certainly never required software!
they really should have loaded their software on top of android and been done with it. immediate developer support in a slightly better dev environment.
You're either unfamiliar with BlackBerry development or unfamiliar with Android development.
(Android development is a nightmare. WTF was Google thinking?)
Because in the crazy world of Slashdot, "Computer Science" is all about practical computer applications.
"Computer Science is no more about computers than astronomy is about telescopes"
I was an undergrad CS major back in the mid-90's when computer science was still actually taught at most colleges. From what I've seen, it's turned in to "Learn C# for 21k/semester" all over. Even SIG CSE is dominated by programming related topics.
(I don't regret it, but I'm really glad I switched fields. Thanks
With enough effort, you can develop in FORTRAN regardless of the language.
If it's good enough for NASA and nuclear reactors, it's good enough for IVI.
It's the world's least annoying way to deal with email.
This one true.
Send from my iPad
(Which is why this message is so short.)
without looking at the conditions that the code was written in, it is impossible to make judgements that you are making.
That's the point the bulk of my post is making. Though I would also claim that such judgments as still impossible even if you include the conditions under which the code was written.
Once the system is running, some relationships become clear and reductions are possible.
I couldn't agree more. Yet another reason that arrogant / insecure developers think everyone around them writes nothing but crap code. A friend of mine likes to say "There are no good writers, only good re-writers." That seems to apply to computer programming as well.
I hate these smart/competent tags that people like to put on other people.
Me too. I think it's perfectly ridiculous.
Competent programmers are a dime a dozen.
Programming is the easiest damn thing in the world. It's so easy that children can and do easily teach themselves!
Sure, some problems are hard. Luckily, you can sometimes avoid them altogether. Go read some of Chuck Moore's work.
Anyhow, how do you judge the quality of a programmer? There's only one way that I know: by the quality of their output. But that can't be right, can it? Some of the most incompetent code I've ever seen has been written by programmers generally considered to be brilliant.
Take a chunk of code known to work correctly. It won't take you long to find one developer to say that it's brilliant, and another to say that it's total garbage. Why? The first developer either doesn't understand it or sees some clever or interesting tricks. The second developer sees it as unnecessarily complicated, the same problem being solvable with a much smaller, faster, and simpler solution.
If you'd rather: Perhaps the code is fine and solves the problem well, but the second developer would have approached the problem differently. Maybe they disliked the use of a specific language feature or technique, choice of brace style, or selected language.
Programmers usually aren't well versed in the humanities and tend to think in absurd black-and-white terms. They constantly mistake completely subjective judgments for objective conclusions. They're also prone to believe absurd myths (mistaking common wisdom for objective fact) and tend to buy in to the latest industry fads. You'll frequently find them defending statements that they obviously don't understand. They've simply never questioned their favorite meme. How could it be anything other than pure fact? Programming is like math, right?
Code quality is highly subjective, obviously. I understand that there are objective metrics like size, speed, and memory use. However, we can only use those to compare two solutions to the same problem! Even then, subjective measures (like readability) will quickly come in to play, which some people will consider to trump this or that objective measure -- particularly when two solutions are close on objective terms.
That absurd black-and-white / right-wrong thinking makes each person think that their subjective opinion is objectively correct, and thus irrefutable. What else can they assume but that they're surrounded by incompetent morons?
Do you know who writes bad code? Everyone. The best developer you know wrote crap code last week. You wrote crap code last week. I wrote crap code last week. Not you, you say? You used the latest set of buzzwords? Remember this: Yesterdays best-practices are today's obvious mistakes. Sometimes they oscillate between bad and good. Pick damn near any topic and dig through both current and old articles and blogs to get a sense of how the common wisdom changed over time. (Nonsense "design patterns" are an easy mark. You'll find lots of back and forth on many of those.)
There are other reasons, of course. A big problem seems to be developers over-complicating problems. Sometimes going so far as to write an interesting problem that solves the problem they've been tasked with as a side-benefit. I replaced an 81k (1700 line) component with an 8k (300 line) component a couple weeks ago. Was the developer of the old component incompetent? Not at all. He just made the problem significantly harder than it was. I'd guess that it was to keep the otherwise dull project interesting -- or because he found the problem space interesting and wanted to explore it.
When I see stuff like this I can't tell if it's arrogance or just insecurity.
Locking "Metro" apps to their store was their biggest mistake. You'd think that after Ballmer's "Developers, developers, developers" chant, that they would have known that ahead of time! Imposing artificial barriers like this would have killed them in the early 90's.
Apple gets away with calling that sort of nonsense "good for consumers", sure, but they're a special case.
Matter will be damaged in direct proportion to its value.