Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Agnostic & Atheistic are orthogonal concepts (Score 1) 536

"Agnosticism" is a statement of epistemology. It merely states that there is no way to prove or disprove the existence of God. "Theism" and "atheism" are declarations of belief. The express an acceptance of an ontology.

You can be an agnostic theist, or an agnostic atheist. While the epistemological question of our knowledge of God may be inaccessible, it is difficult to remain completely neutral with respect to personal belief.

Do you think God exists? You are a theist. Do you think God most likely does not exist? You are an atheist. As it turns out, most atheists are agnostic, and are more than willing to admit there is a certain probability that some form of god exists. That probability is simply vanishingly small.

Atheism is not as un-scientific as being religious. In fact, as the lack of god is the null hypothesis, atheism is the default scientific proposition. Until there is a testable, positive hypothesis concerning the existence of god, theism is entirely in the realm of wish-fulfilling fantasy and philosophical self-gratification.

Agnosticism, meanwhile, is not a "third option." As you use it, as a state of superposition between theism and atheism, it isn't even coherent.

Comment Re:Isn't this already well-known? (Score 5, Insightful) 813

Why is this making the news now?

Because this not only debunks the study (which has been debunked for a few years now), it proves Wakefield manufactured the entire thing. He altered data, misrepresenting each case -- for instance, while Wakefield claimed none of the subjects exhibited signs of autism, medical records show that 5 of the 12 had already been shown to have autism. Further investigation shows that all twelve cases had been misrepresented to various degrees.

Also, Wakefield misrepresented the study to the doctors from whom he received referrals. He called it a "clinical trial," not a study.

Basically, this investigation proves that Wakefield was not simply careless; he intentionally fictionalized the entire study.

We can no longer attribute to incompetency that which is demonstrably malicious.

Comment Re:cable card readers in the 360 FTW (Score 1) 182

No fucking way. The XBox 360 interface fucking sucks. I like my 360, don't get me wrong. Gears of War is one of the best games ever. (I don't get the appeal of Halo, but Gears is a lot of fun.) I fucking hate the XBox Live interface, with its advertisements and cheesy avatars. If I wanted a cartoonish avatar living in a land of advertisements, I would've bought a Wii and moved into a mall.

Take the XBox hardware, rev it to newer processors so it doesn't need a fan, update the interface so it looks less like the old Harvey Comics adds for Little Debbies snacks (you know the ones, with Caspar and L'il Devil and Wendy the Witch), and give it a decent way to browse your music and DVD rips which you have stored on your old PC in the den, the one that's piled under your unread year's worth of Maxim and Handyman magazines. Then it might be OK. But if they just used the current XBox interface, I'd never buy it, even if they gave it the ability to modify Sarah Palin's and Fran Drescher's voices so they weren't so fucking annoying.

Comment Re:NO. NO, GOD, NO (Score 1) 182

Usually when people say xbox isn't profitable they are using a deviant definition of profitability that counts sunk costs from years previous.

Yeah, because everyone knows if you spend $5B on something in the first year, and it makes back $3B over the next five years, you can call that $3B profit.

Microsoft might be able to break even with the 360. If their next generation is any good (and there's no reason it shouldn't be decent -- the 360 is a fairly decent platform, though XBox Live is like living in a mall), they might actually turn a profit on the whole enterprise.

Microsoft has proven it has the tenacity to buy its way into a market over the course of several years. It'll be interesting to see if they have what it takes to move into the set-top box market.

Comment Re:Here's to hoping (Score 3, Informative) 381

So basically, you've seen ONE movie where it wasn't thrown in "just because". UP and Coraline were entirely computer-generated video, and re-rendering with the "camera" in a different position is a matter of tweaking a couple of settings. They could re-make ANY all-CGI film (Ice Age, Wall-E, etc) as 3D if they still had the original files and rendering programs. And probably make money on them.

(Note: Avatar used lots of computer-generated imagery...but not exclusively, and did a lot more with motion capture than is normal.)

Have you even seen Coraline? It was produced via stop-motion, using 3D cameras. There were some digital effects, but not many. So, no. For Coraline, it wasn't thrown in "just because."

Comment ID has fundamental problems (Score 1) 989

What is the unit of complexity? How do you measure complexity? At what point does complexity become "irreducible?"

Irreducible Complexity has fundamental problems that are far more severe than a simple biological example can ever show. The ID folks can provide as many examples of "irreducible complexity" as they desire, and their proposition still has no grounding in actual science.

Behe has proven this by constantly moving the goalposts. "Oh, that example might not've been irreducibly complex, but this one is, certainly."

Comment Re:From the article it is obvious (Score 1) 546

Untyped code is filed under 'ugly' in my book. I generally prefer maximum compile-time safety with as much typing as possible (my favorite language is Haskell :) ).

Haskell is beautiful. I wish I had more opportunity to use it, rather than just playing around with it. You have good taste.

I won't argue about your other points, since they are subjective. Personally, I'd rather prefer good pattern matching to all the OO stuff.

Not really subjective. You made the assertion there wasn't much there for general-purpose computing; I mentioned several features that are good for general-purpose computing. There's nothing subjective there, other than whether or not you like them.

Obviously, many people have found them useful, from the original NeXTStep coders andLotus Improv creators to the modern OS X and GNUStep developers.

In any case, while I prefer Objective-C over C++ (it just fits my style better), I can definitely understand the desire for stronger compile-time type enforcement. That alone is sufficient to warrant C++ over Objective-C, especially for a project that relies on input from many other people of varying skill levels.

My inner language fanboi came out when you glossed over most of the unique features that make Objective-C a versatile language.

Oh, one other complaint about it to add to your list: current implementations don't properly support private variables and messages.

Comment Re:From the article it is obvious (Score 1) 546

Nope, not a troll.

Objective-C is poor. For example, the most useful part of C++ are fast typed template containers.

This is also one of C++ weaknesses. Troubleshooting templates is a royal pain in the ass. I'd rather chase a pointer to hell and back than deal with another set of poorly-written templates. And templates are almost invariably poorly-written.

They are damned useful when done right, though.

Objective-C has only pointer containers which are untyped.

True, that. Objective-C sacrificed compile-time type checking for flexibility. Well-written Objective-C code is almost beautiful (something that can't be said even for well-written C++ code), but you really need to be careful with your types.

'Const' support? Nope.

RAII and smart pointers? Nope.

"Smart pointers" are really just a kludge to fix up a poor language design choice. RAII isn't all that vital in Objective-C, either.

You are attempting to say Objective-C is deficient because it doesn't support the design patterns you use in your C++ code, when those design patterns are necessary because of the language itself.

Memory management in Objective-C is quite convoluted, btw.

You ain't kidding. It's getting easier with each iteration of the language, but the GC is kinda particular. Conscientious use of refs is a must.

So almost nothing useful for general-purpose programming. Except maybe for inheritance.

Riiiight.

Again, I think you are judging Objective-C based on your C++ experience. Late binding, associated references, adding messages to existing classes at run-time, message forwarding, and so on are all excellent general-purpose programming idioms that aren't supported in C++. Couple that with introspection (which is supported in C++, to a degree), and you can write very powerful fully-OO programs.

Objective-C isn't perfect. No language is. But it certainly isn't as anemic as you seem to think.

Comment Not all that slow (Score 2, Insightful) 546

Objective-C isn't necessarily that slow. Message passing can be about four times slower than C++ method invocation, but once cached, the two are comparable. (SEE here for some interesting stats.)

In a system as IO-heavy as GCC, your bottlenecks probably won't be your method calls / message passing. And as for being deterministic: why would a compiler have to be deterministic? There are no hard real-time considerations for compilers. Your variation in compile-times will be minimal, even with a non-deterministic GC.

I think your point 2 (typed) is fairly valid. Part of the reason to move to C++ is to provide a language that is more strongly-typed than C. While the run-time binding of Objective-C makes it a great language for some applications, it does remove some compile-time type checking. (You do get warnings about an object type's ability to process a message, of course, so you don't lose it all.)

Slashdot Top Deals

On the eighth day, God created FORTRAN.

Working...