Dude, listen. I don't even know what you're trying to argue at this point, and to be honest, I don't really care. You don't understand C (or English!) well enough for me to get anything out of this conversation, which as far as I can tell, no longer has anything to do with anything besides your ego. The basic point -- that it is within the guidelines of the Standard for a C compiler to delete null checks on a pointer after it is guaranteed that said pointer has been dereferenced -- has been proved multiple times. Whatever fucking alcoholism or anger management or insecurity issues or whatever are leading you to ramble down this insane, incoherent road, I'll let you deal with on your own.
Where did I admit that I made an erroneous claim? Where did I MAKE an erroneous claim? Like, I was worried I would confuse you with all those words, but this is ludicrous. I never said GCC had this optimization by default -- only that it's Standards-legal to make that optimization, you dummy. Re-read the thread. You are very confused.
Now you're just using a very arbitrary definition of "broken." The compiler, in this specific instance, is working precisely as intended. It's not like someone accidentally went and implemented -fdelete-null-pointer-checks into GCC, Clang and MSVC and then everyone else went on accepting it without question. It's a concept with quite a bit of thought and care and discussion put into it.
The basic premise of an optimizing compiler is this: produce output that is at least as fast as the original code as-written and adheres to all defined behavior. In this case, it's spot-on -- the only way through the example function with defined behavior is to have a non-NULL pointer, in which case, the branch comparison is a waste of CPU cycles. For undefined behavior, the compiler has no obligations. All bets are off. You don't get to dereference NULL pointers, then complain that your program didn't work as expected, unless you're working with a compiler that honors obligations above and beyond the C standard.
There are some environments in which you DO want to have some say in what happens next -- which I guess in my opinion would be anywhere that dereferencing a NULL pointer is legal, or at the very least, not instantly and reliably fatal. Compiler authors have not forgotten about you. In GCC, for example, you have two options:
1. Do not use -O2 or -O3
2. Use -O2 or -O3 in conjunction with -fno-delete-null-pointer-checks, in which case, your null pointer checks will be left unmolested.
I know after a similar piece of code to the example was discovered in the Linux kernel, they decided to apply -fno-delete-null-pointer-checks. Not sure if that's still true.
A far more egregious example of a compiler exploiting undefined behavior is GCC 1.x which, when given invalid pragmas, would generate code that attempted to exec nethack, rogue, Emacs towers of hanoi, or failing all of those, just generate a printf making fun of you.
In conclusion... know thy optimizer. It's making decisions about your code that can affect you, and it is configured by default to cover the most common use cases. If your program depends on behaviors that are unusual and not covered by the standard (like being able to dereference a null pointer), then you should review your compiler's documentation and see if you need to tune the optimizer a bit for your use case. But if your standards-compliant compiler is applying a well-documented optimization in a manner that breaks you, then it's your project that's broken, either for using that optimization, or for relying on undefined behavior.
From your favorite FAQ and mine, comp.lang.c: (http://c-faq.com/ansi/undef.html):
"undefined: Anything at all can happen; the Standard imposes no requirements. The program may fail to compile, or it may execute incorrectly (either crashing or silently generating incorrect results), or it may fortuitously do exactly what the programmer intended. Note, too, that since the Standard imposes absolutely no requirements on the behavior of a compiler faced with an instance of undefined behavior, the compiler (more importantly, any generated code) can do absolutely anything."
In other words, once you do something whose behavior is undefined, you have a program whose execution is (at least, as far as the C standard on its own is concerned) unpredictable. Given that, the compiler can do almost anything it wants in situations where behavior is undefined. It could, for example, just abruptly terminate the program. That would make Chris's comment spot-on.
Alternatively, he could rewrite the comment as,
So yes, it is a completely legit optimization, in full accordance with the C standard, and if you REALLY want to be able to dereference a NULL and have some expectation about what your program does after that, then you need to choose your compilers and/or optimization settings carefully because the C standard alone is not going to give you what you want.
Yeah man if only the LLVM team thought to look at the C standard or consult with Slashdot commenter Zero__Kelvin
You're not really a very good programmer if this is your reaction to being proven wrong.
You really, really ought to read the link he gave you. It's quite eye-opening.
You're an idiot. I've been an embedded systems programmer for 30 years and he was wrong, which he finally admitted. Since he admitted to it in the post above the one you quoted from I have to assume your reading comprehension skills are on a par with your C programming sk1llz.. Good luck learning C !
Uh, I think you need to re-read his post a little more carefully. This is getting a little embarrassing for you, and if you've been doing embedded development for 30 years and still don't know how optimizing compilers work, I feel REALLY bad for you. I can see why you're so insecure.
Post the code rather than trying to explain it in prose and I'll tell you if you understand it yet, but at least you finally see where you went wrong. That's progress and I wish you well!
It sounds like you are a very passionate hobbyist, but one of the nice things about a forum like Slashdot is that there are a lot of professionals like EvanED who will offer you free advice. I suggest you start listening and stop biting the hand that feeds you! Who knows, Evan might have even made some of the software you're using right now. If you work and study hard and ask good questions (respectfully!) you might even be able to work on it with guys like him someday.
Not only was there a serious security issue here, but Dropbox customers are having to find out about this through blogs. Dropbox has yet to email its users about this issue. It claims on its blog that users who logged in during this time have been notified. I logged in during this time, and have received no notice.
I am now leaving Dropbox. I need to review Wuala and Spideroak to see if they meet my needs, but I can safely say that this event and Dropbox's earlier behavior has demonstrated to me that they do not take the security and privacy of their customers seriously.
Why would I want a vicious shark to be an attack dog? It just seems like I either have a suffocating shark, or I was very confused about my requirements when I went looking for an attack dog.
Maybe they just classified the really good part: when the lights came back on, there were only 49.
Just as rat poison is not harmful to humans, so long as you don't ingest it.
Or Ke$ha, as long as you don't listen to it.
Geez, Linux is not some revolutionary, unique software. It copies from other systems and OSes. As long as we know what and where, we can figure out why and how.
As for Linus: not scalable. He needs a break. and do you all really know he's the only one that commits? Really? It's just a git account, i.e. Linus could still be committing in 2310, if he gave someone his password of course... Conspiracies, conspiracies....
Come on, Darl, let it go. It's time to move on.
Dude, you ruined it. As written, you have a brilliant joke in C: the premise is bad (we're beating a dead horse), the plan is questionable (we're directly comparing a pointer and a string literal), and the execution is sloppy (thanks to the typo, we're testing the result of a non-zero assignment, so the horse is beat forever regardless of liveliness).
This is pretty much the story of SCO.
I have to admit, I had to ignore years of experience with Internet forums to follow a link to "goatse.fr."