Follow Slashdot stories on Twitter


Forgot your password?

Comment Re:Null pointer detection at compile time (Score 1) 470

Dude, listen. I don't even know what you're trying to argue at this point, and to be honest, I don't really care. You don't understand C (or English!) well enough for me to get anything out of this conversation, which as far as I can tell, no longer has anything to do with anything besides your ego. The basic point -- that it is within the guidelines of the Standard for a C compiler to delete null checks on a pointer after it is guaranteed that said pointer has been dereferenced -- has been proved multiple times. Whatever fucking alcoholism or anger management or insecurity issues or whatever are leading you to ramble down this insane, incoherent road, I'll let you deal with on your own.

Comment Re:Null pointer detection at compile time (Score 1) 470

Where did I admit that I made an erroneous claim? Where did I MAKE an erroneous claim? Like, I was worried I would confuse you with all those words, but this is ludicrous. I never said GCC had this optimization by default -- only that it's Standards-legal to make that optimization, you dummy. Re-read the thread. You are very confused.

Comment Re:Null pointer detection at compile time (Score 1) 470

Now you're just using a very arbitrary definition of "broken." The compiler, in this specific instance, is working precisely as intended. It's not like someone accidentally went and implemented -fdelete-null-pointer-checks into GCC, Clang and MSVC and then everyone else went on accepting it without question. It's a concept with quite a bit of thought and care and discussion put into it.

The basic premise of an optimizing compiler is this: produce output that is at least as fast as the original code as-written and adheres to all defined behavior. In this case, it's spot-on -- the only way through the example function with defined behavior is to have a non-NULL pointer, in which case, the branch comparison is a waste of CPU cycles. For undefined behavior, the compiler has no obligations. All bets are off. You don't get to dereference NULL pointers, then complain that your program didn't work as expected, unless you're working with a compiler that honors obligations above and beyond the C standard.

There are some environments in which you DO want to have some say in what happens next -- which I guess in my opinion would be anywhere that dereferencing a NULL pointer is legal, or at the very least, not instantly and reliably fatal. Compiler authors have not forgotten about you. In GCC, for example, you have two options:
  1. Do not use -O2 or -O3
  2. Use -O2 or -O3 in conjunction with -fno-delete-null-pointer-checks, in which case, your null pointer checks will be left unmolested.

I know after a similar piece of code to the example was discovered in the Linux kernel, they decided to apply -fno-delete-null-pointer-checks. Not sure if that's still true.

A far more egregious example of a compiler exploiting undefined behavior is GCC 1.x which, when given invalid pragmas, would generate code that attempted to exec nethack, rogue, Emacs towers of hanoi, or failing all of those, just generate a printf making fun of you.

In conclusion... know thy optimizer. It's making decisions about your code that can affect you, and it is configured by default to cover the most common use cases. If your program depends on behaviors that are unusual and not covered by the standard (like being able to dereference a null pointer), then you should review your compiler's documentation and see if you need to tune the optimizer a bit for your use case. But if your standards-compliant compiler is applying a well-documented optimization in a manner that breaks you, then it's your project that's broken, either for using that optimization, or for relying on undefined behavior.

Comment Re:Null pointer detection at compile time (Score 1) 470

From your favorite FAQ and mine, comp.lang.c: (
"undefined: Anything at all can happen; the Standard imposes no requirements. The program may fail to compile, or it may execute incorrectly (either crashing or silently generating incorrect results), or it may fortuitously do exactly what the programmer intended. Note, too, that since the Standard imposes absolutely no requirements on the behavior of a compiler faced with an instance of undefined behavior, the compiler (more importantly, any generated code) can do absolutely anything."

In other words, once you do something whose behavior is undefined, you have a program whose execution is (at least, as far as the C standard on its own is concerned) unpredictable. Given that, the compiler can do almost anything it wants in situations where behavior is undefined. It could, for example, just abruptly terminate the program. That would make Chris's comment spot-on.

Alternatively, he could rewrite the comment as, // P was dereferenced by this point, so it is either non-NULL or the programmer's wishes and expectations no longer apply.

So yes, it is a completely legit optimization, in full accordance with the C standard, and if you REALLY want to be able to dereference a NULL and have some expectation about what your program does after that, then you need to choose your compilers and/or optimization settings carefully because the C standard alone is not going to give you what you want.

Comment Re:Null pointer detection at compile time (Score 1) 470

You're an idiot. I've been an embedded systems programmer for 30 years and he was wrong, which he finally admitted. Since he admitted to it in the post above the one you quoted from I have to assume your reading comprehension skills are on a par with your C programming sk1llz.. Good luck learning C !

Uh, I think you need to re-read his post a little more carefully. This is getting a little embarrassing for you, and if you've been doing embedded development for 30 years and still don't know how optimizing compilers work, I feel REALLY bad for you. I can see why you're so insecure.

Slashdot Top Deals

10.0 times 0.1 is hardly ever 1.0.