Comment Re:Errors (Score 2) 536
POSIX signals themselves are a bit of a horror. Like C++ exceptions (as Google correctly points out) they have implications for `other' code, the worst case being code that has not be written to cope with interrupted system calls. Also, signal dispatch has portability problems; signals did not anticipate threads and POSIX was slow and iterative in its promulgation the standard solution, so many subtleties have appeared among implementations.
However, I think you have the right instinct. I personally find myself working in explicitly event driven environments frequently. Node and TCL for example. Here you can not indulge the illusion of absolute control over the fate of the instruction pointer. Any time you `yield' to the runtime you wind up entering your code at some other point as the runtime dispatches events.
Using the event model to cope with errors and exceptions would mean that anything that would traditionally throw an exception or return a error code would instead be a yield point and may generate an error event. You would then provide a handler to receive these events with enough context to cope with the problem.
I've come to the believe the event driven model is a far better model for the actual conditions one assumes when implementing logic. The moment you write main(){...} you are subject to signals that are handled by a collection of default handlers. One day the system becomes non-trivial and you must 'fix' these handlers. Perhaps you have no business writing main(){...} and adopting a naive, linear model in the first place. Instead, you're supposed to implement (the moral equivalent of) a signal handler instead.
Down at the bottom, where CPUs process machine code, hardware interrupts are endemic. The hardware itself imposes the event model. It may be the case that most machine/assembly code still written by humans today are simply event handlers; logic servicing hardware interrupts.