Uh, no. An X-ray photon and an infra-red photon have the same velocity, c. They have different frequencies. Neither will escape a black hole, which is pretty much defined as a body having an escape velocity greater than c.
That's because C deals with how computers actually think.
No. C presents a highly idealised representation of how computers actually think. According to C, there are two types of data storage; disks and memory. One is accessed through fopen() etc., and the other through malloc() and its kin.
However, on any modern OS, calling fopen(), read() may well return me data that was residing in-memory in a file system cache, and a call to malloc() may well result in a pointer to data that's actually residing on the harddrive in the swap file. Then throw in things like L1, L2 cache into the mix, and we realise that we are writing code against an imaginary, idealised machine. I'd suggest that even by the time we reach 'C', we're in the realm of 'nothing at all to do with how the processor works'.
So why not keep going, and build newer storage models that don't make an arbitrary division between 'memory' and 'disk', and certainly don't force these upon the end-user?
Some of the nicest programs I've used recently, such as Adobe Lightroom, have done away with the concept of a 'Save' button entirely - changes the user makes to data are immediately reflected in the SQLite datastore. The end user, who's not a programmer, doesn't need to maintain a mental model of memory, filesystems etc.
Now, after several years of anticipation, he's finally released arc, his lisp-like language, onto the world. Designed to be a '100 year language', will this be a breakthrough in language design or an interesting toy that will fade quickly into obscurity?