Sorry, but ALGOL is just as awful as Pascal for an engineer. It's a freaking language developed by academics for developers. C on the other hand was the language by developers for developers. Obviously academics chose what's best for them, thus Pascal still survives...
I'm in total agreement. I was at HP when management decreed that we would have to use Pascal for future projects. They were convinced by academics that this would make the code "portable". It just led to a few nightmare years of trying to develop in a severely crippled language. It was about that time I learned C in my spare time. C is hugely powerful while being elegantly simple. The most apt criticism of it is that it gives you enough rope to hang yourself if you are sloppy. But without that rope, lots of things are much harder to do. I've been programming for 45 years now and have never seen a better overall language than C. Ritchie was a giant in this industry. And, BTW, bashing Unix is just kind of laughable.
When someone is injured by a self-driving car, who is liable?
This will be determined in court, like every other kind of liability. Attorneys will go after the deepest pockets and see how far they get. Eventually precedent will clarify liability.
Beginners should not start with creating GUIs in the first place. (Neither did early BASICs support such a thing)
I totally agree! I've been programming for almost half a century now and I've never seen languages more worthless than these things that insist on starting with programming the GUI (e.g., Alice). The key to teaching, or learning, programming at the beginning is striking a balance between conveying core concepts and keeping the newbie's interest (especially if that newbie is yourself). Starting with a lecture on machine language and memory layout (I actually took a class like that) is a sure way to put people to sleep. But if you start with putting buttons on the screen you have skipped over a world of core concepts and turned programming into magic where you just memorize incantations.
I think what Apple has done with Obj C is a travesty and makes it very difficult to learn. They have GUI-ized it so thoroughly that it's really hard to tell what's going on under the covers. I was able to teach myself C from Borland's Turbo C, and I learned C++ pretty easily on my own. But Obj C has been hard and that's entirely because all the books insist on teaching it with this GUI Xcode stuff from Apple that keeps changing with each release so the books are always a little out of date. I'm convinced that C++ was easy only because I learned it before MS adopted it and did the same thing to it that Apple did to Obj C.
i've gotten lost some place. does this mean that a free to use picture posting service is claiming equal copy right ownership to pictures that are freely uploaded which then provides the pictures as proprietary content to proprietary content providers so said proprietary content providers can add the pictures to their proprietary content in order to profit without the freely usable posting service giving equal profits to the originator of the originally free content?
No. This sounds like the same kind of thing online companies have been doing just about forever to protect their sites from unscrupulous people who download in bulk and repost, or who take content from a site for republication for other purposes. They want to be able to protect their users by using copyright to go after people who misuse the posted images. That means it probably IS for the protection of their users. Otherwise every Twitter user would have to police their own content. It's in Twitter's best interest to make people feel safe about uploading content.
So, tell me, how much of Feyerabend's philosphy of science have you studied?
None. But if he comes out with woo-woo shit like equating science to voodoo, that's already too much.
Be careful about judging a philosopher's whole body of work, and even the whole of the field of philosophy, based on a single post on Slashdot.
That said, it does appear that Feyerabend enjoyed being a bit outrageous. But philosophy is not science. It's not about making "progress", but about following different lines of thought to see where they lead. Sometimes thinking about what we are doing in a new way is useful, even if it isn't the only way to look at things. IMHO, it's worth learning about multiple philosophical schools of thought as long as you don't get caught up in just one and take it too seriously.
Feyerabend was probably talking about Truth in the philosophical sense, which I think is a poorly defined term that assumes there is one best way to view all of reality, which may not be the case. Thankfully, scientists just keep working on better ways to explain what we see and theories that are more predictive of what we haven't seen yet.
Here's a prediction for you - there will be an Apple stylus tablet within 3 years. Until about 6 months before launch, it will continue to be the dumbest idea ever. Then, Steve will proclaim it to be brilliant.
The iPad supports a stylus right now. It just doesn't require one.
A stylus is very handy for some things, like taking notes on a class or meeting, making impromptu sketches, etc.
BTW, I'm still waiting for apps on the new devices that will allow me to ditch my old Palm TX (hopefully, before it dies of old age). I love the new smart phones and tablets for all the new cool stuff they can do, but none of them do the old stuff as well as the Palm devices.
Ever notice that even the busiest people are never too busy to tell you just how busy they are?