This demonstrates the importance of not being seen.
Why stop there? The days of the week are named after the gods associated with the seven celestial bodies (only Saturday wasn't converted from a Roman deity to a German one, as there was no equivalent of Saturn in the Norse pantheon). And the first 6 months of the calendar are all named after Roman gods or religious festivals, so we should rename those as well. Even how we divide time into hours, minutes, and seconds is associated with Babylonian numerology and astrology, which has deep religious significance.
If we are going to start changing names to remove religious connotations, we have a lot of work to do. Or is it just the Christian origin of the name you object to, and not religion in general?
It's been doing okay. NVIDIA is making money, but it is only up 4.5% over the last year. Compare this to 6.6% for Intel, 13% for the Dow Jones, and 16% for the S&P 500. It's only doing well when compared with smaller chipmakers like ARM (up only 4.2% in the last year), Qualcomm (down over 12% in the last year), and AMD (which has lost over 26% in the last year).
I haven't read Linux's rant against C++ for a while, but he is correct that C++ isn't a good choice for an OS kernel. The only major kernel written in C++ that I know of is Windows NT, and it uses only a subset of C++ language features. In particular, it disables exceptions, disables RTTI, removes new/delete, and it doesn't have the standard library. Microsoft wrote their C++ compiler with this in mind, and there is a compiler flag to disable kernel unfriendly features (documented here). For everyone else, it's easier to just say that the C++ subset for kernel development is C (minus the standard library).
For non kernel use, C++ is superior to C in the hands of an expert programmer, but mediocre programmers who don't understand the language tend to write absolutely horrible code. And you can't take an expert C or Java programmer and expect them to write expert C++ code with just a few weeks practice. C++ is one of those languages that you have to dedicate a lot of time to, but it can be worth it if you require highly optimized code, have low latency requirements, or have low space requirements (areas where higher level languages like Java don't do well).
The Soviet Union adopted with the AK-74 in 1974, and most Eastern European and former Soviet Republics use it today. It has many advantages over the old AK-47, including accuracy, penetration, and muzzle velocity. With proper maintenance it is as reliable as the AK-47, and it costs less than other modern assault rifles.
The only nation states I know of that still use the old AK-47 are in the Middle East, Africa, and Southeast Asia (including, I think, India). The big advantage of the AK-47 is that it is cheap enough to hand out like candy to guerrilla fighters, and it's reliable enough to still work after years of little to no maintenance (though it's effectiveness drops quite a lot when doing so).
The Pennsylvania bill did not become law, but the UK and Vermont ones did (according to the article, which is flagged as needing more than three references).
I had never heard of the Red Flag traffic laws before. You learn something new every day. I can see why they would be enacted, and why they were thought to be practical (early self-propelled vehicles weren't much faster than a pedestrian, they were only practical for bulk transport). I suspect that many disruptive technologies have crazy laws and regulations before they become mainstream.
The problem with D isn't the language, which is excellent. Unfortunately, superior languages loose out to inferior ones all the time. (Yes, I'm aware that superior and inferior are subjective terms.)
Language quality is nice, but there are several factors that are more important when it comes to market success. These factors include: third party tools (compilers, debuggers, IDEs, profilers, etc.), third party libraries (both quantity and quality are important here), momentum (C++ and Java are pretty well entrenched, and it will take a lot more than being a better language to significantly displace them), and finally there is the coolness factor. Coolness relies on many things, but the one that I think is most important is having a charismatic creator or evangelist.
Now D is making significant improvements in each of these areas, so I expect it to continue to grow in market share. In particular, LLVM support and having Andrei Alexandrescu as an evangelist are pretty huge. It still has a ways to go before it can catch up to C++, however.
Many definitions of Object Orientation describe methods as functions associated with data (or with an objects state) and define them formally in terms of message passing. In fact, Alan Kay (creator of Smalltalk and I believe the originator of the term Object Oriented) includes objects sending and receiving messages as part of his definition of Object Oriented. Of course, Alan Kay also stated that every object is an instance of a class, so perhaps his definition isn't quite correct today.
Multiple dispatch often relies on inheritance, which is necessary but insufficient for Object Orientation. However, it isn't clear how to map a function call into a message for an object. Message passing is too useful of a model of computation to throw away. Perhaps you can create a definition of Object Oriented that doesn't use it, but I haven't seen one.
This isn't 100% true. I recently flew on a United flight from Atlanta to Dubai, and the nifty map showing the flight path displayed us flying right through the middle of Iraq. Perhaps it used to be true that we wouldn't fly over Iraq, but we do now. I'm sure that there are still altitude restrictions, but I'm just guessing.
Computer Science is a pretty broad area of study, but I consider these three problems to be the most fundamental.
Computability: What can a computer do and what can it not do? There are an uncountably infinite number of problems that a computer cannot solve (undecidable), and only a countably infinite number of problems that a computer can solve (decidable). Fortunately, most of the interesting problems are decidable.
Complexity: If a computer can do something, how efficiently can it be done? This goes beyond the Big O you are taught as an undergraduate, and considered language spaces such as P, NP, PSPACE, EXPTIME, and so on. It also considered not only computation time but space (unfortunately, few undergraduates are introduced to space constraints of algorithms, a great interview question is to list an example of a sorting algorithm that takes constant space).
Equivalence: Given two algorithms, do they perform the same computation? Meaning that given the same inputs they will always produce the same outputs (and yes, side effects are also outputs)? A less strict (but of more practical importance) is whether or not a program meets a specification.
Computability and complexity are both important parts of the theory of computation, which is usually built on top of Language Theory, which is itself built on top of Set Theory. The hardest problem is modern mathematics may be P = NP, which is also a Computer Science problem. The third problem requires creating mathematical proofs using Formal Logic. It is also an excellent example of an undecidable problem, meaning that there is no general algorithm that can perform it for every program (in other words, it's something that a computer cannot do).
In addition to Set Theory and Formal Logic, Computer Science relies heavily on Boolean Algebra, Graph Theory, and other areas of Discrete Mathematics. Computer Science is inherently cross-disciplinary, but at its core it is closer to Mathematics than it is to Engineering or Science.
80 years ago the Lingua Franca for diplomacy was French. In fact, French dominated diplomacy from the 17th century until WW2. English didn't start getting used in non-English diplomatic circles until after WW1 (it was quite significant when the Treaty of Versailles was written in both English and French). French has been eclipsed by English, but it is still popular (it is the second most used language in the UN and the EU).
For science and technology, Latin used to dominate. Once people stopped publishing in Latin, three dominant languages appeared: English, French, and German. Which was dominant depended on the field being discussed. Before WW1, German may have been the largest of the three, but after WW1, English was noticeably more dominant (and has only continued to grow).
For business, the general rule is that whenever possible the seller speaks the buyers language. 80 years ago, there were several useful intermediate languages that could be used to facilitate business. The most common would be English, French, and Arabic. I don't know that German was used much outside of Europe and the few German colonies. French was probably the smallest here, since outside of Europe it was most spoken in Africa, where it had to compete with Arabic as a language of trade. There are plenty of other languages which are influential at a regional level, such as Chinese, Russian, Spanish, and Swahili, but these haven't had much of an impact globally. Due to its size and economic might, I expect that Chinese will become more influential in the future, and it will slowly become more significant outside of Asia. I don't see Spanish moving outside of Europe and the Americas, at least not in the short term.
You fail at reading comprehension. There is nothing in the summary that says those drives aren't possible, just that they have increased heat and (therefore) an increased rate of failure. This is one reason why average hard drive speeds haven't improved much in the last 15 years.
I once bought six Western Digital 10000 RPM drives for a RAID setup. Three of them failed within the first year. Two failed the next year (including one of the warranty replacements). I replaced them all with six of their 7200 RPM Red drives and other than one DOA that needed to be replaced, I haven't had a failure in almost two years. Sure, anecdotal evidence is purely anecdotal, but it backs up the summary. (Still pissed off about the DOA though.)
In the same time frame I have owned five SSDs, including an Intel SLC that is almost six years old (works great as a small root partition on my Linux box) and I have had only one failure (a 250GB OCZ Vertex Pro). And unlike the hard drives, the data was recoverable from the dead SSD (I could mount it read only, but it wouldn't mount writable). Other than the failed SSD and my laptop drive (no idea who made it), all of my SSDs have been Samsung or Intel drives, and I highly recommend both manufacturers.
I don't know about journald, but on Solaris the binary logging works using digital signatures. Each log message (and the prior log messages signature) is signed to ensure that the log message hasn't been tampered with, and that log messages haven't been removed. In the event of tampering, the log messages can still be read, but are flagged as untrustworthy. I understand that administrators prefer text messages (which is why our Solaris systems also logged to syslog), but for security auditors digitally signed binary logs are a godsend.
There was never any question about whether Iraq had chemical weapons. After all, Saddam used them against Iran and his own people. The question has always been, "where are they now?"
The possible answers are that he still had them somewhere, that he gave them away, that he destroyed them, or that he had run out. Each of these answers presents problems. If he still had them, then where were they and who might still have access to them? If he gave them away, who did he give them to and why? If he destroyed them, why not let the West verify this and stop the sanctions (and also prevent an invasion)? If he used them all up, why didn't he make more? Saddam's actions suggest that he had something to hide, or that he wanted people to think that he had something to hide (I always liked the idea that he wanted Iran to believe he had them, but wanted to plant doubt in the US, and he couldn't pull off that balancing act).
I don't know if I believe the article, but it would be nice to have a conclusive answer one way or another.
This was very common. Germans emigrated in large numbers in the late 19th century, but you wouldn't know it today. In response to public outrage at unrestricted submarine warfare many Germans immigrants Anglicized their names, turning Schmidt into Smith, Wilhelm into Williams, and so on. Anglicization also happened in England, with the most notable case being the rename of Saxe-Coburg to Windsor (yes, the English royal family were Germans with blatantly German names).