ALPS keyswitches work well for gaming and they're much less noisy than buckling springs. You can find used Dell AT101Ws all over the place.
Ever notice that NAND flash prices per megabyte have plummeted while EEPROM prices per kilobyte have remained high and then wondered how that could be rational?
Kilobyte-quantities of EEPROM are cheap enough that the package is probably a non-trivial part of the cost. The die cost is not the only lower bound on the price of a memory IC.
"Electrolytic capacitors leak, electrodes corrode"
none of those are present in a Kindle.
Those were generic examples. A better one for a mobile device might have been that the contacts on the charging port wears out. Based on the rest of your post I'm skeptical that you know anything about the components used to make Kindles, but I didn't design it so I won't speculate further.
I apologize for the LCD/E-Ink confusion. But it doesn't make a difference because neither of them are designed to last for 100 years. According to the company, they expect that "over 90% of E Ink displays will last more than 10 years with typical usage", where "typical usage" is defined as room temperature. Kindles are rated for operation between 0 - 35 C (32 - 95 F), and discussion on Amazon suggests that this is a real limitation. That range is similar to what LCDs can handle. I suspect that both are limited by a chemical breakdown process (which would happen exponentially faster at higher temperatures) but I don't know enough about displays to say for sure.
you cant get more "temperature extreme" than what [Voyager 1] experiences. and it has "electronics" in it.
If you think that a space probe is in any way comparable to a consumer e-reader, then I'm afraid you don't understand anything at all about electronics or engineering. Had you kept reading on Wikipedia, you might have found stuff like this:
The Flight Data Subsystem (FDS) and a single eight-track digital tape recorder (DTR) provide the data handling functions.
The digital control electronics of the Voyagers were based on RCA CD4000 radiation-hardened, silicon-on-sapphire (SOS) custom-made integrated circuit chips, combined with standard transistor-transistor logic (TTL) integrated circuits.
Electrical power is supplied by three MHW-RTG radioisotope thermoelectric generators (RTGs). They are powered by plutonium-238
Voyager-1 was designed from the ground up for reliability in a hostile environment. I can't find a price for the hardware itself, but you can bet it was a more than $100, probably by several zeros. And even so, it's looking like the power supply will fail before it turns 100. A Kindle is nowhere near that level of reliability. It doesn't need it and nobody wants to pay for it.
Many thanks to you and blueg3 for providing a better summary. These terrible analogies for quantum mechanics are always more confusing than helpful.
If cared for a kindle DX can last 100 years.
On what do you base that statement? Consumer-grade electronic devices are not designed to last that long. Electrolytic capacitors leak, electrodes corrode, copper in IC traces migrates and shorts out, batteries wear out. I don't know about LCDs, but I'm sure they have long-term failure modes too, especially if they're exposed to sunlight. The whole device will be exposed to temperature extremes due to the lack of air conditioning in a survival situation. I'll buy 20-30 years. But 100? No way.
Power is usually described as going in or out.
So which manufacturers have give you the best and worst results?
Yeah, but the poll is about the ALA's Banned Books Week. You would expect totalitarian regimes to ban books; the point of Banned Books Week is that people try to do it in America too.
The American Library Association maintains lists of the most frequently challenged books (i.e. the ones people try to ban). Although 1984 shows up on the list of challenged classics, there is only one challenge listed -- someone in Jackson County, Florida in 1981 thought that it was "pro-communist and contained explicit sexual matter". The first part shows a massive failure of reading comprehension, not actual hostility towards the content. 1984 doesn't show up in the top 100 challenged books lists for 1990-1999 or 2000-2009.
1984 is definitely worth reading, and since its story features a banned book its presence in the poll makes a certain amount of sense. But it's not a great example of a challenged book, and the presence of several other dystopian works makes me wonder what the poll writer was thinking. Book banning in the U.S. is not a top-down government-led project to turn people into sheep for a New World Order. It's a bottom-up process where private citizens (mainly parents) try to "protect" children and teenagers from what they see as objectionable content.
As can be seen from the top 100 and top 10 by year lists, sexual content in books targeted at teenagers is the biggest concern. Most of the challenges are to fictional books, but a few non-fiction sex ed books make the list. The 2011 list even has a book for kids about what happens when their mom gets pregnant! Aside from sex, it seems like drug and alcohol use, offensive language (particularly racial slurs), and "religious viewpoint" (probably criticism of religion/Christianity) are popular reasons for challenging books.
The Handmaid's Tale prominently features all of those subjects, and is an excellent book as well. The first-person narrative really drives home the crushing horror of the setting. If you're looking for some dystopian fiction to read, I highly recommend it.
C's bit fields are a really helpful feature in embedded programming. Unfortunately their implementation is strongly tied to the target CPU architecture. In particular, the endianness of the fields cannot be redefined in code, and bit fields are usually not allowed to cross word boundaries.
1) Never trust a civilian that says "these weapons you want are not very effective or what you need". He is not trained or capable to make that argument.
Why not? Is it impossible for civilians to study military matters? Are you saying that all military historians are quacks? You realize that weapons development and production is done by civilians, right? And that our military and its funding are under civilian control, which is also the case in China and Russia? Are you aware that people (even in the military) always want things that make their own job easier, regardless of the overall cost? And that always giving them what they want is bad management?
You can boil down his argument to what I originally said -"these weapons are good at killing people"
No, you can't. His argument is that hypersonic missiles will not give us enough of an *advantage* in killing people (or destroying equipment, which is arguably a more important use) to justify the cost. Absolute destructive power is meaningless on its own. A weapon only gains value in the context of specific opponents, strategies, and doctrines. In the context of mutually assured destruction, a hypersonic missile is useless.
You're trying to paint Gubrud as some sort of naive hippie who doesn't believe in war, and that's simply not supported by the article at all.
I've not RTFM'd because I try not to let bulletinshit touch my eyeballs, but hypersonic technology certainly has civilian uses.
If you had RTFM'd, you would have noticed that the author specifically mentions civilian uses (including space launches) and proposes limits that would exempt them.
The argument is at heart "Don't develop these weapons because they will be good at killing people and I personally am not smart enough to come up with a civilan use that doesn't kill people".
No, that is not at all a good summary of the article. In particular, Gubrud (the author) addresses civilian hypersonic technology and makes a specific proposal that would allow for it:
It’s not often that one can say an entire technology should be banned because it has no conceivable good use. Hypersonic missiles, however, may present just such a case. Hypersonic air travel seems economically unjustifiable in an era of climate change, high-cost energy, and low-cost video telepresence. But if civilian hypersonic air travel ever did become a reality, it would take the form of a large airplane, not a small missile. Low-cost satellite launches? Hypersonic space planes such as DARPA’s planned XS-1, which would lift rockets to high altitude, might make some sense. But again, to achieve economies of scale, such hypersonic boosters would need to be large, which the hypersonic missiles under military development are not.
To speed its approval, any such moratorium would have to define hypersonic missiles in a way that does not require elimination of already existing cruise missile systems. I would propose a ban on flights of any aerodynamic vehicle of less than, say, 15 meters length or 2 meters diameter, traveling in powered or unpowered flight at speeds in excess of 1 kilometer per second, over a horizontal distance greater than 100 kilometers. Space and ballistic missile launches and reentries could be specifically excepted. The numbers are somewhat arbitrary and could be fine-tuned or adjusted substantially while preserving the intent of the agreement.
Saying that Gubrud wants to ban hypersonic weapons because they're "good at killing people" is a gross mischaracterization. His actual arguments are:
1. The main application of hypersonic missiles is supposedly to attack short-term high-value terrorist targets. But we can already do this successfully by attacking from nearby bases.
2. Another claimed application is to attack key strategic targets within the borders of major military powers.
a. It could be hard to distinguish between a conventional and a nuclear strike. Advocates say this won't be a problem because hypersonic missiles have a distinct attack profile versus, say, conventionally-armed ICBMs. But in practice, there's nothing that prevents a nuclear warhead from going on a hypersonic missile, and nothing that prevents a conventional missile from attacking nuclear targets as part of a nuclear first strike.
b. The idea that a conventional attack on the homeland of a nuclear power won't result in a nuclear counter-attack is questionable to begin with.
c. The existence of very fast attacks makes for hair-trigger standoffs that require rapid (and thus error-prone) decision making. Naval standoffs in particular are mentioned as a risk.
3. Developing hypersonic missiles will force other nations to do so as well.
a. Arms races are extremely expensive.
b. Any military advantage will erode very quickly -- in a few years at best.
To these I would add that 2c) implies that you have to maintain a counterstrike capability, which is a large ongoing defensive cost. We saw this in the nuclear arms race, which led to us having to keep bombers in the air 24/7, ICBM silos on standby, and missile submarines constantly hidden near their targets. Other commenters have also suggested that hypersonic missiles would make great anti-aircraft carrier weapons. Giving other countries more incentive to develop such weapons is not in our interest.
Finally, unlike your ridiculous straw man, Gubrud is quite practical about geopolitical realities:
Hypersonic missiles are a new class of weapon that no country actually needs. Their military advantages are ill-defined, and their capacity to destabilize relations among major powers and contribute to a costly and dangerous strategic arms race is enormous. Even so, the United States can’t expect that just because it proposes a test ban, other nations will line up to renounce hypersonic missiles. What America can reasonably hope is that other nations will see their shared interest in avoiding or slowing a dangerous escalation of the arms race. I am therefore proposing that the United States suspend testing for a while, to show good faith as it seeks agreement on an international ban on hypersonic testing.
If the suspension does not draw a positive response from other countries, the United States can always resume its programs, while still advocating a general moratorium, thereby seizing the moral high ground. That others might not join America there immediately is a poor excuse for not proposing a hypersonic testing ban and calling the others to join. Indeed, if the United States is unwilling to forgo hypersonic testing for a time, others have every reason to be cynical about its real motives and intentions. I’m not sure that I understand those motivations. But I am reasonably certain that hypersonic missiles will not help to make America stronger or more secure, because it is clear, from their programs already in progress, that other nations will not allow the United States to claim a monopoly on hypersonic weaponry.
And then we resume development. Meanwhile, China loses credibility, which makes it harder for them to create treaties and agreements that they actually care about. Nations can't simply flaunt every treaty they sign without consequences. At the very least, they have to be selective.
But then I wondered -- what actually was the motivation for this all out Open Source SoC?
There have been a few projects like this posted to Slashdot over the years. For some people, it's like climbing Mount Everest -- "because it's there". Some people want to extend the open hardware community down into chip design, possibly encouraging new start-up companies. (lowRISC seems to be in this category.) And some people think the semiconductor industry is a stagnant patent-choked wasteland in need of a Linux-style revolution. (These people are idiots, and do not know anything about hardware manufacturing or the semiconductor industry.)
The big thing I don't understand is why they all want to make chips with high-performance CPUs with tons of modern peripherals. Okay, I do understand -- they want to run Linux on their product. But what's wrong with making an AVR clone? Surely it would be much easier and much cheaper to make an 8- or 16-bit CPU with a few low-end comm peripherals on an older process? Is making a fully open hardware Arduino somehow less of an accomplishment?
Regardless, none of these projects have succeeded yet, because making an SoC is much harder than making software. From easiest to hardest, the main obstacles to making an open IC product are: development skills, design tools, prototyping and manufacturing costs, testing, and logistics. I'm not an expert on the full process, but I can try to give an overview:
* Development skills: Designing high-quality digital integrated circuits, even with an HDL, is not trivial. You need people with EE or CE training, not just basic programming skills. Hardware has non-ideal behavior that must be accounted for in the design. It's also not cheap, so the design needs to (mostly) work the first time. This means you need real expertise, not just random volunteers. lowRISC has some experienced people (although not in IC design?) running it, and they're hiring a couple of EE Ph.Ds right now.
* Design tools: You can edit an HDL with a text editor, but physical design and simulation require nasty, expensive, proprietary software packages from companies like Cadence or Mentor Graphics. These are not cheap, so you won't be running a copy at home. Your work will happen at an organization with money, like a university or corporation. lowRISC is a nonprofit associated with the University of Cambridge, so they can probably negotiate lower rates.
* Prototyping and manufacturing costs: You can prototype an IC design on an FPGA, but large FPGAs are pretty expensive. Again, you'll need money for this, thousands of dollars per FPGA board at least. At some point, you'll want to make real hardware. You might be able to get a few prototype units for tens of thousands of dollars, but for real manufacturing in a modern process you'll need a proper mask set. This will probably be on the order of $500k. Small revisions (metal-only) will cost perhaps a tenth as much. If you need to move transistors around, you'll have to go through the physical design process and pay the full ~$500k again. An important side effect of this is that most design bugs will be fixed in the physical layout, not the HDL. This requires expertise, and you'll need to design for the possibility in advance. After all that, you'll have to spend thousands of dollars per wafer for manufacturing, plus more for testing (see below). lowRISC's nonprofit/academic status will help reduce these costs, and obviously they're getting funding from Cambridge and maybe their founders.
* Testing: Even if your design is perfect (and it never is), you will need to test and qualify the hardware before you can sell it. This is where open IC projects fall flat, because nobody even talks about testing. You'll need to include test features like ATPG in your design, probably using even more expensive design tools. You'll also need to write test cases for functional verification, and generate test patterns for your automated test equipment (ATE). Once you've made the tests, you'll need equipment and personnel to run them. This costs more money. For qualification (making sure the hardware doesn't break the moment it gets out the door), you'll need to make and test (at least) thousands of units, preferably from multiple wafers with process variations. You'll also need to work on test time reduction and yield improvement once you reach production. lowRISC does not mention testing at all. They do say they plan to go from test chip to production silicon in one year, and that they're expecting to "yield around 100-200k good chips per batch [of ~25 wafers]". So maybe they have a plan, or maybe they're paying someone else to handle all the DFT design and testing aspects. (I'm not sure I'd call that "open hardware", though.) Since this is a nonprofit/academic project, their early customers may not expect any real quality assurance.
* Logistics: Once you're shipping units in production, you need to do all the boring (yet vital) business stuff like managing your cash flow, making sure orders ship on time, handling customer complaints, adjusting your price over time, making sure you don't break any laws, etc. This requires a full-time staff, at which point you're not really a community project anymore. lowRISC is already an organization with a staff, so this shouldn't be a huge problem for them.
They have very little information posted so far (not even a feature list), but lowRISC seems to have a few things going for them. They're not begging for money. They can get academic discounts. They're aware that they need serious expertise. And they're only aiming for the moon instead of the Andromeda Galaxy. I'm not holding my breath, but they might have a real chance. I wish them the best of luck.