Forgot your password?

Comment: Re:Where are those chips baked? (Score 1) 47

by AdamHaun (#47685143) Attached to: Project Aims To Build a Fully Open SoC and Dev Board

But then I wondered -- what actually was the motivation for this all out Open Source SoC?

There have been a few projects like this posted to Slashdot over the years. For some people, it's like climbing Mount Everest -- "because it's there". Some people want to extend the open hardware community down into chip design, possibly encouraging new start-up companies. (lowRISC seems to be in this category.) And some people think the semiconductor industry is a stagnant patent-choked wasteland in need of a Linux-style revolution. (These people are idiots, and do not know anything about hardware manufacturing or the semiconductor industry.)

The big thing I don't understand is why they all want to make chips with high-performance CPUs with tons of modern peripherals. Okay, I do understand -- they want to run Linux on their product. But what's wrong with making an AVR clone? Surely it would be much easier and much cheaper to make an 8- or 16-bit CPU with a few low-end comm peripherals on an older process? Is making a fully open hardware Arduino somehow less of an accomplishment?

Regardless, none of these projects have succeeded yet, because making an SoC is much harder than making software. From easiest to hardest, the main obstacles to making an open IC product are: development skills, design tools, prototyping and manufacturing costs, testing, and logistics. I'm not an expert on the full process, but I can try to give an overview:

* Development skills: Designing high-quality digital integrated circuits, even with an HDL, is not trivial. You need people with EE or CE training, not just basic programming skills. Hardware has non-ideal behavior that must be accounted for in the design. It's also not cheap, so the design needs to (mostly) work the first time. This means you need real expertise, not just random volunteers. lowRISC has some experienced people (although not in IC design?) running it, and they're hiring a couple of EE Ph.Ds right now.

* Design tools: You can edit an HDL with a text editor, but physical design and simulation require nasty, expensive, proprietary software packages from companies like Cadence or Mentor Graphics. These are not cheap, so you won't be running a copy at home. Your work will happen at an organization with money, like a university or corporation. lowRISC is a nonprofit associated with the University of Cambridge, so they can probably negotiate lower rates.

* Prototyping and manufacturing costs: You can prototype an IC design on an FPGA, but large FPGAs are pretty expensive. Again, you'll need money for this, thousands of dollars per FPGA board at least. At some point, you'll want to make real hardware. You might be able to get a few prototype units for tens of thousands of dollars, but for real manufacturing in a modern process you'll need a proper mask set. This will probably be on the order of $500k. Small revisions (metal-only) will cost perhaps a tenth as much. If you need to move transistors around, you'll have to go through the physical design process and pay the full ~$500k again. An important side effect of this is that most design bugs will be fixed in the physical layout, not the HDL. This requires expertise, and you'll need to design for the possibility in advance. After all that, you'll have to spend thousands of dollars per wafer for manufacturing, plus more for testing (see below). lowRISC's nonprofit/academic status will help reduce these costs, and obviously they're getting funding from Cambridge and maybe their founders.

* Testing: Even if your design is perfect (and it never is), you will need to test and qualify the hardware before you can sell it. This is where open IC projects fall flat, because nobody even talks about testing. You'll need to include test features like ATPG in your design, probably using even more expensive design tools. You'll also need to write test cases for functional verification, and generate test patterns for your automated test equipment (ATE). Once you've made the tests, you'll need equipment and personnel to run them. This costs more money. For qualification (making sure the hardware doesn't break the moment it gets out the door), you'll need to make and test (at least) thousands of units, preferably from multiple wafers with process variations. You'll also need to work on test time reduction and yield improvement once you reach production. lowRISC does not mention testing at all. They do say they plan to go from test chip to production silicon in one year, and that they're expecting to "yield around 100-200k good chips per batch [of ~25 wafers]". So maybe they have a plan, or maybe they're paying someone else to handle all the DFT design and testing aspects. (I'm not sure I'd call that "open hardware", though.) Since this is a nonprofit/academic project, their early customers may not expect any real quality assurance.

* Logistics: Once you're shipping units in production, you need to do all the boring (yet vital) business stuff like managing your cash flow, making sure orders ship on time, handling customer complaints, adjusting your price over time, making sure you don't break any laws, etc. This requires a full-time staff, at which point you're not really a community project anymore. lowRISC is already an organization with a staff, so this shouldn't be a huge problem for them.

They have very little information posted so far (not even a feature list), but lowRISC seems to have a few things going for them. They're not begging for money. They can get academic discounts. They're aware that they need serious expertise. And they're only aiming for the moon instead of the Andromeda Galaxy. I'm not holding my breath, but they might have a real chance. I wish them the best of luck.

Comment: Re:Don't allow jpg or gif or ... (Score 3, Insightful) 299

by AdamHaun (#47668271) Attached to: Writer: Internet Comments Belong On Personal Blogs, Not News Sites

Whether you agree with the politics of a particular site or not, the easiest solution is just to not enable posting graphics. If someone wants to make an offensive graphic and host it somewhere, fine. But why would anyone running a controversial site allow posting such?

People like posting funny animated GIFs in the comments. Losing that capability hurts the users. But image-posting isn't the problem.

The real problem on Jezebel (as described in its article but strangely ignored in the NY Mag article and the Slashdot summary) is that Gawker/Kinja allows anyone to create a burner account and post comments, but there's no IP logging or blocking. There is literally no way to block an abusive user from commenting short of manually banning each new burner account after it posts. This is supposedly to allow anonymous "tipsters" to provide information. As a practical matter, it means the Jezebel editors are having to wade through 4chan-style images on a daily basis to keep a clean comment section. Would you like it if your job forced you to look at 4chan? No, you would not. Hence their complaint.

Comment: Re:One of the most frustrating first-world problem (Score 2) 191

by AdamHaun (#47660297) Attached to: Reversible Type-C USB Connector Ready For Production

Look on the bright side - with Type-D they'll figure out how to go reversible and genderless and then we'll be done for good.

Along those lines: the "gendering" (sort of) of USB was deliberate. USB is a master/slave protocol with a host that supplies power and a device that (optionally) consumes it. The cables were designed to prevent people from connecting two hosts together and shorting out their power supplies. The newer USB On-The-Go (OTG) standard allows two hosts to connect using special connectors (micro-AB) to control power switching and a connection protocol for deciding which end is the master, but it's pretty complicated and requires analog voltage measurement. Fun to have on a smart phone, but massive overkill for most devices.

Comment: Re:Some people are jerks (Score 1) 362

by AdamHaun (#47472739) Attached to: Sexual Harassment Is Common In Scientific Fieldwork

First, let me say that I was talking about workplace harassment. I failed to specify that in my comment.

When I say that the organization is the first responder, I don't mean they're the first, last, and only. People can always call the police (or file a lawsuit), and obviously if your organization covers for harassers then that's the next step. But escalating to the courts is expensive, time-consuming, embarrassing, often bad for your career, and nowhere near certain. Even in severe cases, the police often don't take rape seriously. It seems like the best we can do is have a multi-tiered system of shared responsibility.

Comment: Re:Some people are jerks (Score 4, Insightful) 362

by AdamHaun (#47472211) Attached to: Sexual Harassment Is Common In Scientific Fieldwork

Because I don't see how, if something is already illegal, it also needs to be against "policy". Do all company/university policies have to comb through the entire legal code and duplicate it in policy?

I can think of four reasons:

1. The organization's management is usually the first responder for harassment issues. They're responsible for bringing the people together, they have the authority to set limits on their behavior, they have the ability to monitor and follow up, and they probably know the situation better than law enforcement does. If the harasser needs to be separated from their victim, the easiest way to do that is to fire/expel or relocate them.

2. Not every illegal act can affect your job (or university enrollment). You wouldn't expect to get fired or expelled for speeding, would you? Having a harassment policy makes it clear that harassing your fellow employees/students can get you disciplined or fired.

3. Harassment policies don't just forbid harassment, they also provides rules and procedures for responding to harassment. Illegal or not, wrong or not, the most common response to harassment complaints is to sweep them under the rug to avoid disturbing the status quo. Even well-intentioned managers don't necessarily know how to handle a complaint without training.

4. Having a strong and effective harassment policy with backing from management affects workplace culture. The default attitude in a lot of places is that making other people uncomfortable for fun is no big deal, even if they repeatedly ask you to stop. A harassment policy says otherwise, encouraging victims to report instead of keeping quiet or leaving.

Comment: Re:Good lord (Score 1) 302

by AdamHaun (#47429273) Attached to: Wireless Contraception

The same ignorance you claimed was responsible for high birth rates in impoverished areas. How does a chip give them more knowledge? That was a rhetorical question, the answer is that it does not.

Of course it doesn't. Where are you getting this? Contraceptives are what people get educated *about*. Nowhere in the article or my comments did anything say otherwise.

The "evil that are already happening" gains more power with this type of technology.

You... don't actually know anything about the problems women face in the world, do you? You might want to do some reading, or better yet, talk to some actual women.

Straw man, it has nothing to do with the points, which is that this technology puts the option for birth control in the hands of a person other than the recipient of the birth control.

You haven't made any points because you haven't given any way in which subdermal chips are different from other contraceptive methods, or what revolutionary new methods of force they allow. I don't think you understand how hormonal contraceptives work at all, otherwise you would recognize that this technology is an incremental change from what is currently possible. Let me spell it out for you.

Hormonal contraceptives work by regulating the hormones involved in the menstrual cycle. The hormones vary naturally over the length of the cycle, but you can boost them to a consistent level by providing more externally. This basically tricks a women's body into thinking it's pregnant, which prevents ovulation. (There are some variations; this is the simple version.) For our purposes, we're interested in how the hormones are delivered. There are many options, each with their own mix of benefits, challenges, and side effects. Below, I've listed the ones I see in the 19th edition of Contraceptive Technology by Hatcher et al. There are probably more.

* Pills taken once a day
* Patches worn on the body and replaced weekly
* A plastic ring inserted into the vagina and replaced monthly
* Shots given once every three months
* A thin plastic rod inserted under the skin and replaced every three years

Another major option is a T-shaped piece of plastic inserted into the uterus, known as an intra-uterine device (IUD). There are two types -- one that releases copper ions to prevent fertilization (replaced every ten years), and one that releases levonogestral to do several different things (replaced every five years). IUDs are the most popular contraceptive method worldwide, but are less popular in the U.S. for historical reasons.

All of these methods are reversible, though they take some time to wear off. Subdermal implants and IUDs need to be removed by a doctor.

Now, let's look at the article to see what's different about the contraceptive chip. The article tells us a few things:

* Releases levonorgestrel daily
* Lasts 16 years
* Size: 20mm x 20mm x 7mm (vs. 40mm long x 2mm diameter for Implanon, the subdermal implant I mentioned above)
* Can be activated and deactivated through the skin with a wireless signal.

We can also guess that no user intervention is required once the device is implanted. Medically speaking this is a good thing, because one of the big problems with many contraceptive methods is compliance. It's easy to forget to take a pill, buy more condoms, or schedule an appointment for your next shot. An IUD or subdermal implant works automatically.

Clearly, the biggest difference is the wireless control. However, in order to have that control the chip still has to be implanted, just like Implanon or an IUD. The key point here is that if you want to prevent pregnancy, the chip is *functionally identical* to those other options. Later, if you want to become pregnant again, the chip is *easier to turn off* since it doesn't require a doctor to remove it. So if it's no harder to disable pregnancy and much easier to re-enable it, then it becomes easier for an individual woman to make the decision for herself, not harder.

Even if all of the available doctors are controlled by an oppressive government, the chip is still no worse. At best, it will be easier for black market services to switch it on or off for you. At worst, it probably won't have any more side effects than the IUD you'd be forced into using anyway.

Comment: Re:My sound card is an A/V amplifier (Score 1) 502

My sound card is hooked up to an AV receiver that I used to drive my 5.1 home theater speaker setup (Energy CB-20s and CB-10s, if you're curious). It works pretty well, but there a is a downside, which is that the digital processing in receivers creates audio lag. If you're really sensitive to lip sync, you might want to consider other options. I haven't found a modern receiver without lag yet, but I haven't looked very hard.

Comment: Re:Good lord (Score 1) 302

by AdamHaun (#47421623) Attached to: Wireless Contraception

Claiming that ignorance can be fixed by continued ignorance from a different party is a fools prospect.

I'm not sure what ignorance you're talking about here. How are people going to be unaware that they've had a chip implanted under their skin that stops them from getting pregnant?

There are simply too many nefarious purposes for this type of technology.

But again, functionally, the birth control application isn't much different from a Depo-Provera shot or an IUD. The differences are A) it lasts somewhat longer, and B) you can turn it off without removing it.

If some dystopia decides that fertility is a reward, this technology allows that very easily.

We don't need to fantasize about what hypothetical dystopias might do -- we have an existing one to look at. In China, there are existing technologies that already do what you're talking about. They are backed up by fines and other punishments. Outside of China, trying to force surgery on an entire population is a risky move that could easily provoke a popular uprising. China spent decades under a Stalinist dictatorship before enacting the One Child Policy, and as such they are a pretty extreme case.

If another dystopia decides that soldiers should rape women but don't want pregnancy as a result, well, this allows that as well.

Do you know how hormonal contraception normally works? It doesn't take effect right away. You'd something high-dosage like a morning after pill or shot. Subcutaneous chips are designed to release low dosages over long periods of time.

Also, why would someone who's okay with institutionalized rape be worried about their victims getting pregnant?

I don't understand why you're more worried weird hypothetical dystopias than the kinds of evil that are already happening.

Comment: Re:Good lord (Score 1) 302

by AdamHaun (#47419859) Attached to: Wireless Contraception

Sorry, perhaps I was unclear. I was responding specifically to your statement about family planning, not to the device described in the article. As for the device, it seems to be targeted at areas where people don't have good access to medicine (probably Sub-Saharan Africa), and things like regular birth control prescriptions or Depo-Provera shots aren't practical.

One could conceivably use this for forced birth control, but I don't see how it improves on forced sterilization or IUD insertion. Forced birth control is used to stop women from ever having children again, not to control fertility timing. Even in China they seem to rely more on fines and forced abortions than contraceptives. Also, lowering birth rates tends to make people wealthier and give them more free time, so if you want to keep your population poor and uneducated, forced birth control seems like a bad idea. And of course, outside of China and (apparently) Uzbekistan, it's forced pregnancy that's the problem, not forced birth control. I'd be more worried about this tech being used for psychiatric medications than birth control.

Comment: Re:Good lord (Score 2) 302

by AdamHaun (#47412109) Attached to: Wireless Contraception

An international coalition of governments, companies, philanthropies, and nonprofits recently committed to providing family planning to 120 million more women in the world by 2020.

Of course those Governments, companies, and philanthropists know best how you should plan a family. Considering how the top .01% of the population (which includes those Philanthropists) control the majority of the wealth, they only have societies best interests in mind right?

"Family planning" is a euphemism for sex education and contraceptive access. Large parts of the world do not provide any sex ed to women at all, even basic stuff like giving them a heads-up before blood starts coming out of their vaginas. Even in the developed world there are many teenagers and young adults whose parents either don't know enough to help, don't want to help, or provide false information when it comes to sex. Family planning services give women the information and tools they need to make their own decisions. Oppression by elites in this context involves keeping women ignorant and afraid so they don't question traditional, patriarchal ideas. Family planning services are the opposite of oppression.

Comment: This is so bizarre I'm not sure what to make of it (Score 3, Insightful) 228

by AdamHaun (#47365135) Attached to: Nathan Myhrvold's Recipe For a Better Oven

The article footer implies that he's some kind of cooking science wizard, but I have trouble believing that Nathan Myhrvold has ever done more with an oven than toss a slab of meat in it. I'm no expert, but I've baked an awful lot of cakes, cookies, breads, and pastries, and I find this article very confusing:

Most of us bake, roast, and broil our food using a technology that was invented 5,000 years ago for drying mud bricks: the oven. The original oven was clay, heated by a wood fire. Today, the typical oven is a box covered in shiny steel or sparkling enamel, powered by gas or electricity. But inside the oven, little has changed.

Weird condescension towards "brick dryers" is a running theme of this article. To see how ridiculous this is, I invite you to consider a nineteenth century cake recipe with its many methods for determining correct oven temperature and shielding parts of the cake from the oven walls so that it bakes evenly. Turning a knob to set an arbitrary temperature, while imperfect, is a *vast* technological improvement over wood-fired ovens. (Remember: just because it's analog (or non-electronic!) doesn't mean it's not technology.) Likewise, the metal that the oven is made from represents thousands of years of technological advances in itself.

Preheating always seems to take an unreasonably long time because ovens waste most of the hot air they generate. The actual amount of energy required to reach baking temperature is quite small: Just 42 kilojoules will heat 0.14 cubic meters of air to 250 C. The heating element in a typical domestic electric oven supplies this much energy in a mere 21 seconds. Unfortunately, the heat, which originates in the heating coils of an electric oven or the burner of a gas oven, must pass through the air to get to the walls, and air is an awful conductor of heat, only slightly better than Styrofoam. Even worse, air expands when heated, so much of it flows out of the vent, heating the kitchen rather than the oven.

But the oven walls will heat the air anyway, so how much energy would we really save by heating the walls directly? Pre-heating is only a fraction of the oven's total operating time. And wouldn't an electric burner also produce radiant heat? And then a few paragraphs later:

As soon as you open the oven door to adjust or check on the food, nearly all the hot air spills out. The puny electric element or gas burner is no match for such large surges of cool air, so the temperature in the oven plummets, and it recovers slowly.

which is totally inconsistent with what he said earlier.

At 200 C or below, convection moves most of the heat. But at 400 C, radiant energy starts doing a fair amount of the heat transfer. At 800 C, radiation overwhelms convection. Why couldn’t we have an oven designed to cook primarily by convection at low temperatures that switches to radiant heating for high-temperature baking?

As others have mentioned, this is a Fahrenheit/Celsius error at best and a non-sequitor at worst. The highest normal baking temperature is around 500 F (260 C) unless you're going crazy with pizza. If the article's numbers are correct, we should totally ignore radiant heating! (I don't think they are.) And I'm not clear on how the oven is supposed to "switch" to radiant heating. If the walls are hot enough to radiate, you get hot air for free. If the air is hot, it heats the sides of the oven.

Myrhvold next dives into a laundry list of suggested improvements, which fall into a few categories:

1. Stuff that already exists, but is expensive.
2. Stuff that's not done because it's too expensive and/or inconvenient.
3. Complicated gimmicks that require recipe-specific behavior.
4. Star Trek.

And you’re not going to be able to stop a cook from opening the oven door on occasion ... But designers could prevent that blast of cold air by building a blower into the door frame that generates a “curtain” of air whenever the door is opened, retaining more of the preheated air in the oven. ... Designing one for an oven is trickier because the chamber is small and turbulent currents could do more harm than good. Still, it could be done.

Personally, I haven't found the occasional door-opening to be a big deal. It is discouraged for delicate foods like cakes. But clearly we need a complicated, expensive air curtain that either runs constantly or turns on in an instant. Nobody knows how to do it and it might be more trouble than it's worth, but Myhrvold is *sure* that someone (not him) will make it work.

I'm going to skip most of the broiler stuff since I don't broil, but a couple things stood out:

Electric broilers use bars or rods made from Nichrome, an alloy of nickel and chromium (and often iron) that heats up when electricity passes through it. With reasonable energy efficiency, electric broilers can heat quickly and reliably to temperatures as high as 2,200 C. Maximum settings are typically restricted to 1,200 C in order to extend the life of the heating element and avoid charring the food.

Nichrome melts at around 1400 C (2550 F). I strongly suspect this is another unit error.

You can make a broiler cook more evenly with a simple DIY project: Install some shiny vertical reflectors near the edges of the compartment. Another good way to ensure that food browns evenly under a broiler is to wrap the dish with a reflective foil collar. But why should you have to jury-rig a fix? It wouldn’t be hard for oven manufacturers to build reflective materials into the oven. Viking Range is developing stainless steel cavities that are more reflective, but they need better materials that are easy to clean and don’t discolor when heated to high temperatures.

Not sure if I should be more bothered by the idea of removing tarnish from my oven in order to cook, or "all it needs to work is a major advance in materials science".

But technology offers a fix here, too. Oven designers could put optical sensors in the oven chamber... And a camera in the oven could feed to a color display on the front panel ... a decent optics system ...

Let's talk about what "in the oven" means for electronics -- 250 C. Consumer-grade electronics stop working about 85 C (at best). Higher-grade stuff can go up to 125 C. Beyond that you're talking about serious high-reliability components. Maybe he wants some kind of fiber-optic feed? But that doesn't sound like "a decent optics system". And again, there's more cleaning, which nobody likes.

Even when perfectly calibrated, however, the standard oven thermometer has a big problem. Water evaporating from the surface of the food carries away heat, leaving the food cooler than the surrounding air. Scientists call this temperature at the surface of the food the wet-bulb temperature (that is, the dry-bulb temperature minus any cooling by evaporation). The lower the relative humidity, the more evaporative cooling, so the greater the difference between the two temperatures.

Because most of the humidity in an oven comes from the food, adding food to the oven raises the humidity and, therefore, the wet-bulb temperature at the food’s surface. This means that bigger batches bake faster. Did you burn the holiday cookies last year? Blame your oven’s ignorance of its own humidity.

Cookie dough is usually very dry. The water comes from the butter and the eggs. I know he's looking for something everyone makes as a rhetorical device, but it's still a sloppy and misleading example.

And while you’re at it, perhaps you could solve a related problem: the inadequate precision of the humidity- and temperature-control systems. Both temperature and humidity levels stray from the set points more than they should.

That's an obvious cost saving. It's a solution to a different problem.

Finally, Myhrvold reaches the place where he should have begun:

Some other interesting technologies that make ovens more useful are already on the market. Newer models of steam ovens by Wolf Appliance

This is where the article loses me. Convection ovens with steam injection are far beyond "brick dryers". They're also only a few thousand dollars, which is expensive but within the reach of a hobbyist. *This* is the state of the art in home ovens. Why did we start with low-end household models? People use those because they're cheap or because that's what their landlord provided, not because they're good. Nobody's going to get any of Myhrvold's fantasy features without massive cost reduction. So is this a speculative article about the furthest reaches of oven technology, or is it about practical ways of improving oven design? Despite tap-dancing past major technical obstacles, the article doesn't claim to speculate. From near the beginning:

Yet oven manufacturers could solve every problem with existing technology, if only they would apply it.

But on the practical side, here's the only mention of increased cost in the article:

An oven could be manufactured today that could incorporate many of the technologies now available and go a long way toward turning today’s mere brick dryer into a true food cooker. It would have wet- and dry-bulb temperature controls, better thermostats, probes to go inside food, and sensors that detect hot and cold spots. All these features would add just a few hundred dollars to the cost of the parts.

I don't know where he got that number from, given that each one of those features is only found in a commercial oven that costs thousands of dollars more than residential models. Regardless, even his lowball estimate is doubling the price of a cheap residential oven. And some of these new features are complicated for the user and break backwards compatibility with existing recipes.

Maybe I'm being too hard on the guy. I should probably have more patience with this sort of breathless "more electronics solves every problem!" attitude, especially since it helps keep me employed. But more electronics won't do anything about my landlord, and that, not theoretical oven design, is the biggest problem with most people's kitchen appliances.

(I still can't get over the "brick dryers" thing. Brick making is a big deal! Has Nathan Myhrvold done anything in his life as useful as making bricks?)

Comment: Re:what's the point anymore (Score 1) 113

by AdamHaun (#47264041) Attached to: Unisys Phasing Out Decades-Old Mainframe Processor For x86

I don't know much (ok, anything at all) about the Libre lines but the Dorado machines have some very unusual characteristics such as 9-bit bytes

Nice to know there's still some non-DSP hardware out there with oddly-sized bytes. Maybe not so nice if you have to develop for it, but it gives me examples to point to when people ask why C's data types are defined the way they are.

Wherever you go...There you are. - Buckaroo Banzai