I've got one of these on my desk as I write. I've actually been working with it for several months now, and it's pretty sweet. It's intended to be a DSP co-processor coupled to an FPGA. The company I work for (BittWare) has invested heavily in Adapteva, and we are introducing some boards featuring a handful of 16-core Epiphany chips (which we have rebranded as "Anemone") and an Altera Stratix 5 FPGA.
The tools are Linux-only at this point, but that's more than OK by me. I think this is the first time I've ever not been forced to use Windows to develop code for a new processor.
The target application is anything that requires lots of DSP but can't burn many watts.
</shameless plug>
I thought of a possibly viable path the industry could offer for converting to DC. What if a computer manufacturer started offering desktop machines with an UPS integrated into the power supply? The marketing reasons for this are somewhat compelling - it makes the desktop behave more like a laptop. No need for a separate UPS, etc.
They could at the same time take the further step of providing UPS-backed DC outlets on the PS itself, and then sell other equipment that would plug into these DC outlets - routers, cable modems, printers, monitors, etc. One advantage gained by the manufacturer here is that they would no longer need to provide region-specific wall-warts for small equipment.
Alternatively, a manufacturer could make an UPS with DC outlets as well, so this wouldn't be limited to desktop systems. Third parties would spring up to provide cables to connect the router you already have to this DC outlet in place of the wall-wart. Why buy a $60 router when you could get the same effect for a $5.00 replacement cable?
Once those devices become widely deployed, it's a short jump to DC outlets in the walls. Once that happens, the desktop no longer needs an UPS-backed AC supply - it could just have a DC cable like all the other gear. From there it's a short hop to in-home, battery-backed, off-grid (or aux-grid) power, be it PV or wind, or whatever. Then my innernets would stay up even when an ice storm takes out the grid.
Seems to me they invented the reverse of the process that's really needed. It's a lot harder to get enough skin for grafting than it is to get blood for transfusions. Wouldn't blood-to-skin be a better conversion?
"If I do not want others to quote me, I do not speak." -- Phil Wayne