Same story here with a Whirlpool HE front-loader. The washer quickly developed a moldy smell. The clothes often came out of the machine with completely dry spots because of inadequate water levels. It started leaking a few months ago. We replaced the logic board and the front door baffle only to have the problems return. All this despite taking all the preventative steps and using the recommended products as directed.
The service techs that came out to deal with the machine had the same story as well -- these front-loading HE washers just do not work due to the difficulty in meeting the new efficiency requirements, and the manufacturers will not design them to do so, since it would cut into their profit margins and reduce demand.
I am also now thinking of purchasing Speed Queen while they are still available.
Also notable -- there is a class-action lawsuit pending against Whirlpool, and it looks like it's getting some traction: http://www.forbes.com/sites/da...
While I'm sure it would allow customized algorithms, they would have to be rather unique to not be handled by the current state of geometry/vertex/fragment shaders. Are they thinking some of non-triangular geometry?
The FA mentions voxel rendering for Minecraft-type applications. Although volume rendering can be achieved with traditional hardware accelerated surface primitives, there are many algorithms that are more naturally described and implemented using data structures that don't translate so easily to hardware accelerated primitives.
Constructive solid geometry, vector based graphics, and ray tracing are also not such a nice fit to OpenGL and DirectX APIs. You don't always want to have to tessellate geometry that has an analytic expression, such as conics, rational quadratics, b-splines, and NURBS, so a more software-oriented approach can provide better renderings for those types of mathematical objects.
The challenge here is that graphics primitives that APIs such as OpenGL provide are of course those that the hardware can most readily accelerate. If you don't use primitives and operations that can be massively parallel then you may not get much use out of the hardware.
What's interesting there is we say it reflects reality because it makes the calculations easier.
That really is the most interesting thing in this discussion. Essentially we are making a leap of faith, that simpler models are more likely to be true as long as they continue to support the data and allow us to make predictions. But it is at root an aesthetic judgement: beauty is truth, and truth is beautiful. It is the essence of rationality.
It's cool to see how Feynman's diagrams may be like the epicycles of the earth-centered view of the universe: they can be made to work as long as you keep refining the model, adding loops within loops within loops. But with this new breakthrough, all that can be thrown away for a much simpler model that leads to deeper insights. And those deeper insights are awe-inspiring: locality and unitarity as emergent phenomena.
X was so "ahead of its time" that its entire architecture was dumped in version 10 to give way to X11, and then it remained so far ahead of its time that to this day NextOS, MacOS, Android and Windows have yet to adopt a single thing from it, contrary to the rest of Unix most of which has made its way into those operating systems.
Mac OS X, Android and Windows are consumer operating systems, for which eye-candy UIs are considered more important than network transparency. Their remote connectivity needs are limited to accessing corporate Web, cloud, and IT services, not other peers on the network.
NeXT was a great OS that used Display Postscript as the rendering engine, but it was also wrapped in a networked desktop environment, NextStep, and used with X11 and NeWS as well (Sun's Network Extensible Window System). I did find NextStep and NeWS superior to X11 and it's a damn shame they didn't succeed (although NextStep evolved into OS X, and Applie did include a rootless X11 implementation with it until Mountain Lion).
As for other companies, there were entire industry consortiums dedicated to expanding X and Unix, such as X/Open and the Open Software Foundation: these included companies like AT&T, DEC, Unisys, Hewlett-Packard, IBM, Sun, Prime, and Apollo.
And no, it was not designed to access resources from the desktop. It was mainly designed so that you could use a dumb terminal to access your server. When it became clear that was pie on the sky, instead of redesigning the turd, they just added layer upon layer of cruft, so you ended up with a dumb as doornails protocol running on a heavy weight, expensive "dumb" terminal.
The dumb terminal at that time was a VT100. X was designed to run on bitmapped displays. Although there were such bitmapped terminals available at the time, X mostly ran on engineering workstations. You didn't usually use it access a server (although you could); rather, other networked peers used it to display a UI on your local X display. I'm not sure why you think that is "pie in the sky" since it worked and continues to work rather well. Part of the reason for that was because the protocol was rich enough to transmit graphics primitives at a higher level than a bitmap. Nothing dumb about it.
Lastly the web browser has nothing to do with Unix. It is platform independent. The fact that you think the web==unix shows how little you know about deep OS architecture.
Don't be silly, I'm not conflating the Web with Unix. Sure, web browsers are supported by most computing platforms. But the web browser's roots in Unix go way back to NextStep and the beginnings of the Internet, at that time mostly Unix-based, and the web browser remains a central and crucial component of desktop Linux. My main point was that cleaning up web browser architecture would be vastly more useful and relevant than replacing a stable and functional part of Linux with something that is less useful, but prettier.
X is one of the few remaining *big* mistakes in Unix. It was designed with the wrong philosophy and overtaken by actual usage. Wayland is an effort to clean up and refactor the code.
X was ahead of its time and nothing ever caught up to it. It was designed around the idea that all the resources of the network should be seamlessly accessible from a single user's desktop, and embodied the old Internet ideal of ubiquitous peer-to-peer connectivity (still perfectly reasonable and incredibly useful on a secure LAN). Wayland is an effort to make it easier to develop eye-candy user interfaces for consumers and throw out any functionality that gets in the way of that goal. It's totally appropriate for mobile but unnecessary and counter-productive for the desktop.
Computers don't actually think. You just think they think. (We think.)