DuckDodgers writes: "The database company Ingres announced a partnership with research firm Vectorwise to bring to market an efficiency breakthrough in databases. They assert that most complex queries run by a database engine can run over 100 times slower than a C++ program hand coded to get the same information from the files on disk. They're working on a database engine that closes the gap dramatically by using several methods, like batching tuples for processing in sizes that fit in the processor on-chip cache, other methods for minimizing back and forth between RAM and processor cache, and structuring the data to be processed in a way to make best use of CPU branch prediction. Their example in the whitepaper (unfortunately, it requires registration) is a moderate complexity aggregate query against 6 million rows of data that takes 16 seconds in the regular database engine, 0.04 seconds with a C++ application built to do the same thing, and about 0.2 seconds with their optimized database engine. The press release is here, and some of the technical details are discussed on this blog (no, not mine): Next Big Future. Is this impossible, impractical, or well within the realm of possibility? If it can be done, why haven't we seen it before?"