- File or buy a load of patents and then, the next time someone independently invents something you've patented, ask for royalties and sue them if you don't get them.
- Design things of value and sell the rights to use the designs to companies that would end up paying more if they developed something in house.
There are a load of companies in the second category that are very profitable and usually respected. It's the ones in the first category that give them all a bad name.
The article makes little sense. The site of the DEEP project is more useful. It has the look of an EU publicly funded boondoggle. Those have a long history; see Plan Calcul, the 1966 plan to create a major European computing industry. That didn't do too well.
The trouble with supercomputers is that only governments buy them. When they do, they tend not to use them very effectively. The US has pork programs like the Alabama Supercomputer Center. One of their main activities is providing the censorware for Alabama schools.
There's something to be said for trying to come up with better ways of making sequential computation more parallel. But the track record of failures is discouraging. The game industry beat their head against the wall for five years trying to get the Cell processors in the PS3 to do useful work. Sony has given up; the PS4 is an ordinary shared-memory multiprocessor. So are all the XBox machines.
It's encouraging to see how much useful work people are getting out of GPUs, though.
I wouldn't put it past Verizon to do that but one of my colo's peers primarily with Cogent and Cogent blows up internet connectivity from that colo all the time, an issue I just don't have in my other colo. Honestly I don't think Cogent has the moral authority to be able to assert anything.
If you have some system like Slashdot's which moves junk comments to lower rating positions, this isn't a problem. Everybody should get to blither, but nobody should have to read or listen to their blithering.
Space-X is planning a manned flight in 2015. Space-X will have their own private astronauts. They'll probably be ex-NASA astronauts initially, test-pilot types. Once Space-X has flight crew, that will probably be the place to go.
NASA still has 49 active astronauts, most of whom (but not all) have been in space. They don't have much to do. There are lot of recent astronaut layoffs and quits. NASA had over 80 astronauts at peak. That's not where you go if you want to go into space. It's surprising that they're training new ones, but they probably want to keep the training operation going.
Being a former astronaut sucks today. It's not like being John Glenn. Ex-astronauts used to have a pass to NASA facilities, but one found out last year that his pass to HSC was no longer valid. Even worse, they're now portrayed as unemployed losers on TV. There was an episode of Blue Bloods on this, where a has-been alcoholic ex-astronaut is begging his old friend the police commissioner to find him a job.
The Periodic Table isn't a model, or at least not a functional model. It's a chart - a way to represent data.
It's more than a chart. A table is not just a way to represent data; a simple list of all items in random order can represent the data just as well as a table can. A table is a way to organize data -- by spotting patterns, identifying which patterns are most important, then arranging the items to highlight those patterns. By choosing which patterns are important, you are implicitly constructing a model of what the items in the table are.
The Mendeleev-derived periodic table has done quite nicely for us: it predicted the properties of many elements long before we actually isolated them, and it was doing so well before we understood that the patterns highlighted by the table (the table's implicit model) were ultimately caused by the arrangement of electrons into quantum-mechanical energy-level shells by way of Pauli exclusion, with the arrangement of elements in each row directly dependent on the quantized degrees of freedom in each shell's energy level (hence the 2*, 2*[1+3], 2*[1+3+5], 2*[1+3+5+7] pattern in the table's row widths). Think of the table as a quick first-order approximation to the deeper equations needed to compute the true physics, such as the energy of a filled d-orbital in the third electron shell. A more complex table with an extra dimension or two of symmetry might be able to capture more patterns, giving us a more detailed model that produces better, more subtle approximations than the Mendeleev-derived model can yield; yet that new model would still bypass the tough work of calculating how electrons actually behave when packed around a single nucleus. (Or perhaps we could capture some symmetry affecting how an atom forms molecular bonds, or a nucleon symmetry that gives better predictions of stability and half-life or that better captures why the stable proton:neutron ratio isn't a perfectly smooth curve.)
About a decade ago, there was a fad for "smart office buildings". The concept was that companies would get their computing resources (or at least their networking resources) from the building landlord, It didn't work out well. Property managers are terrible at network operation. The landlord mindset of doing minimal work on maintenance and the data center operations mindset of 24/7 availability were too far apart.
As for PCs you can program your decoder in CUDA or OpenCL so "hardware support" is not very important.
Mobile GPUs are also programmable, but without knowing the details of the algorithms involved it's hard to say what kind of speedup you'll get from a GPU. In general, later generations of video CODECs require inferring more from larger areas and so are less amenable to the kind of access that a GPU's memory controller is optimised for. Just doing something on the GPU isn't an automatic speedup, and until we see real implementations it's hard to say exactly how much better it will be.
Now in theaters:
- Superman N+1
- Fast and Furious 6
- The Hangover, Part 3
- Start Trek N+1
- Iron Man 3
Coming Soon, Monsters N+1 and Despicable Me 2.
Hollywood has a severe idea shortage.
True, there are some things supercomputers can do well, but the same effect can be reached with distributed computing, which, in addition, makes the individual CPUs useful for a range of other things. Basically, building supercomputers is pretty stupid and a waste of money, time and effort.
People don't build supercomputers for no reason, especially when HPC eats up a large part of their budget.
The main application of supercomputers is numerically solving partial differential equations on large meshes. If you try that with a distributed setup, the latency will kill you: the processors have to talk constantly to exchange information across the domain.
As someone pointed out, modern supercomputers are like distributed computing, often with commodity processors. They look like (and are) giant racks of processors. But they have very fast, low-latency interconnects.