Also, don’t make REAL bomb threats either. And don’t set off bombs to kill innocent people.
I think that’s good advice too.
Also, don’t make REAL bomb threats either. And don’t set off bombs to kill innocent people.
I think that’s good advice too.
It’s not about “green idiots.” It’s about the fact that chips will melt (burn? fry?) if you don’t keep them cool, and you can only dissipate so much heat from air cooling. Water cooling is used in HPC systems, but that too only goes so far. What’s next? Everyone needs a supply of liquid nitrogen to run their desktop PCs?
The “power wall” is a real, practical problem, which we reached somewhere around 2001, where power dissipation hit ~150 Watts in high-end systems. And the challenges go beyond cooling. Did you know that half the pins (around 1000) on a modern CPU are used just for power and ground? Do the math on trying to get 150 Watts at 1 Volt through a single pair if wires.
Oh, and what about mobile computers? Current battery technology can only old so much charge. Do you want your cell phone to get only an hour of useful life before recharging?
Paul Graham liked to blog about how awesome Lisp is. Apparently, he did some web back-ends in Lisp and was able to stay ahead of the competition. Now, Lisp has some awesome features. The two that stick out to me are (a) Lisp macros, which are arbitrary Lisp code run at compile time that emits arbitrary Lisp code that gets compiled, allowing some seriouly powerful constructs to be created very concisely and (b) massive libraries. That being said, I suspect that Paul Graham and his cohorts were more successful at using Lisp compared to another back-end language because they were both very skilled at the language and also super smart. The language they chose is actually MUCH less important. If they’d chosen another language, they would have done probably just about as well.
If you’re not a super smart programmer, I recommend Java (among other languages). Pass-by-reference and garbage collection obviate a lot of coding mistakes. And with Jit compilation, you get pretty darn fast code. Under 99% of circumstances Java performance is way better than adequate and makes MUCH better use of programmer engineering time.
But if what you want is super fast performance, a super smart C++ programmer will beat a super smart Java programmer. It’s a lot HARDER to get better results from C++, but the ceiling is higher. Some reasons for this:
- In memory-constrained environments, garbage collection imposes some overhead. I’ve worked on huge programs near the VM size limit whose performance was limited by GC performance. Most of the time, incremental GC in another thread is a win, but it can be a huge liability in memory constrainted environments. Instead, manual memory management in C++ allows you to make tighter use of the memory space and performs better in a memory-constranied environment.
- In a CPU-constrained environment, background GC steals cycles away from computation.
- Just like how macros are a huge win in Lisp, C++ templates generate customized code at compile time that can have huge perfrormance benefits. This is why C++ sort is faster than the C library qsort: The C++ sort is a template that inlines the comparison function for the type you’re sorting, rather than making a method call. In Java, you MIGHT get some of this from a fabulous JIT compiler.
- Compiling to bytecode is a huge information loss. If you used gcj to compile Java to native code, there’s the potential to have less information loss and therefore optimize better based on the programmer intent. But normally, Java uses bytecode as an intermediate. With more programmer intent knowledge, the C++ compiler can make smarter optimizations.
I could add more things, but I have other real work to do. Before I go, I’ll cap this off with two practical thoughts. You CAN get better performance from C++ than Java. Do you WANT to?
- Java has massive libraries too, where the critical parts are written in optimized native code, so if you make heavy use of Java libraries, you’ll see almost no difference in performance with regard to CPU throughput (GC being a separate issue). With no perceptible impact on peformance, less code to write, fewer common programming errors, and better use of engineering time, Java is quite often just an all-around practical win over C++. I say this as a programmer who prefers C++.
- If you’re REALLY REALLY concerned about performance, use Fortran. The language is more restrictive, providing the compiler with even more freedom to optimize. (For instance, no aliasing, no recursion.)
Your resume is examined first by a totally nontechnical HR person. To them, the job position requires that you have 10 years of experience in Blurg and 5 in Blarg. They see 1000 resumes a week and therefore must filter them quickly by selecting checkboxes. There is no room whatsoever for fuzzy logic here. Either you list the skills and pass to the next round, or you don’t list the skills, and your resume gets thrown in the can. The technical people writing the job descriptions are often marginally technical themselves, so they don’t necessarily know if the combo of skills they compiled by committee is even reasonable.
And I don’t know a way around this. You simply cannot have all the technical people filtering the resumes. They have other work to do. I think at Google, the technical people get involved at a lower level of the process, and I get the impression it’s a burden.
We still have a lot of DVDs. When we put one in, I tend not to notice the resolution much, except that the bluray player makes the closed captions really blocky (cheap Samsung that doesn’t upconvert). Then when we put in a bluray, I DO noticed the difference, because of the added crispness and more vibrant colors. But on this 60” TV we have, I can’t imagine how I would see any finer resolution. The resolution and distance from the eye already make it so that there’s no way I can make out individual pixels. How would a higher res display make any perceptible difference?
So often, we see companies making shoddy products manage to continue to snow customer after customer. It’s finally nice to see one actually get what they deserve. Everything I’ve ever gotten from OCZ had some kind of major problem. The only thing worse than their products was their customer service.
Mind you, MSI wasn’t any better.
These processors are like an Intel version of Sun Niagara, but with wider vector. Actually, from an architectural perspective Xeon Phi (Larrabee) is pretty basic. They’re an array of 4-way SMT in-order dual-issue x86 processors, with 512-bit vector units. I think one of the major reasons Xeon Phi doesn’t compete well with GPUs on performance is that legacy x86 ISA translation engine taking up so much die area. Anyhow, so if you have a highly parallel algorithm, then Xeon Phi will be a boon for performance.
But as we know, there are numerous very important algorithms that are not parallelizable.
In the days of kings, someone would come to power typically because they were a powerful warrior. Indeed, in medieval Europe, the economy was based on a number of monarchs frequently going to battle with each other over land and resources. If you were a king of England, and you didn’t try to take over some part of France during your reign, you were a failure. (This explains the right of succession by blood. They didn’t know about DNA, but they did know that relatives had similarities and wanted people similar to successful past rulers.) Interestingly, the most successful monarchs were those who were loved by their own people (good management ability) feared by everyone else (mindless slaughter of people in foreign lands). This delicate balance between aggression and empathy was hard to find, and looking at the history of the English monarchy, not everyone managed it. This sounds like Ender’s game: In the history of the English monarchy (which I am a bit less ignorant of than others), there were plenty of Valentines and Peters those reigns ended in one kid of dismal failure or another, while the Enders are well-known in history. In the abstract, this sounds cool, except Ender and those successful kings were responsible for wide-spread slaughter of countless.
So this idea of returning to a monarchy sounds really bizarre to me. Rule by the one or few is not a recipe for peace, security, or freedom. In medieval Europe, if you were a peasant, you might live out your life unmolested, or you might fall victim to the whims of a foreign army or your own. Peasant life was essentially worthless except for the bit of farming they could do. This sort of attitude was the case into the 19th century. Have a look at the way the English treated the Irish when the potato blight killed off their only economical source of food. The Irish were under English rule, but apparently not under English protection, because all Parliament did was quibble while people starved to death. We also tried communism in several countries. The Soviet Union fell due to a collapsing economy, and China systematically converted to capitalism. Of course, capitalism is a system of economy, and China is still a dictatorship, but it’s a step in the right direction. Basically, when your life and your work have no value, then you have no motivation to work, except under the whip. So what these monarchists are suggesting is a return to slavery.
This isn’t the Christian fantasy of Jesus returning to earth to rule as a benevolent king. People will come to power because they want power, and then they will maintain that power by destroying others. We have that happening in our republics today. The differences are that (a) people are elected or not based on how their constituents perceive the representative to further their interests, (b) there are enough conflicting opinions that sometimes the bad ideas get filtered out, and (c) we have a judicial system that can find bad laws unconstitutional and overrule them. (Frankly, I think the executive branch in the US has too much power and is a vestige of the US legal system being a derivative of the English legal system, which has a figurehead king. We get to elect ours, but ours don’t seem to be very effective at anything other than being a scapegoat for the failures of the legislative branch.) Basically, a republic has problems, but a dictatorship is much much worse.
And let’s not forget to address the baloney about returning to traditional gender roles. As a society, we’re only beginning to respect individual human rights and dignity, regardless of ethnicity, sex, and sexual orientation. If we’re going to experiment with totalitarianism, why don’t we try putting some women in control? Oh, sure, they’ll screw it up too. Humans in power always do. But at least it won’t be a bloodbath.
I skimmed over the article, but I couldn’t tell: Is this another example of someone choosing an O(n^2) algorithm over an equivalent O(n log n) algorithm and then patting themselves on the back because they speed up the O(n^2) algorithm through parallelism, even though it’s still slower than the O(n log n) on a single processor?
I can’t tell because they go on about NFA’s, which would be exponential in time compared to DFA’s. That’s a totally different thing.
There are plenty of algorithms that benefit from supercomputers. But it turns out that a lot of the justification for funding super computer research has been based on bad math. Check out this paper:
It turns out that a lot of money has been spent to fund supercomputing research, but the researchers receiving that money were demonstrating the need for this research based on the wrong algorithms. This paper points out several highly parallelizable O(n-squared) algorithms that researchers have used. It seems that these people lack an understanding of basic computational complexity, because there are O(n log n) approaches to the same problems that can run much more quickly, using a lot less energy, on a single-processor desktop computer. But they’re not sexy because they’re not parallelizable.
Perhaps some honest mistakes have been made, it trends towards dishonestly as long as these researchers continue to use provably wrong methods.
I’m serious. This prevents other companies from making it easy on the NSA. Facebook will never make any royalties from it, and they’ll likely never implement the whole system. I love it. It’s like the GPL: Using one kind of law (IP) against another.
Actually, I'm sure it has happened elsewhere, but when I was in junior high school, they banned book bags, on the grounds that in two years, they founds two firearms. But this no-touching thing trumps all other absurdities.
Wait, I thought we needed to reverse the polarity of the neutron flow.
While many people learn a lot in college (I hope), the first thing that an employer learns when they find that you have a college degree is that you are likely to be able to finish something complex. There are lots of people without college degrees who can see complex and difficult things through to completion, but that is much harder to glean from glancing at a resume for two seconds. And that's all the time you get, because they go through massive numbers of resumes. And the fact is, most companies are less interested in employees who are smart than those who can follow instructions and work (however inefficiently) until they finish something.
Back in the late 90's a friend of mine worked for a "data services" arm of a well-known communications company. They had a very successful process for developing large applications on time, on-spec, and on-budget, and it was designed around having morons do the work. A handful of people at the top did the design work, which trickled down through layers of less and less skilled worker until you go to the bottom. At the bottom, the code monkey (not necessarily their terminology) would have a stack of sheets of paper, each describing one function or procedure to write. It would describe the function name, the inputs, the outputs, and the algorithm to be coded. The algorithm was described in such detail that even the least skilled coders could do the job. And then it would be reviewed by someone else to make sure it did the job, integrated with the growing application, etc. Now, while a handful of scrappy coders could often complete projects in less time, what this big company had was predictability, so they could enter into a contract where they could be precise about the time and cost from the outset.
Unless you understood their business model, you could find their hiring criteria to be to be counter-intuitive. But what they wanted was cheap college graduates willing to do drudge work. If you could play dumb and do the job, then you could gradually work your way up the chain. But in general, a smart 'rebel' type would never get hired there, nor would they generally want to. Linux geeks are used to thinking about computer programmers as being smart, but that's not how the business world sees them. Coders are a commodity to be bought and sold like corn (and just as lacking in useful content).
I don't know about other people, but I had nothing but bad experiences with their DRAM products. I would call their tech support and usually get a voicemail. They would never return those calls. If I called anothe department (sales always answered), they would just forward me to the same voicemail. If I was persistent enough, calling enoug times per day, I might get someone on the phone with technical support.
Their "performance" DRAM products seemed to deteriorate over time. I would configure my system with the exact voltage and timing numbers they specified and run a burn-in test. It would work great for the first couple of days. Perfect stabillity, good performance. Memory tests, kernel compiles, everything was great. But after the first few DAYS, it would all go to hell. There were no hard memory errors, but the system would start crashing during compiles. With a lot of effort, I managed two exchanges with OCZ (so that's three pairs of DIMMs I tried in sequence), and each set went through the same pattern -- worked great then started failing. After the third set, I paid the restocking fee with Newegg and bought form Crucial. I have no idea what the problem was, but OCZ was not interested in figuring it out.
Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.