Agreed. Have you tried Glances? In some ways it is like htop on steroids.
The first thing I install is a system monitor.
I like to keep a close eye on CPU usage, memory usage, disk usage, and network usage. Without that information it feels like I'm flying blind. It is often important on a new system when I don't know what is running and consuming resources.
The radioactivity released at Chernobyl escaped upward into the air. This made it easier to get a handle on the magnitude of the total amount of radioactivity released. The release at the light water reactors at Fukushima is for the most part traveling downward, to basements, tunnels, ground water, and the ocean. This makes it extremely difficult to get a handle on the total amount of radioactivity that has been released. They really don't know [if] the bulk of it is in the thousands of tons they have already discovered or if that is just the tip of the iceberg.
Of course I was called an alarmist and other things for bringing this up back then.
Clearly what they had discovered by April 1 2011 was just the tip of the iceberg. As I had predicted, it is the radioactive water that is the main cause for concern.
Einstein was wrong. DICE does play God with the Slashdot-verse.
Algorithmic information theory (AIT) explains very clearly and simply why we are still writing text-based code. AIT is based on the idea of measuring the amount of information in a series of bits (or bytes or however you want to chunk it) based on the size of the smallest possible program that can create the series.
There are simply not enough bits of information in a GUI based coding system to create the algorithms we want/need to create. Even though almost all programming languages have a lot of redundancy built-in in order to make them easier to understand, programs written in these languages still have a much greater amount of information than what is available by simple point-and-click which is equivalent to a series of multiple choice questions. For example 80 multiple choice questions with 100 options in each question only give you the information contained in a line of 80 ASCII characters.
Shouldn't there be a simpler, more robust way to translate an algorithm into something a computer can understand? One that's language agnostic and without all the cryptic jargon?
I believe people have tried to make universal programming languages. I don't think any of them caught on in the sense of replacing coding in real programming languages. And for very good reasons. One problem is the conflict between simpler and more robust. Shorter programs require higher information density and hence less redundancy and robustness. If you want to make a language simpler by reducing the number of keywords and special symbols then you will force programs to be longer or harder to understand or both. In the limit of the shortest program possible, the program itself appears to be a random series of bits, every one of which is significant. If there is any pattern or bias in the bits then it is not the shortest possible program.
OTOH, some higher level-languages such as R or MatLab (Octave) do make it easier to express many algorithms. This is mostly because they have vector and matrix data types. Their forerunner in many ways was APL which has a fairly high information density partly because it uses a wider range of characters than are available in ASCII. Perhaps you should learn R or Matlab or maybe even Mathematica. These languages give you a high-level means of expressing algorithms in a way that computers can understand.
The summary reminds me of the lollipop Perlisim:
When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop.
Interesting how 90& of all FB-haters are ACs
That is probably associated with the fact that 25% of them are forced to use it:
Right now, we're directing 25 percent of non-logged-in users to the beta; it's a significant number, but it's the best way for us to test drive this new design.
Given how obviously horrible the beta design is, this sentiment on the best way to test drive it is quite galling. OTOH, I am thankful they are not responsible for design airplanes or automobiles:
Yes, for the first test-flight we have filled 25% of the new plane with unsuspecting passengers but this is the best way for us to test drive our new design.
It has occured to me that maybe they trying to alienate the slashdot community
When a nation has conflicting laws it tends to cover illegal activities as any court can choose to take the view that supports the government.
Welcome to the new USA-beta!
We've had only a few major redesigns since 1776; we think it's time for another. But we really do take to heart the comments you've made about the look and functionality of the beta government that will control our country's future.
Excellent comment. OTOH, my cynical side is suspicious of how tone-deaf the site owners seem to be. It makes me wonder if the following item was on an NSA todo list somewhere:
Destroy Slashdot. After those damned Snowden leaks the Slashdot community seems to be united against us. As long as they were divided and bickering, they were not a threat.
They don't delete externally collected data. They obviously delete or age-off internal records.
Prime Minister John Key, who is in charge of GCSB said:
This is a spy agency. We don't delete things. We archive them.
Key's office confirmed that Key was talking about the video that his lawyers had claimed was deleted.
Isn't claiming that Dotcom was illegally spied upon putting the cart before the horse here? Where is the evidence? Regardless of whether it was deleted or not, by making the statement one is assuming the conclusion and puts their own credibility at risk.
Here are some links from the fine article showing that the government and the police have already admitted malfeasance:
Oddly enough you are correct that these admissions of malfeasance do put the credibility of the police and the Prime Minister at risk although that is probably not what you meant.
And in conclusion: FUCK BETA!
The fine article claims:
Most physicists fully expect a useful quantum computer to eventually emerge, [...]
I am a physicist and I don't think a useful quantum computer will ever emerge. The problem is very simple. In order for a quantum system to calculate exponentially faster than a classical system, it must contain exponentially more useful information which makes it exponentially more sensitive to noise. An early computer researcher (perhaps Jon von Neumann) used a similar argument to conclude that digital computers would eventually supersede analog computers because the precision of analog computers is limited by the noise floor which is very hard to beat back while you can make digital systems arbitrarily more precise by simply adding more circuits (or more time).
In simple terms, for every extra decimal digit you want to add to the size of a number you can factor with a quantum computer you need to reduce the effect of noise by roughly a factor of 10. I don't think this is greatly different from the limitation of classical computers where for every decimal digit you want to add to the size of numbers you want to factor you must multiply the time/size of computation by roughly a factor of 10.
Despite this reservation, I think we should continue funding research in quantum computing.
so you never wear a safety harness, because your freedom is more important than your safety?
Saying freedom is more important than safety does not imply that safety is unimportant just like Saying Kareem Abdul-Jabbar is taller than Shaquille O'Neal does imply Shaq is not tall.
The point to note here is that at equlibrium (which must occur), flux in = flux out. That means that under no circumtances will the temperature ever exceed the input.
Again, you are confusing heat and temperature. The input is energy (heat), the input is not temperature. No one who has grasped basic thermodynamics would take your argument seriously after that fundamental mistake. You seem to be just stringing together scientific jargon in a nonsensical way to reach a conclusion you like.
In a different post you claim that convection plays a significant role in the heat loss of the Earth. The upper atmosphere is close to being a vacuum. At a high enough altitude the amount of heat transfer due to conduction and convection is negligible. Do yourself a favor and Google(thermosphere).
You claimed the fine Nature article was wrong because it was based on radiative forcing yet you have never defined what you mean by that term. Your definition seems to be at odds with the definition given by the Wikipedia. The term was never used in the article nor was it used in the two references you gave to back up your claim that radiative forcing had been debunked.
Again, I ask, in the Earth-Sun system what is the "input" temperature if it is not the temperature of the surface of the Sun?
And BTW I do have a Ph.D. in physics. A Nobel Laureate was the chairman of my thesis defense and I've study thermodynamics with some of the leading experts in the world.
And [At?] equilibrium, the "limiting temperature", assuming a black body, is the temperature that corresponds to radiating the same amount of energy as the input. Anything else is nonsensical.
0) Requiring your arguments to be in accord with basic physics is not nit picking.
1) The Earth is not in thermal equilibrium with the Sun. If it were then it would be at the same temperature as the surface of the Sun. The only reason life can exist on Earth is because of the gradient caused by the Earth not being in thermal equilibrium with the Sun.
2) A black body is not the same thing as a perfect insulator. They are opposites in a way. A perfect insulator would block all radiative cooling (or else it would not be a very good insulator). My point is that the limiting temperature is a function of the insulating properties of the Earth. It is not an intrinsic property of the strength of solar heating.
If you treat the Earth as a black body you are explicitly ignoring all insulation effects. IOW you are ignoring all greenhouse effects. In simple layman's terms, how hot something gets when it is left out in the sun depends greatly on how well it is insulated. Even the temperature inside a conventional greenhouse is highly dependent on how well it is insulated.
3) When you say a black body in equilibrium radiates the same amount of energy it absorbs, you seem to be repeating the definition of radiative forcing, not debunking it.
If you believe there is a limiting temperature to the strength of solar heating that is much less than the temperature of the surface of the Sun, please tell us what that temperature limit is.
Neither of the fine articles linked to in the summary nor either or your two references even mention radiative forcing. If you have sources that don't conflict with basic physics which debunk whatever it is you mean by radiative forcing I would like to see them. Perhaps part of the problem is that your definition of radiative forcing differs from the definition given by the Wikipedia. So far you have given nothing more than your opinion that the authors of the Nature article made a serious (and probably job-threatening) mistake.
You've picked your nit accurately and with great force.
The Slashdotter Jane Q. Public had repeatedly claimed the Nature article was bunkum because it was based on the concept of radiative forcing. For example:
I should also point out that the entire concept of "radiative forcing" this is based on was refuted a few years ago, and so far that refutation has not been successfully challenged.
To me, it would be rather earth shattering news if a Nature article was based on a theory that was debunked five years ago. I looked up radiative forcing to try to find out what JQP was talking about. JQP was kind enough to supply references for the so-called refutation which should have made my task easier. The references were utter nonsense that defied basic physics with silly hand waving arguments.
Since JQP's erroneous comments were not moderated into oblivion, correcting their spread of grossly unscientific misinformation which cast aspersions on the fine Nature article is about as far from nit-picking as one can get.
the temperature of the box will rise without limit
wow. Total Science Fail
Given the assumptions of a perfect insulator and constant energy input, what is the limiting temperature? What happens to energy conservation when that temperature limit is reached?
As I said before, there are, of course, limits due to imperfect insulation and the finite temperature of the surface of the Sun but these limits are far above the temperatures reached in the upper atmosphere.
What is the limiting temperature of the strength of solar heating?
What is the limiting temperature of a perfectly insulated box with a constant input of energy?