Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Intriguing (Score 1) 216

But ultimately nothing to see here.

No wrongdoing was acknowledged and it was settled, so there's no case law involved. The police have long since become a corporate entity rather than a public service, so this is just marked up as cost of doing business. The police have been sued before for precisely this kind of retaliation and it hasn't made any difference yet.

No, if you want law enforcement to change, you have to eliminate the market economy within it. You get nothing for issuing fines, you get no rewards for arrests made or cases closed. You should get penalties for things undone, but nothing for doing what you should be doing to begin with.

You also have to demilitarize it. Guns should be limited or eliminated. Using fear and intimidation to control should be banned entirely. The use of violence of any kind should be limited or eliminated - there is almost never any need and if you're a cop, you have no business claiming you were in fear. Cops are paid to go into danger. If you're so wimpy you have to go in guns blazing, you're not a cop, you're a wimp with a badge.

I've no interest in people telling me I've not been there, I've not been paid to walk into the lion's den in a long time. And when I have, I went. Sane, rational and sober. Which, apparently, US police aren't capable of being.

If you're not cut out to face danger and the possibility of death at any time, don't even bother going to a motor race.

Comment If you are concerned by this at all... (Score 5, Interesting) 56

...why?

Your outermost gateway should be a simple NAT/port-forwarder/load balancer and a honeypot server. Web traffic goes to the front-end servers, all else goes to the honeypot server. There should be no live DNS. Computers don't need readable names, strings are often where mistakes are made and replying to an IP doesn't require name resolution. The NAT/load balancing would be per-inbound-packet at this level, not per-session or per-time-interval. That means attacks on server resources (if they get through at all) are divided across your cluster evenly. Buys the machines time to detect and counter the problem.

Your front-end servers should be not much more than static content delivery systems, proxying the rest through your outer defences. OpenBSD is ideal for this - fast, simple, bullet-proof. Middle level defences should be a very basic firewall (maximum stability and maximum throughput) and an Active NIDS running in parallel (so as not to slow down traffic).

Inside that, you have at least two load-balancers, one on hot standby, farming dynamic requests to mainline servers. Mainline servers have no static content, only dynamic content. If dynamic content changes slowly (eg: BBC), have a cache server sitting in front of the actual content server. No point regenerating unchanged content.

Content servers send through another firewall (it can also be simple) to your database servers. Unrelated data should be on distinct servers for security and seek time. Since the content servers are read-only, they need hit only database cache servers with actual databases behind those. If you absolutely have to have FQDNs, zone transfer the critical stuff. Bounce all other DNS requests via the internal network to the regular DNS source. That way, your at-risk gateway doesn't contain stupid holes in the wall.

The internal corporate network would have a firewall and switch linking up to the content servers and cache servers, then a different firewall to the database servers. These would be heavier-duty firewalls as the traffic is more complex. Logins of any kind should be permitted only over an IPSec tunnel. All unused ports should be closed.

For the outermost systems, logins should be by IPSec only from a cache server. (Content servers have three Ethernet connections, none going to the firewall.)

This arrangement will take punishment. The arrangements where everything (database included) is in the DMZ with no shielding against coding errors, THOSE are the ones that fall over when people sneeze.

Ok, so my topology would cost a few thousand more. To Amazon, the BBC, any of the online banks, any of the online nuclear power stations - a few thousand might be spent on an executive lunch, but considerably more than a few thousand would certainly be spent and/or lost in a disaster. My layout gives security and performance, though the better corporate giants might be able to do better in both departments.

Doesn't matter if they can. What matters is that nobody at that level should be less secure than this. This is your minimal standard.

Comment Relative to Earth (Score 2) 147

The diameter is 2.3x. The mass is around 20x. The density is about 1.5x. The length of year is a shade under 1/3. The surface temperature is estimated at 10x. The gravity is around 4x. The magnetic field at Earth's current age was probably 3.375x. Tea time is a universal constant.

Comment Nothing to be alarmed about. (Score 1) 147

At 11 billion years of age, it clearly hosts one of the oldest civilizations in the universe. At an apparent mass of 20x Earth, which is quite impossible for a planet of this vintage, it is clearly a Dyson sphere built round a black hole constructed by the stellar engineer Omega as a power source for Rassilon's space-time capsules.

The reason it is in Draco is that it was shunted from its original universe into ours during the Third Time War.

Comment Short answer: (Score 2) 158

Ha! Fooled you! I never post short answers!

Seriously, I have used IT skills in archaeology. You are basically examining a system where some components are black-box and some are white-box, where you have fragments of state information at given points in time, a library of studies into systems containing similar components, and another library of studies into system dynamics.

Archaeologists trained only in archaeology have only recently started to grasp the importance of systems analysis and reverse engineering. They are still not too clued-up on how to perform rigorous testing of black-box environments, which is why most of them view the subject as a pure humanity and haven't quite figured out that pure humanities don't actually exist.

They are also not very good at understanding how to store, retrieve or correctly associate vast amounts of information. A rather essential skill, one might think, when you can be gathering hundreds - sometimes thousands - of fragments in a relatively small area. It's why reassembled objects tend to be rare, even though pieces that fit together are a lot more common. The data is incompletely collected or never examined for patterns.

I do not recommend barging in and telling them how to do their job. Even though sometimes I wish someone would. Not Invented Here Syndrome and the usual evil of Office Politics applies just as much to the Mediocre Outdoors as to the Even More Mediocre Indoors.

On the other hand, applying the skills, making the necessary observations, making the necessary records, installing a database with just a tad more oomph than Microsoft Access (though leave the basic card entry screen) - that will help you not miss the blindingly obvious.

Hardware Engineer? Pffft! It is not that complex to convert the Open Source hardware spectrometer into an Open Source hardware thermoluminescence ceramic dating device. Might not be as good as the high-end commercial rigs, but high-end commercial rigs are very expensive to buy time on and archaeologists don't have the cash to even afford a decent hat and bull whip any more. But if you can, through decent approximation, show that there's something interesting going on, cash will materialize.

Please bear in mind, though, that although it's not complex to do the conversion, it's not hard to screw it up either. Do test things and do use a better camera than the one the prefab kit comes with.

Comment Re:Hmmm. (Score 1) 82

The calculation was done by NASA and published in a peer-reviewed paper in New Scientist in 1988, I think. As best as I can recall, the solar sail was assumed to also have an initial mass of 5 Kg and to gain mass at a constant rate (since the remnants of the accretion disk should be thinner the further out you go, but you travel through more of it per unit time). I forget what the rate was. As I recall, the paper noted that there would be extreme difficulty in having a sail of such a size that was structurally capable of withstanding impacts at the velocities involved within the permissible mass.

Comment Re:Hmmm. (Score 1) 82

Space is filled with dust, so yes, as you travel away from the sun, as the solar sail cools (since less heat is reaching it, inverse square law) it does indeed get heavier. It also gets heavier as it accelerates, due to relativity. It would be interesting to determine what the precise function is. The density of space dust is given in Carl Sagan's book, Cosmos, that was a companion to the series.

Comment Hmmm. (Score 4, Informative) 82

There are two sorts of solar sail, those that work off photons (and, no, you don't need a mirror, since you can't afford the extra mass) and those that work off ionized particles being emitted from the sun. Ionized particles have much more momentum and are generally considered superior.

A solar sail that is 50 Km in diameter, attached to a 5 Kg probe, would accelerate that probe to 25% light speed by the time you reached the edge of the solar system.

If you built a car whose headlights could accelerate the car in reverse with photonic pressure, the headlights would vaporize a considerable chunk of the planet in front of you. You can do the calculation yourself. The equations are at http://www.physicsforums.com/s...

Comment Some problems (Score 1, Interesting) 126

First, data doesn't get the same protections as voice. Not that voice gets much protection as it is.

Second, carriers have said they will throttle data connections. This has serious implications for digital because it means carrier-to-carrier connections will (not may, will) be of inferior quality.

Third, I would believe digital was going to deliver, except that nobody uses much in the way of error-correction, the speakers and microphones are deteriorating in quality and reliability is naff.

Lastly, phone companies always promise jam tomorrow, or an increase in chocolate rations, but the reality is very different. I get a (billed) month of no service because of an upgrade to 4G. I can't remember the last free phone upgrade offered. My smartphone reboots itself regularly for no obvious reason. I used to be able to run a phone on full batteries for 2 days without a recharge. (Yes, phones "do more", but I don't bloody well want most of the more and the bits I do want aren't any bloody good! That is NOT a good exchange for 1/12th the uptime and nobody sells low-consumption phones any more.)

Comment Re:The Spruce Goose (Score 1) 209

The DeHavilland Mosquito had an operational range of 2000 miles fully loaded and 3000 miles if you used wing fuel tanks. Since the bomb bay could take a load equal to the wings, as a pure "night fighter", it seems plausible that an extra fuel tank in the bomb bay could have given you a range of 4000 miles.

Obviously, you don't tend to have night fighters cross the Atlantic or Pacific much, but in 1941 there would have been nothing remotely its equal on any patrol.

Although my thought of the extra fuel in the bomb bay was never used (as far as I know), the aircraft was without any serious rival for at least 3 years and remained in use for patrol purposes for another couple of decades.

The sad part of the story is that most were destroyed for target practice after the war and only one Mosquito remains flying (and it's a rebuild that mixes several previously crashed airframes).

Comment Underlying assumptions are false (Score 1) 235

Ok, the envelope game. You can rework it to say the second envelope contains the next vulnerability in the queue of vulnerabilities. An empty queue is just as valid as a non-empty one, so if there are no further flaws then the envelope is empty. That way, all states are handled identically. What you REALLY want to do though is add a third envelope, also next item inquire, from QA. You do NOT know which envelope contains the most valuable prize but unless two bugs are found simultaneously (in which case you have bigger problems than game theory), you absolutely know two of the envelopes contain nothing remotely as valuable as the third. If no bugs are known at the time, or no more exist - essentially the same thing as you can't prove completeness and correctness at the same time, then the thousand dollars is the valuable one.

Monty Hall knows what is in two of the envelopes, but not what is in the third. Assuming simultaneous bug finds can be ignored, he can guess. Whichever envelope you choose, he will pick the least valuable envelope and show you that it is empty. Should you stick with your original choice or switch envelopes?

Clearly, this outcome will differ from the scenario in the original field manual. Unless you understand why it is different in outcome, you cannot evaluate a bounty program.

Now, onto the example of the car automotive software. Let us say that locating bugs is in constant time for the same effort. Sending the software architect on a one-way trip to Siberia is definitely step one. Proper encapsulation and modularization is utterly fundamental. Constant time means the First Law of Coding has been broken, a worse misdeed than breaking the First Law of Time and the First Law of Robotics on a first date. You simply can't produce enough similar bugs any other way.

It also means the architect broke the Second Law of Coding - ringfence vulnerable code and validate all inputs to it. By specifically isolating dangerous code in this way, a method widely used, you make misbehaviour essentially impossible. The dodgy code may be there but it can't get data outside the range for which it is safe.

Finally, it means the programmers failed to read the CERT Secure Coding guidelines, failed to test (unit and integrated!) correctly, likely didn't bother with static checkers, failed to enable compiler warning flags and basically failed to think. Thoughtlessness qualifies them for the Pitcairn Islands. One way.

With the Pitcairns now overrun by unemployed automotive software engineers, society there will collapse and Thunderdome v1.0a will be built! With a patchset to be released, fixing bugs in harnesses and weapons, in coming months.

Comment Wrong question (Score 1) 307

Google up on articles on the Lazarus Doctor (he works on patients who have nominally died of hypothermia) and on the new experimental saline blood substitute for potentially fatal injuries (the paramedics swap the patient's blood for the solution, deep-freeze the patient and reverse the process at hospital, eliminating all stress and trauma to the body in transit).

The theoretical duration you can perform suspended animation in real life is unknown, but is estimated to be many months.

The practical duration is only a few hours, so far.

The cost of improving on the practical duration (since the former method is really only limited by how long you can artificially keep O2 levels in the brain over 45%) is far, far less than the cost of a mission to Mars. Ergo, that is the logical solution. Fund medical research into the two methods. Put 100% of NASA funding for a manned Mars mission into those two techniques for at least the next couple of years.

That should accelerate development of the necessary technologies. By doing it this way, you need absolutely bugger all new rocketry technology. The N months food needed for the journey by live astronauts can be replaced with radiation shielding of the same total mass.

This leaves you with radiation on Mars. But only if you land on the surface. What you want to do is land in a deep narrow gorge or chasm. There are some, that is where the methane was reported. That increases the thickness of atmosphere, which is good for radiation. It is unexplored, which is even better. There is a chance of a cave network, absolutely ideal for looking for water, life and/or a good location for settlers.

Oh, and doing things this way improves life on Earth, the very thing all the anti-space people demand NASA prove they can do.

Everyone's happy, apart from, well, everyone. NASA doing a better job of health than the NIH will upset people. A workable mission will upset futurologists because the future will be done rather than talked about, putting them out of a job. Eliminating the radiation problem will infuriate the buggers who say the mission can't be done. Eliminating any issues with transit time mean you can launch the mission the day after the medical stuff is sorted, leaving those talking about a 2030-2050 timeframe looking as stupid as they really are.

So, yeah, it'll get the job done, but expect those involved in a mission to be lynched by a mob of respectable plutocrats.

Slashdot Top Deals

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...