Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re: Kinetic Kill Vehicle (Score 1) 139

A dark satellite made from an ultrablack material or using a stealthy topology simply isn't going to be seen. By anyone.

Ion engines are slow, but they don't give off any tell-tale glare.

The southern hemisphere has very little in the way of monitoring - a satellite traversing any great circle other than equatorial will be difficult-to-impossible to track.

This is not a likely threat, on a scale from one to ten, the seriousness is sqrt(-1). Nonetheless, it's not zero.

Comment Re: are you sure? (Score 2) 139

That may be true in theory, but Iran succeeded in hijacking a US drone via a GPS attack. Thus, whatever authentication exists is not actually in use. The US, for reasons known only to them, hate encryption. Any encryption. By anyone. Including themselves. For much of the war in Afghanistan, drone camera signals were unencrypted and omnidirectional, leading to video footage being circulated. Slashdot covered the issue in the early days of the war.

If the US military are too stupid to encrypt drone GPS systems and drone video feeds in an open war, they can't be trusted to do anything right.

Comment Re: Err - no. (Score 2) 139

The FBI wants to ban private encryption, essentially banning eCommerce, eBanking, UNIX, foreign languages, medical implants, boolean operators...

The mere fact that the director could state this in public and not be fired by the time he'd finished speaking is all the proof you need that Americans - and indeed any post-Babbage civilizations - are expendable in the eyes of the civil (uncouth?) service.

Which should be no surprise. The difference in social influences, culture and thus attitude between the paranoid schizophrenic survivalists and the paranoid schizophrenic security agency staff is pretty much nil.

Comment Re: Err - no. (Score 2) 139

And the government always obeys the law? Further, if the facility exists, anyone can turn the jitter back on. It's no different from what we've been saying about backdoors - once they exist, anyone can use them. There's also risks of social engineering attacks against those running satellites. And, since no software is perfect (and no radiation proofing is perfect), the satellites may spontaneously add jitter, enable encryption (with a gibberish key), or simply activate their steering jets, putting them on an incorrect and/or elliptical orbit, screwing up calculations. (ie: physical jitter)

This is ignoring the solar storm/jamming/gamma ray burst/collision with space junk range of issues, as they're discussed elsewhere and aren't really pertinent to the jitter issue.

Comment Re:Brutally sad day (Score 1) 445

One of the common problems with any aircraft design is that you can't have backups for everything. There simply isn't the capacity, unless you double the size of the aircraft and thus eliminate all of the benefits of having a backup engine (perhaps the most critical system to have a backup of). Thus, some level of failure is inevitable.

(Even if you have backups, that won't necessarily save your skin. The DH98 Mosquito could fly perfectly fine on one engine, but crashes from engine failure still happened. The Space Shuttle, on at least one occasion, lost two or more of the five onboard computers. There's a limit to what you can do in these sorts of cases.)

All flight is, inherently, dangerous. That's the nature of the beast. You can improve safety, which is always a good thing to do, but improvements will be asymptotic to a value below perfectly safe. How much below is unclear, I don't think anyone has really done that calculation. Nonetheless, whatever it is, there's declining returns after a given point. Commercial manufacturers tend to put a ballpark figure on what's an acceptable number of deaths per thousand (miles|hours) of flight and will invest to around that level of safety. Understandable - more than that gets very expensive very quickly but won't affect sales, aircraft usage or aircraft reputation.

Now, high atmospheric/suborbital/orbital/space travel is a great deal worse. Engines have to cope with vastly higher pressures, which means that much smaller defects can be disastrous. You've far worse radiation to contend with, so control circuits have to be better screened and radiation-hardened. They also have to cope with far greater G forces, vibrations from hell, variations in temperature that they're not going to like, and (since atmospherics can be nasty) survive (without producing erroneous signals) plasmas and electrical discharges that aren't always predictable and not always that well understood.

In this particular case, it looks from the amateur footage that claims to be of the accident (you can never be sure) that the engine ruptured. The engine, as I understand it, was a new type. Probably smarter to do the first flight unmanned for that, but that's easy to say now. My guess would be that the engine casing had not been properly made and failed. Not enough to total the aircraft at high altitude, but enough to make a complete mess of things. Again, it's only a guess, but that sounds like the engine wasn't yet full power. If it had been, I doubt there'd have been anything large enough for the video cameras to film.

Engine casings are tough to make flawlessly. You can do limited testing with ultrasonics and assorted remote sensors, and those'll find a lot of flaws, but the only known way to test if an engine is working perfectly is to fire it up to maximum power and hold it there until the fuel runs out or it explodes. If it's still intact, it was fine. It probably isn't now, though.

Comment Re: Well, no kidding (Score 1) 106

I absolutely agree.

The solution necessarily involves three mechanisms:

* Determining what is present
* Fetching what is absent
* Isolating everything that is build-specific

The "tradition" established by CPAN, CPyAN and CTAN is that source packages should specify dependencies - not only the software name, but the range of versions permitted. Archives should then permit requests for specific versions.

Isolation (such as by root jailing) deals with file path issues, software interactions, etc. All the build system should be able to see is the software the build system needs. Nothing more.

Determining what is present is more complex. The software must see what options exist that are compatible and then, through build flags, defaults and user queries, determine which of those options to actually use. If you're using Windows and have GnuWin32, Cygwin and MinGW installed, it's no good just asking if Gnu build tools are to be used.

Installs and deinstalls are platform-specific, it's best if generic package installers for each platform took care of the databases and links. If you can find any. Seems to be a shortage in useful tools.

Comment Building should not be complex. (Score 1) 106

There's software for auto-detection of necessary libraries (cmake is probably the best, since it's more portable than autoconf).

If you've the source tree, then you should require one single platform-dependent package containing cmake, gnu make, curl or wget, grep, cut and associated libraries, along with a text file containing a list of dependencies, where to get them and where to put them.

Your build system then scans for everything needed. If you've got it, it uses it. If you don't, it fetches the source, builds it and installs it.

This is not rocket science. Gentoo has been doing something similar for a very long time, so has Perl, so has Cygwin and Cygwin-based packages like OSGEO4W.

Yes, it's slow. Yes, it means the browser maintainer has to have a text editor. Yes, it's going to be as painful and agonizing as installing X11R4 or GateD. I did both. On a 386SX-16. Uphill. Both ways. In the snow. If you can't write your code properly to begin with, get off my lawn!

Comment Re:50 euro fee for a 20 euro refund (Score 1) 353

No problem. Since they require that, you get contractor rates. Plus per diem for the travel. The petrol and wear-and-tear on your car to Germany will be tax-deductable. The remainder of expenses can be billed to the vendor. You send them the estimate in advance, then when they refuse (which they will, because it'll be a hell of a lot more than the cost of a Windows license and probably not too far from the cost of the computer in its entirety if you choose the right places to stay), sue the bastards for breech of contract.

Would you win? Probably not, but the cost of the lawsuit would be a hell of a lot more than the cost of your expenses sheet. That would put them in an interesting position. If they win, they lose. Hey, corporations have been doing this for centuries, it's about time geeks had a go. It seems to be a very profitable racket.

Submission + - Ask Slashdot: Bitcoin over Tor is a bad idea? (arxiv.org)

jd writes: Researchers studying Bitcoin have determined that the level of anonymity of the cryptocurrency is low and that using Bitcoin over Tor provides an opportunity for a Man-in-the-Middle attack against Bitcoin users. (I must confess, at this point, that I can certainly see anonymity limitations helping expose what machine is linked to what Bitcoin ID, putting users at risk of exposure, but I don't see how this is a function of Tor, as the paper implies.)

It would seem worthwhile to examine both the Tor and Bitcoin protocols to establish if there is an actual threat there, as it must surely apply to any semi-anonymous protocol over Tor and Bitcoin has limited value as a cryptocurrency if all transactions have to be carried out in plain sight.

What are the opinions of other Slashdottians on this announcement? Should we be working on an entirely new cryptocurrency system? Is this a problem with Tor? Is this a case of the Scarlett Fish (aka: a red herring) or something to take seriously?

Open Source

Confidence Shaken In Open Source Security Idealism 265

iONiUM writes: According to a few news articles, the general public has taken notice of all the recent security breaches in open source software. From the article: "Hackers have shaken the free-software movement that once symbolized the Web's idealism. Several high-profile attacks in recent months exploited security flaws found in the "open-source" software created by volunteers collaborating online, building off each other's work."

While it's true that open source means you can review the actual code to ensure there's no data-theft, loggers, or glaring security holes, that idealism doesn't really help out most people who simply don't have time, or the knowledge, to do it. As such, the trust is left to the open source community, and is that really so different than leaving it to a corporation with closed source?"

Comment Time for anew distro? (Score 1) 303

I have often wondered if it would be worth building a new distribution. The existing ones all seem to make weird design decisions, none have conquered the desktop (I blame OSDL), they're nowhere near as high performance as they could/should be, and Linux Base is not necessarily the most secure layout. It's certainly problematic for multi-versioning.

Slashdot Top Deals

If you think the system is working, ask someone who's waiting for a prompt.

Working...