Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:In the wake of Thursday... (Score 1) 402

Given the US' response to WikiLeaks (mulling assassinating all who had anything to do with it), the UN sounds a smart shoice to me. Besides, is it to control such sites or to protect them? A lot of the countries in the UN have benefitted from the current round, and this may well be an attempt to defend freedoms the US is working hard to block.

Comment Re:global standards for policing the internet (Score 1) 402

Which the French, Chinese and Australians already have. The US prefers hellcat missiles to censorship, so it would not suprise me if there was an edit from an anonymous source somewhere in the London region soon.

The fact is, US control hasn't prevented any of the abuses you mention, has actually enabled most of them (up to and including supplying the parts), and is guilty of censorship through terrorism itself. Not to mention backdooring IPSec to spy on others. And this is your preferred option? The UN being unable to decide sounds preferable to what we have.

Comment Re:Burroughs large systems, Lisp Machine (Score 1) 763

The T400/T800/T9000 Transputer was designed to run Occam (a Pascal-like high-level language). It was one of the first mass-produced 32-bit architectures, had 4 communications ports, a decent amount of internal RAM that could be extended to exploit the full 32-bit address bus, etc, etc, etc.

It is now used to power European digital video recorders.

Intel attempted their own variant, the iWarp. This died an even uglier death.

This concept of the high-level processor was very fashionable when I was at University (late 80s). Although I'm not convinced that high-level is necessarily the way to go, I am convinced that having object-oriented concepts and other modern programming paradigms actually native in the CPU would improve performance (as it would eliminate overhead in paradigm conversion and emulation).

(If you think of a single instance of a single class as a single program with internal state, then you could have an OO processor by uploading that program as microcode onto a specific core to form a new "instruction code". One instance is then one set of registers, which you can handle by having an array of them.)

Comment Re:Commodore 64 (Score 1) 763

The CBM64 was rubbish and slow. The BBC model B is where it was. (And still is - the UK has gone back to using it to teach 'A' level computer science. It had more I/O than any machine of the time - and more than most machines today.)

Comment Re:Even more important: "None" (Score 1) 763

The problem is that the values of productive are not static, but point into a random number generator.

In the "old days", you had a lot more flexibility. A lot more risk, too, from side-effects. And a much slower processor, so greater turnaround time when fixing bugs.

(The exception to this was the Transputer, where you had negligible risk, fantastic processing power for the time, without reducing any of the flexibility. In consequence, the Transputer was killed off.)

Comment Re:VMS - the indestructable OS (Score 1) 763

VMS was good in places (reliability being one, definitely) but you should have seen the workarounds Unix programs needed to handle limitations within VMS.

MULTICS was also a damn good OS in places and deserved a better fate than to be replaced by a (relatively) trivial subset. (Which is where we get the Unix name from.)

Plan 9 had a superb distributed architecture, but never quite got the backing. It got re-branded under the name Inferno, and still occasionally gets patches, but again lots of good ideas allowed to rot because the whole package wasn't quite there.

Frankly, I don't care if these get revived and modernized, or whether the best bits are studied, autopsied and analyzed to the Nth degree to allow some "next generation" OS to acquire ALL of the best traits from ALL of the above, minus any of their limitations.

Just so long as those brilliant ideas don't continue to rot in the great Garbage Collector in the sky.

Comment Re:Bah! (Score 1) 194

I wanted NIST to be able to say, at the end of the final bake-off "hey, criterion X is vitally important in a substantial portion of cases where cryptographic hashes are used and criterion Y is really not that critical in most of those cases; SHA-3 is ok at everything, but algorithm xyzzy is massively better at X - it wasn't picked because it's massively worse at Y, but that just doesn't matter".

Since NIST is in a better position to know if X and Y are even real cases and what the hell those cases would be, I didn't want to suggest things that wouldn't actually be that useful.

However, since that route isn't getting taken, here's what inspired me:

  • Cryptographic hashes on Mondex-style snart cards would want to be very fast in hardware, to hell with software. Great for not just money but secure handling of data in a portable medium.
  • Hashes for files (as per Tripwire or AIDE) would want to be very fast in software, but this just isn't done in hardware so who cares what speed it is there?
  • Network security has to consider ATM (48-bit packets); passwords, likewise, have to consider very short strings. Ethernet's largest jumbo packet is about 9K. A hash that can guarantee no pre-images or other even minor weaknesses for extremely low levels of input would be perfect in these cases. Doesn't matter if it's slow for data of greater size.

Three specialist cases that the Federal Government could realistically use on a large-scale basis but won't be able to.

Comment Re:Bah! (Score 1) 194

The purpose is to select one good (overall) universal-purpose cryptographic hash an call it SHA-3.

The problem that a specialist has is that isn't SHA-something. As such, it can't be used in Federal applications. At all.

Let's say that Blue Midnight Wish is the ideal for file validation (which is mostly done in software). The Feds can't use it.

Now, you can't produce an infinite number of alternatives, but a number of us felt that one - just one - Federally-usable special-purpose hash had a place, where it was not that important what that special purpose was, it was so that the Feds would have a very controlled level of flexibility when the general-purpose "ideal" is unsuitable.

Comment Re:Bah! (Score 1) 194

It may be bad in hardware, but if it's good in software then I'd consider it superb for software-only uses of hashes.

This also goes back to an argument I (and a few others) made on the list - since some of the original requirements were being dropped anyway, why not have a runner-up that is acceptably good at everything but is especially good at one that is frequently used?

That way, you have a "winner" that is good overall but you also have something that has a specialist use but is decent elsewhere.

In this case, I'd say BMW is ideal for file signatures (be it for file transfers, Tripwire/AIDE-style uses, etc). This isn't something hardware is normally used for and being very fast in software makes it ideal.

Comment Re:Regulations for classified information (Score 1) 346

These are what systems are required to do in the way of security measures, as defined by the Federal Information Processing Standards, the Orange Book and the Common Criteria.

A lot of the documentation can be found at the Information Assurance Support Environment website, Policy and Guidance

To summarize, information that is labelled "Secret" can only be stored on a machine that - in the Orange Book system - is classed as B3 or better. The use of security labeling and a mox of host-level and network-level mandatory access controls is supposed to ensure that this is actually mandated at the OS level on each machine and between machines. B3 is equal to the more modern Commmon Criteria EAL4.

(It is impossible, in theory, to transfer information that is classified at one level into a lower classification, on the same machine or by going through a series of machines. To be able to do so is a violation.)

To be given an EAL4 rating, that precise combination of hardware and software MUST be tested by an approved laboratory and shown to meet all of the criteria.

Further, as noted on the FIPS website: "With the passage of the Federal Information Security Management Act of 2002, there is no longer a statutory provision to allow for agencies to waive mandatory Federal Information Processing Standards (FIPS)."

Mandated Criteria, Rainbow Series and Related

Mandated Criteria, Common Criteria

These are NOT optional. These are Federally-mandated requirements. If Manning's computer did not meet these standards, it was NOT authorized to be on the network and the machines that transferred classified information to it were NOT authorized to do so.

Comment Bah! (Score 5, Interesting) 194

None of the good names survived!

Still, there was a lot of debate on the SHA3 mailing list governing the criteria as it was felt that some of the criteria were being abused and others were being ignored. I, and a few others, advocated an approach where the best compromise solution was the "winner" for SHA3 but the runner-up that was best for some specific specialist problem (and still ok at everything else, since it's a runner-up, and also free of known issues) would then be considered the winner as "SHA3b". That way, you'd also get a strong specialist hash. The idea for this compromise was due to SHA2 not being widely adopted because it IS ok for everything but not good for anything. Some people wanted SHA3 to be wholly specialised, others wanted it to be as true to the original specs as possible, the compromise was suggested as a means of providing both without making the bake-off unnecessarily complex or having to have a whole parallel SHA3 contest for the specialist system.

The main problem with the finalists is the inclusion of Skein. The use of narrow-pipe algorithms has been widely criticised by people far more knowledgable than myself because it violates some of the security guarantees that are supposed to be present. The argument for Skein is that the objection is theoretical.

Comment Re:horse (Score 5, Interesting) 346

The problem is not the decision, so much as that allowing insecure mechanisms (in violation of NSA Security Information notices, Common Criteria instructions for the levels required for secret information and Federal Information Processing Standards, I should add) was not only bloody stupid to begin with, it was in violation of US law regarding the handling of classified information.

Instead of prosecuting Manning, who at worst is guilty of far less than the Lockheed-Martin officials who publicly sold the plans for the current stealth fighters, one should ask why his actions were even possible in the first place. FIPS standards for secure platforms and NSA publications expressly prohibit the capability to transfer files to insecure formats. It is illegal, under US law, to install or use non-compliant systems for Government purposes. This means that giving Manning the computer violated US law. Do you see anyone charged with violating such US laws? I don't.

Comment Which horse? (Score 5, Interesting) 346

The Pentagon had to ban USB sticks, et al, internally after the biggest single security breach caused by a virus passed around and brought onto the secure SIPRNET within the Pentagon itself. It's unclear to me if the problem was the virus relaying secret information off the secure network, or what, but apparently it was labelled the single biggest security breach by the Pentagon and they're unlikely to be overplaying security holes.

Mind you, NASA has just released secret information into the public domain by selling hard drives known in advance to contain secret information. These are drives that FAILED in-house auditing for such stuff. And prior to that, disk drives containing blueprints for the current generation of super stealth fighters were sold by Lockheed-Martin to Iran. (And people think Wikileaks did bad stuff?!?!?!?! How the hell does a bunch of personal opinions compare with giving a terrorist-funding nation plans for the top US fighters? Internal to Iran, there's the possibility they will find a weakness. Think Death Star plans. Think the Stealth Fighter shot down in Serbia. Yes, the Serbians blew up one of America's best planes, and with a cruddy cheap missile at that. On an international level, the Russians will doubtless use the plans to improve on their own airfoils and may be able to exploit the design to improve on whatever shape-based stealth they've developed so far.)

Add to that that NASA servers have been hacked in the past to turn them into file-sharing sites. Which means that whatever classified files were in those exposed directories have been shared as well. Quite plausibly these files were protected by DES only, not triple DES or AES, as "commercially sensitive" data is classified below secret and certainly only used basic DES up until a couple of years before that breech was discovered.

Then, back in the 90s, there was a breech at the Pentagon due to computers containing classified information being on the public Internet and having .hosts files. (NASA used .hosts files and rsh well into the current millenium and may well still do so.)

That's four Bloody Obvious horses, with gold bridles and gem-encrusted saddles, that have walked out and were only noticed after they kicked the door down at the stablemaster's house. There may be others.

Slashdot Top Deals

8 Catfish = 1 Octo-puss

Working...