Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:If they make good on this. (Score 2) 84

Of course but, if you really need financial stability, I don't recommend being in on the ground floor of a startup.

Especially a completely illegal one based on experimental, buggy technology under ongoing cyberattack, selling products that ruin people's lives, are supplied by organised crime, and are actively targeted by high-level federal and international prosecutors with access to military espionage technology - and a complete dump of your predecessor's transaction databases.

There's high-risk, and then there's unethical high risk, and then there's completely stupid, unethical high risk, and then there's... whatever this is, it's pegging the scale.

I'm not going to say "good luck", but I would advise even the rubberneckers to stand well clear of the impact crater.

Comment Re:"Bespoke" (Score 1) 134

The only planets never to have been the subjects of bespoke space missions from Earth are

Am I misunderstanding the definition of "bespoke" and its application within sentences?

I think you could substitute "tailor-made" for "bespoke" in any context - including actual tailoring - and get exactly the same meaning. It's a linguistic metaphor, yes. Do you object to any other commonly-used metaphors?

Comment Re:Dead end (Score 1) 223

Objects can be serialized and the result looks like a file.

More generally, everything is a namespace/filesystem.

Yep. There's a very close connection between objects, dictionaries, relational tables, files/filesystems, and functions - all centred around binary relations, a fairly well-understood mathematical object - which seems well worth exploring. However, there haven't been (to my knowledge) many languages which attempt to explore this connection at a fundamental level.

Here's a suggestion: we could fairly simply extend S-expressions so they allow for multiple lists or atoms after the dot in a dotted pair. This would allow us to represent binary relations in a simple syntax that reduces to an ordinary list in the case of a relation containing only one row. You end up then with a very low-level but powerful data model which both simplifies and extends the 'array' and 'object' structures in today's scripting languages (eg JSON), and SQL tables, and which has nice mathematical properties: for example, you can union and intersect these relations as you would sets, which is an operation which is undefined on objects or dictionaries. We can also do Cartesian product which is an extension of list appending, _and_ a corresponding Cartesian divide which corresponds to a key-value lookup.

From here, we just need to extend this with a semantics for function evaluation to interpret relations as functions and allow for infinite-sized, recursive computed relations. Which gets a bit tricker, but if we got this, we could represent, say, the entire Internet as a filesystem. Would that be useful?

Comment Re:Dead end (Score 1) 223

But a stream of bytes is inherently too low an abstraction to build everything on.

How about taking it just one step forward to a stream of streams? Then we could at least create object-like structures but with minimal overhead. Plus, it would be a fully recursive definition that would lend itself to virtualisation.

Of course, S-expressions are only 56 years old so such a radical proposal isn't likely to be adopted any time soon.

Comment Re:I find it interesting (Score 1) 223

Eventually you have to talk to highly interactive hardware with massively parallel threads

What does parallelism have to do with anything? The only argument against everything's-a-file is overhead, not complexity.

Exactly. I'd like to see more exploration of something like Kahn process networks as a fundamental programming abstraction; it seems to me that we need to be thinking of programs, filesystems and networks as examples of the same thing. Our networks are becoming software-defined (especially in virtualisation), our chipsets are compiled from languages like VHDL, our programs are becoming parallelised, and our filesystems are starting to grow virtual nodes and do processing. Seems dumb to be maintaining multiple completely separate families of languages and tools each with their own subtle incompatibilities and bugs when we could settle on just one and work all the bugs out once, then use it forever.

Comment Re:I find it interesting (Score 2) 223

I suspect that between various BSDs and Linux versions that the concept of everything being a file has pretty much reached its logical endpoint.

Not even close, unless you're thinking about Plan 9.

A truly 'everything is a file' Unix would implement BSD sockets and X11 windows as files, just for a start. Can you do that on Linux yet?

Comment Re:On Debian that's allready done. (Score 1) 223

Then "the hack stage" is the state of the world when you're operating at any significant scale.

And that's why every week we have reports of major data centers being hacked. This is not a sustainable course for the global Internet. Eventually, people are going to die from infosec disasters. (In drone warfare, they already have, but that's also a political problem.)

Yes, we'll always have bugs. But we have to get to the point where we have zero tolerance for _preventable_ bugs, such as machine code level crashes. Raw x86 code is simply too unsafe to run at any speed on the Internet; it gives no fundamental guarantees about separation of memory space. At the very least, we need managed languages with an extremely tiny, simple, provably correct kernel that make it mathematically impossible for one process to smash its stack or corrupt another's heap. We've had solutions for decades; microkernels like L4, languages like Erlang, Haskell and D. We can replace failed, non-securable syntaxes like raw ASCII SQL queries with nested list structures that don't suffer from quote-escaping vulnerabilities. We simply refuse to develop and deploy these solutions, because of no better reason than laziness, institutional inertia and a sense of "it's not my problem if my program is not provably secure". Wrong. It's everybody's Net, and that means it's everybody's problem.

As an industry, we're at the stage where medicine was before the discovery of antiseptic surgery. We have no fundamental data hygiene in our execution environments. We kill as much as we cure. This has to change.

Comment Re:On Debian that's allready done. (Score 3, Insightful) 223

This is an incredibly basic problem in multiprocess systems. It's like saying IF your computer crashes and needs to be restarted... in a datacenter, it's a matter of WHEN.

Except that in today's hostile Internet, WHEN that broken Internet-facing process crashes it WILL be because it was pwned by shellcode, and if that process had write access to core files, your entire server is now rooted. If that process also had any read or write credentials to your local network, your entire data center possibly just got rooted also.

Are you _really_ saying that the appropriate thing to do in that situation is to simply restart the process and continue? You'd be better to flash-wipe and reinstall at least the entire server node, and probably also change all your internal administration passwords. Otherwise, you're an infosec disaster waiting to happen.

You're fighting a full-scale hot cyberwar out there, don't forget. It's no longer 1970. You don't have the luxury of trusting that incoming packets come from universities and defense contractors with administrators you can chew out with a phone call when they misconfigure stuff by accident. NSA owns the wires and your packets come direct from the Russian Mafia and Syrian Electronic Army.

It's not a hack, because machines are NEVER perfect.

It's totally a hack, and _because_ machines are never perfect you'd better be 150% certain that every single step in your error-recovery process is double and triple checked and accounts for every possible side-effect of executing evil x86 machine code with root permissions.

Look, we both agree that Murphy rules. And you're right to say 'because random stuff happens, I need an overseeing process to automatically fix it'. But auto-restarting pwned services is not that fix, anymore, and it really hasn't been since 1999.

Comment Re:Bad Analogy (Score 1) 716

What is a spectacular crash in software? ... Software just doesn't fail that catastrophically

Wut.

Oh yes it does. If you don't realise that Internet security is already a catastrophe then I just don't... you really really need to get out more.

We're living through the biggest security and privacy disaster in the Internet's short history. We don't yet understand the full dimensions of the damage, but we understand this: it was almost entirely preventable. Inexcusably shoddy software workmanship, defended with exactly the argument you're making, is what caused this.

We won't progress as an industry until we learn the meaning of "first do no harm". First, deploy no root exploits to your customers. Then we can talk about efficiency, productivity, market forces, and what colour the fifth pixel from the left on the splash screen should be.

Comment Re:Am I the only one.. (Score 1) 158

Hell, back in the 80's it was common for kids under 10 to teach themselves how to program.

Yes, exactly. I was there, I was that age. I remember how it was.

Of course, the ROM-based 8-bit micros we bashed out 10 PRINT "INSERT NAME HERE RULES": GOTO 10 on weren't nearly as scary as a toxic HTML5/Javascript/PHP/MySQL soup of SQL injections and root vulnerabilities running on a three-tier Web platform. It was our parents who were scared of "breaking the computer" while we reassured them that no, a misplaced comma wasn't going to drain their bank account and launch the NATO missile arsenal, and a 'crash' just meant we had to hit the power switch. And we mostly just coded BASIC so we could get games running. But it was fun, and we learned a *lot* more than you do with Facebook and a Playstation.

Things are a lot different now. I wish we did have coding environments half as safe and clean as a Commodore 64 or Atari 800. In fact, growing up in the 80s taught me a lot I had to unlearn when the Internet came along; for years it never occurred to me that commercial software could be so fault-riddled and plain dangerous to operate as Windows was. After all, I'd used machines with 8,192 *bytes* of RAM that were solid, stable, and just didn't crash unless you physically tripped over the power button. Your machine was totally air-gapped, totally safe, and could be reset to factory defaults instantly. And that was an environment where you could try anything and learn. It was intoxicating, ike having wings under your brain.

But now... no, now we've built the Matrix we had nightmares about in the 1980s. Not the space-opera Wachowski Matrix; the Gibson Matrix, all neon and chrome and happy smiling avatars on the outside, and a horror show of broken crypto and corporate greed inside. And hacking has become as stupidly easy as downloading a rootkit and clicking 'go'. And there's no guarantee that your hard drive controller or your building HVAC server aren't under the control of the NSA or the Mafia.

Good luck, guys.

Comment Re:How long would that last... (Score 1) 353

If you actually didn't know what you were doing and they tasked you to accomplish something?

Then presumably you could get a good job as a security reviewer for Adobe or Oracle.

"Exploit, exploit, exploit, exploit, Flash, Java, exploit and exploit. That's not got much exploits in it."

Slashdot Top Deals

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...