Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: Self-Driving Pedestrians (Score 2) 62 62

Just imagine, there could be a phone app that displays an arrow to show the user which way to walk. Using the Lidar to detect obstacles, the app could enable a phone zombie to become almost self-driving, avoiding obstacles and other people. Almost like a real person.

Comment: Valet Parking Everywhere (Score 1) 477 477

With self-driving cars I expect parking will become like having valet parking everywhere. Think of how guests arrive and leave at a large hotel. There will need to be a reasonable sized area where cars can come and stop to pick up and drop off passengers and their stuff. Once empty, the cars will go and park themselves in high-density fashion. Your typical Safeway parking lot will need to be reorganized to accommodate this.

There will be an opportunity to reduce the space allotted to parking at many places.

Comment: Road-based drones (Score 1) 162 162

I have serious doubts about the practicality of aerial drones, at least for deliveries to individual consumers. What happens when a drone shows up at your door and the family dog attacks it? There are many other problems which other posters have mentioned.

I would really like to see autonomous, road-based drones developed. A road-based drone could be much smaller, lighter, and cheaper (in mass production) than a car, since it wouldn't need to carry people. Road drones would need the same sorts of sensors as self-driving cars, so if road drones were mass produced, self-driving cars could reap the benefit of cheaper sensors. Road drones also would need software that would be very similar to that required by a self-driving car. But a road drone, being smaller, lighter, and cheaper, would usually cause less damage if it were involved in an accident. So it might serve as a testbed for new software, before it is deployed in self-driving cars.

The trick would be to come up with a form factor that could share the roads with existing traffic. I was thinking of something the size of a bicycle, a Segway, or a very small car. But it just occurred to me that there is a lot of space underneath most cars. Maybe a road drone could just position itself under a car and stay there as long as the car is going its way. Making a transition to another car if a drone's car turns in the wrong direction would be tricky, and perhaps not even possible if all the surrounding cars have their own drones. Ideally you would want the humans driving their cars to be able to completely ignore the drones, as the drones would be smart enough and fast enough to keep themselves from being squished. Of course it would be easier if there were a substantial number of self-driving cars on the road, and the drones and cars communicated and coordinated.

I would love to work on developing something like that.

Comment: Learned on a PDP-5 (Score 1) 92 92

I first learned machine language on a PDP-5, which was similar to the PDP-8, but limited to 4KB of memory. Mostly I just used it to toggle in small programs through the console switches, but I think we got the FOCAL interpreter running on it at one point. Those were the days. To think now there is a generation of programmers who have known nothing but JavaScript.

Comment: Re:You want security? Start with the OS. (Score 1) 237 237

Of course vulnerabilities remain. But when you're deliberately aiming for a secure *system*, they're a lot less impactful. Kinda like how turning ASLR on simply nullifies entire classes of vulnerabilities. MULTICS, according to your paper, didn't have problems with buffer overflows. Thirty years ago, this was a solved problem. Why is it an ongoing problem now?

Because programming languages like C/C++ are still in wide use. I suppose most people who still use these languages would tell you that they must, for reasons of efficiency. Then they would start talking about how their application can't tolerate pauses for garbage collection. But of course you could have a language which supports manual allocation of data types, with a maximum length that is enforced at runtime.

I've addressed why I think software engineering hasn't progressed more in a previous post. The argument I make there about hobbyists designing languages and lack of industry support for standardization also apply to software security. But the problem of security is open-ended. We could have better languages which prevent all kinds of abuses of the hardware-level machine model, languages in which buffer overflows and stack overflow exploits are impossible. But then someone writes a program that builds a SQL query in a string, but doesn't take the necessary precautions, and you have SQL injection. Now the SQL interpreter is in some sense another level of virtual machine which needs to be protected from abuse. It's not hard to do that if your program creates SQL queries using a data structure that supports a higher level of abstraction than strings. But if a SQL client library is provided, and it takes a SQL query as a string, building a safer level of abstraction on top of that probably isn't going to occur to most programmers. Nor will they necessarily take the time to discover that someone else has implemented a higher-level interface. Strings are what they know, and strings are what they'll use.

Likewise, when JavaScript was added to the browser, it created a whole new world of potential security vulnerabilities. Brendan Eich may not have been a hobbyist in the strictest sense, since he was being paid. But he was working under a very tight schedule which prioritized functionality over security. And I suspect even he would admit to having made a number of rookie mistakes in language design. More importantly though, is that some kind of client-side scripting of HTML was virtually inevitable. There is an inexorable force toward adding more functionality to successful software. And whenever new abstractions are created, or new interpreters built, there is the potential for new kinds of security vulnerabilities which can exist regardless of how secure the underlying infrastructure may be.

This is not to say that I believe secure software is impossible. But it is a moving target that can't be addressed simply by instilling in programmers a comprehensive list of secure programming DOs and DON'Ts. Programmers really need to be able to recognize when their code may be creating new kinds of security vulnerabilities.

Comment: Re:You want security? Start with the OS. (Score 1) 237 237

MULTICS eh? Here's an interesting paper looking back on MULTICS security:

Thirty Years Later: Lessons from the Multics Security Evaluation

In spite of the fact that security was a top priority for MULTICS, in spite of the fact that it was written in PL/I rather than C, in spite of the fact that it was a very small, less complex system by today's standards, in spite of the fact that it was more secure than most modern systems, MULTICS was easily penetrated during the security evaluation. So I maintain my original position that writing secure software is hard. So hard that even when people are diligently trying to write secure software, vulnerabilities remain.

MIT's ITS was another system derived from MULTICS, and deliberately insecure. It even had a non-privileged "crash" command to crash the system, and logins were optional.

Comment: You want security? Start with the OS. (Score 1) 237 237

From the linked article:

"Overall, Brian Gorenc, manager of vulnerability research for HP Security Research, said that one of the surprises at the Pwn2Own 2015 event was the amount of Windows kernel vulnerabilities that showed up, though he noted that HP, in a way, expected it."

Although many exploits may be against vulnerabilities in the browser code, I have to wonder how we can expect a browser implementer to write secure code if kernel implementers can't. In my view the basic problem is that the goal for security in almost all software is that it be just "good enough", where "good enough" is a bar which is raised as a piece of code gets a reputation for being insecure. With the possible exceptions of governments and a few financial service companies, no one really wants to pay the cost of ensuring that software is secure. So we make a game of it, with prizes, like Pwn2Own, in an attempt to amortize the cost.

Why is it so hard to write secure software? Well, programming languages are an easy target. Most OS's and browsers are written in C/C++, which we know provide many, many ways to accidentally undermine security. But the complexity of the software also is a factor, and that is fed by the desire to continually add functionality over time. The evolution of the web is a textbook example of this. Rather than being satisfied with a nice, relatively secure textual browser, we had to turn it into an application platform. We used to have different application protocols for different applications. Now everything goes over HTTP, and so HTTP becomes ever more complicated. What we ought to have is a browser application that only browses and renders hypertext, with no JavaScript and a very restricted plugin API. Then if you want to launch an application over the web, make it a separate application and use a URL that starts with something other than "http[s]:". Personally I think it would be cool just to have a URL that opens a VM in a tab and boots it up from a secure OS image.

You would think that the evolution of smart mobile devices would provide the opportunity to not repeat the mistakes of the past. And it is true that mobile devices have security managers which provide some granularity to the rights that an application can be granted. But my experience has been that apps that I install on a mobile device often require more rights than it seems they should. In practice the decision I make when I install an app is, "Do I trust this app or not?" And I either grant it all the rights it wants or not. (Actually what I really think is that mobile devices are not really secure, that the security manager is effectively "security theater", so I don't put anything on a mobile device that I wouldn't want the world to see.)

Comment: Tonight's Word: pwned (Score 2) 628 628

An economy is a mechanism for regulating human (so far) behavior. If you're an economist, an economy is a means of regulating production and consumption, usually with a goal of achieving some kind of balance. But a computer scientist might view the mechanism itself as a (usually) distributed algorithm. The salient points are how data enters the system and how it gets processed as it moves through the system. Capitalism, for example, uses a distributed data structure we call "prices" to represent the state of supply vs. demand. Because the data is distributed, all the familiar problems of concurrent, distributed systems have to be addressed in some way.

However, just as software is typically built in layers, from firmware, to operating systems, to frameworks, to applications, once you have an economy, it is irresistible to build more complexity on top of it. So we use our economy to regulate human behavior in ways other than production and consumption, through the use of taxes, fines, and additional rules on what can be bought and sold, and who can work at what jobs.

The goal, as always, is to control human behavior. There are a few things that set humans apart from other species, but one of most under-recognized is our instinct to control things, including other humans. This is built into our DNA and is surely a big factor in our successful proliferation as a species. And it is something that the coming of the machine age will not change over anything less than evolutionary time scales, unless human nature itself is re-engineered.

But what does change as information and telecommunication technologies advance is the rate at which a system like an economy can process data, and the scale at which it can do it. The global economy is already almost completely integrated, and is becoming increasingly tightly coupled. And yet, humans are unceasing in their desire to control it, and to use it to control other humans.

What happens to people who can't find jobs? Some people say a basic income is the solution. But: pwned by the government. What is already happening? People living on credit cards. But: pwned by the banks. People going to school to qualify for better jobs. But: pwned by student loan debt. Is it even possible to have a society where most people aren't pwned? Could being pwned by a machine be any worse?

And that's tonight's word.
(You will be missed, sir.)

Comment: Re:Software fails the test of time (Score 1) 370 370

As someone with 45+ years of software experience

44+ years here. Old-timers represent!

I can personally verify that software development has not improved significantly over the last 25 years or so.

I can relate to where you're coming from with that statement, but to me it seems more like a lot of backwardness is obscuring the forward progress that has been made. Programming languages, in spite of the horrors of PHP and JavaScript that you mention, are becoming more powerful. Just being able to program most things in a language that has garbage collection is progress in my book. Give me Python or Scala over Fortran or Pascal any day. I also think that the modern emphasis on test-driven development and tool chains is a step forward. On the other hand, I find client-side web development to be a simply appalling mess.

I can't be certain, but I strongly believe that one of the reason for the lack of progress is that there are not a lot of old programmers still in the profession.

I don't think that's it at all. It seems to me that the problem is two-fold. First, academia has lost either interest or influence or both in the area of software engineering. I remember when research in novel operating systems and programming languages was abundant. Now, instead of professors and Ph.d students who have taken some time to study prior work, hobbyists are the ones developing new programming languages and operating systems. The problem is not so much with the hobbyists themselves - some of them are extremely capable. But rather the problem is that the initial work is supported only by the personal enthusiasm of the hobbyist.

Which brings me to the second part of the problem: the software industry seems to have lost all interest in funding R&D to improve software engineering tools. There used to be a healthy segment of the software market involved in making tools for software developers. And that's probably because companies were willing to pay to buy those tools for their developers. Now we just use the free versions. Or wait for a hobbyist to save us.

Industry has also failed in the area of making software standards. Standards bodies have become just another field of corporate battle, where companies seek to either control developing standards or kill them. Software patents are part of that problem. But the short-sightedness of companies in understanding the long-term value of standards is the more fundamental problem.

Comment: Re:Why? (Score 1) 309 309

Because the operating systems that run those downloaded apps were not designed to run them securely. Even the newer mobile operating systems and their security managers are not really up to the task. One is forced to grant broad privileges to many apps in order to use them. The user needs to have finer control over what an app can do with the network or local files, rather than being asked for blanket permissions when the app is installed. In some cases that control need not be explicit, but can be implicit in how the user interacts with the app. For example, if I ask an app to open a local file, that could implicitly grant read access to the app for that particular file (within the limits of my own rights on a multiuser system).

The evolutionary pressures on browsers are obviously moving them closer to being operating systems themselves. There's nothing wrong with that in principle. In fact it restores the ability to tinker with OS features and structure without having to worry about writing device drivers for every device in the world. However, if browser developers simply mimic the features of existing OS's, we will soon find our way back to square one.

And then there is the god-forsaken mess that is HTML/CSS/JavaScript. If you look at how evolution is shaping those, it's obviously trying its best to turn them into a decent window system. People are already using virtual DOM's to speed up dynamic HTML because the real DOM is such a beast. It won't be long before someone implements a mapping from a virtual DOM to HTML5 canvas, and then we can finally start to think about whether the DOM itself is more trouble than it's worth.

Regarding the original topic, I think one would have to have a awfully good reason not to make a web OS multilingual. The only such reason that occurs to me is the possibility that the language itself may be integral to the security manager. Back in the day when there was a vibrant OS research community, people actually did experiment with OS's that attempted to forgo the overhead of virtual memory management by putting all applications in the same address space and implementing containment via the programming language. However, I tend to think that any containment or security management that a particular language can provide could just as easily be implemented in a byte-code VM.

Comment: The difficulty has evolved (Score 2) 391 391

When I started 44+ years ago, the hard part of programming was getting programs to fit the memory size and processor speed of the computers of that day. We often wrote in assembly language, because it was often the only reasonable choice. Scientific code was written in FORTRAN, business code in COBOL (or maybe RPG). A little later C came along and became a viable alternative to assembly language for many things. There was much experimentation with languages in those days, so there were a lot of other languages around, but FORTRAN, COBOL, and C/assembly were the major ones in use.

One thing we did have in those days, which I recall with great fondness, were manuals. There were manuals for users of computer systems, and manuals for programmers, describing the operating system and library interfaces. There were people who specialized in writing such manuals, and some of these people were quite good at it. But at some point manual writing became a lost art, and the industry is poorer for it.

Over time, the machines reached a level of capability that exceeded the requirements for the kind of programs we were writing. It was rare to find a program (code, not data) that wouldn't fit in memory, and compiler technology had advanced to the point that we didn't need to obsess about the speed of code sequences in assembly language. So we started writing larger and more complex programs, and the main difficulty of programming was managing and testing the code. Source code control systems were developed, and eventually continuous build systems, but testing remains an important concern to this day.

As computer networking evolved, the problems of managing concurrency and asynchronicity became more severe. Many approaches have been developed to deal with these problems, such as threads and monitors, but this remains an area of difficulty, and is more important than ever with the advent of multicore processors.

By the mid-90's, we'd learned about all we could from the structured programming craze, and were well into the thrall of object-oriented programming, with many of us programming in C++. Large teams of programmers became common, and for them, source code control and continuous build systems were essential. Then the web finally arrived on the scene, and a great war between corporations for control of the web platform began.

How did that war end? Well, everyone lost. We got HTML, CSS, and JavaScript. Some people call that a platform. I call it an abomination. But the alternative would have been to have no standard at all, which I'm sure would have pleased some of the combatant corporations, but was never really a viable option.

Now we are in the age of frameworks. Everybody and his dog has a framework, and most of them are very poorly documented. I believe that it is only by virtue of forums like stackoverflow that some of them are usable at all. But many of them are aimed squarely at dealing with the problematic "web platform", and for that we should all be grateful. However, we really need to get some smart people together and design a new web platform, including a reasonable migration path from the current mess. The problem is, as we approach the 20-year anniversary of the web debacle, there is an entire generation of programmers who have never known anything else.

The nature of programming will continue to evolve, but evolution is not a particularly efficient process. If we are indeed intelligent, we should be capable of intelligent design.

Comment: Re:Replusive (Score 1) 505 505

Really? For a long time the MS JVM was the fastest and most compatible (according to Sun's own verification suite) VM available for any browser on any OS.

MEMORANDUM OF THE UNITED STATES IN SUPPORT OF MOTION FOR PRELIMINARY INJUNCTION

The whole thing makes interesting reading, but just search for Java and JVM to see what I'm talking about. Microsoft was following their "embrace, extend, and extinguish" strategy, which had worked well for them many times before. They had a Java-like language, J++ that was not compatible with Sun Java.

Sun's has to share some of the blame for Java failing to become a browser standard, as they did everything they could to maintain control of Java, rather than turn it over to an open standards body. But in fairness they did invent it, and Microsoft had the resources to dominate any open standards effort through sheer numbers of representatives. In contrast, Netscape did turn JavaScript over to Ecma (a very wily choice, I think), and I don't think we'd still be discussing it if they had not.

Comment: Re:Replusive (Score 1) 505 505

JavaScript thrived because the alternatives were arguably far worse. Java applets were terrible. ActiveX a platform specific disaster. Flash is heavy. JavaScript allowed you to do the very minor things most web developers wanted at the time without having to turn your website into a plugin that disregarded base web technologies.

No, JavaScript survived because it wasn't seen to threaten corporate interests until it was too late to stop it. Java applets failed because Microsoft did everything in their power to kill them. Yes, there were problems with the technology, but early versions of JavaScript offered far less functionality. The only reason people still talk about how bad applets are/were is because they actually got used to solve problems which JavaScript could not. Flash was also used heavily and is still hanging on, though probably not for much longer.

JavaScript is the result of the major players of that time, Microsoft, Sun, Netscape, and Adobe, failing to realize that an open standard would be in their best interests. It is unfortunately the case that corporations tend not to endorse open standards until no alternative is left. This happened with TCP/IP - by the time corporations like IBM and DEC got around to collaborating on ISO/OSI as the future replacement of their proprietary network protocols, it was too late. Network companies like Cisco and many others were bringing products to market based on TCP/IP. The ISO/OSI protocol got to the party just a little too late.

So we have TCP/IP with its NAT and its ever-so-drawn-out migration to IPv6, when we could have had something better. And we have JavaScript, which was a cute solution to making live widgets on a web page when it was invented, but like TCP/IP has been pushed far, far beyond what it was designed to do. The inevitable fate of such technologies is to become a victim of their own success. So it will be with JavaScript. I don't know whether that will be because it is replaced by some language that compiles into JavaScript, or whether some new browser runtime such as Dart will do it. Or perhaps the mobile device market will make browsers themselves obsolete, in favor of some mobile O/S.

What will not change is that corporations will continue to try to own the next standard. That will inevitably fail, and the software industry will be the poorer for it.

+ - Are we on the verge of being able to regrow lost limbs? This scientist thinks so->

blastboy writes: Michael Levin is a Russian-American scientist who started life programming Pac-Man clones on his TRS-80 and freezing bugs in his kitchen. In recent years, he's become fascinated by the role electricity plays in the body—and whether it could even help people regrow lost limbs. Guess what? It's starting to pay off: his team at Tufts is on the verge of a major breakthrough in regenerative medicine. It was just announced that this story was awarded the 2013 Institute of Physics prize for science journalism.
Link to Original Source

Comment: Re:Spiritual Malaise? (Score 1) 385 385

I'm not going to try to define it, but if you want to measure it, the suicide rate should be well-correlated. I looked for some statistics that would go back to 1964, and didn't immediately find any (though I'm sure they're out there). But the rate has spiked lately, even surpassing auto accident fatalities.

If it's not in the computer, it doesn't exist.

Working...