Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Good. (Score 1) 146

I particularly would like to see a resurgence of OS research. Things have changed enough in both the hardware and the application landscape since Windows and Linux were designed that I think it would be worthwhile to revisit the questions of what an OS can and should be. But in my view the real problem for the commercial success and/or widespread deployment of a new OS is not so much legacy applications as it is device driver support for the very broad range of devices that are found across the major hardware platforms. It would not surprise me if the volume of device driver code for Windows and Linux (and maybe even Android and iOS) exceeds that of the rest of the OS. In contrast, support of legacy applications can usually be achieved through a compatibility API or container approach.

Ideally I think device drivers should be written in a language which supports a level of abstraction which is at least somewhat OS-agnostic.Then, even in cases where the device manufacturer was unwilling to provide source code, an OS developer could provide the manufacturer with a compiler that would generate a binary for their OS. But the marketing obstacles to such an approach probably far outweigh the technical challenges of its implementation.

Comment Policy vs. Mechanism (Score 1) 609

As with the design of operating systems, it makes sense to distinguish between policy and mechanism. Science and rationality may or may not be a sufficient basis for creating policies. And here by "policy" I mean things like "people should be equal before the law", "healthcare should be a right", "what's good for Wall St. is good for America", "all citizens should be armed to the teeth", "Mars colonization should be our highest priority". That is, policies are goal statements, and reasonable people can certainly disagree about what our goals should be as a society. Mechanisms are the means we use to achieve our goals, that is, the means by which policies are implemented. So a policy might be: "wealth inequality should be bounded", and mechanisms to achieve it might include "progressive income tax", "subsidies for the poor", or "universal basic income". Given a policy, science and rationality are certainly applicable to designing and evaluating the efficacy of mechanisms to achieve the policy.

Our biggest problem is that most of our political discourse is consumed with debating what we call policies, but which are usually mechanisms to achieve some policy. The policy is seldom explicitly stated and almost never debated, while the participants in a typical political debate take it for granted that everyone accepts whatever implicit policy their proposed mechanism seeks to achieve. But even worse, we implement mechanisms without ever tying them to an explicit policy goal, which makes it difficult to determine whether a given mechanism is working. Politically you get a situation where anyone who questions a mechanism is assumed to be disagreeing with the unstated policy behind the mechanism. The result is that bad mechanisms become entrenched, and are no longer subject to rational or scientific examination. And that just sucks.

If I could interject one question into every political debate, it would be: what are you trying to achieve? And if I could have a second question it would be: how will you know if you've achieved it?

Comment Mentored at NASA (Score 1) 515

I had an excellent mentor at NASA at age 16. Learned about high-level languages and algorithms using the CAL language on Tymshare. Learned about how computers actually work by toggling in programs through console switches on a PDP-5. Learned how to code efficiently mostly by reading other people's code. Learned FORTRAN IV from McCracken's book. Read a lot of computer manuals, back when computers came with full documentation sets.

Still working as a developer at age 63, and still love programming. These days I mostly use Scala and Python, and when I must, JavaScript. In my spare time I pursue my passion, which is machine learning, taking online classes and working on Kaggle competitions.

Comment Re:Nothing to worry about? (Score -1) 232

Not just a single disaster. Just as we've seen more frequent and more powerful storms in recent years, so too I expect there will be more frequent and more powerful earthquakes, including the possibility of tsunamis. More and larger volcanic eruptions would not surprise me. I expect most of the action would be around the boundary of the Pacific plate. But things could get interesting on the mid-Atlantic ridge as well.

The consequences of this on civilization would be fairly devastating. Besides lives lost in individual incidents, there could be hundreds of millions of refugees. Economic collapse would surely follow, straining our political and social institutions to the breaking point. Much of the infrastructure which supports our technology also would be severely impacted. Major ports may be destroyed by tsunamis or earthquakes, disrupting supply chains for food, and causing food shortages, even assuming that food production itself remains intact. Ultimately a sizable chunk of the human population will perish. But unless Earth goes the way of Mars or Venus, humanity will survive and over time, adapt.

It's useless to debate whether this is a natural cycle or a result of human activity. It would be great if we could limit our carbon emissions, but I fear we're already two or three decades too late. What matters now is trying to understand what is happening, anticipate the possible consequences, and prepare. What we need is something like the International Geophysical Year, except that intensive level of research needs to be sustained for at least a decade.

Comment Nothing to worry about? (Score 2, Interesting) 232

The article says it's nothing to worry about. Well that's what they said to Jor-El, and you know how that turned out.

The shift in mass distribution caused by melting ice will cross the boundaries of tectonic plates, changing the relative pressure on adjacent plates. This will likely lead to increased earthquake and volcanic activity. On the bright side, the ash from the volcanoes may limit global warming (maybe even trigger an ice age), and deformations of the sea floor may reduce sea level rise (or make it worse). Another possible impact is the triggering of a geomagnetic reversal.

The political debates about climate change are futile. What we should be discussing is whether we know enough about how this planet works (and have the technology) to attempt some kind of active intervention, such as carbon sequestration or actually blocking sunlight from space. But we'll probably just end up fighting over whatever habitable parts of the planet remain. Maybe the survivors will be wiser.

Comment Re:Can you work with an image? (Score 1) 364

They dont really give a shit about the data in this case, they want to cow the tech sector into not making their jobs harder.

Maybe they care about the data, but it's likely they have other ways to brute force the passcode. This battle with the tech sector over encryption has been ongoing for more than a decade. What's different about this case is that it is the best opportunity the government has had to use fear of more mass killing to shut down the thinking part of the average person's brain. Their goal is to ensure that they have the keys to decrypt anything encrypted by the general public. (Anybody remember key escrow?)

Anyone with a basic technical understanding of how encryption works knows that there is no way to stop a knowledgeable person from implementing encryption in software, and keeping their keys private. So this is really about preventing the average person who lacks that knowledge from having unbreakable encryption. It's interesting that the situation with the general public and firearms is a similar situation, and in fact cryptography was once classified as a munition. It seems to me that a liberal interpretation of the Second Amendment might apply to encryption. I point that out especially for those of you who feel entitled to assault weapons under the Second Amendment.

Personally, I think we need to look at personal devices, and perhaps even our use of search engines, as extensions of our minds and as such, should be treated by the law with the utmost concern for privacy. After all, the technology to actually read minds is advancing, and the day may come when the precedents we set today for our personal devices are applied to our brains.

Comment Valet Parking Everywhere (Score 1) 477

With self-driving cars I expect parking will become like having valet parking everywhere. Think of how guests arrive and leave at a large hotel. There will need to be a reasonable sized area where cars can come and stop to pick up and drop off passengers and their stuff. Once empty, the cars will go and park themselves in high-density fashion. Your typical Safeway parking lot will need to be reorganized to accommodate this.

There will be an opportunity to reduce the space allotted to parking at many places.

Comment Road-based drones (Score 1) 162

I have serious doubts about the practicality of aerial drones, at least for deliveries to individual consumers. What happens when a drone shows up at your door and the family dog attacks it? There are many other problems which other posters have mentioned.

I would really like to see autonomous, road-based drones developed. A road-based drone could be much smaller, lighter, and cheaper (in mass production) than a car, since it wouldn't need to carry people. Road drones would need the same sorts of sensors as self-driving cars, so if road drones were mass produced, self-driving cars could reap the benefit of cheaper sensors. Road drones also would need software that would be very similar to that required by a self-driving car. But a road drone, being smaller, lighter, and cheaper, would usually cause less damage if it were involved in an accident. So it might serve as a testbed for new software, before it is deployed in self-driving cars.

The trick would be to come up with a form factor that could share the roads with existing traffic. I was thinking of something the size of a bicycle, a Segway, or a very small car. But it just occurred to me that there is a lot of space underneath most cars. Maybe a road drone could just position itself under a car and stay there as long as the car is going its way. Making a transition to another car if a drone's car turns in the wrong direction would be tricky, and perhaps not even possible if all the surrounding cars have their own drones. Ideally you would want the humans driving their cars to be able to completely ignore the drones, as the drones would be smart enough and fast enough to keep themselves from being squished. Of course it would be easier if there were a substantial number of self-driving cars on the road, and the drones and cars communicated and coordinated.

I would love to work on developing something like that.

Comment Learned on a PDP-5 (Score 1) 92

I first learned machine language on a PDP-5, which was similar to the PDP-8, but limited to 4KB of memory. Mostly I just used it to toggle in small programs through the console switches, but I think we got the FOCAL interpreter running on it at one point. Those were the days. To think now there is a generation of programmers who have known nothing but JavaScript.

Comment Re:You want security? Start with the OS. (Score 1) 237

Of course vulnerabilities remain. But when you're deliberately aiming for a secure *system*, they're a lot less impactful. Kinda like how turning ASLR on simply nullifies entire classes of vulnerabilities. MULTICS, according to your paper, didn't have problems with buffer overflows. Thirty years ago, this was a solved problem. Why is it an ongoing problem now?

Because programming languages like C/C++ are still in wide use. I suppose most people who still use these languages would tell you that they must, for reasons of efficiency. Then they would start talking about how their application can't tolerate pauses for garbage collection. But of course you could have a language which supports manual allocation of data types, with a maximum length that is enforced at runtime.

I've addressed why I think software engineering hasn't progressed more in a previous post. The argument I make there about hobbyists designing languages and lack of industry support for standardization also apply to software security. But the problem of security is open-ended. We could have better languages which prevent all kinds of abuses of the hardware-level machine model, languages in which buffer overflows and stack overflow exploits are impossible. But then someone writes a program that builds a SQL query in a string, but doesn't take the necessary precautions, and you have SQL injection. Now the SQL interpreter is in some sense another level of virtual machine which needs to be protected from abuse. It's not hard to do that if your program creates SQL queries using a data structure that supports a higher level of abstraction than strings. But if a SQL client library is provided, and it takes a SQL query as a string, building a safer level of abstraction on top of that probably isn't going to occur to most programmers. Nor will they necessarily take the time to discover that someone else has implemented a higher-level interface. Strings are what they know, and strings are what they'll use.

Likewise, when JavaScript was added to the browser, it created a whole new world of potential security vulnerabilities. Brendan Eich may not have been a hobbyist in the strictest sense, since he was being paid. But he was working under a very tight schedule which prioritized functionality over security. And I suspect even he would admit to having made a number of rookie mistakes in language design. More importantly though, is that some kind of client-side scripting of HTML was virtually inevitable. There is an inexorable force toward adding more functionality to successful software. And whenever new abstractions are created, or new interpreters built, there is the potential for new kinds of security vulnerabilities which can exist regardless of how secure the underlying infrastructure may be.

This is not to say that I believe secure software is impossible. But it is a moving target that can't be addressed simply by instilling in programmers a comprehensive list of secure programming DOs and DON'Ts. Programmers really need to be able to recognize when their code may be creating new kinds of security vulnerabilities.

Comment Re:You want security? Start with the OS. (Score 1) 237

MULTICS eh? Here's an interesting paper looking back on MULTICS security:

Thirty Years Later: Lessons from the Multics Security Evaluation

In spite of the fact that security was a top priority for MULTICS, in spite of the fact that it was written in PL/I rather than C, in spite of the fact that it was a very small, less complex system by today's standards, in spite of the fact that it was more secure than most modern systems, MULTICS was easily penetrated during the security evaluation. So I maintain my original position that writing secure software is hard. So hard that even when people are diligently trying to write secure software, vulnerabilities remain.

MIT's ITS was another system derived from MULTICS, and deliberately insecure. It even had a non-privileged "crash" command to crash the system, and logins were optional.

Comment You want security? Start with the OS. (Score 1) 237

From the linked article:

"Overall, Brian Gorenc, manager of vulnerability research for HP Security Research, said that one of the surprises at the Pwn2Own 2015 event was the amount of Windows kernel vulnerabilities that showed up, though he noted that HP, in a way, expected it."

Although many exploits may be against vulnerabilities in the browser code, I have to wonder how we can expect a browser implementer to write secure code if kernel implementers can't. In my view the basic problem is that the goal for security in almost all software is that it be just "good enough", where "good enough" is a bar which is raised as a piece of code gets a reputation for being insecure. With the possible exceptions of governments and a few financial service companies, no one really wants to pay the cost of ensuring that software is secure. So we make a game of it, with prizes, like Pwn2Own, in an attempt to amortize the cost.

Why is it so hard to write secure software? Well, programming languages are an easy target. Most OS's and browsers are written in C/C++, which we know provide many, many ways to accidentally undermine security. But the complexity of the software also is a factor, and that is fed by the desire to continually add functionality over time. The evolution of the web is a textbook example of this. Rather than being satisfied with a nice, relatively secure textual browser, we had to turn it into an application platform. We used to have different application protocols for different applications. Now everything goes over HTTP, and so HTTP becomes ever more complicated. What we ought to have is a browser application that only browses and renders hypertext, with no JavaScript and a very restricted plugin API. Then if you want to launch an application over the web, make it a separate application and use a URL that starts with something other than "http[s]:". Personally I think it would be cool just to have a URL that opens a VM in a tab and boots it up from a secure OS image.

You would think that the evolution of smart mobile devices would provide the opportunity to not repeat the mistakes of the past. And it is true that mobile devices have security managers which provide some granularity to the rights that an application can be granted. But my experience has been that apps that I install on a mobile device often require more rights than it seems they should. In practice the decision I make when I install an app is, "Do I trust this app or not?" And I either grant it all the rights it wants or not. (Actually what I really think is that mobile devices are not really secure, that the security manager is effectively "security theater", so I don't put anything on a mobile device that I wouldn't want the world to see.)

Comment Tonight's Word: pwned (Score 2) 628

An economy is a mechanism for regulating human (so far) behavior. If you're an economist, an economy is a means of regulating production and consumption, usually with a goal of achieving some kind of balance. But a computer scientist might view the mechanism itself as a (usually) distributed algorithm. The salient points are how data enters the system and how it gets processed as it moves through the system. Capitalism, for example, uses a distributed data structure we call "prices" to represent the state of supply vs. demand. Because the data is distributed, all the familiar problems of concurrent, distributed systems have to be addressed in some way.

However, just as software is typically built in layers, from firmware, to operating systems, to frameworks, to applications, once you have an economy, it is irresistible to build more complexity on top of it. So we use our economy to regulate human behavior in ways other than production and consumption, through the use of taxes, fines, and additional rules on what can be bought and sold, and who can work at what jobs.

The goal, as always, is to control human behavior. There are a few things that set humans apart from other species, but one of most under-recognized is our instinct to control things, including other humans. This is built into our DNA and is surely a big factor in our successful proliferation as a species. And it is something that the coming of the machine age will not change over anything less than evolutionary time scales, unless human nature itself is re-engineered.

But what does change as information and telecommunication technologies advance is the rate at which a system like an economy can process data, and the scale at which it can do it. The global economy is already almost completely integrated, and is becoming increasingly tightly coupled. And yet, humans are unceasing in their desire to control it, and to use it to control other humans.

What happens to people who can't find jobs? Some people say a basic income is the solution. But: pwned by the government. What is already happening? People living on credit cards. But: pwned by the banks. People going to school to qualify for better jobs. But: pwned by student loan debt. Is it even possible to have a society where most people aren't pwned? Could being pwned by a machine be any worse?

And that's tonight's word.
(You will be missed, sir.)

Comment Re:Software fails the test of time (Score 1) 370

As someone with 45+ years of software experience

44+ years here. Old-timers represent!

I can personally verify that software development has not improved significantly over the last 25 years or so.

I can relate to where you're coming from with that statement, but to me it seems more like a lot of backwardness is obscuring the forward progress that has been made. Programming languages, in spite of the horrors of PHP and JavaScript that you mention, are becoming more powerful. Just being able to program most things in a language that has garbage collection is progress in my book. Give me Python or Scala over Fortran or Pascal any day. I also think that the modern emphasis on test-driven development and tool chains is a step forward. On the other hand, I find client-side web development to be a simply appalling mess.

I can't be certain, but I strongly believe that one of the reason for the lack of progress is that there are not a lot of old programmers still in the profession.

I don't think that's it at all. It seems to me that the problem is two-fold. First, academia has lost either interest or influence or both in the area of software engineering. I remember when research in novel operating systems and programming languages was abundant. Now, instead of professors and Ph.d students who have taken some time to study prior work, hobbyists are the ones developing new programming languages and operating systems. The problem is not so much with the hobbyists themselves - some of them are extremely capable. But rather the problem is that the initial work is supported only by the personal enthusiasm of the hobbyist.

Which brings me to the second part of the problem: the software industry seems to have lost all interest in funding R&D to improve software engineering tools. There used to be a healthy segment of the software market involved in making tools for software developers. And that's probably because companies were willing to pay to buy those tools for their developers. Now we just use the free versions. Or wait for a hobbyist to save us.

Industry has also failed in the area of making software standards. Standards bodies have become just another field of corporate battle, where companies seek to either control developing standards or kill them. Software patents are part of that problem. But the short-sightedness of companies in understanding the long-term value of standards is the more fundamental problem.

Slashdot Top Deals

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...