Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment Probably not much different to today (Score 1) 279

I mean seriously, there is nothing paradigm shifting on the horizon. Yes people may now use graphical tools to write GUI applications, but they did so since Visual Basic and Delphi, both of which were originally released in the first half of the 1990s.

There will always be specialized systems for special applications, but none of those are suitable for general consumption.

Comment The manufacturers of those devices should be... (Score 1) 108

... required to pay for all of the damages caused by their stupidity.

Seriously this could only work if you connected medical devices (incompetently) to a network. It could only work if you used some completely overcomplex operating system with far more features than you need.

Comment Some projects may actually have to much money (Score 1) 68

And Mozilla is probably one of the best examples. They used to make a browser, now they implement every miss feature they can find, from DRM over HTTP/2 to binary Javascript.
Instead of saying, "We want a simpler web", they just continue on with layer after layer of complexity, making it harder for competitors to write their own browsers.

Of course they also do great stuff like investing into codec research, however they more and more behave like any big company.

Comment "As an industry we know how to scale up software" (Score 1) 146

No you don't. The industry certainly knows how to solve trivial problems in hugely over complex ways, but those mountains of code quickly become unmaintainable. One of the prime examples of this is Android which tries to solve the near trivial problem of "application launcher" with so much code, entering a to long password can trigger an application crash which unlocks your phone.

Comment Opinion of the constitutional court of Germany (Score 1) 103

I'm sorry, but please follow the current state of the discussion which probably is the opinion of the constitutional court of Germany.

Essentially they found that it's rather irrelevant how secure it is, what's important is that it's easy to detect fraud. And by being easy they mean that a lay person without any special knowledge can, without a doubt, find out when fraud occurred.

The typical well designed system is the hand marked paper ballot. The technique to check for fraud is trivial. You look into the ballot box before the election to make sure it's empty, you make sure everybody just throws one ballot into the box, you make sure that in the end the number of ballots is equal to the number of people voting, and then you make sure everything is counted correctly. The last part is hard to watch, but since the ballots are stored you can always have a recount.

Compare that to those mathematical systems which, even if you understand the math, require you to actually see what computers are doing. So essentially you need to do a deep forensic analysis on a voting computer checking everything from the firmware to the individual dies of the chips.

Other areas as in banking have it easier. There you can just have audit logs for everything and check against such logs. This cannot be done with elections because of voter privacy which is highly important by itself.

Comment It can't be done (Score 1) 80

Any form of DRM on a simple system like the NES could be circumvented rather quickly. And since the primary purpose of Netflix is to promote DRM, they won't drop DRM from that.

Without DRM it would obviously be rather simple, just add a network card and copy raw frames from it to the graphics chip. That's a no brainer.

Comment X11 has lots of things to be improved... (Score 1) 375

...but you _can_ make secure screen lockers on it, you just need to use it raw and not use bloated frameworks. It's been done for years.

There is nothing wrong about considering to replace X11, however the current crowd of desktop developers probably won't make it much better. Instead of learning from modern operating systems like Plan 9 and using language neutral file system based interfaces, systems like Wayland still are stuck in the past requiring dynamically linked libraries as API interfaces.

Comment Lazarus (Score 2) 492

Unfortunately the state of desktop applications is now to bad, that Lazarus is now pretty much the only alternative left, particularly if you want to distribute your software in binary. .net requires the user to install a huge and fragile framework. Java does the same and even adds an insecure browser plugin. In both cases your code will need an installation routine. And even then, Lazarus will be able to compile for more platforms than Java and .net support.

On Lazarus you get a statically linked binary you can just plop onto your system and execute it. So up- and down-grading your application is trivial.

Plus you get things like bounds checking simply with a compiler option. In my tests it didn't hurt the speed, probably because the compiler can easily find out when they are needed and when not. However as far as I know you can enable and disable it per line.

Comment Webservice or Lazarus (Score 1) 264

Of course today you would do such things via webservice, but if you prefer actual desktop applications you can use Lazarus, which is a Delphi clone. The database connectivity concept of Delphi is geared towards creating fancy GUI applications with database connectivity easily. It's more or less point and click.

Plus unlike .net or Java you can run this on multiple platforms just by recompiling. And on every platform you get a (mostly) statically linked binary file.

Comment It's more like slithering along the ground (Score 1) 598

I mean MacOS, for example, didn't have any kind of memory separation. Applications had statically assigned memory, but they were free to write to the memory of others freely. That's one of the reasons why MacOS was nearly unusable for any webbrowsing around Version 6 and 7. In fact back then it emulated 68k code on the Power platform.

Then came MacOSX, taking an ancient version of some BSD and removing all the good bits replacing them with proprietary stuff. Even MacOSX 10.3 was hardly usable. It did work for a while, but after a week of uptime it became increasingly sluggish.

Software quality never was particularly good at Apple. They always just competed with Microsoft, not with any meaningful quality standards.

Same goes for hardware. Logic board failures were common during "evil Steve's" reign. Macs just became much more fragile than the industry standard. Batteries were glued in. Harddisks were really hard to replace. Even things like the Apple Airport had design flaws leading to mass breakdowns.

I guess the point why this now looks like a sudden decrease in quality is that the "reality distortion field" is gone. Apple is no longer the underdog which invests significant amounts of its money into engineering. Apple is, particularly since "evil Steve" a marketing driven company.

Comment He's not quite found the problem yet (Score 1) 252

It seems like he's still in the "I'm not satisfied" phase of solving a problem, unfortunately it's unsure if he'll ever reach the "I've understood why I'm not satisfied" phase.

Simply put, in order to derive any meaning full use out of those systems you need to be able to program them. And to be able to program them, they need to have as simple as possible interfaces. If I'll have to read into some complex programming language like Java I'm not going to bother.

It needs to be something simple like sending "show status" over a socket to the device and it'll return with it's current status in a simple non-XML or JSON format. And devices should be able to emulate multiple protocols. So people can choose the simplest one with the functionality they need or the one they are most familiar with.

Slashdot Top Deals

% "Every morning, I get up and look through the 'Forbes' list of the richest people in America. If I'm not there, I go to work" -- Robert Orben