Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Hmmm (Score 1) 520

Ports from 49152 through 65535 are reserved for Network address translation. If your assume every customer has ~8 computers and 1 IPv4 address is shared among 32 customers, that yields a port window of 64 address on average.

It is unlikely they will all be browsing complicated "web applications" at the same time. If they want to do something like bittorrent; they just have to fire up IPv6, assuming the ISP routes the packets.

Comment Re:Hmmm (Score 1) 520

I think the opposite may happen: people stuck with an IPv4-only ISP will be considered to have "second class" Internet.

Since consumers are not really allowed to host servers anyway, NAT can be implemented at the ISP level sharing each IPv4 address among 254 customers. If they want to host a "server" or otherwise directly connect to their machine, they have to use IPv6. With IPv6, your ISP is obligated to give you at least a /64 block. You can have as many machines as you want with a public IPv6 address.

I just realized that a 250:1 may be a little tight. Maybe 32:1 is more appropriate for ISP-level NAT.

Comment Re:Hmmm (Score 1) 520

They can probably share 4 IPv4 addresses among 1000 IPv6 hosts (based on NAT typically supporting 254 hosts). All that is needed is Network address translation. Instead of just translating ephemeral port numbers, those port number/IPv4 address pairs can correspond to an IPv6 address.

The common objection to NAT no longer applies: You can host servers behind the NATing router because other IPv6 hosts will be able to connect directly.

You only need IPv4 connectivity for legacy software and services like certain games and their DRM servers.

Comment Re:Careful what you say! (Score 1) 385

I think my last Debian install did exactly that: The install program would scale how many questions it asked with how much you interacted with it. So if you start by answering every question, then go away for 2 hours (I have slow computers), it starts waiting for input, then goes on with the installation when a certain time-out is reached.

Comment Re:I don't hate computers (Score 1) 385

What pisses me off are the poorly thought-out attempts to "simplify" things for users. People advocating "walled gardens" miss the point. It is not that "general purpose" computers are inherently hard to use: it is that software is badly designed (and probably won't improve for several generations (I think *all* software needs to be formally proven correct (such that it does what the documentation says it does))).

Often things are "simplified" by hiding them in a script that the user does not understand. This leads to extra points of failure and fails to teach user how to configure their software.

For some stupid reason, just about every file format (including HTML, PDF, Word docs, PostScript, WMF) supports some kind of automatic scripting. In the short run, it simplifies things for the help-desk person: "Just open this file and everything will be fixed." The problem is that such capabilities take control away from the user. The next file they open up may break things instead of fixing them. Users become afraid to open e-mail attachments even though there is no reason they should be able to harm the computer in the first place.

Users relying on scripts and wizards also don't learn how to configure their software. They don't get exposed to the (sometimes daunting) documentation, so can't learn to fix the problem for themselves even if they wanted to.

Comment Re:How is this news? (Score 1) 201

I _would_ say that there seems to be some developers who actually think "perfect" software is both somehow possible, and wanted by people.

I think "perfect" software is possible, but won't be demanded until computer technology matures to the point the society is fundamentally changed. With the wheel and printing press this took hundreds of years. I don't think computer technology is going to be any different. Back before the Dot-Com bubble, people were assuming that computers/the Internet had somehow changed the laws of economics. The crash proved those people wrong.

Software happens in the mathematical domain. It is possible to "prove" that software is correct (that it will implement the specification). As layers and layers of abstraction get added to software, it is harder to prove it is correct. As computers stop getting faster and faster as we run into the laws of physics, it will make more and more economic sense to avoid any abstraction (library or hardware) that is not proven correct. To prove (and correct) current popular Operating Systems would take decades. That is why I feel my estimate of several hundred years (for the computer industry to mature) is not too far off.

Digital Rights Management is a special case: any software (or hardware) implementing DRM can not be proven correct; it is designed to fail. Not only may the experience of "legitimate" users be degraded, but it is impossible to prove the software will fail (or succeed in the case of cheat tracking) in the face of an attacker. Even if you could come up with "perfect" DRM, there will be errors in the specification. The Rule of Law is a complicated system checks and balances that would be short-circuited by any automated process.

Comment Re:Gay rights are civil rights. (Score 1) 348

In many places that is the case, except that for convenience you can usually sign the civil marriage papers at the same time. If the particular holy man isn't properly registered to handle the civil part, or the religious ceremony is incompatible with the legal requirements, then you have to do them separately.

Comment Re:Crappy frameworks, tools and web standards (Score 1) 623

The HTTP protocol is not designed for low latency. It is a stateless protocol that supports things like automatic content negotiation. Combined with HTML, it is designed for sharing static, structured, multi-media documents. Protocols like SSH and telnet maintain an open TCP connection so character updates only cost about 40 bytes. I have seen websites do per-character updates over HTTP (usually search engines doing completion suggestions).

Given how "Rich content" is used, I'm not sure how much benefit you get from doing validation client-side. On a fast machine the client-side verification will be slightly faster (than a round-trip to the server), but the server must still validate the information because the client is untrusted. The problem for the client is that most of the time (Short of a mutually authenticated HTTPS connection) the server is untrusted as well. If you can tell the browser to take only certain forms of input (or generate unwanted pop-up windows), you are taking control of the browser away from the user. This is a problem because the browser is typically interacting with several websites at the same time.

As I hinted with the Lynx comment, I think client-side scripting should be avoided. I actually liked the concept of Java Applets: there is a clear delimitation between "Static Webpages" and executable code that is kept in an explicit sandbox. Opening an X window (labeled by the originating host (I don't think the X protocol is sand-boxed)) or VT100 terminal window (labeled by host) is also a clear delimitation. Your post asking for a new GUI/CRUD interpreter (after cluing in CRUD is an acronym) and mention of client-side validation reminded me of the Network extensible Window System, based on the PostScript programming language.

I think we should stop assuming that users are too stupid to figure out the difference between their browser and the Operating System. If the average user is confused, it may be because Microsoft tied Internet Explorer (with the bad idea that is ActiveX) to the OS back in 1998. We should make it easy for users to make rational decisions about their computer. That means showing a clear distinction between software and data. Keeping a clear distinction between software running on the local machine and remote machine is a consequence of this.

I know I am fighting an uphill battle. Last time I checked, every file format I checked (including text (shell scripts) and HTML (JavaScript)) was able to embed code. We have Microsoft employees suggesting taxes to help clean up infected machines because everyone knows computers require constant maintenance. We have people suggesting that locked-down systems like game consoles and iPads are good things because the average user cannot configure a computer. I say that is because the software the average user is using is badly designed, not because computers are inherently hard to use. For example: there should be no way to get your computer "infected" by opening and e-mail attachment. The "common sense" advice to avoid opening e-mail from people you don't know isn't. (In real life, junk mail goes in the trash, letter bombs are rare.)

Comment Re:Mechanical linkages != automatically safer (Score 1) 345

The problems here are not about mechanical being safer (it isn't.) But about simpler is much easier to make safe. Toyota is doing thing like adaptive shift logic, cruise control, traction control, taking out shock of things like Air Conditioners, etc. Having this many inputs makes it difficult, not to mention they replace a single linkage with a system that reads the pedal position sends it to a ECM that does lots of other things as well, then sends it to a servo motor, that moves a mechanical air restriction valve, then air flow is read by a MAF sensor that then determines fuel based on RPM, and finally injects the right amount of fuel.
My Diesel has electronic throttle that is much simpler, it reads the throttle, with RPM+boost it determines injection. Being a manual I do have a mechanical override right their as well, it's simplicity should make it much easier.
All this said, the ability to do multiple pickups with a electronic throttle and thus throw faults without any "oh shit" warnings like you describe is a big advantage of electronic, if done right. I have worked on vehicles with electronic throttles for the past 13 years (on Diesel) they can be replaced in a minute, with little skill set, they self tune, they keep a log of what went wrong, and on the systems I work on, you can click a few buttons on a display to show the actual reading and verify the whole system without leaving the drivers seat, they can easily be moved to the best position in the cab without linkage redesign, concern for what stress goes into what linkage, or heat, frozen water penetration, etc, etc.

Comment Re:Down or DDoS? (Score 3, Insightful) 634

...which would have cost them more than the game will earn in profits.

I doubt it, but still a fatal flaw. Among many. The game only lasts as long as the servers are up and active. The servers are up and active as long as the game is still making a profit. The profitable window for games is not very long. So the game is fucked by design.. Long live stupid DRM. Every pissed off user is another nail in the coffin.

Comment Re:No sympathy (Score 1) 634

> I might even have failed to notice the small print which said that an Internet connection was needed in order to play it. I certainly wouldn't have expected that to be a requirement.

Without knowing how obvious the technology makes it, I wonder if some people didn't even realise it had this sort of DRM until the servers went down...

Comment Re:Luddites (Score 1) 171

I agree that there are things that can be disproved about religion, but there are few of those. And even those proofs rely on postulates such as the law of non-contradiction (which, though it completely boggles my mind, some religions believe is false).

Furthermore circumstantial evidence should not be discounted as worthless, evidence that suggests a religion is wrong, such as evidence of "miraculous" healings, or evidence that the bible stole ideas from other religions that it declares are false, while not final, is worth considering.

However, some of the things you call proof do not even come close. "We can show that a large portion of the bible is also immoral", for instance. Even within our own culture there are far, far too many theories surrounding morality for anything to be proved. Even circumstantial evidence (the fact that our culture tends toward one way of thinking about ethics is not evidence, its the naturalistic fallacy) is scarce here. Most contradictions in the bible are found by people looking for them, and not found by people who are not looking for them (or looking for them not to be there), similar to how cell-phone company funded researches that didn't find evidence of cell-phones causing harm. I also know of no evidence that the bible rips off older religions (other than Judaism, which the bible acknowledges) that is so clear that someone could legitimately call it proof. Mithraism was once used as such as example, but is no longer considered to mirror Christianity.

Similarly, some "radiation sensitive" people have been proven to be faking it is proof that some radiation sensitive people are faking it. Given that some research indicates a certain level of harm due to non-ionizing radiation suggests that there is a potential for such a reaction, beyond individual people, the idea of "radiation sensitivity" is quite far from being disproved.

Slashdot Top Deals

Neutrinos have bad breadth.

Working...