Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Carrier comparison (Score 1) 92

Many who comment here will have a reason that they chose one carrier over one other carrier. They may have switched carriers. I always found that the latest carrier plan was better than the competition, and that it would go back and forth or be too confusing to come up with one clear answer. I actually have iPhones and aPhones on 5 carriers. I also travel the world quite a bit. Domestically, all the carriers are good for most unless you live in an area not covered by some. I remember times when Verizon was faster but now it seems that AT&T is faster for me, most of the time. I remember when you could buy international data from Verizon that covered 200 countries, while the AT&T list was only about 50 countries. That affected me in places like Russia and South Africa, back then. T-Mobile has incredible data plans for here and away but they don't seem as fast as claimed unless I'm in the store. Sprint has gone far out of their way to help me with issues, including a stolen phone number. Right now I believe that the best carrier I have, for my own needs, is Google Project Fi because the plan works in over 100 countries. You can even order a free data-only SIM for free, without even a shipping charge, to use it on iPads and the like. I would never say that anyone's choice of plan is bad in any way though.

Comment Re:Good bye jungles? (Score 1) 73

I always think of Douglas Adams calling the mosquito "Nature's Viet Cong" for making it much more expensive to cut down all the rain forests. I hope the laudable goal of saving people doesn't have the negative side effect of accelerated jungle destruction.

Note that a Malaria vaccine means we don't have to cut down the rain forests to protect people from malaria. So, this particular way of saving people can also save the rain forest....

Comment Re:Logic and Reason, or lack thereof (Score 1) 196

The problem with people like you who belittle the Constitution as written, and who belittle people who believe that it was intended as written, is that you ignore all of the history that goes with the Constitution.

The problem with people like you who worship the Constitution as written ignore all the history that goes with the Constitution - and make up shit from whole cloth to support your nutjob notions. You're no different from the airheads who believe that Nostrodamus could see the future and constantly 'discover' evidence to support it.

Comment Re: Overpopulation in Africa, the Middle East, Ind (Score 1) 259

Nuclear isn't viable. Nobody has figured out how to deal with the waste.

Umm, no.

We know perfectly well how to deal with the waste. Alas, the anti-nuclear types have fought for 50 years now to keep us from doing anything with the waste other than putting it into storage ponds.

Which is insanely stupid, since nuclear fuel is poisoned by its own wastes long before the fissionables are actually used up in the reactor. So there's a LOT of potentially usable nuclear fuel sitting in those storage ponds. Hell, we'd hardly have to mine uranium for a century or so if we actually reprocessed that "spent" fuel....

And that's without even considering breeder reactors, which turn all that U238 that we've mined (and which is basically useless as fuel) into usable fissionables....

Comment Re:Payouts are garbage, though (Score 1) 47

This is an interesting question. We don't really know what will happen long term. One possibility, as you point out, is that black markets will always outpay any other market. Another possibility is that the ethical hacker community will become so large and strong that they will find all those same vulnerabilities and deliver them to the system owners before the black market gets to build exploits and use them for nefarious purposes. It takes just one ethical hacker who finds a critical 0day to deliver it to a service like HackerOne, and the market for that vuln is over. Although asymmetry is usually in the favor of the criminal actor, in this case it is in the favor of ethical behavior. One ethical hacker can put an end to the sale of a 0day on the black market.

Comment Re:Fortran (Score 3, Interesting) 615

I'm not quite old enough to have used FORTRAN. I grew up on BASIC and Z-80 assembly language on a TRS-80 (and a bit of HP BASIC on equipment at school), but when I went to college in 1982, they were using PL/I. The first semester was even on IBM equipment, but fortunately they got a VAX late in the semester, because I managed to screw up my JCL by trying to reformat it to be readable. I still don't know why it took DEC so long to add the UNTIL statement to their PL/I compiler.

Then I got into programming on the Macintosh, so I started using Pascal. Also, Turbo Pascal was a thing, and they were both UCSD variants. But one of the worst things to do is use Pascal and PL/I at the same time. (as in same era, not simultaneously) The function headers are syntactically backwards to each other.

I didn't even officially switch over to C until after 2000. I even have one program I use sometimes that started with code I originally wrote in college in PL/I, then ported to Pascal, then again ported to C.

Comment No. (Score 4, Interesting) 170

robots.txt is intended to indicate what parts of a site should not be scanned recursively, often due for technical reasons such as generated content> It especially for sub-paths like /cgi-bin/, but there is no technical reason why the content of any arbitrary URL can't be programmatically generated. It might be and you wouldn't even know it, because the generated content may be the same most of the time, such as a navigation menu.

However, it was also not intended to be used to remove previously-archived content, as archive.org is currently using it. When an archived page changes status in robots.txt, they should note the first date that the status changed, then simply stop updating it until and if robots.txt re-allows it.

scanning and archiving are two different operations, and robots.txt is only intended to apply to the former.

Slashdot Top Deals

Nothing motivates a man more than to see his boss put in an honest day's work.

Working...