Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Choose your 2 factors carefully (Score 1) 247

Passwords are commonly used because they have a lot going for them -
* people understand them
* they're reasonably easy to implement (especially if you are savvy enough that you only store an md5 or whatever, not the password)
* most password interfaces are accessible

You mention phone-based - Google wants me to give them my mobile phone number to enable 2 factor security via SMS, but (1) I don't have a mobile phone, (2) if I did, there's no reception where I live, (3) when I did have one, SMS messages were not free to receive.

Picture-based systems don't work for people who can't see the pictures. So you need to research an alternative that works for blind users, and possibly also a low-bandwidth alternative that does not rely on audio or video as a fallback for blilnd users.

So your replacement should start out being accessible and should not cost money for the end user, and should not rely on unreliable external systems (phone netowrk) unless those are all Ok and a given in your environment - even then, locking out even a single blind or mobility impaired employee because they couldn't see the picture or didn't react quickly enough can open your company to a painful law suit large enough to make reverting to passwords seem like a win.

I don't want to put you off from innovating - but innovat to solve real problems that you've measured, with solutions that have been tested, and that introduce as few new problems as possible.

Comment Re:Seems fundamentally broken (Score 1) 399

This is a complete misunderstanding of what is going on.

"The bigger problem to me seems to be that cgi scripts export user parameters to environment variables before calling bash"
No, this is not what it is about at all.

The CGI specification says that the Web server (not CGi scripts) makes HTTP headers and request data available in environment variables.

Programs called through this interface (often called "index.cgi") are vulnerable *regardless* of what language was used to write them, because a lot of programs end up calling bash indirectly. And no, shell functions themselves are not the problem, so redefining cp or ls isn't going to happen in this way, it just happens to be a part of shell syntax with a bug in the parser in bash that also executes commands when it shouldn't.

Comment Re:Remote? Vulnerability? (Score 1) 399

That you don't understand it doesn't mean it's not real.

Some people run a program called a "Web server" which listens on a network and runs programs based on requests it receives :-)

It's not about using bash for CGI scripts, although that's an obvious example. Some people use languages like Perl or python or even php (in CGI mode), or C, or Pascal, and all of those programs can be affected too, because they might use something like systm() or `...` or run an external program that in turn calls the shell, directly or indirectly.

Other services are also affected - ssh, dhcp, remote git, potentially even ftp and email although that seems less likely.

Comment Re:I am wrong but... (Score 1) 399

Any program that's called via the CGI interface, regardless of programming language, is potentially vulnerable to this attack, because the Web server puts the environment variables in the process' environment and they'll inherit to sub-processes.

The ssh exploit can be ued to escalate privileges e.g. when used in a limited account such as github or remote CVS or backup.

DHCP is also vulnerable (remotely).

I don't think calling people names is helpful. Anything called via the CGI API is potentially vulnerable, as is anything that passes on environment variables or HTTP headers to sub-processes, regardless of what language was used to write those programs.

Comment Re:Linux is just a full of holes as Windows (Score 1) 399

It also affects bash scripts called from programs run by CGI, so e.g. Perl, python, C, C++ programs using system(). Since environment variables are inherited by all subprocesses, it affects grandchildren, greatgrandchildren, and all the other sub-processes that get created all the way down.

Some scripting and programming languages use the shell to expand "glob" patterns, e,g,
        $names = glob( "/tmp/[0-9]*" )
or in other places where it may not be obvious.

I'm not interested in comparing numbers of holes in different systems as it ony takes one hole for an intruder to get in.

Comment Re:Full Disclosure can be found on oss-security... (Score 2) 399

The CGI spec tells the Web server to make the user data available as environment variables, so e.g. Apache will put them in the environment, and environment proceses are inherited to all sub-processes, so e.g. a Perl script called via CGI and using back-ticks, my $a = `pwd`; may result in code execution.

The vulnerability doesn't apply to all ways of running code on Web servers, e.g. Java servlet APIs shoud be fine, but CGI does automatically add the HTTP headers and request paramters to the environment.

Comment Re:as a photographer (Score 3, Insightful) 129

Published works are automatically copyrighted in most countries, including the USA, because of ratification of international treaties such as the Berne Convention. The old US-specific requirement of marking something as copyright has long gone. (in other countries requirements varied, but e.g. in most Western countries items published anonymously, or published without explicit marking, get full copyright if the creator's identity becomes known. Just because a photograph is unmarked does not mean you can use it without permission!)

However, it's true that if you mark something as copyright you may do better in court, particularly in the USA, and that registering copyright, still available in many countries, can help.

Comment Re:It's the ink soaking through the paper. (Score 1) 116

It's harder than it sounds.

I do a lot of scanning from old books. The print-through can often be darker than parts of the printed page you're trying to scan; I have not found a good way to cure that beyond hand-editing.

In many case, of course, you can make huge improvements in a very short time. But Google Books is about commoditisation, it's about really large quantities of mediocre results, getting ad revenues from the keywords to pay for the work.

Comment Re:17th Century? (Score 1) 116

The images are better than average for project gutenberg. On my own site I generally scan at 2400dpi, http://www.fromoldbooks.org/ - although people have to ask me for the high resolution images. For one thing, a 2 gigabyte image can crash people's Web browsers :-)

Project Gutenberg has always been really sloppy with metadata - identifying exactly which edition of a work was transcribed (and which impression), describing its physical characteristics and so forth. They seem to be improving a little, slowly.

Google Books on the other hand has always been really bad with images and with the OCR. For some books I've had some luck making a "majority edition" by taking the text when Google scanned the same book multiple times. It turns out to be almost impossible to do that with images, unfortunately.

As I understand it, Google's method of scanning books also means fold-out or large-size illustrations tend to get lost altogether.

Comment Re:best poison... and internet and rats (Score 1) 85

Hope you like the stink of dead rodents in your walls...

That's a problem with pretty much any of the poisons, yes.

Poison is just a band-aid, fix however they are getting into the house.

We live in an old wooden farmhouse; it's not really feasible to stop rodents from getting in altogether - just as we have a sump pump in the basement for the water that gets in, standard operating procedure here in rural Ontario. We have, however, added .2 inch steel wire mesh under the deck, to a depth of two feet, which helped.

I thought warfarin was still #1 for what it's worth... As far as I recall rodenticide has to be slow acting, else the neighbours will notice something is awry when the victim drops dead.

I didn't say the Vitamin D was fast, I said the rats eat a lethal dose at one sitting. When rats return to the nest the alpha male smells their breath, and when they start to die, the remaining rats will soon stop eating the bait. So any poison that takes more than one feeding to kill will tend not to kill all the rats, unless you only have a very few rats. But if it kills the rats too soon, they'll notice and avoid the bait. It's tricky to get right. Multiple-feed poison is OK in a city if they're coming up from the sewers or other underground tunnels, but if you have a nest in your walls, forget it.

Warfarin is #1 in sales, sure. You have to keep buying it, because it won't kill them all. In addition, a large proportion of rats are immune to warfarin these days. I didn't want to mention brand names in case it sounded like an advert, but Quintox and Terad3 are the leading Vitamin D poisons (both from Bell Labs, one newer than the other). Quintox is also the only rat poison that can be used on an organic farm here. You can also get it in liquid form, which is good if you're confident there are no other animals, children, etc ;-) - e.g. it's used at a local power station here. The rats have to drink a lot of liquid each day so they're particularly attracted to it.

Thanks for replying. And yes, you're right, we had a bad smell in the walls ;D

Comment best poison... and internet and rats (Score 5, Interesting) 85

We had an Internet outage in our house when rats got into the walls and chewed through the cables. They just like eating plastic, and also will chew through walls (and cables) to get to the other side.

It's no surprise that the most effective rat poison (I discovered after extensive research!) was developed by a phone company - Bell Labs.

It was also interesting to me that the Wikipedia article on rat poison appears to recommend the most widely used *ineffective* rat poison, which also made by a large company..., and lists some stupid problems with the competition.

The most effective, if you are wondering, is based on Vitamin D, and has the advantages that (1) the rats eat a fatal dose on the first feeding, and hence do not get a chance to learn to avoid it; (2) pregnant rats eating the poison do not give birth to rats that are immune to it, (3) since vitamin D isn't really a poison as such, if another animal eats the rat, there's very little risk of secondary poisoning.

So we solved our own rat problem, but I had to do a lot of learning about rats and rat poison on the way!

Comment Re:Biased summary (Score 1) 242

The summary is incorrect about the public domain part - for one thing, JSTOR holds a great many articles that are still in copyright. For another, "published before 1923" only applies to articles written and published in the US (smplifying slightly). An article written in Germany or France or the UK in 1923 may still be in copyright even in the US (because of copyright treaties that say countries respect one another's copyright laws, and although admittedly the US has not been an equal player in these, it's starting to honour them more often). JSTOR has journals published this year.

JSTOR is a non-profit organization that has saved university libraries huge amounts of money. In the 1980s a publisher would often charge $10,000 for a year's subscription to one journal or family of journals. Now, as others have said, the current business model for academic research and dissemination of results is pathetic and flawed. Physics and Mathematics have long had ways to try to work around it for practical research, sending pre-prints and publishing independently

A few gigabytes of text is actually a massive amount. The entire King James Bible is abut five megabytes. A single journal article is a few kilobytes, or low megabytes if it has figures. The complaint is not about the bandwidth use (as I understand it, I do not speak for MIT).

For me there's a bigger question here. If you are successful in challenging the model of careful selection and editing of articles, and of presenting them by subject, if you succeed in giving away the goods for free and making the publishers lose their shoes and socks and declare bankruptcy, have you lost anything? Is the selection process a valuable service, and, if so, can it be replicated? Crowdsourcing has for sure worked to make wikipedia voluminous, mediocre and untrustworthy. Is that a heresy here? Maybe. But you can never take an article there at face value. There have been whole fake conferences whose "conclusion" was that smoking was good for you, or that the global climate is not changing, or that the sun really does go round the earth. I don't want to see the end of traditional journal publishers unless there's a way to retain the benefits, or to have enough new benefits that the people most affected are willing to lose the old benefits.

Comment It's 1760 all over again (Score 1) 244

When John Baskerville invented a process for making smoother paper, and printed books with the blackest ink and whitest smoothest paper ever seen, Benjamin Franklin said that people would go blind. Others took up this claim, although today almost all books are printed on paper every bit as white and often as smooth, and with inks every bit as black.

Comment Re:*correction* Re:IE 6 intentionally crippled (Score 1) 97

There wasn't a published standard that night when Marc coded "blink" in Mosaic. We were working on a standard for HTML in the IETF Working Group but it was only a draft then.

Note that IE inherited "blink" from Mosaic, because the first version of IE was licensed from Spyglass and was a commercially-supported version of Mosaic.

I don't remember where "layer" came from I'm afraid.

Slashdot Top Deals

The rule on staying alive as a forecaster is to give 'em a number or give 'em a date, but never give 'em both at once. -- Jane Bryant Quinn

Working...