DoJ Following Porn Blocker Advances? 265
GreedyCapitalist writes "A new filter called iShield is able to recognize porn images based on the content of the image (other filters look at URLs and text) and according to PC Magazine, it is very effective. The next generation will probably be even better -- which highlights the retarding effect regulation has on technological progress - if we relied solely on government to ban 'inappropriate' content from the web, we'd never know what solutions the market might come up with. Will the DOJ (which argues that porn filters don't work) take note of filtering innovation or continue its quest for censorship?"
Reversal (Score:5, Funny)
Re:Reversal (Score:2, Redundant)
Re:Reversal (Score:5, Funny)
Re:Reversal (Score:2)
You need to filter out the crap. (Score:3, Insightful)
I welcome this new technology!
Re:Reversal (Score:2, Funny)
What Is The Story here? (Score:5, Insightful)
I don't understand why this summary has to bring the government into this or speculate that they might do something. There's no evidence of impending censorship, no political issues at work here. It's just a review of a product. Why does Zonk continually try to troll politics on slashdot? He's turning into worse than Michael ever did.
Re:What Is The Story here? (Score:2)
Although I agree that this FP has little to do with government regulation (other than a sort of "proof of concept" for potentially effective porn blocking), we have plenty of proof that the current administration wants to censor the internet... The entire quest to make Google turn over search records on a legal f
Re:What Is The Story here? (Score:2)
Bad news... the fools started swallowing things like that wholesale on 9/12
Re:What Is The Story here? (Score:2)
His assertion that products of this kind have some sort of "retarding effect" on "technological progress" is a bit naive. Making a machine smart enough to distinguish between porn, lingerie and breast examines is fairly remarkable. This is probably and extension of eigenface analysis applied to more general images. Machine vision, in other words.
Porn driving technology, again? Perhaps. A few years from now we'll take it for granted that machines can identif
Re:What Is The Story here? (Score:2, Insightful)
You must be American
Re:What Is The Story here? (Score:5, Insightful)
- you'd have to get every country in the world to go along with this
- how would you decide if a site needs a
- you'd have to create an 'internet police' to enforce compliance
Re:What Is The Story here? (Score:4, Insightful)
But then there are the gas stations that sell porn mags behind the counter. These places have porn, yes, but someone who has an aversion to erotica may have a compelling reason to enter the gas station, even though it contains porn. Would these places be
Then there is the library. I can find pictures of bare breasts, and vaginas, and butts etc. There may not be any hardcore pics (unless you count the sex advice picture books), but you can see nudity. You have to seek out the porn (both literally, because it isn't in the main room, and figuratively, because you have to decide that a photography book is beat off material). What happens when the "libraries of the internet" get slapped with
Re:What Is The Story here? (Score:3, Interesting)
Re:What Is The Story here? (Score:5, Insightful)
Re:What Is The Story here? (Score:2)
Re:What Is The Story here? (Score:3, Insightful)
1.Who decides what is required to go under
2.What do you do with a site like, hypothetically, www.hotgirls.co.uk (made up name)? Do you create www.hotgirls.xxx.uk and force them to move? Or do you move it to www.hotgirls.xxx? (and then what about www.hotgirls.com? where does that go?)
3.How do you deal with people having to find the sites once they move? How does someone used to going to www.hotgirls.com find www.hotgirls.xxx?
4.How do y
Re:What Is The Story here? (Score:3, Insightful)
Except they won't. They'll continue to whine about rude words on TV and violent video games, even when they have all the tools they need to do something.
You're following the wrong model. You wouldn't let the children wander around downtown and put
Screaming so loud we can't hear you anymore (Score:5, Interesting)
But according to the article, it works well and doesn't filter out health-related websites. It also doesn't work for black and white images, but the majority of online porn isn't b&w. Or so I've heard.
Re:Screaming so loud we can't hear you anymore (Score:2)
Now there is a compagny that claims it can catagorize porn (hmm, try to explain to your wife/virtual gf: i work for a compagny that catagorize porn 8) ). They would love to sell it to a government, and might sell it to some nazi filtering compagny (they always fi
Re:Screaming so loud we can't hear you anymore (Score:2, Funny)
hmm (Score:5, Insightful)
Would Michelangelo's David be filtered out
How about anatomy/autopsy pictures ?
I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.
Re:hmm (Score:3, Funny)
Could give HAL-9000 a whole new outlook.
Re:hmm (Score:5, Informative)
Would Michelangelo's David be filtered out
How about anatomy/autopsy pictures ?
This excerpt answers these pretty well:
So it's business as usual. If PC Mag's quick checks revealed innocent sites being blocked, I hope this never sees the light as anything with a mandatory use anywhere. I think missing to spread information is worse than actually even showing human intercourse. Yes, even if there's a vagina there. I hope the kids aren't traumatized for life if they'd stumble over such things and the dirtiness of our anatomy.
Oh, also watch out for the new Pumpk1n Pr0n:
The article says IE would crash more with this tool in use too, but I'm not sure anyone would notice the difference from before and after.
I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.
Just use the Mirrordot version [mirrordot.org].
Re:hmm (Score:2)
Of cause they're not, it's their parents who are... so for maximum effect, this filter should be used to stop such sites/images from being added to the browser history/cache... the parents will never find out, and the kids just think it's funny, problem solved!!!
Re:hmm (Score:2)
Reading between the lines of TFA, I'd say that it isn't just doing an analysis of the images, but must be tweaked to allow exceptions, possibly using some kind of analysis of the accompanying text. I doubt this would deter people who wanted to sneak their site around the filter, but it might help with the model problem of the accidental Google link targetted at young children.
Unless the analysis is very crude, I don't think it would be hard to distinguish oil paintings from photos, which may be a bo
Re:hmm (Score:2)
Could it be this pumpkin [goat.cx] that gets blocked? ...not so innocent after all!
Re:hmm (Score:4, Insightful)
Why isn't this amazing AI advance being reported?
Re:hmm (Score:2)
I'm stunned that a bit of software can both read and understand the law and interpret it exactly as a real judge would.
I'm not at all stunned that a bit of software can interpret the rules exactly as some real judge would. There aren't many of them, fortunately, but there are some idiot judges out there.
Re:hmm (Score:3, Insightful)
Erm, surely the filter is set up to filter based on the wishes of the person who installs/manages it, not legislature. It's not interpreting anything but the image.
Re:hmm (Score:2)
Re:hmm (Score:2)
It is more probable that it would filter out pages with 'porn' in the title
our favorite (Score:3, Funny)
Re:our favorite (Score:5, Funny)
Segmentation fault (core dumped)
Inconceivable! (Score:5, Funny)
A goatse reference that is helpful and useful? Inconceivable!
I would buy this software if it could filter me from seeing that ever again.(I jest, but only slightly)
~Rebecca
Re:Inconceivable! (Score:5, Funny)
Or you could train your mind, as I have. My occiputal lobe no longer processes Goatse as it "should", and substitutes a fuzzy blur in its lieu. I literally cannot see it!
Unfortunately, this is not without side effects. For instance, it's no longer safe for me to drive through tunnels.
Re:Inconceivable! (Score:4, Funny)
Re:Inconceivable! (Score:2)
Which, in turn... (Score:5, Insightful)
In other news, today I successfully opened a can of Diet Coke -- which highlights the retarding effect regulation has on quenching thirst. Man, if I'd waited for the government to open that can for me, I'd still be thirsty now!
If only there were a more effective way to highlight the retarding effect that obsessing over the complete works of Ayn Rand has on independant thought...
Re:Which, in turn... (Score:2, Insightful)
Yet for many - they expect government to be that first line of defense against the "undesirable" and refuse to help themselves. Of course after so many years of public "education" this shouldn't be a surprise.
Re:Which, in turn... (Score:5, Insightful)
I don;t get it. (Score:3, Insightful)
First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.
What a load of cr*p !
On one hand we shout at the ineffectiveness of Govt's first real action in decades to counteract this problem (by yahoo, msn and google searches), and then we shout at them for NOT providing a solution at all.
You tie both my hands behind my back, then you blame me for not shooting at the thief !
It's the Slashdot Fallacy ... (Score:5, Insightful)
First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.
You are failing to realize that the same person is not talking in both cases. Also, while Slashdot as a whole leans to the left, the same issue can have articles written by, and about people on, both sides. The only thing that is happening here is that someone thought a discussion about a software for image identification and its future impact on us would be a good thread, and here we are.
You tie both my hands behind my back, then you blame me for not shooting at the thief!
The fallacy lies in missing that the ties hands speaker is not the same speaker as the one doing the blaming.
Make more sense now?
~Rebecca
Re:I don;t get it. (Score:3, Insightful)
You think it's a charcter flaw not to kill for property?
Re:I don;t get it. (Score:5, Informative)
Government Censorship: There are wing nuts who want the US government to censor the internet, usually with cries of "think of the children" or "help fight terrorism". People who know how the internet works generally realize that this is a stupid "solution".
Local Filtering: There are several different way that this can be done and all of the currently available local filtering "solutions" have problems. TFA was about a new local filtering scheme, which COULD be better than the existing methods.
Local filtering vs. government censorship is, I think, where you see the contradiction. It really isn't a contradiction for people to say NO to government censorship (including local filtering in public libraries) and to also have some of the same people wanting the government to get involved in improving local filtering technologies.
If it wasn't for porn on the internet, war, gay marriage, and abortion; you couldn't get anybody to go to the polls.
Re:I don;t get it. (Score:2)
"There are wing nuts who want the US government to censor the internet"
It's alright when it's done in the way they want it done, but I bet the same people would accuse china of being draconian for filtering.
It's funny how people's views of censorship change depending on what they're talking about censoring, and that sexuality ranks so high on so many people's kill list
Re:I don;t get it. (Score:2)
Re:I don;t get it. (Score:2)
Backwards.. (Score:5, Funny)
How so? (Score:2)
In other words, something you could easily do by hand. Unless that hand is busy because of the porn, of course.
False Positives (Score:5, Insightful)
This won't go anywhere for a long time, until image recognition technology catches up.
Re:False Positives (Score:2)
RTFA. "It didn't block department-store lingerie ads but covered up a few scantily clad models at the Victoria's Secret site. A Google Images search on "breast self-examination" was correctly allowed."
But they also say: "your tech-savvy teenager may attempt to evade this monitoring by terminating iShield....Without the password, you just can't turn it off." Right. Unless he reboots to a Knoppix CD, for instance. But basically if you don't want porn, this see
Re:False Positives (Score:2)
Since Victoria's Secret isn't a porn site, there's even more evidence it doesn't work right there.
Re:False Positives (Score:5, Interesting)
This won't go anywhere for a long time, until image recognition technology catches up.
Even then, one person's "porn" is another's "art". Even a human can't correctly distinguish offensive vs. non-offensive content with all that much accuracy. (This is besides the fact that around the same time as image recognition technology catches up computers will have overtaken the world and we'll be following their rules rather than our own.)
Re:False Positives (Score:2)
If they made this for video, It'd be interesting to see how many commercials, award-ceremonies and various news-shows that would be labeled as soft-porn.
Can't say you could call it a bug either...
Re:False Positives (Score:2)
Need programers ? (Score:5, Funny)
Looking for porn that the filter cant handle...
What those meetings must have looked like.
Dev Meetings (Score:4, Funny)
"Yeah, sure bob, you can run the 'barely legal college girls' tests. Janet and Simone, you check the 'hot lesbian' batches. What? Sure Ramone, you can check the 'young gay studs' test. Now, who is going to run the 'goatse.cx' tests?"
"Guys....? Anyone?"
Errors abound (Score:5, Interesting)
This thing won't be deployed en masse with problems like that.. it quickly becomes uneconomical for admins to be whitelisting pictures of pumpkins.
Re:Errors abound (Score:3, Insightful)
OTOH,For
Re:Errors abound (Score:2)
In the city I live in alone there are 3400 jobs in 8 different companies that specifically market selling pumpkins (unless I'm getting confused with something I just imagined), there's a lot of money in it and they wouldn't be happy having their images blocked.
Marality and AI (Score:2, Insightful)
Re:Marality and AI (Score:2, Informative)
Re:Marality and AI (Score:3, Insightful)
And more interestingly, what's the difference between a nipple on a nudist shot and not?
Nudism wasn't illegal in any modern country I know.
There are plenty of even less grey area cases like these that would be problematic, mentioned by a poster above. Art, both as for paintings and photography, etc. If we simply forbid the human body out of religious reasons and whatever, isn't that admitting S
Re:Marality and AI (Score:2)
Sorry, I should've written "modern and democratic country".
This was those I was thinking of, not well industrialized and "modern" dictatorships etc.
Not a big-brother issue! (Score:2)
I hate to repeat myself, but I feel it's worth mentioning here anyway (and I will add to it) - the filter is set up to filter based on the wishes of the person who installs/manages it, not legislature. It's not interpreting anything but the image, and it's not enforcing anything other than the wishes of the person who ownes the computer. This is not a big-brother issue! Sure it
I've seen something similar before (Score:3, Informative)
http://www.isp-planet.com/news/2001/messagelabs_0
"SkyScan AP uses Image Composition Software (ICA), which decomposes an image," White explained. "It runs 22,000 algorithms and in addition to skin tone textures, it can decipher porn through other features such as facial expressions.""
In practice these tools are simply filtering by URL, then by colour gamut analysis.
Re:I've seen something similar before (Score:2)
So much for animal rights (Score:3, Funny)
Re:So much for animal rights (Score:2)
This is not new... (Score:2)
All in all however, I'd rather see these kinds of initiatives than a governmental crusade against online porn, which would not only be
Comment removed (Score:4, Funny)
Re:What is porn? (Score:3, Funny)
Why didn't I think of this? (Score:3, Funny)
flag_as_porn(image);
endif
Step1: use silly algorithm
Step2:
Step3: PROFIT!
Dream job (Score:2)
My thoughts (Score:2)
I don't want the federal government getting involved with this. This is just censorship towards minors. Minors should be able to view what they please, but parents should be the ones responsible for stopping them from viewing things they don't wish for their y
One more thing... (Score:2)
Solution? (Score:2, Insightful)
Hmm.. (Score:3, Funny)
Re:Hmm.. (Score:2)
My work so far... (Score:5, Funny)
bool check_porn_content(const char *url)
{
(void)url;
return true;
}
Any suggestions for further development, or licensing queries, please let me know.
Re:My work so far... (Score:2)
Leave the Government out of this, thank you. (Score:2, Insightful)
As someone with some idea of all of this... (Score:2)
Machine vision can't even begin to start on this until you actually know what porn is, from an ontological point of view (a problem I've mostly nailed). Even then, the recognition algorithms for such won't be written for many years, and won't run on reasonable hardware for years after that. It's pretty dumb if you ask me.
Not many of you... (Score:5, Interesting)
Why? (Score:5, Informative)
Why should the sixteen year old be stopped from looking at porn? He's over the age of consent, what's wrong with letting him look at some naked women? He's probably thinking about sex all the time anyway, that's just what teenagers do.
Re:Why? (Score:3, Insightful)
A History of Violence (Score:4, Insightful)
You can bomb, shoot, maim every night on the nightly news, but God forbid you show a naked breast...people might be harmed!
There are hypocritical cultural 'norms' in the USA.
Re:What's wrong with violence? (Score:2)
Patent Opportunity (Score:2)
If you see this, the user has just to hit the remove fig-leaf button, each element of the photo is unscrambled locally (with java or maybe javascript), and the various vignettes are shown ordered and side-by side for the enjoyment of people at work and other people that could potentially be minors in their juridiction, but
How could this work... (Score:2)
Doesn't work (Score:3, Informative)
http://dansdata.com/pornsweeper.htm [dansdata.com]
There's been research on avenues like this for IR (Score:2)
Supposedly the next version of Mac OS will have at minimum OCR built in. Meaning it will scan images for words and put them in a searchable index. Now, eventually (or maybe the next release)
Google Blocked (Score:2)
That's a feature?
http://www.pcmag.com/slideshow_viewer/0,1205,l=&s
Shape recognition? Hmmmm! (Score:2)
Machine-language instructions are vertices. High-level-language loops, functions and similar structures are more or less complex primitive shapes {squares, triangles, circles
Comment removed (Score:3, Funny)
Academic paper on Finding Naked People (Score:3, Informative)
I'm not so sure about its applications, but it's certainly an interesting vision problem.
Re:what is the problem with .xxx domains (Score:2, Insightful)
Re:what is the problem with .xxx domains (Score:2)
I don't know what the problem was with the
if you want to, search google with site:.xxx and if you don't, then block
simple and effective
Because it is the opposite of the obvious answer which is to create ".censored" domains for the squimish. There are just too many sites in too many countries with too many views of what is obscene to try to force everything that anybody would find offensive into one domain. I propose the following instead:
>.burkas for those who are sickened by sigh
Re:would it filter breast cancer images? (Score:2)
Who cares, I'll use either!