A Japanese Smartphone Uses AI To Keep Users From Taking Nude Photos (petapixel.com) 75
JustAnotherOldGuy quotes the PetaPixel photography blog on a new smartphone being sold in Japan: Aimed at parents who want to keep their kids from making bad choices, the TONE e20 has an AI-powered "Smartphone Protection" feature that prevents users from shooting or saving "inappropriate" photos (read: naked pictures).
The official Tone Mobile press release hails the TONE e20 as the world's first phone with an AI that "regulates inappropriate images" through an AI built into the so-called TONE Camera... If the AI recognizes that the subject of a photo is "inappropriate," the camera will lock up; and if you somehow manage to snap a photo before the AI kicks in, the phone won't let you save or share it.
Additionally, a feature called "TONE Family" can be set to send an alert to parents whenever an inappropriate image is detected. According to SoraNews24, this alert will contain location data and a pixelated thumbnail of the photo in question.
The official Tone Mobile press release hails the TONE e20 as the world's first phone with an AI that "regulates inappropriate images" through an AI built into the so-called TONE Camera... If the AI recognizes that the subject of a photo is "inappropriate," the camera will lock up; and if you somehow manage to snap a photo before the AI kicks in, the phone won't let you save or share it.
Additionally, a feature called "TONE Family" can be set to send an alert to parents whenever an inappropriate image is detected. According to SoraNews24, this alert will contain location data and a pixelated thumbnail of the photo in question.
alternate solution: (Score:3)
Bugless Algorithm flow chart:
1 User doesn't take nude photo.
2 End.
Re: (Score:1)
Re: (Score:1)
Out of mild curiosity, how come you're so fucking ignorant that you believe removing IR filters will allow you to snap "nude" photos?
Re: (Score:3)
Out of mild curiosity, how come you're so fucking ignorant that you believe removing IR filters will allow you to snap "nude" photos?
Because in a previous age he would have been buying the "X-ray specs" from the back of comic books!
Re: (Score:2)
"Because in a previous age he would have been buying the "X-ray specs" from the back of comic books!"
(The "lenses" consist of two layers of cardboard with a small hole about a quarter-inch (6 millimeters) in diameter punched through both layers. The user views objects through the holes. A feather is embedded between the layers of each lens. The vanes of the feathers are so close together that light is diffracted, causing the user to receive two slightly offset images. For instance, if viewing a pencil, one
Re: (Score:2)
If you're living in your mom's basement, please, please, for the sake of your mental health, go outside and talk to people outside! At least one a day.
Re: (Score:2)
In my experience, it is only the people that live in their parents' basement who are actively researching all the magical ways to take pictures of "nude womens" without the consent of the said women.
The people who actually have relationships and sex don't really need such contraptions.
Re: (Score:1)
https://www.liveleak.com/view?... [liveleak.com]
Too fucking stupid to know, or you were born after 1998, which makes you a mere child and you should shut the fuck up while adults are discussing things.
Re: (Score:1)
"You think what's on that video qualifies as "nudes"? You are a sad little forever-alone fuck."
you're just an angry cowardly shitstain that knows nothing, let alone can't see the point being made.
"But then, when you're a senile old fart who's spent the best part of his life on /. discussing "see-through capable cameras", what else could you be?"
I've worked in porn. I've worked in research. I've worked in higher-end restaurants. I've been on the BBC and have done hugely important horticultural research (grow
Re: (Score:2)
Not exactly x-ray vision going on in that video. It just means that the outer garment is somewhat translucent to IR light. I've seen a similar effect happen with garments in visible spectrum/white light depending on intensity, angle, and how tightly woven the fabric is.
Basically, a whole lot of nothing.
Re: (Score:2)
Re: (Score:2)
I stand corrected, not nude but lightly revealing in some circumstances. Mind you airport backscatter xrays was also labeled as nude as well. However I still think this feature WOULD sell more phones, which is what it is all about. Probably corona virus suspects or anyone with a temperature would show better. It is a selling point. I'm not sure who still uses software/app kiddie protection settings for the internet and the TV, but suspect it is low. Other possible reveals/tells could be does the person have a weapon? A drug stash maybe. Can we get close to a poor mans bolometer? Other possible uses: Spot growhouses, or dealers with many phones. Anyway, enjoy you crippled light spectrum phone camera.
although a common misconception, thus forgivable mistake, it is wrong. FLIR is completely different to NIR, the former being thermal imaging and NIR not having thermal vision ability any greater than visible spectrum cameras (save more magenta casts on stuff glowing visibly red/orange hot because of it giving off NIR too). They get confused a lot by lay folks despite being very different tech. NIR wavelengths aren't filtered by the RGB microfilters in the bayer array effectively, closer wavelengths are slig
Re: (Score:2)
Bugless Algorithm flow chart:
1 User doesn't take nude photo.
2 End.
It's been shown time and time again that prohibition doesn't work, so your algorithm comes across as being simplistic. So I propose a more realistic algorithm:
10 Frame picture
20 If Not(subject is nude*) GOTO 50
30 If subject's face is showing GOTO 10
40 If other elements in the picture uniquely identify the subject GOTO 10
50 Take picture
60 Send picture
* Also consider taking drugs, committing crime etc
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Your "more realistic" algorithm's recursion precludes portrait shots among other things.
How so? For a standard portrait I see the flow as 10=>20=>50=>60
And it's looping, not recursion :D
Re: (Score:2)
In reality they're probably just re-using the Not Hotdog [youtu.be] classifier which is very accurate.
Re: (Score:1)
I wonder if it works on black people. They probably forgot about those again, I bet. "Not pink, probably a gorilla [theverge.com]".
Re: (Score:2)
Re: (Score:2)
Doctor Manhattan? Is that you?
Easier, better solution (Score:2)
Very smart AI. (Score:2)
It's Japan. It will have to detect tentacles.
(It probably is being funneled to a room full of people monitoring the cameras, because that is how "AI" seems to be working at the moment. That's not creepy at all).
Re: (Score:1)
And it works about as good as word filters. (Score:2)
Somebody should buttbuttinate those retards.
Re:Very smart AI. (Score:5, Interesting)
yep and it lacks context completely.
I used to see spam filtered emails at a company I worked for (yes, I got to see all the good stuff that was deemed "unacceptable") and I recall one filtered out as a nude image- it was actually a colleague's nephew, pulling a pose in his karate outfit as he'd just won his red belt.
There's no way AI filters will not detect a red belt dangling between the subject's legs in some poorly taken photographs as what it is, they'll err on the side of false-positive and censor it.
Re: (Score:1)
I used to see spam filtered emails at a company I worked for (yes, I got to see all the good stuff that was deemed "unacceptable") and I recall one filtered out as a nude image- it was actually a colleague's nephew, pulling a pose in his karate outfit as he'd just won his red belt.
So, you were hoping to see nude pics of your colleague's nephew?
I think we know why you aren't working there anymore.
Re: (Score:3)
What can possibly go wrong (Score:2)
1) A penis has a thin high contrast shadow crossing it - Picture may be taken (AI gets thrown off by the shadow, making the penis a "not a penis")
2) Somebody is being assaulted - Nudity detected, ACCESS DENIED (even though everybody is fully clothed, and it's a robbery in progress, some random object in the background gets misidentified as a penis because it's taller than it is wide, and it just happens to look vaguely phallic)
Society is being done a great disservice by throwing around the term 'AI'
Re: (Score:2)
"you can detect pixel color for nudity there are several apis available already"
You mean you can detect something that may be human skin by checking if a pixel falls within a certain color range. You still need human eyes to know what part is being detected, or if it's even detecting human skin and not something else.
Re: (Score:2)
YOLO can detect thousands of objects in real-time on a raspi. I have no trouble believing that a halfway decent cellphone can detect naked people. However, it's going to have false positives and/or false negatives.
Re: (Score:2)
It’s Japan! They probably already have animated tentacle filters that get added!
That will go well... (Score:2)
But I expect this thing, which is basically a scam, will sell well.
Re: (Score:3)
But I expect this thing, which is basically a scam, will sell well.
There are a few places where this device might sell, but Japan is definitely not going to be one of them. Meanwhile, Saudi kids and Mississippi kids will just find ways of fooling the algorithm.
Another solution: parenting (Score:5, Insightful)
Re: (Score:3)
Or, in the words of Solomon, "teach your children the way that is right so that when they are older they will not depart from it".
KInd of old fashioned, I know... maybe not PC, or possible to accuse of having a unnecessary religious agenda.
But ultimately still damn good advice.
Re: (Score:2)
On the worst case, it's just a bunch of advices given by smart people that observed how the humans worked for literally thousands of years, and wrote down the stuff that actually works.
Re: (Score:2)
Or, in the words of Solomon, "teach your children the way that is right so that when they are older they will not depart from it".
The problem is, that statement is false.
They WILL depart from it, if most of the rest of society does. Peer pressure, herd behavior to name a few, they affect people no matter how smart they are.
My favorite example is jaywalking. Many parents teach their kids the right thing to do, and many others don't, or give them bad examples every day, by jaywalking together with said kids. Now, when kids taught the right way will go to school and back with kids wrongly taught, they will, together, err on the side of j
Re: (Score:3)
It's a proverb, not a logically true or false proposition. It's advice, not a guarantee. The point of the statement is not to suggest that the stated outcome is invariable by following the advice, but to indicate what a (presumably desirable) outcome is could create enough incentive to follow the advice in the timeframe mentioned.
And speaking to the general reliability of the proverb, while obviously everyone ultimately makes their own choices, our parents do not make them for us, it's still by far mo
Re: (Score:2)
My favorite example is jaywalking. Many parents teach their kids the right thing to do...
In many parts of the world what the US calls "jaywalking" IS the right thing to do.
Re: (Score:2)
Re: Another solution: parenting (Score:2)
tap tap (Score:2)
Clippy says: "Hey it appears you're taking a picture of a boob, would you like assistance with that?"
Tap Tap? (Score:2)
Saying "Tap Tap" when talking about Japan? Have you read the Dragon Ball mangas?
child porn detection (Score:1, Insightful)
Re: (Score:2)
Kidnapping is a serious matter. What's really needed is a system to prevent kids from being taken by cameras.
Nanny corporations. (Score:2, Troll)
Seriously, every time you think they canâ(TM)t get any more overbearing, condescending, and totalitarian ...
Also: Thanks, Catholiban missionaries! Because you hated yourselves for raping children you made halft a planet treat normal things like nudity and sex like a taboo. Imagine if you had been scat fetishists instead... we'd be censoring food and jailing Gordon Ramsay. ;)
Addendum: ... I meant with grown-ups. Duh. (Score:2)
Of course with kids it's different. But ONLY if the recipient is a pedo.
No problem with normal people. Small kids usually go nude on European beaches. And p
family photos of their kids in a bathtub, playing in foamy water, are also normal.
Only sick fucks see that as wrong (or hot... same type of mind).
Re: (Score:2)
Also being a Naturist is legal, so how are they going to take their photos now ?
The problem is that any given picture (photos and paintings) of a nude can be classified as being both a piece of art and pornographic because these terms are not mutually exclusive. I think an AI will pull its hair out if it attempted to classify a picture for artful and porn ratings. This is because the ratings are subjective.
In particular, being nude is not illegal in most of Europe even if you walk in the nude down the high
Challenge accepted! (Score:5, Insightful)
Off the top of my head I can think of a couple of ways to defeat this and I'm not in the demographic of people who will want to defeat it.
For example Nip slip, wet t-shirt, copious amounts of makeup. Without being able to actually understand the content of a photo there will always be different types of photos that will be able to slip past the filters. Especially if the filtering is done in camera where there are limited computing resources available for doing the classification.
Now if a more thorough analysis is being done off camera .. well that raises a huge bunch of red flags.
Re: (Score:2)
Before the digital world, mistakes were not spread permanently to the whole planet.
Another solution (Score:1)
Re: (Score:2)
Making different devices is never a good idea in today's manufacturing world. It's easier to have parental settings to enable/disable the cameras.
But WAIT, that's not all: (Score:2)
"Furthermore, the “Biometric App Authentication” function, which is a security measure for social engineering, etc., can be activated with “face authentication” for each application, providing a smoother user experience while enhancing security. Also, "TONE e20" is a log that uses the blockchain-related technology that the FreeBit Group has been working on under the Trusted Internet Initiative and addresses the risk of tampering with elements that cannot be protected by the blockchai
Re: (Score:2)
"Furthermore, the “Biometric App Authentication” function, which is a security measure for social engineering, etc., can be activated with “face authentication” for each application, providing a smoother user experience while enhancing security. Also, "TONE e20" is a log that uses the blockchain-related technology that the FreeBit Group has been working on under the Trusted Internet Initiative and address"[and to the hypotenuse of the right triangle to the lateral array.....]
What about statues and paintings? (Score:5, Insightful)
Will Michelangelos Sistine Chapel ceiling and his David statue for example also be blocked by this nanny camera?
And someone tell me why the hell are so many countries so outraged by nudity (or even *gasp* - womens breasts!) yet are quite happy for kids to sit in front of consoles firing bullets and missiles into virtual people with blood and gore splatter everywhere? There's something profoundly wrong with 21st century morality.
Re: (Score:2)
The ironic thing is, a good lions share of the Holy Books of the "Big 3" Abrahamic religions are NSFW. Words like "whore" and even "pissith" is present in the Bible. Among other things there is graphic violence, and some very nasty sexual stuff *cough*rape*cough*.
But it's OK for kids to read it, because it's the "Good Book" %{
Re: (Score:2)
To be fair, most religious people like to pretend the nasty bits don't exist. Most of them pretend so well they've never actually read them.
Right,,,, (Score:3)
And kids will just find clever ways to fool the 'AI' to be able to snap the dirty photos. Even Japanese school kids have a lot of idle time on their hands, and the desire to be cool because they beat authority.
Re: (Score:2)
OH look! it's a flock of unintended consequences (Score:2)
If this gets any traction, be prepared for teens having contests to see who can wear the least without triggering the protection, who can trick the protection into allowing a nude and, of course, what is the most innocent pic you can take that will trigger a false positive.
Re: (Score:2)
If this gets any traction, be prepared for teens having contests to see who can wear the least without triggering the protection, who can trick the protection into allowing a nude and, of course, what is the most innocent pic you can take that will trigger a false positive.
Along that same line, but for nerds, feed random noise to the algorithm and keep the pictures flagged as nude.
Re: (Score:2)
Soon after, the +5 cloak of fnord, a cape with an artistic nude printed on it that makes you impossible to photograph.
Send all photos to the parent. (Score:2)
Much simpler to have the phone be set to automatically send all photos taken on the phone to the parent who bought the phone.
For an extra fee, can do the same with incoming photos to protect your precious snowflake from dick pics.
Will fail (Score:2)
Japanese kids sell these photos to old pervs for iPhone money, just as they sell used panties to sell in vending machines.
It's called burusera
"However, burusera goods in themselves are not child pornography, and selling burusera goods is an easy way for schoolgirls to gain extra income"
https://en.wikipedia.org/wiki/... [wikipedia.org]
How did they test this in the real world? (Score:2)
I bet the engineers had fun testing this camera+AI. Surely they wouldn't just test it by pointing the camera at a bunch of photos. They'd have to test this for real. Fun times!
I hear that, if you put a Hat on a Snowball... (Score:2)
it's Infernal Lifespan is only increased by a negligible amount.
In a nutshell, it will likely be bypassed before the product is even officially launched, so meh.
Why not blur automatically instead? (Score:1)
how this works (Score:2)
you all know how this works, right?
the phone is uploading everything to an AI-cloud service for detecting naughty bits.
it's already going wrong right there.