Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Japan

A Japanese Smartphone Uses AI To Keep Users From Taking Nude Photos (petapixel.com) 75

JustAnotherOldGuy quotes the PetaPixel photography blog on a new smartphone being sold in Japan: Aimed at parents who want to keep their kids from making bad choices, the TONE e20 has an AI-powered "Smartphone Protection" feature that prevents users from shooting or saving "inappropriate" photos (read: naked pictures).

The official Tone Mobile press release hails the TONE e20 as the world's first phone with an AI that "regulates inappropriate images" through an AI built into the so-called TONE Camera... If the AI recognizes that the subject of a photo is "inappropriate," the camera will lock up; and if you somehow manage to snap a photo before the AI kicks in, the phone won't let you save or share it.

Additionally, a feature called "TONE Family" can be set to send an alert to parents whenever an inappropriate image is detected. According to SoraNews24, this alert will contain location data and a pixelated thumbnail of the photo in question.

This discussion has been archived. No new comments can be posted.

A Japanese Smartphone Uses AI To Keep Users From Taking Nude Photos

Comments Filter:
  • by Dallas May ( 4891515 ) on Sunday March 01, 2020 @09:42AM (#59784044)

    Bugless Algorithm flow chart:

    1 User doesn't take nude photo.
    2 End.

    • Losers. A phone lacking IR filters so it can take nude photos through clothing would sell like hotcakes - like an old Sony Videocam. But all you can buy are lousy crippled phones that don't offer full spectrum capability.
      • by Anonymous Coward

        Out of mild curiosity, how come you're so fucking ignorant that you believe removing IR filters will allow you to snap "nude" photos?

        • by OzPeter ( 195038 )

          Out of mild curiosity, how come you're so fucking ignorant that you believe removing IR filters will allow you to snap "nude" photos?

          Because in a previous age he would have been buying the "X-ray specs" from the back of comic books!

          • "Because in a previous age he would have been buying the "X-ray specs" from the back of comic books!"

            (The "lenses" consist of two layers of cardboard with a small hole about a quarter-inch (6 millimeters) in diameter punched through both layers. The user views objects through the holes. A feather is embedded between the layers of each lens. The vanes of the feathers are so close together that light is diffracted, causing the user to receive two slightly offset images. For instance, if viewing a pencil, one

        • by I75BJC ( 4590021 )
          You might want to take part or all of your next paycheck and buy a sense of humor because Canberra1 is most likely joking. (Obvious to me anyway.)
          If you're living in your mom's basement, please, please, for the sake of your mental health, go outside and talk to people outside! At least one a day.
          • In my experience, it is only the people that live in their parents' basement who are actively researching all the magical ways to take pictures of "nude womens" without the consent of the said women.

            The people who actually have relationships and sex don't really need such contraptions.

        • by Khyber ( 864651 )

          https://www.liveleak.com/view?... [liveleak.com]

          Too fucking stupid to know, or you were born after 1998, which makes you a mere child and you should shut the fuck up while adults are discussing things.

          • Not exactly x-ray vision going on in that video. It just means that the outer garment is somewhat translucent to IR light. I've seen a similar effect happen with garments in visible spectrum/white light depending on intensity, angle, and how tightly woven the fabric is.

              Basically, a whole lot of nothing.

          • I stand corrected, not nude but lightly revealing in some circumstances. Mind you airport backscatter xrays was also labeled as nude as well. However I still think this feature WOULD sell more phones, which is what it is all about. Probably corona virus suspects or anyone with a temperature would show better. It is a selling point. I'm not sure who still uses software/app kiddie protection settings for the internet and the TV, but suspect it is low. Other possible reveals/tells could be does the person ha
            • by K10W ( 1705114 )

              I stand corrected, not nude but lightly revealing in some circumstances. Mind you airport backscatter xrays was also labeled as nude as well. However I still think this feature WOULD sell more phones, which is what it is all about. Probably corona virus suspects or anyone with a temperature would show better. It is a selling point. I'm not sure who still uses software/app kiddie protection settings for the internet and the TV, but suspect it is low. Other possible reveals/tells could be does the person have a weapon? A drug stash maybe. Can we get close to a poor mans bolometer? Other possible uses: Spot growhouses, or dealers with many phones. Anyway, enjoy you crippled light spectrum phone camera.

              although a common misconception, thus forgivable mistake, it is wrong. FLIR is completely different to NIR, the former being thermal imaging and NIR not having thermal vision ability any greater than visible spectrum cameras (save more magenta casts on stuff glowing visibly red/orange hot because of it giving off NIR too). They get confused a lot by lay folks despite being very different tech. NIR wavelengths aren't filtered by the RGB microfilters in the bayer array effectively, closer wavelengths are slig

    • by OzPeter ( 195038 )

      Bugless Algorithm flow chart:

      1 User doesn't take nude photo.
      2 End.

      It's been shown time and time again that prohibition doesn't work, so your algorithm comes across as being simplistic. So I propose a more realistic algorithm:

      10 Frame picture
      20 If Not(subject is nude*) GOTO 50
      30 If subject's face is showing GOTO 10
      40 If other elements in the picture uniquely identify the subject GOTO 10
      50 Take picture
      60 Send picture

      * Also consider taking drugs, committing crime etc

      • All crimes are prohibition. It doesn't make those laws any less of value. The problem isn't whether or not something should be prohibited, it's whether or not a person is individually capable of controlling one's own behavior. If to reach that lesson it is necessary to prohibit the behavior through real and effective consequences, then let it work for those that need them.
      • Your "more realistic" algorithm's recursion precludes portrait shots among other things.
        • by OzPeter ( 195038 )

          Your "more realistic" algorithm's recursion precludes portrait shots among other things.

          How so? For a standard portrait I see the flow as 10=>20=>50=>60

          And it's looping, not recursion :D

    • In reality they're probably just re-using the Not Hotdog [youtu.be] classifier which is very accurate.

    • Comment removed based on user account deletion
    • Just don't put a camera in the phone. It's cheaper and it's the only way to guarantee that the user will never take a photo that someone will be offended by.
  • It's Japan. It will have to detect tentacles.

    (It probably is being funneled to a room full of people monitoring the cameras, because that is how "AI" seems to be working at the moment. That's not creepy at all).

    • you can detect pixel color for nudity there are several apis available already
      • Somebody should buttbuttinate those retards.

      • Re:Very smart AI. (Score:5, Interesting)

        by gbjbaanb ( 229885 ) on Sunday March 01, 2020 @10:12AM (#59784130)

        yep and it lacks context completely.

        I used to see spam filtered emails at a company I worked for (yes, I got to see all the good stuff that was deemed "unacceptable") and I recall one filtered out as a nude image- it was actually a colleague's nephew, pulling a pose in his karate outfit as he'd just won his red belt.

        There's no way AI filters will not detect a red belt dangling between the subject's legs in some poorly taken photographs as what it is, they'll err on the side of false-positive and censor it.

        • by Anonymous Coward

          I used to see spam filtered emails at a company I worked for (yes, I got to see all the good stuff that was deemed "unacceptable") and I recall one filtered out as a nude image- it was actually a colleague's nephew, pulling a pose in his karate outfit as he'd just won his red belt.

          So, you were hoping to see nude pics of your colleague's nephew?
          I think we know why you aren't working there anymore.

      • So all flesh tones from albino to a resident of Cameroon? Link please.
      • 1) A penis has a thin high contrast shadow crossing it - Picture may be taken (AI gets thrown off by the shadow, making the penis a "not a penis")

        2) Somebody is being assaulted - Nudity detected, ACCESS DENIED (even though everybody is fully clothed, and it's a robbery in progress, some random object in the background gets misidentified as a penis because it's taller than it is wide, and it just happens to look vaguely phallic)

        Society is being done a great disservice by throwing around the term 'AI'

      • "you can detect pixel color for nudity there are several apis available already"

            You mean you can detect something that may be human skin by checking if a pixel falls within a certain color range. You still need human eyes to know what part is being detected, or if it's even detecting human skin and not something else.

    • YOLO can detect thousands of objects in real-time on a raspi. I have no trouble believing that a halfway decent cellphone can detect naked people. However, it's going to have false positives and/or false negatives.

    • It’s Japan! They probably already have animated tentacle filters that get added!

  • But I expect this thing, which is basically a scam, will sell well.

    • But I expect this thing, which is basically a scam, will sell well.

      There are a few places where this device might sell, but Japan is definitely not going to be one of them. Meanwhile, Saudi kids and Mississippi kids will just find ways of fooling the algorithm.

  • by Way Smarter Than You ( 6157664 ) on Sunday March 01, 2020 @09:52AM (#59784074)
    Instead of using electronic device to raise kids: tv, iPad, iPhone, internet, porn blockers, content filters, gps kid trackers, and god only knows what else. Maybe they should try actually teaching their kids something. Some parents used to actually do that. Kind weird and old fashioned, I know, but it didn't need batteries or new cables all the time.
    • by mark-t ( 151149 )

      Or, in the words of Solomon, "teach your children the way that is right so that when they are older they will not depart from it".

      KInd of old fashioned, I know... maybe not PC, or possible to accuse of having a unnecessary religious agenda.

      But ultimately still damn good advice.

      • by Z80a ( 971949 )

        On the worst case, it's just a bunch of advices given by smart people that observed how the humans worked for literally thousands of years, and wrote down the stuff that actually works.

      • Or, in the words of Solomon, "teach your children the way that is right so that when they are older they will not depart from it".

        The problem is, that statement is false.
        They WILL depart from it, if most of the rest of society does. Peer pressure, herd behavior to name a few, they affect people no matter how smart they are.
        My favorite example is jaywalking. Many parents teach their kids the right thing to do, and many others don't, or give them bad examples every day, by jaywalking together with said kids. Now, when kids taught the right way will go to school and back with kids wrongly taught, they will, together, err on the side of j

        • by mark-t ( 151149 )

          It's a proverb, not a logically true or false proposition. It's advice, not a guarantee. The point of the statement is not to suggest that the stated outcome is invariable by following the advice, but to indicate what a (presumably desirable) outcome is could create enough incentive to follow the advice in the timeframe mentioned.

          And speaking to the general reliability of the proverb, while obviously everyone ultimately makes their own choices, our parents do not make them for us, it's still by far mo

        • My favorite example is jaywalking. Many parents teach their kids the right thing to do...

          In many parts of the world what the US calls "jaywalking" IS the right thing to do.

    • My buddy installs water meters. Wrench, pipe cutter, shovel. All good skills. However, they have to enter in a ton of data and what have you into a handheld computer. The water meters are electronic and connected, so they have to sync them up with the device. They spend half their time not with a shovel in hand, but with a phone in hand. Not teaching your kid how to use a computer in this day and age is going to be totally crippling as an adult, because computer knowledge is only going to increase as
  • Clippy says: "Hey it appears you're taking a picture of a boob, would you like assistance with that?"

  • this would be good if it could detect kids being taken by cameras. if it was built into all cameras to detect child porn that would be great
  • Seriously, every time you think they canâ(TM)t get any more overbearing, condescending, and totalitarian ...

    Also: Thanks, Catholiban missionaries! Because you hated yourselves for raping children you made halft a planet treat normal things like nudity and sex like a taboo. Imagine if you had been scat fetishists instead... we'd be censoring food and jailing Gordon Ramsay. ;)

    • Of course with kids it's different. But ONLY if the recipient is a pedo.

      No problem with normal people. Small kids usually go nude on European beaches. And p
      family photos of their kids in a bathtub, playing in foamy water, are also normal.
      Only sick fucks see that as wrong (or hot... same type of mind).

      • Also being a Naturist is legal, so how are they going to take their photos now ?

        The problem is that any given picture (photos and paintings) of a nude can be classified as being both a piece of art and pornographic because these terms are not mutually exclusive. I think an AI will pull its hair out if it attempted to classify a picture for artful and porn ratings. This is because the ratings are subjective.

        In particular, being nude is not illegal in most of Europe even if you walk in the nude down the high

  • by OzPeter ( 195038 ) on Sunday March 01, 2020 @10:08AM (#59784118)

    Off the top of my head I can think of a couple of ways to defeat this and I'm not in the demographic of people who will want to defeat it.

    For example Nip slip, wet t-shirt, copious amounts of makeup. Without being able to actually understand the content of a photo there will always be different types of photos that will be able to slip past the filters. Especially if the filtering is done in camera where there are limited computing resources available for doing the classification.

    Now if a more thorough analysis is being done off camera .. well that raises a huge bunch of red flags.

  • You don't want kids taking nude photos? How about making kid friendly electronic devices without cameras?
    • Making different devices is never a good idea in today's manufacturing world. It's easier to have parental settings to enable/disable the cameras.

  • "Furthermore, the “Biometric App Authentication” function, which is a security measure for social engineering, etc., can be activated with “face authentication” for each application, providing a smoother user experience while enhancing security. Also, "TONE e20" is a log that uses the blockchain-related technology that the FreeBit Group has been working on under the Trusted Internet Initiative and addresses the risk of tampering with elements that cannot be protected by the blockchai

    • "Furthermore, the “Biometric App Authentication” function, which is a security measure for social engineering, etc., can be activated with “face authentication” for each application, providing a smoother user experience while enhancing security. Also, "TONE e20" is a log that uses the blockchain-related technology that the FreeBit Group has been working on under the Trusted Internet Initiative and address"[and to the hypotenuse of the right triangle to the lateral array.....]

  • by Viol8 ( 599362 ) on Sunday March 01, 2020 @12:18PM (#59784450) Homepage

    Will Michelangelos Sistine Chapel ceiling and his David statue for example also be blocked by this nanny camera?

    And someone tell me why the hell are so many countries so outraged by nudity (or even *gasp* - womens breasts!) yet are quite happy for kids to sit in front of consoles firing bullets and missiles into virtual people with blood and gore splatter everywhere? There's something profoundly wrong with 21st century morality.

    • The ironic thing is, a good lions share of the Holy Books of the "Big 3" Abrahamic religions are NSFW. Words like "whore" and even "pissith" is present in the Bible. Among other things there is graphic violence, and some very nasty sexual stuff *cough*rape*cough*.

        But it's OK for kids to read it, because it's the "Good Book" %{

      • by ceoyoyo ( 59147 )

        To be fair, most religious people like to pretend the nasty bits don't exist. Most of them pretend so well they've never actually read them.

  • by Malays Bowman ( 5436572 ) on Sunday March 01, 2020 @12:56PM (#59784532)

    And kids will just find clever ways to fool the 'AI' to be able to snap the dirty photos. Even Japanese school kids have a lot of idle time on their hands, and the desire to be cool because they beat authority.

  • Comment removed based on user account deletion
  • If this gets any traction, be prepared for teens having contests to see who can wear the least without triggering the protection, who can trick the protection into allowing a nude and, of course, what is the most innocent pic you can take that will trigger a false positive.

    • If this gets any traction, be prepared for teens having contests to see who can wear the least without triggering the protection, who can trick the protection into allowing a nude and, of course, what is the most innocent pic you can take that will trigger a false positive.

      Along that same line, but for nerds, feed random noise to the algorithm and keep the pictures flagged as nude.

      • by sjames ( 1099 )

        Soon after, the +5 cloak of fnord, a cape with an artistic nude printed on it that makes you impossible to photograph.

  • Much simpler to have the phone be set to automatically send all photos taken on the phone to the parent who bought the phone.

    For an extra fee, can do the same with incoming photos to protect your precious snowflake from dick pics.

  • Japanese kids sell these photos to old pervs for iPhone money, just as they sell used panties to sell in vending machines.
    It's called burusera

    "However, burusera goods in themselves are not child pornography, and selling burusera goods is an easy way for schoolgirls to gain extra income"
    https://en.wikipedia.org/wiki/... [wikipedia.org]

  • I bet the engineers had fun testing this camera+AI. Surely they wouldn't just test it by pointing the camera at a bunch of photos. They'd have to test this for real. Fun times!

  • it's Infernal Lifespan is only increased by a negligible amount.

    In a nutshell, it will likely be bypassed before the product is even officially launched, so meh.

  • If the thing is so smart that it can detect the offending areas, why not blur them tautomatically, like the faces in google streetview? With no record, it is impossible to even troubleshoot what part of the picture the "AI" thinks is offensive.
  • you all know how this works, right?
    the phone is uploading everything to an AI-cloud service for detecting naughty bits.
    it's already going wrong right there.

You know you've landed gear-up when it takes full power to taxi.

Working...