Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI

Is It Illegal to Trick a Robot? (ssrn.com) 82

An anonymous reader writes: Can you get into trouble under anti-hacking laws for tricking machine learning...? A new paper by security researchers and legal experts asks whether fooling a driverless car into seeing a stop sign as a speed sign, for instance, is the same as hacking into it.
The original submission asks another question -- "Do you have inadequate security if your product is too easy to trick?" But the paper explores the possibility of bad actors who deliberately build a secret blind spot into a learning system, or reconstruct all the private data that was used for training. One of the paper's authors even coded DNA that corrupts gene-sequencing software and takes control of its underlying computer, and the researchers ultimately warn about the dangers of "missing or skewed security incentives" in the status quo.

"Our aim is to introduce the law and policy community within and beyond academia to the ways adversarial machine learning alter the nature of [cracking] and with it the cybersecurity landscape."
This discussion has been archived. No new comments can be posted.

Is It Illegal to Trick a Robot?

Comments Filter:
  • Stop sign (Score:5, Insightful)

    by religionofpeas ( 4511805 ) on Saturday March 31, 2018 @02:47PM (#56359933)

    Modifying a stop sign with the purpose of fooling a self-driving car is similar to someone tampering with a stop sign to fool human drivers, and can be handled with existing laws.

    • indeed manslaughter convictions have been made for tampering with or removing road signs; existing law covers this

      • Obligatory xkcd [xkcd.com]

        "Those things would also work on human drivers. What's stopping people now?

        • Re: Stop sign (Score:4, Interesting)

          by javaman235 ( 461502 ) on Saturday March 31, 2018 @03:18PM (#56360067)

          AIs can be tricked with things way different than what would fool human mind:

          https://www.google.com/amp/s/w... [google.com]

          • indeed that turtle looks like a certain model of crossbow not a firearm, if I saw its outline on a T-ray scanner at an airport that passenger would be up against the wall getting frisked

        • by mark-t ( 151149 )

          The effort required to do so.

          It happens to be a whole lot easier to trick machines than people.

          • It happens to be a whole lot easier to trick machines than people.

            Millions of magicians and scam artists would beg to differ. Hell, there's a reason why the biggest weakness in computer security has always been the lump of meat sitting in front of the keyboard (aka "social engineering").

    • by AmiMoJo ( 196126 )

      A few years back some lady painted her bird box yellow. It looked kinda like a speed camera from a distance. Pretty effective against human drivers.

    • Attempts to teach a computer to spot military vehicles resulted in flagging all "Rainy" pictures as containing military vehicles, because all the learning photos had that. If you inserted a small christmas tree near every stop sign used to train the robot car, it might not recognize stop signs without one. That is not the same as disguising a stop sign, and probably not illegal under current law.
      • That is not the same as disguising a stop sign, and probably not illegal under current law.

        Current law also considers intent of your actions. If you purposefully create a situation where people end up in danger, then it is illegal.

  • by DRJlaw ( 946416 ) on Saturday March 31, 2018 @02:55PM (#56359963)

    The answer is probably going to depend upon one word:

    Computer Fraud and Abuse Act (18 USC 1030) [cornell.edu]:
    (a) Whoever--
    (5)
    (A) knowingly causes the transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer;

    Can you convince judges that "cause the transmission" should only mean active electronic transmission, or can prosecutors convince judges that "cause the transmission" should have the same epidemiological sense as causing the transmission of a virus, worm, etc, regardless of means.

    • As others have stated, there are many better criminal statutes that would apply to this situation. There are laws specifically protecting street signs, and it seems like other much more serious crimes against persons would be provable if the worst were to happen. But,

      A stop sign is a command input that is transmitted to the cameras. I would think that intentionally changing that command input in a manner designed to cause any problem with the computer is no different than exploiting any other exposed interf

  • There's going to be kids that are going to see videos and attempt to recreate any flaw - just like there's plenty of pennies smashed on train tracks over the years (not really dangerous, but if kids could be jailed for intent...), there's going to be flaws in any automated system by random folks you can't "teach a lesson to."

    One of the biggest purposes of having an automated system approaching computerization ("robot", if that's what gets clicks), is that you can spot flaws, and ALTER the system to better a

  • by tomhath ( 637240 ) on Saturday March 31, 2018 @03:23PM (#56360085)
    Is cutting the brake lines on a car a security issue? Of course not. But it is a crime.
  • Comment removed based on user account deletion
    • yes it's illegal to cause traffic accidents. be it by defacing signs, stealing stop signs, or screwing with the road markers. this is not even a question.

      What if you cause it by wearing a custume looking like a stop sign to a computer, but like a custome to a human?

      • Comment removed based on user account deletion
      • yes it's illegal to cause traffic accidents. be it by defacing signs, stealing stop signs, or screwing with the road markers. this is not even a question.

        What if you cause it by wearing a custume [sic] looking like a stop sign to a computer, but like a custome [sic] to a human?

        Remember, intention is the key. If you were going to a costume party, dress up like that, and on the way to the party, then it is not your fault but rather the AI. On the other hand, if you just dress up like that and stand along a road/street where self driving cars often time go by, then it could be illegal depending on how they interpret your intention (and likely you would be at fault).

  • Existing laws cover such behavior. Expect charges ranging from Malicious Mischief to Vandalism to Terrorism depending on how vindictive the prosecutor feels.

  • Are we talking about robots that are going to kill you?

  • "bad actors who deliberately build a secret blind spot" - Reminds me of Robocop's "Directive 4". I dunno why.

  • How will a court tell the difference between:
    * someone knowingly and intentionally circumventing security, and
    * when the robot has a flaw and behaves unexpectedly.

    I can see some companies maknig accusations of malicious interference as a way to save face.

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...