Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Privacy

Smart Cameras To Predict Crimes 245

hairybacchus writes: "The Independent News is reporting that scientists at Kingston University in London have developed video processing software that is able to predict behavior patterns of the people on-screen. They say it will be used to alleviate congestion in the London Underground or alert police to potential muggings. I wonder how long it will be before this is combined with face-recognition technology? It's spooky." I can't wait. "We searched you because the computer told us to." Trust the Computer.
This discussion has been archived. No new comments can be posted.

Smart Cameras To Predict Crimes

Comments Filter:
  • by Anonymous Coward
    The could use it to determine which of us are likely "pirates". Oh wait, they have no need. They consoder us all pirates.
  • The Independent News? Wot's that thar then? The newspaper is called 'The Independent'.

    Sorry, but I still on 'Stunned by the Americacentrism' after the story where every man and his dog bemoaned a story that spoilt a television program before it has been shown in the whole of the *states*....
    • The Independent News? Wot's that thar then? The newspaper is called 'The Independent'.

      Did you bother to click on the link before complaining? The browser title is "Independent News".

  • If this actually works as promised, and only alerts police to people who really are about to kick the shit out of someone, this could be a really good thing.

    Of course... that's if it works.

    • "Of course... that's if it works." Did it Ever work?.... or for that matter did the sun ever rise from west?
    • Re:could be good (Score:3, Insightful)

      by drsoran ( 979 )
      I saw a blurb about Washington, D.C. wanting to install a massive system of cameras like London has and now I understand why there is such a backlash. There are cameras everywhere in our society now. Our homes are just about the only place left that we can hope to not be captured on camera without our consent, but how long will that last? Why do Americans allow our government to slowly eradicate our civil rights in the name of safety and security? Benjamin Franklin would be turning over in his grave if he heard some of the twaddle people are blathering about these days. What you say? Yes, we should ban guns.. they're dangerous and can be used to kill people. Hmm, yes.. privacy.. that's an odd issue too, maybe we don't need privacy. Let's install cameras everywhere and use them in a court as evidence. Freedom of speech? Well, only when it is convenient and when it doesn't offend anyone. We wouldn't want to be politically incorrect now would we? The PC police might come and haul us away for being insensitive. What? You plead the 5th? What do you have to hide? Are you a TERRORIST or something? Only terrorists plead the 5th Sir! You must be hiding something. Let's go review the video cameras for the last month of your movements.

      Anyway, I'm getting a little off topic, but from what I've seen, the London camera system was installed to combat the IRA terrorists (sound familiar Americans?) but according to the program hasn't ever actually resulted in capturing an IRA terrorist. So, pray tell, what is the massive camera system in London used for? Spying on the citizens of course. Am I paranoid? A little, but without paranoid people we would not have a Bill of Rights in the US. We'd all be ignorant trusting twats who believe evil men don't exist and believe everything spoon-fed to us by the media and our government.
      • Let's install cameras everywhere and use them in a court as evidence.

        You know, if you were robbed, you would be only too pleased to pull up the video evidence of it to help nail the person who did it.

        Indeed, many people want cameras up in train stations and on trains (and probably would prefer to have a real person watching it to arrange for help if there was a problem). On a similar note of machine prediction, most of us are happy to have metal detectors at airports as a "predictor" of subsequent potential illegal acts.

        So what gives? I think that what we are worrying about here is about a couple of themes:

        1) The possibility that someone can collate and document your own activities and use innocent (and legal) behaviour against us. Of course, all bets are off with some groups. Just ask Bill about Monica - I don't think he actually broke the law, but everyone wanted to know anyway. (Ok, you expect a little scrutiny as a president!)

        2) The possibility of persecution for acts which have never been commited, and where people are being judged on presumed intent. (Although most peopole still don't want guns on airplanes, funnily enough).

        If this is the case, then what we need is more in the way of legislative (or ideally, constitutional) protection of:
        1)rights to privacy, and
        2)the presupposition of innocence.

        These concepts do exist legally in some areas for protection of privacy (eg., Medical records) and from persecution (when accused of a major felony).

        But they really don't exist at a day to day level for most people living normal lives. Protection against these actions with legal rights is probably the best solution here.

        Because the technology isn't going to go away.

        My 2c worth - Michael
  • Thoughtcrime (Score:3, Redundant)

    by dillon_rinker ( 17944 ) on Monday April 22, 2002 @06:07AM (#3386380) Homepage
    I think the subject says it all.
    • For those who haven't head George Orwell's 1984, this is a reference to the only crime in the 1984 world, which is thought against the government.
    • That's a good point, in fact its not that the camera system needs to be able to recognise behaviour, just simply recognising a guilty look or someone going "Bwhahahahah someday all this will be mine" should be enough
    • We arrested you because the computer said you were going to .
      That is a line that scares me because it isn't unconceivable for this to happen if this technology takes off. COmputers don't have the ability to distinguish between someone might commit a crime and someone who won't. Police, and the rest of law enforcement, have a hard enough time doing this, and they can think.
      • That was supposed to be:
        We arrested you because the computer said you were going to {insert you favorite crime here}.

        I used "" by mistake. I guess I should hit preview more often.
  • by HiQ ( 159108 ) on Monday April 22, 2002 @06:07AM (#3386381)
    Camera 1: I predict that I'm going to be stolen in 10 seconds.
    ...
    **Damn** I hate it when I'm right!
    • Seriously I remember a related story in New Scientist a few years ago. In real time (using CCTV footage of a car park) gave the security guards about 45 seconds warning that someone was going to break into a car based on their movements - giving the security guard time to get there in time to apprehend them!
  • by GrandCow ( 229565 ) on Monday April 22, 2002 @06:08AM (#3386386)
    Robot cameras 'will predict crimes before they happen'
    CCTV: By learning behaviour patterns, computers could soon alert police when an unmanned camera sees 'suspicious' activity
    By Andrew Johnson
    21 April 2002
    Computers and CCTV cameras could be used to predict and prevent crime before it happens.

    Scientists at Kingston University in London have developed software able to anticipate if someone is about to mug an old lady or plant a bomb at an airport.

    It works by examining images coming in from close circuit television cameras (CCTV) and comparing them to behaviour patterns that have already programmed into its memory.

    The software, called Cromatica, can then mathematically work out what is likely to happen next. And if it is likely to be a crime it can send a warning signal to a security guard or police officer.

    The system was developed by Dr Sergio Velastin, of Kingston University's Digital Imaging Research Centre, to improve public transport.

    By predicting crowd flow, congestion patterns and potential suicides on the London Underground, the aim was to increase the efficiency and safety of transport systems.

    The software has already been tested at London's Liverpool Street Station.

    Dr Velastin explained that not feeling safe was a major reason why some people did not use public transport. "In some ways, women and the elderly are effectively excluded from the public transport system," he said.

    CCTV cameras help improve security, he said, but they are monitored by humans who can lose concentration or miss things. It is especially difficult for the person watching CCTV to remain vigilant if nothing happens for a long period of time, he said.

    "Our technology excels at carrying out the boring, repetitive tasks and highlighting potential situations that could otherwise go unnoticed," he added.

    While recent studies have shown that cameras tend to move crime on elsewhere rather than prevent it completely, in certain environments, such as train stations, they are still useful.

    And Dr Velastin believes his creation has a much wider social use than just improving transport.

    His team of European researchers are improving the software so that eventually it will be capable of spotting unattended luggage in an airport. And it will be able to tell who left it there and where that person has gone.

    However, the computer is not yet set to replace the human being altogether.

    "The idea is that the computer detects a potential event and shows it to the operator, who then decides what to do - so we are still a long way off from machines replacing humans," Dr Velastin says.
  • "The idea is that the computer detects a potential event and shows it to the operator, who then decides what to do"

    So considering it is better to err on the side of caution, the best we can hope for is that these computers show the operator everything...
    How exactly are they testing this and do the get many "CrimeNotFound" exceptions?
  • by gnovos ( 447128 ) <gnovos@ c h i p p e d . net> on Monday April 22, 2002 @06:11AM (#3386394) Homepage Journal
    One forgets that when the computers hold sway over the people, those chosen few who program the computer are Gods. I REALLY can't wait, becuase this is where it all pays off...

    "Gnovos, the computer has informed us that your progress in the 'QuakeSex Research Project' has been incredibly successful, and we are to give you another $100 million extension to the grant. Personally, I don't see how playing deathmatch games against your friends between sexual encounters with supermodels contributes to global peace, but it's not my place to dispute the wisdom of the computer. Machines are always right, after all. Oh, and another Nobel prize came today, should I put it in the box with the others?"
  • by Nevermine ( 565876 ) on Monday April 22, 2002 @06:19AM (#3386407)
    How cynical can you be.. whenever something like this comes around you predict the end of the world.. It's not a question of somebody getting arrested because they thought of mugging a person on the street.. it's about the ability to do city surveilance more effectively by reporting suspecious behaviour of people on screen.. imagine having to monitor 100 cameras at the same time.. wouldn't it then be somewhat of a relief if the program would sort out the screens that show suspicious events on them? Come on people get real! assuming this camera tecnique would work..
    • by Jim Norton ( 453484 ) on Monday April 22, 2002 @06:34AM (#3386434)
      And rightly so! Ever since the settling of the New World we have experienced racial/ethnic/religious oppression, corporate power-mongers using their money and influence to squash our rights and freedoms, magic bullet theories, the use of fear to convince us to sign our freedoms away (eg. 09/11 "terrorism", crime) among other things. All of them committed by those in power.

      It all boils down to whether you trust them to responsibly use the power they have in cases like this.

      Well, do you?

      STUDY THE PAST
      • by NearlyHeadless ( 110901 ) on Monday April 22, 2002 @10:18AM (#3387110)
        And rightly so! Ever since the settling of the New World we have experienced racial/ethnic/religious oppression, corporate power-mongers using their money and influence to squash our rights and freedoms, magic bullet theories, the use of fear to convince us to sign our freedoms away (eg. 09/11 "terrorism", crime) among other things. All of them committed by those in power.

        It all boils down to whether you trust them to responsibly use the power they have in cases like this.

        Well, do you?

        STUDY THE PAST

        Study the past, indeed! From your post, you would think that all this began with "the settling of the New World"! ROTFL! Try reading any history about any part of the world at any time!


        When we consider whether we allow the police to have guns, we don't ask whether we can always trust them to use their guns wisely. Of course we can't. Instead, we ask what are the advantages of the police having guns versus their not having guns and what procedures we can have in place that will minimise the abuses.


        We don't ban police from interrogating suspects even though sometimes they abuse their power in those interrogations. We do prevent them from torturing suspects, and we also will exclude certain evidence if police disregard the rights of suspects. Some jurisdictions also videotape all (custodial) interrogations of serious crimes, an excellent practice, which should be required.


        But, notice, we do not ban interrogations. Nor do we say, we trust police to do the right thing always. The very foundations of our government are based on accountability to the people and checks and balances, not on trusting authorities to always do the right thing. Try reading The Federalist Papers some time instead of watching Oliver Stone movies.


        Of all technologies, this one of having computers analyzing video surveillance cameras in public places, seems amazing innocuous. I can hardly imagine anything less threatening to me.

    • Not sure where your from, but most of us here don't like the idea of cameras, let alone cameras reporting on any "suspicious behavior" we might be doing at any given moment.. We like to think we live in a free society based around citizens, where the country is run for us and is a product of us as a whole as opposed to being subjects of a government. Basically we belive people's rights come first and not governments rights. Governemnt is an extension of the people. People aren't an extension of the government... Follow?

    • by Myco ( 473173 ) on Monday April 22, 2002 @06:55AM (#3386472) Homepage
      I think you're missing something important here. Technology enables ordinary surveillance tasks to be replicated and scaled up by amounts previously unimaginable. The result is not just more of the same -- at some point, it introduces a qualitative change

      Consider surveillance cameras on city streets. Sure, the fact that I walk down a particular street at a particular time is public knowledge -- anyone could see me and remember. But what if every step I took in public was recorded on video and tracked? Whoever had that information would know a great deal about my behaviour, and that information could be used against me. Pervasive collection of information, even public information, can be a grave threat to privacy.

      Now consider the technology discussed in this article. Phenomena such as racial profiling have taught us that an innocent person can suffer horribly at the hands of law enforcement personnel just because they fit a perceived statistical profile. Imagine a world where everyone is afraid to act in any way unusual for fear of being stopped for "questioning."

      And you can forget the argument about "if it works, it's okay." First of all, these methods are inherently statistical, and statistical methods are never 100% accurate. If they were, they would be logical, deductive methods. Statistics is inductive.

      Secondly, even if you did claim to have perfect foreknowledge of crimes to be committed, you create a predestination paradox. At what point does a would-be criminal make up his or her mind to commit a crime? Who's to say he or she wouldn't back down at the critical moment, or be unable to go through with it due to some chance event?

      My real point here is that we can't always rely upon "more is better" methodology as our technology progresses. We have to consider how scale affects the nature of our technological activities. If we are blind to issues such as these, then eventually we'll get screwed. Maybe this prediction thing will turn out to be benign or even beneficial. But there are many, many issues of this sort, and some of them are going to bite us in the ass if we don't raise hell when we see a problem. Dig?

      • The police arent 100% effective either, but I'm sure glad we have em.
      • Simple example: You, known to be a single guy, are regularly seen walking down the street and entering the home of another known single guy. What inference is readily drawn from that? Who might put it to use in a fashion that might negatively impact both of you, regardless of the facts?

        "No one goes from idealism to realism. There's a cynical stage inbetween." (--Sir Fred Hoyle, IIRC)

        • > Simple example: You, known to be a single guy, are regularly seen walking down the street and entering the home of another known single guy. What inference is readily drawn from that? Who might put it to use in a fashion that might negatively impact both of you, regardless of the facts?

          Obviously, we're faggots, busily offending the sensibilities of victorian society behind closed doors.

          Of course, if we had a transparent society - in which all our personal data were available to anyone who'd care to look - they'd realize that we're just a couple of heterosexual geeks having a small LAN party. (We both read Slashdot, but he's the only one within 13000 feet of the CO.)

          Of course, since such a meeting would also be conducive to things that would offend the sensibilities of RIAA and MPAA executives, and the penalties for that are far worse, maybe it's better that the security apparatus doesn't know what goes on behind modded cases :)

    • And London is filled with thousands of video camaras because of the IRA attacks that used to be frequent. This would significantly decrease the workload on the video monitors.
    • There are plenty of places in the world where safety is not a problem. They have security mechanisms that work well traditionally: liveable communities, functioning social networks, police officers from the community, equality of opportunity, low economic disparities, etc.

      Trying to substitute cheap technology for a functioning society is the wrong path. You can put in cameras to detect potential criminals, but that doesn't get at the root of the problem. Crime and violence are the result of failed government policies. Cameras won't make you secure, and neither will minimum wage security gaurds or a stressed police force.

      The cynic is you: rather than trying to prevent crime at the root, you give up and want throw more and more people into jail.


      • Crime and violence are the result of failed government policies.

        Hmmm. Some of the time, maybe. But I think a "functioning society" is not absolutely correlated with government policies.

        There are plenty of examples of societies with lousy governmental policies and, yet, some fine, upstanding good citizens.

        Likewise, there are places with progressive, enlightened governmental policies where, nevertheless, criminals can be found.

        I think the roots of crime and violence grow much deeper into culture as a whole. It would be convenient if government policies were so effective, but my observation is that they are only roughly correlated with society's behavior.

        • But I think a "functioning society" is not absolutely correlated with government policies.

          Nothing in the real world is "absolutely correlated" with anything.

          There are plenty of examples of societies with lousy governmental policies and, yet, some fine, upstanding good citizens. Likewise, there are places with progressive, enlightened governmental policies where, nevertheless, criminals can be found.

          Crime and terrorism isn't about existence or non-existence, it's about statistics and frequency. And the US statistics are lousy.

          It would be convenient if government policies were so effective, but my observation is that they are only roughly correlated with society's behavior.

          Government is one of the mechanisms by which culture is made. And, in a democracy, government is the mechanism by which culture acts. It's the one place where culture becomes visible and where it can be changed.

  • by TarpaKungs ( 466496 ) on Monday April 22, 2002 @06:22AM (#3386411)
    Maybe we should take to walking backwards - a favourite pastime of students caught on camera during the filming of the Oxford-set UK series Inspector Morse.

    Very difficult to spot during editing apparantly ;-) Wonder what it would make of that?

  • This reminds me... (Score:4, Interesting)

    by mav[LAG] ( 31387 ) on Monday April 22, 2002 @06:25AM (#3386419)
    of an old Guardian Newspaper ad on TV (a few years back now). It showed a skinhead running towards an old man - then froze.

    VO: Some newspapers stop here.

    Unfreeze and said Skinhead sweeps man out of the way of falling masonry i.e. it was a rescue and not a mugging.

    VO: The Guardian - get the full picture.

    I guess with this technology in place, computer-controlled lasers would have taken out the rescuer before he could act :)
    • A slight correction...(enhancement?)

      There were more than two viewpoints. You missed the middle sequence.

      As you say, first viepoint was of dodgy looking skinhead running towards businessman,
      it looks like a mugging about to happen.

      Next set of shots show a car (not visible from previous angle) with some dodgy looking
      geezers in it, slowing to a halt at a junction, next to skinhead walking along pavement
      (pavement=sidewalk). Skinhead starts running away from car.

      Final sequence, skinhead running towards businessman walking past building site.
      Some heavy building material is just falling down from above, skinhead grabs businessman
      pulls him out of the way.

      An excellent advert.

      >>I guess with this technology in place, computer-controlled lasers would have taken out
      >>the rescuer before he could act :)

      But of course. Think of all the starving lawyers who could have made a tidy packet out of
      suing the building site. (Won't some think of the children^H^H^H^H^H^Hlawyers !)
    • by Reziac ( 43301 ) on Monday April 22, 2002 @11:31AM (#3387476) Homepage Journal
      There was a TV news magazine article yesterday (might have been on Sunday Morning) about the 2 MILLION surveillance cameras that now infest London, in response to IRA threats. The piece pointed out that NOT ONE terrorist has been stopped by these cameras (but that abuse is rampant). It also mentioned that the average Londoner is caught on camera 300 times a day.

      Privacy issues aside, somehow a 0:2,000,000 success:cost ratio strikes me as a wee bit useless, not to mention being an utter waste of tax money and gov't time.

      And that doesn't begin to touch the problem of sorting out the mass of data from 300 screencaps per day per citizen.

      • Now I've got me talking to myself... but it just occurred to me: London is what, 10 million people or so? (I really have no idea, just guessing)

        Assuming that's tolerably close, that means there is one camera for every 5 residents!!

        And postulating that perhaps 20% of Londoners are out in public at any given moment, that's one camera per publicly-visible citizen at all times.

        So.. with what statistically amounts to 100% surveillance of each and every citizen while they're out in public, the cameras still can't catch ONE terrorist.

        [sarcasm] If the surveillance system is accurate in determining potentially naughty behaviour, it follows that the number of terrorists in London is zero. [/sarcasm]

  • Tom Cruise? (Score:3, Insightful)

    by noz ( 253073 ) on Monday April 22, 2002 @06:25AM (#3386420)


    It's a trashy promo for the new movie Minority Report [tnmc.org]. Computers predicting crimes before you commit them (in the 'not too distant future' they'd have you believe).


    What I find funny is that Phillip K. Dick is listed as an 'author' of the movie on that web page. Promotional bs. He died in 1982 just before Blade Runner was released (his short story 'Do Androids Dream of Electric Sheep was the philosophical foundation for it).

    • Re:Tom Cruise? (Score:3, Informative)

      by GregWebb ( 26123 )
      Yes, but he wrote the original short 'Minority Report' which was cool so I'm looking forward to seeing a film of it. Didn't know that was coming, so yay! I mean, if you adapt Dickens for the screen you wouldn't remove all mention of him from the credits just because he's dead, would you?

      News on philipKdick.com [philipkdick.com]

  • hmm (Score:3, Insightful)

    by glwtta ( 532858 ) on Monday April 22, 2002 @06:25AM (#3386421) Homepage
    aren't we always the ones to yell that it's not the technology, but how you use it, that counts? just saying...
    • by zCyl ( 14362 )
      it's not the technology, but how you use it, that counts

      Precisely. I was just contemplating how to use this new surveillance technology for personal amusement.

      I bet a pound I can convince it that I'm about to mug myself...
  • by WolfWithoutAClause ( 162946 ) on Monday April 22, 2002 @06:29AM (#3386425) Homepage
    A guy I work with has a PhD in image processing. He relates this story of a system that was designed to try to detect human beings, and raise the alarm so that a security guard could check it out; rather than have a security guard staring at it continuously.

    Anyway, they wrote some software- it more or less just looked for a human sized blob that moved. Worked too- it could detect human beings pretty well.

    Trouble was, they found that it was unreliable- it tended to think birds landing in flocks and groups were people appearing and disappearing. So they improved on the algorithm, and put in some code that if the system could see the wings flapping- it would realise it was birds and ignore it.

    Anyway, it worked pretty well, so they thought they'd give a hard test. Could someone deliberately evade it? They got a grad student and told him to work out a way to fool it. They set up the computer guarding a notional prize, and set him at it.

    The grad student puzzled over it for a while, then siddled into the middle of view; and removed his jacket. He then waved his jacket over his head vigorously. The computer saw all the flapping, and activated the 'bird' assignment and he was able to steal the item...
    • by DahGhostfacedFiddlah ( 470393 ) on Monday April 22, 2002 @06:41AM (#3386445)
      ...but one oft-proposed use of this technology is to catch shoplifters. If you're running around the store flapping your coat like a bird, I have a feeling that a little computer is a small worry compared to those nice men in white taking you away right now.
    • BYTE's circuit cellar had an article in the mid 1980's where the columnist (Steve Ciarcia?) was locked out of his house with food in the oven. He had to get back in without a key before it set off the smoke detector which would auto dial the fire department. The various intrusion alarms would auto dial the police. He and a neighbor crawled in since his the external motion sensors had an exemption for dog-like objects.

      These analysis programs are interesting research, but it'll be years before there's any thing even close to being have picking up enough threats and few enough false positives to be considered being in production. Besides, by the time physical movement is visible, the target / victim has already been selected, monitored, and assessed by the attacker. Proactive measures are much more effective than reactive. Look at the PC virus industry for detailed case studies in prevention versus cure.

      • Actully I beleive it was his Father in Law who was helping him. I think I remember he was worried that his nosy neighbor would call the police as well... He had his house ultra secured, I wonder what he was tring to hide?--- I'm only 21 so I didn't get to read Steve's origenal articles, but do have a book that is basicly a " best of" collection. While it is slightly off topic, I found a circuit he made that mounted a camera on a couple of electic moters. The camera's image was analized by the computer which would align the camera with any movement. I would think these kind of applications were the first steps into this recognition field. Besides that it was pretty cool.... Now that I think of it, I bet that project would be alot easier to build now that we have digital cameras and there is no need to build that video Analog to Digital board. I'm gonna go whip out my soldering iron...
      • but it'll be years before there's any thing even close to being have picking up enough threats and few enough false positives to be considered being in production

        Hmm. Well, I can't say much for legal reasons, but the technology has come along a lot further than you realize.

        Keep in mind that you're referring to an article that's almost 20 years old dealing with consumer-available technologies. The current commercial and government-grade stuff is way way way beyond that.
    • "Trouble was, they found that it was unreliable- it tended to think birds landing in flocks and groups were people appearing and disappearing"

      So? Flocks of geese look like soviet nuclear missiles to radar operators - i didnt hear anyone complaining about that!
  • The software (Score:3, Informative)

    by Alien54 ( 180860 ) on Monday April 22, 2002 @06:29AM (#3386426) Journal
    As seen on RFN [radiofreenation.net] item on this, here is the link to the actual company page where you can read about the software:

    http://www.cordis.lu/telematics/tap_transport/rese arch/projects/cromatica.html [cordis.lu]

    Their other projects [cordis.lu] are also interesting as well

  • I have seen this (Score:5, Interesting)

    by Anonymous Coward on Monday April 22, 2002 @06:35AM (#3386436)
    I have seen this first hand. It's pretty cool. It learns what is "usual" about a scene and then monitors the scene for unusual events. Scenarios include:

    Locating "suspect packages" left in public places

    Spotting vehicles parked in dodgy places

    Watching for people accessing secure areas

    Making sure no service vehicles get onto runways

    Yes, all this is possible with more conventional technology but these often need a human being in close attendance. This system filters out noise like stray animals, cyclists, etc because it learns what suspect packages, vehicles and aeroplanes look like and also how they move and behave.


    and yes... it could be used to spot human behaviours. It appears that someone plotting a crime moves differently to someone just going about their business. This system knows the rules about human shapes and modalities and fluidity of movement.


    My view is that the final bit is a bit of spin for the consumption of venture capitalists and is unlikely to be of much use in prime time - so no need to panic yet. It does however raise interesting questions about "reasonable suspicion", evidence and culpability if someone is wrongly detained. Police would no doubt try to shift resonsibility onto the technology, as is their wont.

    • by wbg ( 566551 )
      how would this system make a difference between a begger, a person waiting for someone, a person waiting to commit a crime?
      do authorities take a care about it if there is a difference?
      would being black, hispanic or asian rise the chance of triggering an "alarm"?


      what does this mean for the definition of public space? will there be a public space in future, if this technology is used, or would it shift the term of public spaces, and fragment them into spaces for certain people, that are forbidden to others just because a computer systems database says they are likely to commit a crime, even if they dont want to?
    • by Anonymous Coward
      Automatic Toilets in NZ and Australia -
      Camera notes customer velocity, gait down hallway.
      Computer computes desperation factor , and closes other door stalls to 'engauged', except last 'vip' stall that accepts $2 in coins.
      For extra realism dummy legs and shoes can be electronically positioned on other thrones , loudspeaker noises added , to convice payeee, that that was the best 'penny' ever spent.

      Dreamt up by the same architects who put the same number of stalls in mens and womens. Adding a coinbox to stalls is good, but just wait till the camera tells it to set a higher price.
    • It does however raise interesting questions about "reasonable suspicion", evidence and culpability if someone is wrongly detained. Police would no doubt try to shift resonsibility onto the technology, as is their wont.

      I would hope that trying to shift responsibility for wrongful detention/arrest/prosecution would be met with a resounding, "So what?" If you use a tool to do your job, you're still responsible for what you do with the tool. If a house I build collapses and kills people, I shouldn't be able to blame the hammer - even if it's a special prototype hammer with artificial intelligence and accelerometers. I decided to use that particular hammer, so I am responsible for the results of that decision. (I'll get around to suing the hammer manufacturer later).

      Also, we hear time and time again about how police don't have the power to act until a crime is committed (e.g. domestic violence) so how will this stop crime? It might assist in arrest or conviction rates by capturing evidence, but unless we have even more fundamental rights taken from us by our "representatives" and "protectors..."

      It does seem to be a cool technology, but the potential for abuse is so high that I have trouble supporting it. When a technology exists that has a high potential for criminal abuse (e.g. MP3 copying) legislators fall all over themselves trying to quash it. But they conveniently look the other way when it's something that government might abuse (e.g. radar guns, surveillance equipment, drunk driving check points, Patriot laws...).

  • by wildcard023 ( 184139 ) on Monday April 22, 2002 @06:39AM (#3386442) Homepage
    Cameras set up at Kingston University in London marked everyone coming into the computer lab as "criminal" as it predicted each individual was about to illegally download copyrighted music.

    --
    Mike Nugent
  • Come on... (Score:3, Funny)

    by Danse ( 1026 ) on Monday April 22, 2002 @06:49AM (#3386457)

    Every psycologist worth his salt knows that you can't predict the behavior of individuals or even small groups. You need a large group before the mathematics of psycology can be applied with any acceptable degree of accuracy, on the order of the population of a medium to highly populated planet. Seldon would be rolling in his grave if he'd been born yet.


    • For the flow analysis stuff they will have a large amount of people to deal with an predict. On the mugging front however its going to be harder maybe its a simple

      "That bloke is wearing 50k of gold round his neck and a 10k Rolex... he better watch out or he'll get mugged" :-)

  • Wrong (Score:2, Informative)

    by eander315 ( 448340 )
    Actually, the article in question states that "Computers and CCTV cameras could be used to predict and prevent crime before it happens." (emphasis mine). That means they can't do it yet, contrary to the way the article was presented here.

    I don't think we have to worry yet about a computer causing the erronious arrest of someone performing thoughtcrime or attempting a mugging: "'The idea is that the computer detects a potential event and shows it to the operator, who then decides what to do - so we are still a long way off from machines replacing humans,' Dr Velastin says." It's simply a tool to help the operators sort through the huge amount of visual data they are presented with.

    BTW, I don't support the idea of a Big Brother monitoring the public. However, I'm equally unsupportive of the spread of FUD like this article write-up.

  • by Yousef ( 66495 )
    As with all technologies that are present, this one has the ability to be misused or just mis-interpreted.
    However, the idea present in the system are not poor. When at university, I knew many students that worked nights as security guards. Most of them would either be studying notes or sleeping! Having a machine to help during the monotony isn't necessarily a bad thing.
    If however this leads to harrassment from the authorities just cos you have bad social skills is another matter. Hence its use must be monitored and have regulations inplace to tackle misuse.
  • I live 20 steps from Times Square in the only residential building on my block. As such, I probably can't pick my nose without being recorded on 15 different cameras. Of course, you think this is bad, but consider the possibilities!

    1. If I seem lost in thought, change the contents of some of the digital billboards to warn me about wandering into traffic.

    2. If I seem sleepy, send an email to my employer warning them not to let me touch any code that day.

    3. If I seem irritable, call my girlfriend and warn her to leave me alone for a few hours.

    4. And of course, if I seem shifty and nervous, like someone about to do something hazardous and antisocial, someone with something to hide, who is going to do harm to everyone around them... warn the police because I am about to experience flatulence.

    ;-P
    • I live 20 steps from Times Square in the only residential building on my block. As such, I probably can't pick my nose without being recorded on 15 different cameras.

      Sadly, this is all that NBC has to offer for their fall lineup :)
  • 1984 (Score:3, Insightful)

    by mikethegeek ( 257172 ) <blair@@@NOwcmifm...comSPAM> on Monday April 22, 2002 @07:06AM (#3386491) Homepage
    You know, I *LOVE* computers. I've been around them since I was 8 and got my first one, a VIC-20... But I think it's wrong to EVER put them in "charge" in any way in law enforcement.

    The popular myth is that "computers never make mistakes". Well, we ALL know this is bullshit. No computer is any better than the software that it is running, and the hardware is no better than the people who designed it.

    Show me ONE bug free piece of software that exists, anywhwere, that is more complex than the "hello world!" level and you can argue with me.

    Better yet, show me one OPERATING SYSTEM, the layer atop the hardware that any applications software (such as this Orwell-Ware) that is bug free.

    Bug=mistake.

    That said, the odds of any such application, to be flawless itself, running on a flawless OS, running on flawless hardware, are SO small as to be non-existant.

    The best that can be hoped for is accuracy in the 90%+ range. Multiply that by 300 million people, and the number of people who are going to be harassed is in the TENS of million... The potential for abuse, by both law enforcement, and by hackers with agendas is staggering...

    Already the face scanners have been proven to be so inaccurate that they are being dropped in places. This is a FAR more complex algorhythm... I'd think an accuracy rate of 20% would be generous.

    For one thing, they are assuming that normal people will behave normally, but that criminals will behave differently, evasive, etc... Well, I for one will NOT act normally anyplace I know such a thing is operating, and I doubt anyone else will either. This, I doubt can be taken into account.
  • Why? Because people can use intuition when looking at a CCTV screen. All a machine can do
    is spot patterns. If a criminal can learn these patterns he can avoid making them and even have
    a friend somewhere else deliberalty MAKING those patterns to draw the attention of the CCTV operator
    elsewhere. People assume criminals are stupid.
    They're not.
  • This is a Good Thing (Score:5, Informative)

    by IntelliTubbie ( 29947 ) on Monday April 22, 2002 @07:15AM (#3386510)
    There is nothing scary about this; in fact, humans already do it on a regular basis. A department store security guard scopes out a crowd of shoppers for potential shoplifters. An airport security guard scans a terminal for suspicious activity. A cop checks out a crowded street looking for potential muggers and pickpockets.

    The trouble is, humans are inefficient and expensive, and their "gut instincts" may be fallible. The mall security guard may be the only guy watching a dozen closed-circuit monitors, and he may even be dozing off from the monotony of his job. The airport guard might be a minimum wage high-school dropout with barely any training. The cop's instincts are pretty good, but as objective as he tries to be, he unconsciously tends to target members of a particular race instead of going by solid scientific indicators.

    This technology (if it works) will be a Good Thing because:
    1. It improves upon an existing system that helps keep us safe.
    2. It could be more effective and consistent.
    3. It could apply rules objectively, and could be designed to flag activities that truly are suspicious (e.g. "casing" a department store) rather than those that merely look suspicious to biased humans (e.g. a young black man in a record store). This means that it could help protect our rights more than the current system.

    Cheers,
    IT
    • by Indras ( 515472 ) on Monday April 22, 2002 @08:05AM (#3386625)
      I'm still not sure how I feel about this, really. There was a little grocery shop across the street from my high school, everyone would go there to buy candy and pop for lunch, and it made for a popular hangout after school.

      Once new management came in, it took approximately three hours for them to come up with a rule that changed all that. They were tired of stuff being shoplifted (can you blame them?), so they said nobody can wear coats or backpacks into the place. We all had to leave them outside the front door. And it wasn't their responsibility to watch the coats and bags, either.

      The very first day, someone walked out and picked up two backpacks, the next day a leather coat was stolen. After that, nobody wanted to go.

      The problem? They assumed everyone with a coat or a backpack was a shoplifter. Inconveniencing everyone in order to stop one or two people seems wrong to me. I imagine this new camera system will use some sort of stereotyping as well, like watching for people who bounce around nervously, looking all around them for escape routes or police (many armed robberies in gas stations are like this). But, will the software be able to tell that from someone who really has to use the bathroom, and is bouncing up and down impatiently, searching around the room for the nearest restroom? I think not.

      I admire the optimism, though.
    • The trouble is, humans are inefficient and expensive, and their "gut instincts" may be fallible.

      The trouble is that people have too much confidence in the efficiency and infallibility of machines. A department store security guard that suspects you of being a shoplifter might be annoying, but he can't do anything until you actually shoplift.

      Also, these kinds of machine vision applications are almost impossible to validate. Where do you get the training data from? How do you measure false alarm rate? Most likely, they will have to get trained by some person's judgement of what looks suspicious, which merely enshrines a fallible human judgment into perpetuity, inexactly at that.

      The potential for false alarms is enormous. If you have some disability, carry a heavy package in an unusual way, or wear some strange outfit, this system is likely going to tag you as suspicious. Video cameras and computers have nowhere near the reasoning ability to figure out what is going on, or the resolution to even see the necessary details if they could.

  • Law enforcement is increasingly going to video surveillance nowadays. I've seen a History Channel special on streetlights (nothing else was on), and they mentioned that many big cities are using video cameras to catch people running red lights. One video camera would take a picture of the driver and another would take a picture of the license plate. Then the owner of the car would get a ticket in the mail.

    Also, London is filled with tens of thousands of video cameras, and now they all have face recognition software, so they can see a criminal and follow him through various areas of the city on camera until a cop can catch up to him.

    And then there's this story [slashdot.org] about Connecticut doing the roughly the same thing.
  • better link (Score:3, Informative)

    by mshurpik ( 198339 ) on Monday April 22, 2002 @07:19AM (#3386523)
    This article [mit.edu] is much more in-depth and does a better job of representing the technology. The article posted to Slashdot implies that Cromatica can predict a mugging. Cromatica identifies congestion and predicts suicide attempts. And it does this with pretty simple algorithms.

    Briefly: Cromatica views crowds as changing colors against a background. When the colors stop, this is congestion. Likewise, suicide attempts are indicated by lingering for 10 minutes or more. It's pretty easy to identify a single person against an empty backdrop.

    Of course, people are working on predicting muggings, and the article goes into that as well.

    The article also has links to the research itself.
  • We searched you because the computer told us to

    It is being done on casual basis by police around the world for personal preventive searches and car searches. For the time being it is "the trained operator told us so" instead of computer. And to be honest I would rather have a computer decice then some cops. It will be less racially and ethnically biased

    Statistics from observing policemen in some US states and the number of blacks and whites they stop for checks and searches are well known, no point in reiterating them...

    • Statistics from observing policemen in some US states and the number of blacks and whites they stop for checks and searches are well known, no point in reiterating them...

      Well, the accusations are well known. Then the US Justice Department got New Jersey to "agree" to actually commission a study of the issue, in a consent decree.

      The company hired to do the study found that the incidence of speeding varied by race. In a way fairly consistent with the stop ratio.

      The Justice department was outraged, has "grave doubts", etc. because that isn't what they wanted to find.

  • A company my company recently required a portion of, Nestor [nestor.com], operates Nestor Traffic Systems (NTS) [nestor.com].

    NTS uses real-time video and neural-network technology at traffic intersections and railroad crossings to predict traffic accidents and enforce traffic violations (bad news for you guys who blow red lights) - kind of similar to the situation in the article...instead of predicting the actions of people, it's predicting the actions of automobiles. There are already many deployments nationwide and lots more being installed.

    BTW, the same predictive neural network technology is used to predict all types of financial fraud, including credit card fraud and money laundering.

  • This is creepy, people, yes even us programmers and tech types amaze me, I guess what I mean is I cant belive the arrogance of people that think they can devise a machine that tells someone what is in the heart and intentions of a man, just from looking at him nonetheless. Someone needs to get ahhold of the "geniuses" that are working on this and give then a good ass kicking, just for attempting to be so naieve.
  • Likely motivations (Score:3, Interesting)

    by GodSpiral ( 167039 ) on Monday April 22, 2002 @07:58AM (#3386602)
    Most likely reasons for comming up with this device:

    1. Its a complete scam. They can't get facial recognition to work, so they've moved to new BS that doesn't yet have a bunch of defrauded users telling the marks that its crap.

    2. A source for independent and arbitrary racism. Its no longer racist to search you for looking black. The computer has determined that you should move along.

  • Lawsuits (Score:4, Interesting)

    by chill ( 34294 ) on Monday April 22, 2002 @07:59AM (#3386603) Journal
    I can see it now -- "The police KNEW my wife was going to get mugged, but didn't stop it. Therefore, I am suing for $10 Million on the grounds of negligence."
  • by Anonymous Coward
    So when robbers learn to adapt their pattern of dress and behavior when they go out on the streets to mug people, and say, start dressing and shuffling about as old ladies, the police will start arresting old ladies on the street because the computers told them they fit the behavior patterns of robbers? :)

  • Quick and the Dead (Score:2, Interesting)

    by Mulletproof ( 513805 )
    Of course, all this technology assumes that humans don't have the ability to adapt their behavior patterns when performing a crime. The stupid ones will get caught while the smart ones learn what trips the system to "track" suspects and endevour to avoid those actions. True for nearly any aspect of life; from hacking to shooting rockets into space.

    But the point with face recognition would truely be a kicker. Once that system acually becomes reliable, anybody with a record notorius enough to have their face mapped would be tracked the moment they entered a store. Assuming you can't obscure your likeness in someway, of course.
  • This is just another exampel of how ludicrous some company's "busines plans" are, and how willing the government is to spend money on things like this.

    Right, this company out of nowhere can suddenly predict human behavior? Humans in large groups?

    This is akin to the millions of dollars that CA just needlessly spent on Oracle licenses -- it's an example of some government flunky with a budget picking up some snake oil from an overzealous salesperson.

    Anyone who claims they can "mathematically predict" human behavior is lying through his or her respective teeth.
  • 666... yeah, that's right, it's not he machines top be concerned about, it's the human beasts to watch out for. From the programmers to the cops to the potential muggers and victims...

    The machines are just a good excuse and distraction for that beasty point, via reflection.
  • to cut out all the bullshit.
  • hopefully they didn't test this software on the typical soap opera... which was probably written by "plot writer version 1.0" anyway.

    I can see it now: "The camera predicts that the person on screen will turn out to be the long lost, transgender half-brother of the amnesiatic ex-stripper, and that he will marry the heiress to the papaya plantation..."

"Money is the root of all money." -- the moving finger

Working...