Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security IT

Video Surveillance Identifies Threat Patterns 132

Ponca City, We Love You writes "When the 2008 Olympic Games kick off in Beijing next year, organizers will be using a sophisticated computer system to scan video images of city streets looking for everything from troublemakers to terrorists. The IBM system, called the Smart Surveillance System, uses analytic tools to index digital video recordings and then issue real-time alerts when certain patterns are detected. It can be used to warn security guards when someone has entered a secure area or keep track of cars coming in and out of a parking lot. The system can also search through old event data to find patterns that can be used to enable new security strategies and identify potential vulnerabilities. IBM is also developing a similar surveillance system for lower Manhattan, but has not yet begun deploying that project. "Physical security and IT security are starting to come together," says Julie Donahue, vice president of security and privacy services with IBM. "A lot of the guys I'm meeting on the IT side are just starting to get involved on the physical side.""
This discussion has been archived. No new comments can be posted.

Video Surveillance Identifies Threat Patterns

Comments Filter:
  • What we all need (Score:3, Insightful)

    by Xiph ( 723935 ) on Sunday December 09, 2007 @12:36PM (#21632213)
    Ahh, finally more survaillance, and computers to monitor the cameras.

    Pattern recognition to identify threats, before trouble occurs.
    Soon come the day when, we can finally arrest people, before they realise that they're going to do something criminal.
    • by Anonymous Coward on Sunday December 09, 2007 @12:42PM (#21632261)

      Soon come the day when, we can finally arrest people, before they realise that they're going to do something criminal.
      Sounds like a cool movie plot. I think a good name for it would be The African American Report or maybe The Hispanic Report. You know, something like that.
    • by phantomcircuit ( 938963 ) on Sunday December 09, 2007 @12:44PM (#21632283) Homepage
      Lets be real here.

      The police try to find patterns in activity already, but it is far less effective for a relatively small group of people to look for patterns than it is for a computer with many cameras.

      This is exactly what is already happening but faster.

      When you are in public, you are in public.
      • Re: (Score:2, Insightful)

        by EaglemanBSA ( 950534 )
        ...and on some level, that's called profiling, and it's illegal. The grey area here is a wide one with steep, slippery slopes. I'd like to think we have the capacity to exist in a world where I don't have to be inside my home in order to not be on camera, but I think too many people mistake surveillance for safety.
        • Just wanted to be clear, I'm well aware that this is in Beijing, and that profiling may not be illegal there (like it is where I sit). I'm simply referring to general principle, not local law.
        • Re: (Score:3, Insightful)

          by RagManX ( 258563 )
          WTF? How is profiling illegal? You have car insurance? Your rates are determined by profiling. Did you/are you/will you go to college? Admissions are based on profiling. Have you ever been asked to make a donation to some organization? You were probably selected by profiling. Have you ever taken prescription medicine? The medicine most likely to be effective for you was determined by profiling.

          Yes, there are cases where profiling is illegal. But in and of itself, profiling is *not* illegal. At le
          • Re: (Score:3, Insightful)

            by EaglemanBSA ( 950534 )
            I wasn't referring to selling insurance, I was referring to being investigated/arrested for bits of data picked off automatically by a computer. Of course, profiling is useful for determining insurance premiums and whether or not you're MIT material, but when it comes to determining whether or not you've broken the law, it's a different story.
            • I wasn't referring to selling insurance, I was referring to being investigated/arrested for bits of data picked off automatically by a computer. Of course, profiling is useful for determining insurance premiums and whether or not you're MIT material, but when it comes to determining whether or not you've broken the law, it's a different story.

              Most warrants to arrest come from bits of data in a computer. I think what you are meaning to say is that it's illegal to arrest someone based on bullshit.

              In oth
            • You have perfectly identified the slippery slope.

              Gathering evidence that would be readily available to a police officer in the same place as the camera is not profiling.

              The problem occurs when people start to be harassed and/or arrested because they do not fit the normal patterns of society, which is precisely what these computer systems are determining.
              • "The problem occurs when people start to be harassed and/or arrested because they do not fit the normal patterns of society, which is precisely what these computer systems are determining."

                BINGO!

                I started seriously shooting night photos about 1981, and until "911" I never got looked at twice by the police in my late night/early morning photo forays.

                But since that group mind fuck day, I've been harassed by the police "just doing my job", because some room temperature IQ citizen thought I was "suspicious"

        • Inside your home you may still be on camera, if you left your drapes open.

          o\\\ Be seeing you.

      • by rhizome ( 115711 )
        The police try to find patterns in activity already, but it is far less effective for a relatively small group of people to look for patterns than it is for a computer with many cameras.

        Do you have a citation for this assertion? More generally, any information that supports your contention that computers are better than people at recognizing crime patterns in real-time. Even more generally, that computers are effective at this at all.
        • by foobsr ( 693224 )
          Even more generally, that computers are effective at this at all.

          I suspect research comes under the guise " Pedestrian Detection". Add/substitute 'moving vehicle', 'target identification', associate 'Artificial Intelligence for Homeland Security', google for the profile of a 'Fei-Yue Wang' and probably identify an emergent pattern.

          CC.
      • I wonder how long the video footage will be archived for. You could have a relatively small number of people go on a fishing expedition for a long time after the Olympic Games are concluded, looking for "anti-State" activities in the populace. Not that a nation like China, renowned for its human rights record, would ever do anything like that...

      • When you are in public, you are in public.


        That notion can go fuck itself in public; the issue is not that black and white, and you know it. Maybe what people are proposing is, get this: we need to redefine what 'public' is. Why should my right to privacy only exist within the square footage of real estate that I can afford to rent or own? Why should my right to privacy be limited if my private acts don't cause you any harm?

        • by badasscat ( 563442 ) <basscadet75&yahoo,com> on Sunday December 09, 2007 @02:44PM (#21633189)
          Maybe what people are proposing is, get this: we need to redefine what 'public' is.

          I think the problem is we are redefining what public is.

          20 years ago, there was no expectation whatsoever that being in "public" meant your every move would be tracked by government officials potentially hundreds of miles away, and then stored for all time. That's not what "public" meant. People had an expectation that yes, anybody who was around you could potentially be watching you, but that kept it a relatively level playing field because you could pretty easily identify any threats to your privacy and avoid them if you like. If you were walking down an empty side street and needed to quickly adjust your belt because your pants were too loose, you could look around and do so without fear that cops are watching ready to jump you for "reaching for a concealed explosive" or even "intent to expose oneself in public" or whatever other nonsense law they can come up with.

          That is the expectation we have always had for what "public" means - yes, you can be watched, but only by those around you, and that means that you can easily watch them back. Being able to be watched - and recorded - by someone many miles away is not what "public" means to me or anybody else. That's an intrusion, just like any other. You are being watched by people who are not there. And you have no idea what they're thinking or doing, even while they can watch your every move. It's a completely one-sided relationship where the other side has all the power. That's scary. And it's the exact opposite of what "being in public" is all about.

          We don't need to redefine what public means, we need to take back its original meaning. Nobody should be allowed to watch a space that they do not own (ie. a public space) without being physically present.
          • by rwyoder ( 759998 )
            Dammit, I'd like to mod you up, but I don't have nay mod points today!!!
          • Interesting idea. Though I think we've always had, and still have the expectation that in public we can be observed by anyone; whether they are visible or not. The difference now is that people are fearing what the government are going to do with that information.

            They say that if you're doing nothing wrong, you've got nothing to fear, but everyone does something wrong (e.g. downloading music). With the surveillance nation, you could theoretically be caught for wrongdoings that you didn't even realize were w
            • They say that if you're doing nothing wrong, you've got nothing to fear, but everyone does something wrong (e.g. downloading music). With the surveillance nation, you could theoretically be caught for wrongdoings that you didn't even realize were wrong. Ignorance isn't a defense in law, but to know every stupid little law is impossible.

              The solution to have "stupid little law[s]" is not the have selective enforcement but rather is to have those laws changed!

              Selective enforcement of laws is a method that oppressive governments use. Very strict laws are put on the books with a non-written promise that they will only be used on the "bad guys". The problem is that the government has now given itself nearly limitless power to arrest those who it chooses.

              • Indeed. The problem is that our governments are passing these laws under the veil of anti-terrorism, but it's putting the whole of society in fear; just what the terrorists want.
          • I absolutely agree. While there has always been a certain level of expected privacy even while being in public, there are people now claiming that there was never any such thing. So what I meant was, we need to use legislature to define that "public" doesn't mean everybody's business is completely open to everybody else. We need to reestablish and update the law to reaffirm in non-vague terms that certain aspects of one's life are private, even when out and about in public. Will people figure that out be
          • by sean4u ( 981418 )

            You are being watched by people who are not there. And you have no idea what they're thinking or doing, even while they can watch your every move. It's a completely one-sided relationship where the other side has all the power. That's
            pr0n!

            Nobody should be allowed to watch a space that they do not own (ie. a public space) without being physically present.
            No pr0n? WTF are you doing on /.?
      • Re: (Score:3, Insightful)

        by robably ( 1044462 )

        When you are in public, you are in public.

        When being in public entails having your every move watched and recorded and profiled, that's more like being on private property, or a prison.

        This is exactly what is already happening but faster.

        Beyond a certain point, making something bigger or faster or stronger in just one aspect pushes it over a line to where it becomes something very different - spin a propeller and it turns around, spin it fast enough and you suddenly have powered flight. The connection of

      • Comment removed based on user account deletion
      • "When you are in public, you are in public." should not equal "When you are public, you are presumed to have criminal intent." This is yet another symptom of the growing perceptual gap between the police and the community they are supposed to "protect and serve". There are new stories every day about the effects of the increased militarization of the civilian police forces. Some of the stories are about SWAT teams kicking in the wrong door and terrorizing and/or shooting innocent people in their own homes.
      • When the 2008 Olympic Games kick off in Beijing next year, organizers will be using a sophisticated computer system to scan video images of city streets looking for everything from troublemakers to terrorists.

        You know, all in all, I think I'd rather take my chances with the terrorists.

        Let's be honest here: this isn't about "terrorism" at all, but about those "troublemakers". For some reason, the leaders in this world seem to think there's going to be a great increase in the number of "troublemakers" in th

    • issue real-time alerts when certain patterns are detected. It can be used to warn security guards

      Oh come off it! This isn't Minority Report or crimethink. It's a way for security to monitor high traffic, high risk areas. It issues warning and allows security to prioritize their time and respond better to interruptions. When you're in these places, you're in public. You're not in the privacy of your own home or anything like that. You're on public streets. By going out into public you've already given up
      • by base3 ( 539820 ) on Sunday December 09, 2007 @01:10PM (#21632477)

        I would much rather have a surveillance system like the one the article describes in place than an armed national guard member on every corner.
        Ah, but the "armed national guard member" doesn't have a perfect memory and because of resource limitations, can't really exist on "every corner." But an "armed national guard member" can be dispatched to round people up either in real-time or after a review of the video. I'd rather that ubiquitous surveillance be as obvious as that, so that maybe the sheep get a little outraged and the "you don't have any right to privacy" and "if you've nothing to hide, you've nothing to fear" apologists don't end up getting the world they want.
        • I agree that any surveillance should not be surreptitious and in fact should be flat out obvious. Still I think something like this is a good idea, because the national guard member on every corner can exist in high traffic urban areas like the ones this system is being utilized in. The article makes no assumption that this is going to be ubiquitous and every move you make the government will have on file. People just automatically assume this is the first step to the police state. Well we may be headed the
          • the slippery slope is greased not with the application of technologies but with complacent attitudes of acceptance for incremental control.
          • By definition, a slippery slope is a series of incremental steps. Your argument that this step is incremental does not figure. The problem with this system is 1) the capability to add the tracking you say is not there is simple storage, something that is cheaper every day, is simple to add and is... incremental. 2) being programmable, it is capable of being used/abused for much more than its intended purpose with little control.

            But the big thing you say is that security is only being added in "high-risk" ar
      • by try_anything ( 880404 ) on Sunday December 09, 2007 @01:39PM (#21632673)

        When you're in these places, you're in public. You're not in the privacy of your own home or anything like that. You're on public streets. By going out into public you've already given up a certain amount of anonymity de facto.

        That makes perfect sense in the world of twenty years ago and the world we still mostly inhabit, but pervasive electronic surveillance threatens to change the meaning of statements like these. If you want to maintain the same rhetoric, make sure the words mean the same thing -- i.e., stop surveillance from de facto changing what it means to be in a public place versus a private place.

        If you accept that "public place" means "a place where a detailed, permanent record of every action is captured and archived by the government," then you should rethink whether we want to have any "public places" at all. By that definition, perhaps only congressional chambers, courtrooms, jail cells, and the immediate vicinity of police officers should be public places.
        • by dbc001 ( 541033 )
          You make an excellent point, and I agree - going forward, we really want to limit the number of "public places" where a detailed record of all events is kept by the government. Unfortunately, the reality is that we are rapidly heading down that path. We will most likely have to devise ways of defending our privacy in those public places. The idea of limiting the number of places where detailed records of our actions are kept is a fantasy.
      • by evought ( 709897 )

        As for me, when traveling through areas like Manhattan, I would much rather have a surveillance system like the one the article describes in place than an armed national guard member on every corner.

        Given the choice, I would rather have the armed guard. First, they are visible and obvious; there is a gun behind the camera just the same. Second, they actually have a chance to *interact* with people and learn about where they are working. It has a chance of becoming less of an "us-vs-them" thing as I stare at this gal's tits from the safety of my video screen and wonder about all of these "perps" walking around, and more of a "how can we work together to make this safe" thing. In an area I lived outside

        • Given the choice, I'd rather have open access to the cameras. If the material being recorded by these cameras is 'public', then let members of the public view it. We should use technology to make a level playing field. Armed guards may be more obvious, but open cameras are more useful.
  • China is a good testing ground for new surveillance tech... After all there is no illusion about there being no Big Brother. Then we're going to have it here (in Manhattan). Yup, we're still years ahead of China, aren't we?
    • by krycheq ( 836359 )
      But in China, the definition (or lack thereof) of troublemaker is what's so troubling... or maybe there's an inherent transparency in no transparency whatsoever? Being perfectly certain that there is no certainty is somewhat comforting but I don't know what's worse.

      • a) IBM sold a lot of punchcards machines to Nazi Germany to monitor the Jews [wikipedia.org]
        b) IBM sold a lot of video surveillance and face recognition hardware to China to monitor the "terrorists"

        Anyone else see a pattern here?
        • by zcat_NZ ( 267672 )
          In Apple's 1984 advert, the 'big brother' image was intended to represent IBM. How prophetic ;-)

    • 'Stupid software keeps on locking the cameras on the cathedral, starts screaming something about Fallen Gongs, or something fall and gone, then starts tasering the Priests and Nuns when they come out!'
  • by FooAtWFU ( 699187 ) on Sunday December 09, 2007 @12:44PM (#21632273) Homepage
    Ah, yes, I saw a video about this at my IBM internship back about two years ago. It was all internal/NDAed then. They showed the trails of people walking into and out of buildings, and cars zooming around parking lots, and neat things like that, even with lighting changes / moving trees blowing in the wind / other environmental visual noise. My internship project's team lead wanted us to try and exploit this for our project but it, ah, wasn't going to happen, and we did much less interesting things instead.

    From what I understand, though, there's a nontrivial amount of hardware involved to process the video, and though that may be less of an issue these days with better computers, I'm wondering just how many CPUs they will be throwing at how many different video cameras for this.

    And I'm sure it's imperfect and prone to false alarms and such, but that's why you put human beings behind it instead of machine guns, no?

    • by Cheesey ( 70139 )
      I do not understand why they bother doing image recognition when most people are already carrying wireless tracking devices [wikipedia.org]. Take away the need for image recognition and instead recognise people using the hardware addresses of the devices they carry, and the CPU requirements for surveillance become tiny.
      • Because "most people" aren't the problem. Potential threats would quickly learn to either modify their cell phone or just turn it off.

        Mod parent +5 Funny.
    • by Yvanhoe ( 564877 )

      And I'm sure it's imperfect and prone to false alarms and such, but that's why you put human beings behind it instead of machine guns, no?
      Of course, because human policemen never make errors that is why we use them.
    • Reminds me of IBM's "assistance" to the Nazis 70 years ago: IBM and the Nazis [wikipedia.org]


      The Nazis awarded IBM founder Thomas J. Watson the "Eagle with Star" medal, for IBM's assistance in keeping track of Jews and other "undesirables".
      Watson not only accepted the medal, but traveled to Germany so that Hitler could present it in person .
  • by LiquidCoooled ( 634315 ) on Sunday December 09, 2007 @12:44PM (#21632281) Homepage Journal
    The recognition logic is fairly simple:

    if (hoodie || foreign) police.respondto(camera.location);
    • by imasu ( 1008081 )
      "Police".  Hah.  As if.

      More like,

      if(hoodie || !white || hair.length >= stallman.hair.length)
      {
          UAV.attack();
      }
      • by imasu ( 1008081 )
        In all seriousness, this has the potential for racial profiling written all over it. An audit of the eventual deployed system would probably be very interesting, and equally unlikely for "security reasons".
    • by ross.w ( 87751 )
      remember this is for the Beijing olympics...

      if (Falun Gong || Human rights protestor) police.respondto(camera.location);
  • Good *old* IBM (Score:4, Insightful)

    by Anonymous Coward on Sunday December 09, 2007 @12:45PM (#21632293)
    It's just like the old days, IBM looking for ways to "enhance security" [ibmandtheholocaust.com] and help the good old boys at the Department of Homeland Security (or, as the Germans called it, Schutzstaffel (S.S.)).

    The important thing is, just like they had no idea their technology was helping make the holocaust more efficient and were just making a buck, it's completely unimaginable that the Chinese might continue to use it to crack down on dissidents afterwards.
    • by widman ( 1107617 )
      I beg to differ. They are well aware how this technology will be used besides the Olympics. And in my opinion, anyone working in a monstrosity like aware of the consequences is responsible too. Nobody coded this under a life threat. If people get in prision and tortured you are responsible. You-are-responsible. This work is specifically designed to impose fascism (go dig the dictionary if you don't agree.) This is evil.
      • I beg to differ. They are well aware how this technology will be used besides the Olympics

        I think you missed the point there. Did you not detect even a trace of sarcasm in the GP post? He was saying that IBM knew full well what the Nazis were going to use the technology for - just like they know today how it is going to be used in China. The whole point was that IBM was a Nazi collaborator in WWII.

  • As anyone who has played Metal Gear Solid 2 knows [wikipedia.org], S3 is a baaaad thing.
  • "Physical security and IT security are stating to come together," says Julie Donahue, vice president of security and privacy services with IBM. "A lot of the guys I'm meeting on the IT side are just starting to get involved on the physical side.""

    That sounds as if she meant that IT staff started going to the gym - surely this recent datacenter break-in would look different then, just imagine the wrestler-looking sysadmin throwing office chairs and rack servers at the thieves...

  • by jacquesm ( 154384 ) <jNO@SPAMww.com> on Sunday December 09, 2007 @12:52PM (#21632329) Homepage
    On how many real life acts-of-terroris-in-the-making have been uncovered using cameras like this ? Iirc the only use they were in London was that *after* the bombings it was still possible to see what the bombers had looked like.
    • Re: (Score:3, Insightful)

      Did the article make a point of saying this was an anti-terror tool specifically? No, because it has a very real application. Instead of fighting whatever fantasy threats politicians throw at us, this is designed to curtail the very real problems of mugging, assault, theft, etc., that occur in high traffic urban areas.

      Guys security is good. Raping the constitution, disregarding human rights, and doing a number of other unsavory things to attempt to get it isn't. However, something as common sense as this
    • That's the only purpose of cameras in general. You don't put a security camera in a store because you think it will magically stop people from stealing - you put it in so that people who steal from you can be caught afterwards.
      • by PPH ( 736903 )
        The person with the bomb strapped to themselves isn't terribly concerned with being caught afterwards.

        In fact, they are relatively certain to be caught .... by a bunch of people pushing brooms around the blast scene.

    • The primary stated purpose of surveillance cameras is to deter [slashdot.org] crime.

      - RG>
    • On how many real life acts-of-terroris-in-the-making have been uncovered using cameras like this?
      None. Privacy groups cried foul, and no politician in any position of real power dared to try the technology. I believe there was some question of statistics, and the lack of testing in real-life situations.
  • Never forget (Score:1, Informative)

    by awitod ( 453754 )
    If it weren't for the automation provided by IBM to the Third Reich, the Nazis would not have been able to keep tabs on and slaughter so many people. http://www.ibmandtheholocaust.com/ [ibmandtheholocaust.com]

    'Do no evil.' isn't a motto IBM has, or ever will, adopt.
    • by awitod ( 453754 )
      Flamebait? You must be joking. I see several others have now made the same point.
    • So IBM's bad? Later on, IBM will be good, eh?

      Hate to burst your bubble, but companies are amoral. They have one sole priority: making money for their shareholders.

      If that means selling high quality computers at good prices, they will do it. That also means, if theres a need from another country to do XYZ job for gobs of money, so be it.

      It is our choice of customers to choose who or who not to associate with.
      • Not really. Customers have surprising little influence on corporate behavior even when they bother to exert any.

        Put it this way: publicly-held corporations (like IBM) do not operate in a power vacuum. They are a shadowy reflection of the ethical and moral standards of their shareholders, who are the only ones that have the power to tell upper management to stop doing something. As our society has increasingly begun to suffer what many term "moral decay", it's to be expected that the corporations beholden
  • I'm sure you will all remember how the computer made for categorizing, keeping track of, and determining the fates of all those in Nazi control was in fact an IBM machine. There is a picture with the head of IBM sitting at a table with Hitler conferring on the computer design.

    So now IBM is in cohorts with the militaristic China to determine people terrorists from a far-away camera through no human logic, just 0's and 1's again. And yes, the Manhattan project has been in the works for a long time, it is alre
  • Seriously, isn't this the kind of unbiased, behavior-based surveillance that we should be encouraging? The alternatives are (1) no surveillance in crowded, high profile events or (2) surveillance by humans with their weird biases about race, dress, headgear, etc.

    No surveillance carries risk, human surveillance carries risk, and computerized surveillance carries risk. It just depends on which risks you are comfortable with.

    RR
    • Re: (Score:3, Insightful)

      by DarkOx ( 621550 )
      Don't forget its humans deciding what "patterns" are suspect in the first place and its that the machines will be searching for, this does nothing to reduce human bias. It might even enhance it, given that a small group of people will likely be tasked with developing those patterns, as a opposed to a much larger group of independantly(or at least more so) minded security personal.

      If a human gets it wrong, with some luck hopefully his partner, or commander may get it right and make a better choice. No it d
    • Given people make small mistakes, some of which may break law

      Would you rather have a human watch the area you are in, unable to notice every tiny detail or an infallible camera/computer system that notes every transgression regardless of severity?

      I, for one, welcome our weirdly biased human law enforcement officers.
  • by nbauman ( 624611 ) on Sunday December 09, 2007 @01:25PM (#21632599) Homepage Journal

    a sophisticated computer system to scan video images of city streets looking for everything from troublemakers to terrorists. The IBM system, called the Smart Surveillance System, uses analytic tools to index digital video recordings and then issue real-time alerts when certain patterns are detected.
    IBM's computer scientists must be getting paid quite a bit to endure the humiliation of making claims that every knowledgeable person knows are false.
    These systems have been tested before, particularly in England, where Thatcher's government paid a shitload of money that could have been used for something useful, and the only useful thing they got out of it was well-designed studies that demonstrated that these screening systems don't work.
    Here in Manhattan, we had a video monitoring system set up in the labyrnthine Columbus Circle subway station for a couple of years. It also had no effect on crime. (Nor did it have any effect on the cops beating up innocent people, who happened to be black.) The City took money that could have paid for more police (hopefully honest ones) and spent it on video toys instead. Duh.
    Now we're getting these digital cameras all over NYC -- even though we have good data from England, from our own pilot programs, from the Atlanta Olympics, and elsewhere, that they don't do what their promoters claim. What it demonstrates is that a huckster can sell hundreds of millions of dollars worth of useless digital junk to unscrupulous politicians accountable to a hysterical public and campaign contributors as long as it has blinking LEDs and they say the magic word "terrorism."
    I challenge anyone to cite any scientific evidence, any pilot program -- not some security "expert"'s opinion -- that there are any computer "patterns" that can identify "troublemakers" or "terrorists".
    Stop and think. The London suicide bombers walked on the subway with backpacks full of explosives. Innocent people go about their business on the subway all the time wearing backpacks. What pattern is there that a digital camera could spot?
    The only good news in this story is that we Americans are finally ripping off the Chinese for a couple of hundred million dollars, which is good for the balance of trade. This is known in economics as the broken window fallacy.
    Maybe we could sell them the Brooklyn Bridge too -- oh, wait, they already own it.
    • by Cheesey ( 70139 )
      I think the image recognition bit is definitely bullshit. But the data mining aspects are not. This technology is already widely used by shops to predict what people are likely to buy, and IBM is part of that business. Privacy and digital police state implications aside, why not also use it to spot unusual behaviour?

      Suppose you drop the image recognition part and instead recognise people using personally identifiable information that is captured wirelessly from their mobile phones and RFID. That would give
      • by nbauman ( 624611 )

        At this time, we just don't know how effectively the resulting data could be searched for unusual behaviour. Nothing of this scale has been done yet.

        That's my point.

        But it could work, at least in principle, because all of the technology issues involved have already been solved for other problems.

        I don't agree that it could work in principle, with any technology that we have now or in the forseeable future. What evidence do you have? I put out a call for evidence.
        The one thing that otherwise-intelligent techies miss is that computer-level technology problems are easy (given a blank check). It's the other problems that are hard, such as: how do you tell whether somebody is a terrorist? How do you tell from watching him in a crowd?

        The remaining issue is how well you can automatically distinguish between a terrorist and a regular person when all you know about each is everywhere they have been in the last few years.

        Well, yeah. That's the problem. If you discover a

        • by Cheesey ( 70139 )
          I think it could work because data mining works in other places, e.g. for analysing shopping patterns. Provided that you can capture the data in the first place, why isn't it possible to extend an existing model of what someone is likely to buy to cover the other things they might do?

          You have much more information about each person in a crowd than a CCTV picture. You also have data about everything else they have done. That's how the hypothetical surveillance system works: it doesn't detect terrorists, it d
          • by nbauman ( 624611 )
            My question is, can you cite a published study where it's worked on applications like security? The answer is no.
      • At this time, we just don't know how effectively the resulting data could be searched for unusual behaviour. Nothing of this scale has been done yet. But it could work, at least in principle, because all of the technology issues involved have already been solved for other problems. The remaining issue is how well you can automatically distinguish between a terrorist and a regular person when all you know about each is everywhere they have been in the last few years. Being flagged with a false positive could prove rather inconvenient.

        This is the big problem. Terrorists are actually quite rare. There is therefore very little information to input on them and most of it is likely to be statistical anomaly. There was an AI test at one point getting a piece of software to recognize images with tanks in them. They had a relatively small training set but the software did really well with it. Hit it with some real data and it got essentially random results. Why? In the test data, the photos with tanks and without tanks were shot on separate da

    • by MobyDisk ( 75490 )
      This is part of a general problem where the government researches to see if something is a good idea, then deploys it without paying attention to the results. That is why we have electronic voting machines, RFID passports, and 3-ounce limits on liquids on airplane flights.
      • by nbauman ( 624611 )
        Right. The problem as I see it is government policy-makers who either don't understand or don't care about the scientific method. Search Google for "Chris Mooney".
  • One of these days a hacker will get into such a system, find all images of people picking their nose, compile that together, and post it on youtube. At that point, then people will start thinking about the Big Brother implications.
    • by moxley ( 895517 )
      See, I always thought it would be when black bag jobs are done on people and people are arrested and sent to secret torture prisons for no just reason....But yeah, I guess boogers are the real threat here.
  • This is the kind of backup that Jack Bauer gets from CTU.
  • "I was at the Kennedy School (of Government at Harvard University) a couple of weeks ago, and some guy got up and said, "If there's a security incident at the Beijing Olympics, it's going to change the course of capitalism forever," and I'm like, 'Oh man!'" Donahue said.

    I don't know which part of that quote from the NYT article disturbs me more.

  • There is a scottish company that has been doing this sort of video analytics for years. Here is their website if you want to check it out: http://indigovision.com/ [indigovision.com]

    They are in fact the only supplier that has delivered fully digital IP-CCTV for Casinos in United States. Casinos tend to be quite picky when it comes to surveillance. IndigoVision also did the Olympics in Athens etc. I do not work for them, but I have lived in Scotland and are aware of their business.

    The also technology similar to IBM for detecti
  • Irresistible racist joke here. Please forgive me.
    • by enoz ( 1181117 )
      I think "slanty eyes" is the wrong racist stereotype for this story, you hairy-tea-towel-wearing-goathugger.

  • As someone who works in this industry, this is nothing new. Others [coe.co.uk] have [objectvideo.com] done [citilog.fr] it [ioimage.com] for years now (notice that all the companies linked here are based in different countries), but gotta love IBM for taking credit for something which they neither invented nor perfected.
  • Ever since I first got wind of Real ID, I've been predicting a system would eventually track every citizen in the country, enter their daily activity into a central database, then use the data to rate your activity based on your averages and finally flag you as a potential threat, alerting authorities to keep an eye on you more closely.

    This sounds kind of like the early stages of such a system, except that it doesn't immediately know who you are.
  • Video Surveillance Identifies Threat Patterns

    Maybe they could save some money on expensive computer hardware and use some of those picture-sorting dogs from the next story.
  • No doubt that many special-interest groups will want to disrupt the games to showcase their political agenda.
  • IBM selling surveillance equipment [ibmandtheholocaust.com] to oppressive governments?

  • I think its a mistake to try and use a computer system to try and prevent terrorism in this way. its hard to differentiate from valid and suspicious behavior. Sounds like a disaster waiting to happen.
  • Its only object regognition programmed to match any black, spherical object, approximately 12 inches in diameter with a burning piece of string protruding from the top.
  • Why pay IBM? I mean he's got the last Precog...

The last person that quit or was fired will be held responsible for everything that goes wrong -- until the next person quits or is fired.

Working...