Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy Security Government News Politics

Anti-Terrorist Data Mining Doesn't Work Very Well 163

Presto Vivace and others sent us this CNet report on a just-released NRC report coming to the conclusion, which will surprise no one here, that data mining doesn't work very well. It's all those darn false positives. The submitter adds, "Any chance we could go back to probable cause?" "A report scheduled to be released on Tuesday by the National Research Council, which has been years in the making, concludes that automated identification of terrorists through data mining or any other mechanism 'is neither feasible as an objective nor desirable as a goal of technology development efforts.' Inevitable false positives will result in 'ordinary, law-abiding citizens and businesses' being incorrectly flagged as suspects. The whopping 352-page report, called 'Protecting Individual Privacy in the Struggle Against Terrorists,' amounts to [be] at least a partial repudiation of the Defense Department's controversial data-mining program called Total Information Awareness, which was limited by Congress in 2003."
This discussion has been archived. No new comments can be posted.

Anti-Terrorist Data Mining Doesn't Work Very Well

Comments Filter:
  • Bets....? (Score:5, Insightful)

    by Gat0r30y ( 957941 ) on Tuesday October 07, 2008 @02:06PM (#25290439) Homepage Journal
    I bet this will not change what they are doing or how they are doing it one bit.
    • Re: (Score:3, Interesting)

      Of course not. And neither major-party presidential hopeful is going to change it, either. We're still going to get stupid hassles from the TSA, we're still going to get the watch list filled with pointless entries based on the name of someone who might have been seen with someone who was linked to someone who claimed to have been involved in a shooting in North Ireland.

      I would seriously consider voting for either one that came forward and promised to cut TSA's authority and streamline the process, gettin

      • probably cause it a lot cheaper than data-mining.
      • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Tuesday October 07, 2008 @02:24PM (#25290673)

        I would seriously consider voting for either one that came forward and promised to cut TSA's authority and streamline the process, getting back to only those people who are basically confirmed problems being on the list, no matter what their views might be on Iraq, Afghanistan, the economy, or offshore drilling.

        Vote for me.

        I'd take their "no fly" list and identify every single person on it who was a legitimate threat and either have them under 24 hour surveillance or arrested.

        The mere concept of a list of names of people who are too "dangerous" to let fly ... but not dangerous enough to track ... that just fucking stupid.

        Think about how many people could be killed in the airport terminal itself WITHOUT getting on a plane ... say during the Thanksgiving or Christmas rushes there.

        What idiot would let the people on that list (if they were really a threat) into a terminal? Wouldn't you expect them to STOP them BEFORE they get into a position to do that kind of damage?

        • by Fulcrum of Evil ( 560260 ) on Tuesday October 07, 2008 @02:31PM (#25290757)
          The no fly list doesn't identify people, just names, and it's very exact, so changing charles to chuck will defeat it. The upshot is that it's utterly useless for stopping bad guys, so you can't even identify who's on there - John Smith is on the list, but there are 10,000 of them.
          • The no fly list doesn't identify people, just names, and it's very exact, so changing charles to chuck will defeat it.

            The FCC has the seven words that can not be broadcast over the air.
            The TSA has the million names that can not be flown in the air.
            The FCC really lost that competition.

          • by Geoffrey.landis ( 926948 ) on Tuesday October 07, 2008 @03:16PM (#25291263) Homepage

            The no fly list doesn't identify people, just names, and it's very exact, so changing charles to chuck will defeat it.

            No, actually it won't. The newspapers are full of stories of people who were detained or forbidden from flying because their name was similar to a name on the list, or a nickname of a name on the list, or a possible alternative spelling of a name on the list, or names that had once been used as an alias of names on the list.

            for example, the name "T. Kennedy" was on the list. Senator Edward Kennedy (whose name does not begin with "T", but who is nicknamed "Teddy") was stopped:
            from Wikipedia [wikipedia.org]

            "In August 2004, Senator Ted Kennedy (D-MA) told a Senate Judiciary Committee discussing the No Fly List that he had appeared on the list and had been repeatedly delayed at airports. He said it had taken him three weeks of appeals directly to Homeland Security Secretary Tom Ridge to have him removed from the list. Kennedy said he was eventually told that the name "T Kennedy" was added to the list because it was once used as an alias of a suspected terrorist. There are an estimated 7,000 American men whose legal names correspond to "T Kennedy". (Senator Kennedy, whose first name is Edward and for whom "Ted" is only a nickname, would not be one of them.)"

            • by lgw ( 121541 ) on Tuesday October 07, 2008 @03:36PM (#25291499) Journal

              Right, but flying under your middle name does work. As does claiming that you lost your ID (but if you refuse to show it on principle, you can't fly). As does using one boarding pass with matching ID at security, and a different boarding pass with matching ID at the gate.

              The realy sad thing is, the people who the government feels are a real threat based on strong intelligence are *not* on the no-fly list! The government doesn't want to reveal to the real suspects that their being watched.

            • Boneheaded security guard:

              tee? Sounds like ee.
              No fly fur u.

              Being a security guard, so easy even a caveman can do it.

            • Re: (Score:3, Interesting)

              by kalirion ( 728907 )

              What would happen if terrorists got nicknames after all major U.S. and U.K. political figures.

              • What would happen if terrorists got nicknames after all major U.S. and U.K. political figures.

                I'd say an insert error (duplicate key) or a hash collision.

            • It probably uses Google search engine or similar. Tell TSA-guy you're German and the "'s are part of your name.

              At $8/hr TSA-guy isn't paid to think.

            • by Free the Cowards ( 1280296 ) on Tuesday October 07, 2008 @04:48PM (#25292473)

              It doesn't matter, because the only place where you have to get your ID checked is at the TSA checkpoint, and they don't check it against any databases.

              So, the easy recipe for bypassing the no-fly list is:

              1. Purchase tickets in a fake name.
              2. Check in at home before your flight, and print your boarding pass on your home printer.
              3. Using any number of techniques which are trivial to the computer literate, capture that boarding pass, alter it to match your real name, and print a second copy.
              4. When you arrive at the airport, go straight to the security checkpoint.
              5. Use the altered pass with your real name in combination with your real ID to get through security.
              6. Use the original, non-altered pass to board the plane.

              I flew as recently as last month and was not subjected to anything which would defeat this scheme. It fails if you need to check luggage, but I doubt a terrorist is going to be doing that. The no-fly list is such an obvious joke.

        • by mcgrew ( 92797 ) *

          Wouldn't you expect them to STOP them BEFORE they get into a position to do that kind of damage?

          Perhaps in Soviet Russia or Nazi Germany, but in the "land of the free and the home of the brave" that just ain't how we're supposed to do things. You wait until BillyBob steals your cow and rapes your horse THEN you string him up! Damned horse-theivin' cattle rustlers.

          But I guess we are no longer the land of the free, nor the home of the brave. Yellow ribbon? Yellow alert? "Yellow" used to mean "cowardly". I'm

          • Yellow ribbon? Yellow alert? "Yellow" used to mean "cowardly"

            And yellow ribbons were emblematic of the US Cavalry, yellow lights are used as warnings, etc.

            "Yellow" means cowardly. Except when it doesn't. And it doesn't, more often than it does, when the word is used to modify another word (except that "yellow belly" still means cowardice).

        • by xant ( 99438 )

          I'd take their "no fly" list and identify every single person on it who was a legitimate threat and either have them under 24 hour surveillance or arrested.

          You might get a few votes from that. But your actual suggestion is impossible. The list hit a million names lately. Gonna investigate a million people? BTW, you have to investigate everyone in the world, not just people in the US.

          The list should be burned, along with everyone responsible for its creation. (Oops, I probably just went on the list. O

    • I bet this will not change what they are doing or how they are doing it one bit.

      Sadly... no. It means they need to do more of it, with even more control.

    • Re:Bets....? (Score:4, Interesting)

      by megamerican ( 1073936 ) on Tuesday October 07, 2008 @02:19PM (#25290605)

      I bet this will not change what they are doing or how they are doing it one bit.

      They'll be sure to change the amount of money spent on the program. I don't need to clarify whether it'll be more or less, its too obvious.

      Whenever something doesn't work in government it seems to get more money and more power.

      That leads me to think that maybe the primary function of government is to pretend to fail.

      • Re: (Score:3, Funny)

        by Gat0r30y ( 957941 )

        That leads me to think that maybe the primary function of government is to pretend to fail.

        Why would they need to pretend? They seem to be quite practiced at failing for real to me.

      • by rtb61 ( 674572 )

        It really all relates to political appointees, those who get their positions purely as a result of which political candidate they supported not on their skills and qualifications. This is not to be confused with skilled employees with in government who attained their position in much the same fashion as skilled employees in private industry. So the political appointees are always looking for those magic box solutions where the computer will do their job for them because they are not sufficiently competent

    • I wanted to mod parent but there's no, "depressing but true" choice.
    • Re: (Score:3, Insightful)

      by Shotgun ( 30919 )

      The question is, "What will you replace it with?"

      No, they will not listen when you say the obvious, which is "Get a real job."

    • Re: (Score:3, Insightful)

      by SecurityGuy ( 217807 )

      Your lack of faith is completely unwarranted. After all, when the polygraph was shown to be unreliable and thrown out as evidence of guilt...

      Right. Nevermind.

    • Obviously the Evil Terrorists(tm) got their hands on the report and altered it! We need to DO SOMETHING!(tm) Won't anyone think of the children?

  • I'm probably going to get creamed for this, but what is that image linked to?
    I'm young, get over it.
    -Taylor

  • In other news, (Score:4, Insightful)

    by toby ( 759 ) * on Tuesday October 07, 2008 @02:10PM (#25290479) Homepage Journal
    The Constitution is there for a reason.
  • by frank_adrian314159 ( 469671 ) on Tuesday October 07, 2008 @02:12PM (#25290509) Homepage

    In other news, water is wet, the Pope is Catholic, and Ursines excrete solid wastes in silviculture.

  • by Reality Master 201 ( 578873 ) on Tuesday October 07, 2008 @02:12PM (#25290513) Journal

    And several billion dollars.
    And unrestrained access to all of the personal information about everyone that can be gotten by whatever means.

    It'll probably still suck then, too.

  • Seems (Score:4, Insightful)

    by speroni ( 1258316 ) on Tuesday October 07, 2008 @02:12PM (#25290515) Homepage

    What we really need are spies. Not so much in the US, here good old fashioned detective work (with Warrants) should work.

    But over seas a standing army isn't going to do anything to quell terrorism. Tanks and plans will only inspire more terrorism. What we need are good old fashioned black ops. Undercover agents penetrating the terrorist groups and talking to the bad guys. Much less collateral damage as well.

    We'd get a lot further with a couple guys with silenced pistols rather than a whole army.

    • Re:Seems (Score:5, Interesting)

      by Martin Blank ( 154261 ) on Tuesday October 07, 2008 @02:17PM (#25290581) Homepage Journal

      I seem to recall that much of this was gutted by Congress in the 1990s when they didn't want intelligence operatives paying off criminals for information, on the risk that the money might be tied back to the United States. This severely nerfed the ability of the CIA (among others) to gather HUMINT, as paid informants were a significant source of the information required to infiltrate the groups in the first place. I don't recall if this was ever overturned, though.

      • Another thing which is holding back the CIA, etc. from infiltrating the fundamentalist Islamic terrorist groups is the lack of Arabic/Farsi/Urdhu speakers. There aren't enough teachers because people from those countries who want to migrate here to teach those languages are having a bugger of a time getting visas. It seems like that should be a priority for the government to make sure that people with those crucial skills are encouraged to come and have to deal with less bureaucracy--just expedite their b
      • As can be seen in the recent "terrorist" arrests in the US. Once you start paying people to turn in "terrorists", you start a market in "terrorists".

        So the guy who wants to sell a "terrorist" to the government finds some idiot who meets the basic criteria (non-Christian, non-white) and encourages that idiot to make inflammatory statements while being recorded.

        Ka-CHING!

        • by Duradin ( 1261418 ) on Tuesday October 07, 2008 @03:07PM (#25291169)

          Reminds me of a bit from Discworld.

          To summarize, Ankh-Morpork was over run by rats. The obvious solution was to put a bounty on rats, payable per tail. Soon, the rat infestation was under control but the number of tails being brought in kept increasing.

          The Patrician's solution: tax the rat farms.

        • The GP isn't calling for vigilante groups turning in terrorists. He's calling for old-fashioned cloak-and-dagger HUMINT. It works far, far better than the technological circus we are operating now. Humans will always outsmart machines made by humans. The only real accomplishment of mass government data mining is the oppression of the general public who aren't interesting in outwitting the government. They're just trying to live their lives.

          In the old days (Revolution, World Wars, Cold War), when we were aware of our enemies, spies, analysts and cryptographers defeated the enemies with courage, brainpower and skill. Now we've replaced them almost entirely with people in offices. This isn't going to change until we have another wakeup call, and the next one will probably come from Russia. The red bear is back, and we aren't prepared to deal with it (or China). Much of Russia's new technology is ahead of the US, particularly in aerospace submarine areas. We do not have a real missile shield, we do not have space-based weapons, we do not have supercavitating torpedoes (or anything to stop them). About the only encouraging developments we do have are in robotics and lasers.

          China isn't very technological (except for those nasty anti-sat weapons), but they have an enormous mountain of people they don't mind sacrificing for whatever they dream up. Their standing army is over 2 million. They're also currently building and testing over one ballistic missile a week.
          2005 article [bbc.co.uk] 2007 Article [aviationweek.com] Oct 6, 2008 [heritage.org]

          Terrorist data mining won't help much of anything when an EMP hits and the computers are fried.

          • by mangu ( 126918 )

            Humans will always outsmart machines made by humans.

            This assertion is not true [wikipedia.org] not at all [wikipedia.org].

            We need humans to collect data, but if we don't use the analysis power of machines, it will be human against human. Who do you think has a bigger chance of success, a suicidal terrorist who is determined to cause harm at any cost in an open society, or an undercover agent who tries to infiltrate a closed group of fanatics?

            In the old days (Revolution, World Wars, Cold War), when we were aware of our enemies, spies, anal

            • by tobiah ( 308208 )

              Deep Blue lost the first few rounds with Kasparov, and the team of engineers and chess players that managed it had to tweak it repeatedly to win the tournament. Not to mention that chess is a much much simpler scenario to model. Without humans to guide the machines people are going to identify their weakness and run circles around them, because they'll fall for the same trick every time.
              Everything we know about the 9/11 attack indicates that traditional crime prevention methods would have worked if any of

      • Re:Seems (Score:4, Interesting)

        by Ethanol-fueled ( 1125189 ) * on Tuesday October 07, 2008 @02:35PM (#25290833) Homepage Journal

        "...gutted by Congress in the 1990s when they didn't want intelligence operatives paying off criminals for information..."

        They're still doing it here in the US. The FBI paid a shady informant 230,000 bucks to rat out harmless, loud-mouthed nobodies as part of this [militantislammonitor.org] case:

        The government had no direct evidence. The confession was vague and even contradictory. And the statements about attacking American targets came only after heavy prompting from FBI interrogators.

        America's FBI: "Incompetance and Pusillanimity through Proxy".

      • You've been watching too many Hollywood movies. Intelligence officers don't carry silenced guns with them, they are collecting and analyzing vast amounts of data â" a very boring job actually.
        Remember that about 80% of top secret information is in some way or another published in mass media. The remaining 20% are known by locals. And I'm speaking about military secrets here. True story â" a military engineer was sent to an apparently top secret facility located in a small town. He was instructed t

        • Re: (Score:3, Informative)

          I'm actually well aware of how intelligence works. Merely cultivating contacts is an arduous process, because pushing it too fast can cause them to become suspicious and either stop talking to or actively turn on the recruiter. Some are eager to provide what the recruiter wants, and some take years to provide any useful information.

          Your 80/20 assertion is at least partially incorrect, because if it were, the US would have been far less worried about Soviet space program in the later part of the 1960s, and

          • As it is slashdot, you probably think of Open Source as of software development model. But have you heard about OSINT?

            OSINT (Open Source INTelligence). 35% to 95% of information gathered by US intelligence agencies are gathered from open sources depending on the region, language etc. Yes OSINT gets only about 1% of total intelligence fundings. (loose quotation from CIA's "Studies in Intelligence" i used to read years ago). OSINT has a very long story too -- there was a journalist who has made an article tha

            • Re: (Score:3, Informative)

              Yes, I know about OSINT. It still doesn't replace SIGINT, which cannot replace HUMINT. They're all interlocking pieces of the intelligence realm. HUMINT is more expensive than OSINT, and SIGINT is more expensive than HUMINT. Costs for all of them reach points of diminishing returns. A satellite that shows movements in real time at 1m resolution is better than nothing. Improving that to .5m may cost ten times as much but deliver only five times the value. Improving it to .1m may cost 100 times as much

    • But if spies could work, then so could computers.

      It's not like you can send in a spy and they try and get a job at the Defense Ministry.. there is no terrorist defense ministry.

      You have to send in a spy, and he or she is going to go around 2000 villages in northern Pakistan trying to find a terrorist ring. What do you do? Ask, "hey, do you know anyone who has an atomic bomb?"

      In fact, you need to have lots of spies talking to lots of people, almost like a police state, and that set of spies has to be on to

    • Re:Seems (Score:5, Insightful)

      by Fulcrum of Evil ( 560260 ) on Tuesday October 07, 2008 @02:36PM (#25290847)
      No, what you need is to stop making people hate you - go after al queda, sure, but the guys killing soldiers in Iraq aren't terrorists for the most part, they're resisting a foreign invader. Tell me, does Canada have a big problem with foreign terrorists?
  • by davidwr ( 791652 ) on Tuesday October 07, 2008 @02:15PM (#25290561) Homepage Journal

    As any Cold War spy can tell you, if you "fit the profile" of a normal law-abiding person with just enough "off-perfect" things in your life so you don't seem "too perfect," it's much easier to blend in.

  • by Mycroft_514 ( 701676 ) on Tuesday October 07, 2008 @02:17PM (#25290585) Journal

    enough data in any kind of real time to make this work.

    Years ago, we were playing with a design of a system to track all the phone calls made on the AT&T network over a 3 month period. (not record the calls, just track the billing info). The machine that management wanted to try and do it on could not hold enough data just to store the data, let alone process it. And that was the largest theoretical model of hte machine there was (about 4 times the size fo the largest one in use at the time). They really needed one about 10 times as large as the largest theoretical one, just to store the data!

    Multiple that by the rest of the items one buys during the day, and we can not track all the daa that is out there.

    Why did they even waste the money to do the testing and the reports?

    • Re: (Score:3, Interesting)

      by DeadDecoy ( 877617 )
      The problem isn't really the amount of data but rather a clear definition of when data is coming from a real terrorist or not. In natural language processing, it's fairly straightforward to say that some words in a certain context fall under a part-of-speech tag 10% of the time; well the math can be a little tricky. In mining for 'terrorist' your results can be hindered by ambiguity, subterfuge, or context. For ambiguity, I could tell a friend over the phone that he has to bomb a building at 5:00 AM to unlo
      • Did somebody define "terrorism" already? Last time I saw, the UN was over it, creating several hard arguments.
        • People can try, then what happens is the language morphs because politicians start dicking with semantics to support their policies. We're not 'terrorizing citizens' we are 'strategically suppressing anarchists' or 'overthrowing a totalitarian state'. Thing is, terrorists probably don't see themselves as terrorist and once they are defined as such, they'll just relabel what they're doing to justify their motives. Take for instance torture. In the US we don't torture, we just use alternative interrogation te
    • What you need is research into good multistep filters: you shouldn't be trying to swallow all the data, you should be trying to throw away as much as possible. This will reduce your central processing boxes to something manageable, possibly even cheap.
  • by MozeeToby ( 1163751 ) on Tuesday October 07, 2008 @02:19PM (#25290603)

    I thought we already knew this. If the algorithm comes back with even .1% false positives the system is totally worthless. There's 365 million people in the US, .1% means that the FBI/CIA/NSA would have 365,000 people to investigate. Now go and talk to someone in the AI field and see if even .1% false positive is possible.

    I'm betting that if a system is going to catch any decent percentage of terrorists (greater than 50%) the false positive rate will be above 1%. Even if you only apply the system to a relatively small number of people (say people entering a leaving the country) you are going to have hundreds of thousands, if not millions, of people to investigate. Combine any kind of realistic false positive rate with the fact that about .00001% of the population deserves to be investigated and the system is worse than worthless; all it will do is distract from the people who should be investigated.

    • Maybe not exactly see you are assuming that .1% is referring to every American. Where in fact you are talking about a subset of Americans. If you are Hispanic you probably will not be checked or even have your name in the database (unless they turn this into an illegal immigration thing). If you are African American you will probably say off of the list if your last name is "American" like Smith, or Jackson but if it is something Muslim your going on the list. If you are White you should have little to worr
    • I might still go along with a system to find the dumbass terrorists. Not every bad guy is smart, so you're going to get some obviously suspicious activity every once in a while. So if a system spots some guy renting a Ryder Truck, bought 10,000 kg of fertilizer, isn't a farmer, etc, yeah, knock on that guy's door. It's pretty uncommon, but checking a few of these, even if most are false positives, might not be a bad idea.
      • by Shotgun ( 30919 )

        We want to catch the ones that will strap a bomb to their chest and walk into a crowded venue before blowing it. Those aren't going to be your rocket scientist.

        Remember, the ones caught in the UK bombings used their cellphones as triggering devices. Their personal cell phones. With messages and addresses and various other data that an intelligent person that didn't want to get caught might consider incriminating.

    • actually its worse than you portray. its like pornography. even if you had access to someone's entire history and could detain and interview them indefinately, do you think that you would ever be able to determine if the were are a 'terrorist'? in the absence of some verifiable proof of an act, what would you base your determination on?

      • its like pornography

        Is it me or did you start trying to make a point and then you just stopped and switched to something else?

        • maybe.

            pornography in the sense of not being amenable to any objective definition. 'you know it when you see it'

    • They do it in the movies so it must be possible.

      If you're not doing anything wrong then what are you worried about?

    • The biggest problem is actually not the false positives - that would just mean extra wasted effort to screen the individuals, which "only" costs time and money.

      The larger problem is that in order to do any real good you need an unbelievably low false negative rate. Let's take the 9/11 hijackers as an example: they were only about 0.00000667% of the population. Unless you could capture all but 2 or 3 them, you're still vulnerable to the plot unless you can get one of the ones you captured to spill the bea

      • Re: (Score:3, Insightful)

        The biggest problem is actually not the false positives - that would just mean extra wasted effort to screen the individuals, which "only" costs time and money.

        No. False positives "only" cost the government time and money. For the individuals falsely suspected, it could cost them their career, their relationships, their home, and their freedom, depending on how much "time and money" the government spends on them before realizing they are innocent. (If they ever do, since -- as shocking as it sounds -- there

        • No. False positives "only" cost the government time and money. For the individuals falsely suspected, it could cost them their career, their relationships, their home, and their freedom, depending on how much "time and money" the government spends on them before realizing they are innocent.

          Not exactly - it also depends on what you DO with the false positives; the negative consequences that you mention only come about if the government takes concrete action against them based solely on "fitting the profile

  • Take off the limits placed by congress!

    3 2 1 ...
  • That notwithstanding, researchers did suggest that stores could sell more Kalashnikovs by placing them next to the diapers...

  • if your goal is intelligence gathering, data mining is rather weak. signs and portents. an increase of chatter hear, an interesting whisper of a phrase there. nothing even remotely solid or actionable, but perhaps something to attune your intelligence gathering in your more concrete and reliable methodologies

    datamining is something to back up a hunch, something to suggest an avenue to look where you might find more, something better than a wild ass guess about where to look. but certainly not a front line t

    • datamining is something to back up a hunch, something to suggest an avenue to look where you might find more, something better than a wild ass guess about where to look. but certainly not a front line tool, and certainly not the first place you visit, nor proof of anything. its not evidence, its just scattershot impressionism, to guide you in vague ways. your front line tools are spies and moles.

      Brilliant piece of taoistic AI poetry, sir.

    • data mining is not completely useless. just almost completely useless

      Access to data (bank statements, insurance, visa, travel records, credit card records, etc.) is what's valueable. Putting it, with millions of other people's data, is like trying to make a cake by mixing everything in the kitchen together. It's not going to work and all you get is a fucking mess and a waste of time and resources (plus or minues any number of lives ruined by false charges).

      It is useless. Only dropping darts from the ISS

  • The actual report (Score:5, Informative)

    by Americano ( 920576 ) on Tuesday October 07, 2008 @02:48PM (#25290979)
    I know this is slashdot and all, but if anybody's actually interested in looking at the full report, it's available for reading in pdf format [nap.edu] online.
  • So, our super-duper-not-Orwellian-really datamining system can't be used to save us from The Terrorist Threat(tm) because of too many false positives. Luckily, I have a solution. These so called "false" positives are guilty of obstructing justice and making us all more vulnerable to terror.

    See, no more false positives!
  • by gknoy ( 899301 ) <gknoy&anasazisystems,com> on Tuesday October 07, 2008 @03:16PM (#25291257)

    I realize this is likely starting to sound old, but Cory Doctorow's Little Brother should be required reading for people doing something like this. His writings about the "Paradox of the False Positive" are enumerated there, but also in other sources:

    http://www.guardian.co.uk/technology/2008/may/20/rare.events [guardian.co.uk]

    Statisticians speak of something called the Paradox of the False Positive. Here's how that works: imagine that you've got a disease that strikes one in a million people, and a test for the disease that's 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them because for every hundred people, it will be wrong once (that's what 99% accurate means). Yet, statistically, we know that there's only one infected person in the entire sample. That means that your "99% accurate" test is wrong 9,999 times out of 10,000!

    Terrorism is a lot less common than one in a million and automated "tests" for terrorism data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent.

    (emphasis mine)

    And, as others have pointed out, this system is likely to have a false positive rate higher than 1%.

    • That's not actually a paradox of statistics, it's merely a journalist showing that he misunderstands statistics (what else is new?).

      The issue is that if we *know* that there's 1 terrorist and 999999 innocents, then every time that the test is used on a random person, we learn something new about the people which haven't been tested yet. So the 99% accurate rating for the test is not correct after the first time it is used.

      Put another way, the (standard meaning of) 99% accuracy rating refers to an experi

  • Is it just me, or should the be data mining for terrorist, not anti-terrorists? It is the War on Terrorism right, not War on Anti-Terrorism?

  • We got here thanks to all the whining and complaining that The Man(tm) was unfairly targeting minorities (and, eventually, everyone) by profiling. Then, people bitched when they started searching grannies too in order to show "random sampling." Well, without being able to target folks based on statistical likelihood because of cries of racism/bigotry and then not being able to search anyone because it was done "stupidly," what are they left with? Blanket searches in the hope that looking in the right spots

  • Having personally used Multiple Data Mining techniques for several years now - It's not that Data Mining doesn't work, rather it's how its used. Data Mining is great at trend forecasting and if you're really good at what you're doing in it you can factor in probabilities of certain future events. The one key factor in data mining is a "Training Set" of Data to teach the machine(s) how to recognize the patterns. Since I suspect Terrorist come from every walk of life, every know nationality, and are using 1 o

  • Defense and LE contractors will pay big PAC money to politicians and beltway bandits to dispute these findings. They see the Iraq war winding down and the economy faltering, so the government trough could run dry. Data mining was a big bucks adventure, but not anymore if this report is taken seriously.
  • Any chance we could go back to probable cause?

    There is no contradiction. Those flagged by the software can be quietly investigated by the government... But an existence of any such an investigation shall not be deemed grounds for, uhm, anything — none of "No Fly" list bullshit, etc... We've always had the notion of "innocent until proven guilty" — but we have not always followed it, because "there is no smoke without fire". Well, there can be — if the smoke-detector raises a false alarm.

    • Let the software pick up suspects.

      Let's not. It does a shitty job at it.

      Much like a broken smoke detector waking you up in the middle of the night is not grounds for rejecting the idea of automatic smoke detection altogether, this technology can be extremely useful.

      If the smoke detector goes off every single night, and there's only a fire on average once every 100 years, does that make a case for or against that smoke detector?

    • by Tuoqui ( 1091447 )

      "there is no smoke without fire"

      Sounds like someone has never had a toaster before.

  • overfitting (Score:3, Insightful)

    by glyph42 ( 315631 ) on Tuesday October 07, 2008 @04:32PM (#25292197) Homepage Journal
    I said it before and I'll say it again: Any model that is built on 10 or 20 positive examples from a population of 6,000,000,000 is going to suffer from overfitting. Not just a little overfitting... I mean it's going to overfit like a mo-fo. There's just no way, and I mean NO way, to create a statistically significant test based on the data we have on who is and who is not an ACTUAL terrorist.
  • by hey! ( 33014 ) on Tuesday October 07, 2008 @04:32PM (#25292215) Homepage Journal

    I was wondering whether techniques of commercial data mining could be applied to environmental problems like emerging disease surveillance.

    Well, of course they can. The question is how far is it from practical? I think, pretty far from being as practical as it is in business.

    First of all, businesses have a great deal of object model in common: they have common concepts like customers, products, sales, brands etc., which form a common framework in which they can do all kinds of creative thinking, or if not thinking you can even discover relationships using some kind of machine learning.

    Secondly, when you are dealing with business data, the most important events tend to be common events. The most important common event is when a customer buys something. When you talk about something like a new disease emerging, or somebody committing a crime like hijacking or bombing, the most important events are exceeding rare, but catastrophic. Therefore the connection between events we do have in abundance and the events we are interested in is tenuous, poorly statistically attested to, and in many cases pure conjecture.

    Finally, a lot of what businesses use data mining for is tweaking marginal costs and revenue by shifting dollars that were already going to be spent from one place to another. Offer product A to this web visitor instead of B. Stock more of item X in the store rather than Y. If you really don't know a priori whether X or Y will sell more profitably, you probably aren't going to go too wrong.

    In something like environmental monitoring, you create expenses that weren't already there. No, you can't drain this lake because the model predicts a 5% marginal increase in the probability of human cases of hantavirus in the area. To somebody counting on the economic value of draining that lake, that's a brand new cost that wasn't there before.

    Same goes, even more so, to deciding somebody is a danger to society.

    Now let me say that I have no doubt that data mining will lead to more terrorist being thwarted or captured, compared to doing nothing else. Of course so would a lottery, but I suspect that data mining is a great deal better at identifying good suspects than a lottery. However, it is for reasons I noted above not going to be particularly accurate, certainly not compared to probable cause. Furthermore, the marginal cost of false positives gained seems likely to exceed the marginal value of false negatives lost, if such things could be quantified.

  • Just like anything else, there's a right way to go about something
    and a wrong way to go about someting. Data mining for terrorists is
    quite possible (despite what the article and report claims).

    However, certain things are just obviously wrong to anyone that's
    ever tried this sort of thing. Namely, you don't try to identify
    people by name only. This is why the no fly list is such a joke.

  • Bloody hell. Somebody should be shot for wasting our resources when our infrastructure is threatened with neglect.

  • by sydbarrett74 ( 74307 ) <sydbarrett74.gmail@com> on Tuesday October 07, 2008 @10:04PM (#25295287)
    Why do people still stubbornly insist that flagging 'terrorists' was ever the reason for all of this data-mining? Don't people understand the hidden agenda is to develop detailed dossiers on every single ordinary US national?
  • Just compile a list of all the extremely illegal and unethical things you're doing as a government and find the groups of people most impacted.

    Let those people simmer for 5-10 years under your asshattery and let cool. Presto! A tasty terrorist.. Bon Appetite!

  • by HuguesT ( 84078 ) on Wednesday October 08, 2008 @01:09AM (#25296405)

    It turns out that terrorism in western countries is a very rare thing, outside of a few hot areas like Spain's Basque area. This is very good, by the way.

    Mining for rare event is extremely difficult. Bayes' s rule indicates that if in a database there are 0.01% actually suspicious events and your mining algorithms are extremely effective at 99% accuracy, then you still have an approximately 100:1 false positive ratio, which makes the mining still useless.

To be awake is to be alive. -- Henry David Thoreau, in "Walden"

Working...