Likelihood of Success

Ron Coleman’s pretty good blog

Blogophrenology

Posted by Ron Coleman on October 7, 2007

“Blogger” Cory Doctorow utilizes scare quotes — “scientists” and “predict” — to mock a proposed anti-terrorism strategy:

Computer and behavioral scientists at the University at Buffalo are developing automated systems that track faces, voices, bodies and other biometrics against scientifically tested behavioral indicators to provide a numerical score of the likelihood that an individual may be about to commit a terrorist act. “The goal is to identify the perpetrator in a security setting before he or she has the chance to carry out the attack,” said Venu Govindaraju, Ph.D., professor of computer science and engineering in the UB School of Engineering and Applied Sciences. Govindaraju is co-principal investigator on the project with Mark G. Frank, Ph.D., associate professor of communication in the UB College of Arts and Sciences.

Doctorow “links” to this story, which treats the project very seriously, but he obviously “thinks” it’s a joke, comparing it to the discredited discipline of phrenology. He doesn’t explain why it’s like phrenology, but expects us to “get it” that because they are both based on the observation of physical characteristics which supposedly lead to a prediction of behavior, both must be equally lacking in validity.

This is called a “faulty analogy” or “sloppy journalism,” because, in fact, the story linked to suggests that this nascent science may in fact be very promising. Certainly no one is suggesting that anyone could be arrested or convicted on the basis of the observations, technology and algorithms that would be employed, but considering the cost of letting a person who is just a little better at hiding his intentions than Richard Ried get on a plane and hijack it to 70 Virgins Interplanetary Airport, it sounds like a good idea to try this technology.

Cory Doctorow, however, thinks it is a joke, and offensive to his libertarian instincts; and no explanation is necessary to explain why.

Perhaps the next time some airplane falls out of the sky because Cory Doctorow or the ACLU object to “profiling” and projects such as this are shouted down, Doctorow will “find” someone else to “blame.”

Advertisements

37 Responses to “Blogophrenology”

  1. Ara said

    I don’t fault Cory Doctorow for presuming the reader would be familiar with negatives of this idea. After all, the idea itself isn’t a new one. Authors as disparate in tone and outlook as George Orwell and Phillip K. Dick have explored its implications — mostly focusing on the negatives.

    Would you like to discuss it at greater length? I’d be happy to explore it here with you.

  2. We can if you want. I don’t think merely acknowleding the existence of an argument, however (and even I recognized it) is a substitute for making it, when presented with new facts (i.e., the project in question).

  3. Ara Rubyan said

    I hear you.

    So here’s my question: What safeguards could there be built into this system to prevent it from being used to haul in whoever the authorities want to haul in?

    In other words, if human judgement can override the result (and I hope to G-d it can) what good is it in the first place?

  4. Jack said

    “In other words, if human judgement can override the result (and I hope to G-d it can) what good is it in the first place?”

    Statistical.

    In anticipating any type of criminal, deviant, or possibly malignant behavior (or by contrast exemplary behavior), humans being what they are and prone to behavioral unpredictability or at least individual idiosyncrasy and eccentricy, it is a statistical and behavioral model for anticipating the stated objective or condition given certain pragmatic operational parameters.

    Now it will be far from perfect, or even very capable in the beginning, such as the first airplane at Kitty Hawk was not a Joint Strike Fighter either, but then again the most modern aircraft is prone to imperfection, as well. It is just simply far less prone to malfunction and yet still more capable of greater function despite immense increases in complexity as expertise, engineering, and materials have improved over time.

    The problem is not with the idea behind the system but with the assumptions that a) it must be perfect (it never will be) and b) that over-reaction must be the first response to anticipation triggered by the system.

    There is a sliding scale of both investigative procedures to follow if the system is tripped, and an escalating or de-escalating scale of force to be applied once the next step in the process is executed, if and when it is executed.

    What is needed is to prove in testing that the system is more effective than not, and that it can be improved over time as better sub-systems, more data, and more expertise is gained. None of those things can be proven effectively until the system is actually tested in the field.

    That is to say nothing can be either proven, or disproven, except theoretically, until such time as a thing is actually tried.

    If however it is a failure then it will likely be discarded for somethiing more effective, and if it is partially effective (and at best, like anything else in life, including people, it will only be partially effective) then it will likely be improved over time if the system proves viable and valuable.

  5. zach. said

    Ara,

    the safeguards seem obvious. Simply make it inadmissable as evidence, or as not sufficient grounds for detaining someone. If it’s used as a tool to, say, help the police figure out some targets to focus a little more attention on (like telling them that these are the people to be searching before getting on a plane), then I don’t see the harm.

    That said, this seems like yet another technology breathlessly reported and yet with an approximately 0% chance of ever coming to fruition. There doesn’t seem to me to be a convincing physical or biological case from first principles to be made as to why a device like this would provide a higher success rate than a person. And plus, how much data do we have about the biometric measurables of someone about to commit a terrorist act?

  6. Ara Rubyan said

    Jack:

    Sounds good, but one smart cop (or a wise judge) is worth 10 billion algorithms — even at 1000 teraflops per second.

    Zach:

    I agree with your assessment of the odds. But that will not stop someone from trying.

    J & Z:

    I presume you have both seen “Minority Report?”

  7. zach. said

    Ara,

    Yes, but the consequences of extreme misuse of a technology don’t invalidate its responsible use. These sorts of pre-screening devices must stop short of being used as grounds for detainment or arrest. It’s all about the safeguards against misuse, something which could be said of all police tools, up to and including old-fashioned human judgment.

  8. Jack said

    “Sounds good, but one smart cop (or a wise judge) is worth 10 billion algorithms — even at 1000 teraflops per second.”

    You’re right, about the cop at least, (it will be men after all gathering data and programming the parameters) but you can’t have cops everywhere looking at everything. And as for wise judges, those are even far more rare.
    So technology is there when men cannot be.

    And Zach is right here too:

    “Yes, but the consequences of extreme misuse of a technology don’t invalidate its responsible use. These sorts of pre-screening devices must stop short of being used as grounds for detainment or arrest. It’s all about the safeguards against misuse, something which could be said of all police tools, up to and including old-fashioned human judgment.”

  9. Ara said

    Great discussion, by the way.

  10. Ara said

    P.S. Since you’ve both read it, I’ll ask you this:

    If you had to re-write the ending of “Minority Report” to back up your point, how would it read?

  11. Jack said

    I wouldn’t change it, myself.

    It is only fiction and fiction is supposed to make us question things, in order to better prepare for them. That is fiction often purposely amplifies and magnifies and exposes and exploits weak points in order to make a moral point, so that trouble can be avoided, if possible, before abuse occurs.

    A reality in which things run well and work as intended does not make for a good story. Or usually a very interesting one.

    Right now I’m writing a fictional book about a future Cold war between Europe and the United States. I fully expect that one day there will be another shooting war between the US and Europe (in real life), with certain nations in Europe allied against the US and certain nations allied with us.

    However in the story I am particularly and purposely amplifying the differences between the US and Europe, not as I really expect them to unfold, but as I expect those differences to excite controversy, to make various moral points, and create a good story for the reader.

    The story warns about possible real things, but it is not real things. So that the reader can be prepared without falling into the trap of assuming things must proceed along the lines the author outlines. That is a fictional story should serve it’s general moral and practical purpose without becoming so calcified and fixed to this or that manner of unfolding that it fails to serve it’s function. A fictional story must be open enough that the general principles are understood as still being true in nearly any particular way in which real events actually unfold. If you get what I’m driving at. It can’t be too general to serve no real purpose, or too specific to write itself out of relevance by too narrow a scope of perception of the future. The very best fiction tells a fake story about real things, or potentially real things, generally enough to be true without being specific enough to be impossible. And then leaves it up to the reader to understand what that really implies.

  12. Ara Rubyan said

    A fictional story must be open enough that the general principles are understood as still being true in nearly any particular way in which real events actually unfold. If you get what I’m driving at.

    I do.

    The very best fiction tells a fake story about real things, or potentially real things, generally enough to be true without being specific enough to be impossible. And then leaves it up to the reader to understand what that really implies.

    You really haven’t seen the movie have you? Because if you had, then what you just said would destroy the support that anyone would have for a program like the one we’re talking about.

    Unless, of course, it were being done in secret. Which it probably already is.

    P.S. Do yourself a favor: rent “Minority Report” and watch it. If you’ve already seen it, watch it again and then let’s talk.

  13. Jack said

    I have it in my DVD library. Right beside Blade Runner.

    It is though a fictional story. Like Blade Runner.
    That’s my point.

    What is real is not the same as what is fictional.
    Anything is possible if you believe the movies or fiction predict reality.
    But they never really do. Documentaries may interpret and record reality, but fiction does not predict it, it merely implies possibilities about it.

    In 20,000 Leagues Under the Sea the idea of an advanced fictional submarine was presented, included how it would be used for warfare, how it would function, who would command it, what it would fight, where be stationed, what the internal components would be, how powered, etc.

    It got the submarine part right, but it is nothing like a real submarine, diesel or nuclear.

    If such a program did exist or were tried (and it would require technology not now available) it would function very differently than that which was fictionally imagined. That is reality might parallel the idea of the submarine, but a real submarine would operate as per the real world, not as per the fictional world.

    The real world is not Star Trek, Star Trek is just one source of possible inspiration for what might one day be achieved in a general sense.

  14. Ara Rubyan said

    It is though a fictional story. Like Blade Runner.

    Thanks for pointing out the bleedin obvious.

    Orwell’s “1984” is also fiction. Does that mean we can’t learn something from it?

    If such a program did exist or were tried (and it would require technology not now available) it would function very differently than that which was fictionally imagined.

    Really. And that is based on what, exactly? Hope? Hope is not a plan.

    I’ll say it again: the development of an “automated system” that correlates “scientifically tested behavioral indicators” in order to predict “the likelihood” that an individual may be about to commit a terrorist act is a profoundly flawed undertaking, is probably illegal and is certainly un-American as well.

    Which is not to say it will probably be used in the near future by the next president of the United States. Wouldn’t that be ironic? Hillary Clinton hauls you in for questioning because your nose is too straight or your skin too white, e.g., you look too much like Timothy McVeigh. He was a terrorist after all, wasn’t he?

    C’mon Jack! Snap out of it. Tell me how this technology actually makes the world a better place. Give me a positive outcome to the scenario set forth in “Minority Report.”

  15. Jack said

    “C’mon Jack! Snap out of it. Tell me how this technology actually makes the world a better place. Give me a positive outcome to the scenario set forth in “Minority Report.”

    I’m afraid if you don’t understand the real difference between fiction and fact, and that fiction does not actually determine fact, then any potential facts I point out will be considered as nothing more than my own personal fiction. (Which considering that nothing like Minority Report technology exists, including that technology described in this thread, would be merely my personal opinion about an imaginary technology presented in a fiction anyways.)

    Now that’s your right of course, to believe that fiction creates reality, I wouldn’t argue it.
    It’s just, strictly speaking, not actually factual.

    By the way 1984 never happened.
    And we won the Cold War.

    It’s an interesting and instructive fiction.
    But it wasn’t much of a reality.

    So it’s kinda hard to argue, “what’s the best real world alternative” to something that has never happened anywhere but in the imagination of a script writer.

    But if you really need one, here’s the alternative.
    The not real technology is used in the proper way to do the thing intended in a way best suited to our laws and culture.
    It ain’t rocket science that way, I’ll grant ya, or even much of a story (after all how many people feel compelled to watch stories on how bulldozers are working every day to successfully build things).

    But I imagine that’s what any potential creators and users would attempt.
    To create, modify, and employ technology as would our laws and society might allow.

  16. What is truth?

    And what is fiction?

    A phrenological (!) debate in the discussion section at Likelihood of Success. I’ve got two great commenters going at…

  17. zach. said

    Ara,

    As I said before, the positive ending to Minority Report vis a vis the technology Ron is discussing would require the fundamentals of the book to be totally altered. In this scenario, the precognitives’ suggestions would not be grounds for arrest. People would only be monitored by the precogs once upon entering an airport or possibly government building, and that’s it. Searching someone’s bag based on a computer-scored likelihood of future terrorist actions is not the same as charging someone with terrorism based on a computer score. To me the distinction between the two, and the safeguards between one and the other seem obvious.

  18. zach. said

    ha, that “as i said before” related to a totally different beginning to the comment than what i ended up with! disregard!

  19. Dishman said

    “Minority Report” is a nightmare.

    I saw “Resident Evil” over the weekend. Does that nightmare mean we should stop all genetic research?

    In constructing a nightmare, the author or dreamer is free to set the rules such that only bad things can happen.

    As for the “Minority Report” technology and something good that could come of it, I have one word: Moneyball.

  20. Yu-ain Gonnano said

    I’ll say it again: the development of an “automated system” that correlates “scientifically tested behavioral indicators” in order to predict “the likelihood” that an individual may be about to commit a terrorist act is a profoundly flawed undertaking,…

    You do understand that if you substitute “terrorist act” with “not repay a loan” you get your credit score right?

    Those seem to work rather well, so I have a hard time calling the process flawed.

    If a validated scorecard is used to determine who goes through stricter searches at the airport (i.e. 0-500 only go through metal detectors, 500-900 have to take their shoes off and get ‘wanded’, and 900-999 get frisked) it may lead to more efficient use of TSA resources.

    Nobody here is talking about using this scorecard as the sole evidence to put people in jail. That was the extemity of Minority Report and it simply does not apply in today’s world.

  21. zach. said

    Yu-ain,

    I think you’ve hit the nail on the head here. But one aspect which dooms the technology ab initio is that we have tons and tons of data on credit users and the correlation between the information credit scores collect and the risk a credit card company can expect to undertake in issuing a card to someone with that score. I suspect we don’t have anywhere close to that level of data on the correlation between biometrics and likelihood of terrorists acts.

  22. Jack said

    We’re all Superstars now.

  23. Ara said

    Yu-ain:

    You do understand that if you substitute “terrorist act” with “not repay a loan” you get your credit score right?

    Yeah, I know how you feel about that. I felt the same way too. Buth then I thought about it some and here’s what I found: despite the elaborate system of FICO scores, we’re witnessing a collapse of the mortgage industry. Why? Because of an over-reliance on statistical analysis instead of common sense. Data becomes greed’s handmaiden.

    A similar thing will happen with this “terrorist prediction” tool. The system will make things worse not better. Instead of lessening tension and crime, we’ll see an increase in fear, uncertainty and doubt.

  24. Ara said

    P.S. Zach also makes some great points too.

  25. zach. said

    Ara,

    it’s my understanding that the collapse of the mortgage industry is not because of credit scores but despite them. rationalizing issuing loans to people with low scores simply by upping the interest. again, it is about responsible safeguards against the abandonment of reason and good sense.

  26. Yu-ain Gonnano said

    1) I never said we had enough data to create a scorecard that works. Just that the concept of creating one was not, in and of itself, flawed.

    2)The collapse of the mortgage industry is not the result of an over-reliance on statistical methods. It’s much more complex that that. And if you’re interested, I’ll be glad to expound. But it’s not an over-reliance on stats.

  27. Without any real details, it seems to me that many are misunderstanding what is being attempted here. I don’t think that they are using biometrics in the ‘he looks Arab so he must be a terrorist sense’ but rather the ‘he is especially nervous, maybe he is up to something sense’ although how they would distingish between terrorism and other crime is questionable. A close analogy might be a lie detector, which uses biometric data to determine if someone is lying. I have no idea whether such a thing is at all feasible now, but I don’t doubt that it will be possible someday.

    Which I think leads into the more interesting question of Minority Report, one that the film moved away from. The movie focused on the failure of the system, but to me the moral implications of a successful one are far more interesting. If, as the initial premise of Minority Report suggested, the system was perfect, I would imagine that we would use it, and we would indeed use it much like it was used in the movie. Recall that by the time the action in the movie had taken place, the only ‘crimes’ were impulse crimes of passion because everyone knew that you couldn’t get away with anything. Would we not arrest people rather then let such a crime happen? If indeed the system was perfect, could we do anything less? Even if the system was just near perfect (and our current system of justice isn’t anywhere close), it seems that the cost of not using it would far exceed that of using it.

    To our minds this is indeed somewhat of a nightmare scenario, but is it a nightmare because innocents might be charged or because none of the guilty could escape justice? I doubt many of us would truly be pleased with a system that was perfectly fair in punishing our foibles.

  28. Ara Rubyan said

    Yu-ain:

    I don’t know if you have a mortgage. I do. When I applied for it, I sat across the desk from my local banker. She knew me. She knew the local community, the neighborhood I was buying into and what my job prospects were in the long term. In other words, she made her decision using unique human factors as well as financial ones. For all I know the next guy she met after I left didn’t meet those criteria and didn’t get a loan. Whatever.

    As we both know, it doesn’t work that way anymore.

    Bankers make a premium, i.e., they have an incentive, to close on a mortgage loan one day and package it up with many others, selling it to large equity firms that monetize and securitize these bundled debt packages. Investors buy these instruments for the income stream they produce.

    Problem is, the human factor is gone. Financial institutions have reverse engineered the system so as to maximize the return on investment…while dropping the human factor, the local factor, out of their decision making process.

    In other words, they’ve factored human judgement out of the equation by using numbers instead of common sense. Instead of saying, “I’m not sure this mortgagee can make the payments” they’ve jacked up the interest rate up on everyone in order to cover their eventual losses.

    It might seem counter-intuitive: how can mortgagees get such low interest rates (or any loans at all) while giving a great return on investment to the anonymous investors?

    I think you know the answer: the initial interest rate to the homeowner is kept very low in the first couple of years. Then it jumps precipitously, sometimes 5 points or more above normal lending rates. Who’s fault is that? The mortgage holder? Yes.

    But lending institutions also bear some blame for emphasizing short-term gain instead of long term stability of the local market. As a result, foreclosures are way up, people are losing their homes in record numbers and the middle class (the real engine driving growth in this country) is devastated.

    Moral: Data is the handmaiden of greed.

  29. Yu-ain Gonnano said

    Well, you’ve got the effect right, but the cause wrong.

    Reselling loans in the secondary market has been going on for a very long time. Well before statistical analysis was anywhere near providing a projection of risk, much less an accurate one.

    Rates were adjusted for the loss of human-human interaction on the basis of gut feel. Not data. The margins were pulled out of the sales staff’s a$$.

    The recent collapse is due to the recessionary period growing out of the late 90’s early 00’s and then compounded by 9-11. Interest rates were dropped to historic lows almost overnight. this spurned a refi-boom. Because people were already payment buyers and not price buyers the market became hyper-competetive. Lenders first lowered rates to attract borrowers. Then, in order to lower the payment even more, you saw the proliferation of more exotic products: Interest Only (for a term), Neg Am (where, also for a term, you could pay less than the interest) and others. This was driven, not by data that said doing so was safe, but by the ‘gut feel’ of the sales side of the business who needed volume to drive their commissions. They believed that by the time the low payment term ended the borrower would have refied to another product/bank (Who cares if the customer is crappy, he’ll refi away before he defaults). And, since the payments were so low, it was easy for buyers to overbid on the price of the property. So a house that normally would have sold for 100k now gets sold for 120k (hey, what does the customer care, he’s still got a lower payment than his mortgage on his old 90k house). You do this year after year, and appreciation rates become ridiculous.

    So we have unusually low payments, unusually high house prices, and customers that, for the moment, were turning over their loans before the low payment term ended.

    Meanwhile, all the data are saying, “Hey guys, this is all great right now, but this can’t be sustained long term”. The business line said, “Yeah, we know, but we’ll deal with that later. If we don’t follow the trend now, we won’t make it to later.”

    But then interest rates rose. Boy, did they ever. In fact they rose so fast that the lending market couldn’t adjust in time. The market was so used to such low rates for so long that the market wouldn’t bear the increase in costs. So lenders were borrowing money from the Fed at 8% and lending it out at 6.25%. (As opposed to just a few years earlier borrowing it at 4% and lending it out at 5.5%)

    And the house of cards all falls down.

    I agree with you that greed was the problem. But not greed caused by bad data analytics, a greed that you feel so deep in your bones that you ignore the data because “it’s no longer relevent” to today’s environment.

  30. Ara Rubyan said

    The margins were pulled out of the sales staff’s a$$.

    Not sure I buy that, but no matter. Let’s say it’s true. Why would they feel the need to do that? Because the data was so important that they would risk their jobs to fake it.

    That’s pretty much all I’m saying — over-emphasis on data modeling leads to disaster.

  31. zach. said

    Ara,

    over-emphasis on data modeling leads to disaster.

    i think we all agree on that. the issue is that that’s not the implication of the above-described technology. it is a way to model data, yes, but it places no lower bound to how much emphasis you give it.

  32. Yu-ain Gonnano said

    Why would they feel the need to do that? Because the data was so important that they would risk their jobs to fake it.

    No, because there was no data at all, and everyone knew it, so gut instinct was accepted. Just like that gut instinct was accepted when your loan officer who knew you, your community, job market, etc. had a gut instinct about your quality as a borrower.

    In your loan, gut instinct (presumably) made the right decision. In this mortgage crunch, gut instinct made the wrong decision.

    An over-emphasis on gut instinct/human understanding leads to disaster too.

  33. Ara Rubyan said

    No offense guys, but this topic now officially bores the crap out of me.

    Looking forward to the next discussion. Thanks again — I really did enjoy it.

  34. Yu-ain Gonnano said

    Can’t blame you.

    It was boring me to write it. 🙂

  35. What, are you guys still here?!

  36. Jack said

    You really stepped into it this time Ron.

  37. […] know.  I don’t always agree completely with Cory Doctorow, and sometimes I think he’s really off base, but this is pretty […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: