Tech Buzz

OPINION

How Technology Could Prevent Another Paris-Like Attack

What I find fascinating is that with all of the focus members of the intelligence community place on violating our privacy, they still aren’t able to stop attacks like the one in Paris. Currently they are complaining that it is our fault for implementing encryption that blocks their often-illegal views into citizens’ personal lives.

I think that even if encryption didn’t exist, they still would be ineffective. Looking back at 911, there was no lack of intelligence indicating an attack was imminent. That intelligence was clearly in evidence to show guilt after the fact, but there was an inability to get that intelligence to a decision maker who could — or would — make a timely decision to keep the event from happening.

Our system is set up largely to punish people who commit crimes after the fact — not to keep a crime from happening in the first place. That mindset will need to change if we actually are to become safer. Something else that will need to change is the separation between citizens and enforcement. Rather than being treated like we are part of the problem, we — the folks supposedly being protected — need to be part of the solution, and I think there could be an app for that.

I’ll go deeper into that this week and close with my product of the week: IBM Watson, which could be at the core of not only this effort, but also of efforts to make us healthier and happier.

Minority Report vs. Current Law Enforcement

Restraining orders don’t work. Law enforcement largely is based on the concept of punishment as a deterrent. If it doesn’t work, then it is about catching and punishing the perpetrator to serve as an example in order to prevent the next crime. This works to a degree for financially motivated crimes like theft and illicit drug sales, because there is a risk-reward balance when criminals think strategically. They have to weigh the reward of the crime against the cost and risk of going to jail.

With crimes of passion, folks tend not to think strategically. They are more in the moment, and while they certainly may regret their violent action, the punishment plays little role in the decision process, because they are living in the moment and not considering it until after the act is committed.

With suicide bombers, the plan is to be dead. Even if they think strategically, at least for now, we don’t have anything we can do to punish people who are dead. In fact, the punishment may help ensure the outcome, because the only folks who are punished in these instances are those whose bombs fail to detonate. In a situation that might lead to torture to extract information about other parts of a terrorist plan, the punishment simply would ensure the outcome society wants to prevent.

That is the concept behind the movie Minority Report — stopping crimes before they are committed, which supposedly is the goal of most antiterrorist activity. Since the psychic approach probably won’t work, that movie isn’t much help. However, we have tons of people out on the streets with smartphones, and we have cognitive computing solutions like IBM’s Watson. That combination actually could be better than the psychic approach. (If you recall, the entire movie was about how it failed.)

There Is an App for That – or Should Be

The idea for this came from a note I got from John Byrnes, writing for the Center for Aggression Management, which has a scientifically backed process that reliably identifies people who are about to do something violent. This truly drifts into Minority Report territory, because it is about violence in general. It would include those planning to shoot up schools, places of employment (or ex-employment), or spouses.

The process looks for behaviors that would be evident leading up to an attack or crime — while criminals are gathering information and selecting targets and planning — and even early in the execution process.

The thing is, to make this work, we need folks who know what to look for, and a way for those folks to alert the authorities to take action.

There could be an app for that. People just use their cellphones to take pictures of people they think are acting questionably as part of an app that sends the pictures to a central service, with an AI, such as Watson, that uses facial recognition to identify and profile them. If the behavior captured triggers a violent profile, law enforcement would be notified, and the individual put under formal surveillance, with a flag level ranging from questionable to the possibility of imminent violence.

At the end of the year, the person who was instrumental in preventing the most crimes would get recognized — maybe a presidential citation or a medal, and a cash award to be donated to charity of the person’s choice. A job offer from the FBI might even make sense, because a top skillset like that could be valuable full time.

Wrapping Up

I think the problem with the current approach to attacks like the one in Paris is that it treats us all like criminals who need to be monitored. Our privacy is violated as a consequence of a process — based on post penalties — that doesn’t work for this kind of crime.

Instead, I think we could use technology to make people a part of the solution. Focus on those who fit a scientific profile for people intending violence, which would give law enforcers a better chance of stopping crimes before they’re committed.

Backed by something like Watson, I think there could be an app for that — and it could do a better job of catching upset, crazy or depressed people who are planning violence against schools, employers or government facilities. I don’t know about you, but I’m tired of being part of the problem and actually would like to be a bigger part of the solution.

Rob Enderle's Product of the Week

A couple of weeks ago, I was at IBM getting an update on Watson, and I heard how it now can capture, classify, and even direct programs that can modify behavior at a national scale.

The information was presented in a way that suggested marketing, and it would be incredibly useful to move product or even change the outcome for elections.

It hit me at the time that IBM should be using it to change perceptions that surround IBM and Ginny Rometty — the company’s CEO — and help get people to see IBM for what it is becoming, not what it was. One of the historic problems with firms like IBM is that they don’t use their own technology aggressively enough, even when it could improve the value of every employee’s stock options.

Here is a system — currently unique in the market — that once trained could do things like identify terrorists or other criminals early on, and maybe help influence them not to take the steps that end up with them and a lot of us dead.

This is an incredibly powerful cognitive computing tool, and every major CEO in the world has gone to see it, is in the process of going to see it, or planning on going to see it, according to IBM, and every one of has been amazed at what this early step into artificial intelligence could do. The system apparently has advanced a great deal from when it won Jeopardy! and could end up saving the world. As a result, IBM’s Watson is my product of the week.

Rob Enderle

Rob Enderle is a TechNewsWorld columnist and the principal analyst for the Enderle Group, a consultancy that focuses on personal technology products and trends. You can connect with him on Google+.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Rob Enderle
More in Tech Buzz

Technewsworld Channels