Monday, May 30, 2016

Terrorist or tourist: How do you tell the difference?

Israel has a tourist bonanza. Each year, more than three million people travel to Israel. Almost all of them come through Israel’s main International gate-of-entry, Ben Gurion Airport.

But along with these tourists come many others: government officials, diplomats, businessman and women, Israeli citizens returning from personal trips, Jews making aliyah from all around the world and, most important for Israel’s security, a number of anti-Israel operatives who intend to harm the Jewish state. This hostile group includes not only anti-Israel advocates or anti-Israel academicians who would harm Israel through words. It can include terrorists who come to kill Jews.

If your job is to help keep Israel safe, how do you catch these terrorists before they leave the airport? How do you spot them?

When you add up all the arrivals landing at Ben Gurion each year, you’ve got close to four million faces to look at. If the same number of arrivals came each day throughout the year (which, of course, doesn’t happen in real life), you’ve got 11,000 faces to look at every day. How do you spot potential terrorists in such a crowd?

Currently, Israel has, like other security forces, a technology that can compare the face of an arrival with an existing data base of known criminals and/or suspects. But the problem with this technology is that a face has to be in the data base in the first place. You cannot identify someone to stop if his face is not already in the data base. 

This requirement creates security problems. For example, only three of the eleven terrorists who attacked Paris in November, 2013 had criminal records (“Terrorist or criminal? New software uses face analysis to find out”, israel21c, May 29, 2016). Before the attacks, there was no way to identify any of them as ‘terrorists’.

Now a new company in Israel could change that. The company is called, Faception. It’s developed a technology that can tell if a face belongs to a ‘terrorist’.

The technology isn’t perfect. But in its early stages, it seems to work to an 80+% accuracy (ibid).

Faception is a high-tech Israeli start-up. It claims that it is first-to-technology and first-to-market with proprietary computer vision machinery to profile people and reveal their personality based only on their facial image (Faception, Homepage).

It claims that its technology was able (through back-testing after the attacks) to identify nine of the Paris attackers as ‘potential terrorists’ (ibid). It claims it’s already working with one homeland security agency. It’s looking for more business.

Here’s a corporate video. It’s 2:02 long:





Faception believes its technology can make us all safer. But this technology has a problem. It depends upon profiling.

The United Nations doesn’t like profiling. It defines profiling as “Any action undertaken for reasons of safety, security or public protection that relies on stereotypes about race, [color], ethnicity, ancestry, religion, or place of origin rather than on reasonable suspicion” (UN Working Group of Experts on People of African Descent, 2008). Using that definition, a ‘facial profiling’ system might be called another ’stereotyping’ method that’s  as superficial as skin color or ethnicity.  

In America, that kind of profiling could be a civil rights violation. Already, racial profiling has been called a violation of the 14th Amendment (Faye V. Harison, “Racial Profiling, Security, and Human Rights”, University of Florida, no date, p 1). Will facial profiling be considered the same as racial profiling?

Profiling individuals based on ‘superficial’ criteria is regularly attacked by Human Rights organizations. Those groups have condemned it (“Threat and Humiliation: Racial Profiling, Domestic Security, and Human Rights in the United States”, amnestyinternational, October 1, 2014).The UN joins them. It calls on all nations to end all forms of profiling (“States must step up efforts to counter racial and ethnic profiling – un rights expert”, UN Human Rights Office of the High Commissioner, ohchr, June 30, 2015).

Faception offers a technology to help protect us. But that technology could provoke a Human Rights storm (Harison, ibid). What do you do about a technology that could catch a terrorist before he commits an act of terror--but causes a Human Rights uproar? Put another way, what’s more important, individual human rights or national security?

Personally, I‘ve got only one answer to these questions. It’s an answer that comes from my own experience here in Israel: with terror attacks often occurring daily, I don’t mind being stopped because of how I ‘look’. 

I want that public police activity. It sends a message: we are stopping people. You might be one of them.  

So far as I’m concerned, I don’t have a problem with that. If there’s a new technology that can keep terrorists wondering when they’ll get stopped, I’m for it. I don't mind a little inconvenience to stop a killer.

Do you?


No comments:

Post a Comment