On Tuesday, Tim Cook has presented the new iPhone X during the annual Apple Keynote in Cupertino, California. The anniversary edition of the world’s first smartphone comes with a handy new feature, a face scan to unlock the phone. For every technophile person this sounds like a revolution, data protectionists on the other hand call for caution with this new feature. But this technique is not as new as Apple claims, face-scanning devices have been used by intelligence services for several years, for instance in the fight against terrorism.

Facial recognition software is part of everyday life, for example when people want to tag their friends on Facebook, the website’s software recognizes other persons immediately. Also police and intelligence services can find suspects with the same principles and algorithms in any public space, that is surveilled by cameras. Security forces expect a higher and faster solving in criminal investigations, but also for identifying foreign agents or terrorists. A recent example comes from Germany. Police of Berlin installed surveillance cameras inside a metro station for a six-month trial period, to find out if the cameras can help on recognizing criminal suspects. The data can be also used to detect felonies or other dangerous situations (e.g. unattended luggage or pickpockets) already in the making.

German intelligence services are allowed to ask other authorities for personal information, like bank accounts, finger prints, or travel behavior. If facial recognition software is installed at many public places, intelligence services could gain detailed information about every citizen, not only of suspects. After recent attacks in nightclubs or supermarkets – or public space in general – the German parliament has signed a law, to improve public surveillance (“law for the improvement on video surveillance”) in March 2017. With that measure, the state wants to guarantee comprehensive security in the public, especially those areas where many people are at one place, train and bus stations, as well as sport stadiums.

But the question for data security arises quickly – what will happen with the scanned faces? In the case of Apple’s software, the data will be only saved on the device with a special protection, so it will be secured against hacking attacks. Companies like Facebook could sell the received facial data for commercial purpose, like they already do with internet search behavior. But the information saved by police and intelligence services have to be shared somehow, otherwise they do not complete the tasks they were invented for. There need to be several security systems, to provide the highest possible protection for the information, and to not give away private data.

Another problem besides data protection is the missing accuracy of the used facial recognition software. A 2012 study analyzed several mug shots to test the precision of mainly used software, the result was alarming! The three tested algorithms performed five to ten percent worse on African Americans than they did on white persons. Meaning, that if an African American committed a felony, chances are higher that innocent persons are subjects of criminal investigations. This is an extremely sensitive topic, since it basically equals character assassination. And in case of counterterrorism, investigating on the wrong person and missing the terrorist can even have fatal consequences.

Facial recognition is already part of everyday life – sometimes for entertainment reasons, other times for security issues. Sometimes by people’s own choice, other times without anybody noticing it. No matter which information is being transmitted by facial recognition software, it must be treated with the highest level of security