![]() ![]() Legally speaking, Clearview AI is arguably missing the point. Police services in the UK have also used Clearview AI on a “free trial” basis.īut the company claims that it no longer accepts European customers and thus does not fall under the jurisdiction of European regulators. The Swedish regulator fined the country’s police authority €10,000 for using Clearview AI unlawfully. Doing Business in EuropeĬlearview AI has catered to European clients in the past. However, Clearview AI maintains that, because does not offer its services in Europe, neither the ICO nor any other European regulator has the power to impose such sanctions or penalties. The recent decision from the UK’s ICO involved a similar set of orders as the Italian regulator, along with a £7.5 million (€10 million) fine against Clearview AI. In March, the Italian data protection authority fined Clearview AI €20 million, banned the company from processing any images or biometric data of people in Italy, and required it to delete the information it already possessed. But the regulator’s decision was only to order Clearview AI to delete the complainant’s biometric data-not the image itself. In Hamburg, for example, in January 2021, the state’s data protection authority found Clearview AI had unlawfully processed an individual’s data by deriving biometric information from his facial image. “Courts have, time and again, found mass surveillance systems unlawful and in violation of fundamental rights, even though they were very useful to law enforcement and intelligence agencies.” Investigations Under the GDPRĬlearview AI has faced complaints in at least six European countries, where its methods are allegedly illegal under the General Data Protection Regulation (GDPR).Įuropean regulators have invariably agreed that Clearview AI’s methods are incompatible with EU law and have imposed a range of sanctions-some more severe than others. “They keep arguing that their tech is useful to law enforcement-but that’s not a reason to close your eyes on the harms and the major societal shift that this tech can cause,” Audibert told me. Lucie Audibert, a lawyer and legal officer with the campaign group Privacy International, was involved in a complaint against Clearview AI that was considered as part of the UK ICO’s investigation into the company. “These leads, when supported by other evidence, can help accurately and rapidly identify suspects, persons of interest, and victims to help solve and prevent crimes.”īut Clearview AI’s business model is highly controversial, and the company is subject to multiple legal challenges from people who are trying to stop it. “Agencies that use our platform can expect to receive high-quality leads with fewer resources expended,” Clearview AI states on its website. The company then provides a link to the source of the image, which should identify the person in the client’s uploaded photo. ![]() The company says it has amassed over 20 billion facial images in this way.Ĭlearview AI enables its clients-mostly law enforcement agencies-to upload a picture of someone’s face, which can then be matched with an entry in its database. On 26 May, the UK’s Information Commissioner’s Office (ICO) joined an increasingly long list of regulators-including in Canada, Australia, Italy and France-to order the company to stop collecting data about its residents.Ĭlearview AI “scrapes” images of people’s faces from publicly available sources and derives biometric information from each picture using its proprietary facial recognition algorithm. Incident Response Management Training Courses.Business Continuity Management Training Course.#RISK Amsterdam - 27th & 28th September. ![]() #RISK Digital EU Focus - 19th & 20th September.Diversity, Equity & Inclusion in Tech Awards. ![]() Women in Governance, Risk & Compliance Series.PrivSec Global: 28th & 29th November 2023. ![]()
0 Comments
Leave a Reply. |