

Failed to meet the higher data protection standards required for biometric data, which constitutes sensitive/special category personal data and,.Failed to have a process in place to stop the data being retained indefinitely.Failed to have a lawful reason for collecting the personal data of individuals.Failed to use the information of individuals in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably have expected their personal data to have been used in this way.The ICO decided that Clearview had infringed UK data protection law because it had: As the Information Commissioner, John Edwards, also put it, Clearview “ not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service is unacceptable.” Because of the high number of UK internet and social media users, Clearview’s database is, according to the ICO, likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.Īlthough (according to the ICO) Clearview no longer offers its services to UK organizations, Clearview has customers in other countries, so it is still using the personal data of UK residents. The affected individuals were however not informed that their images were being collected or used in this way. The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.Īccording to the ICO, Clearview collected more than 20 billion images of people’s faces and data from publicly available information on the internet and social media platforms all over the world to create an online database. The case is yet another reminder of the conflicts between AI & GDPR.Ĭlearview provides a service that allows customers, including the police, to upload an image of a person to Clearview’s app, which is then checked for a match against all the images in Clearview’s database. Clearview collected images from the internet and from social media to create a global online database that could be used for facial recognition. (Clearview) £7,552,800 in May 2022 for data protection law infringements in using images of people for its AI product. The UK Information Commissioner’s Office (the ICO) fined Clearview AI Inc. So why is it then that Clearview AI receives multimillion dollar fines from multiple countries whereas pimeyes was only investigated once by a german data watchdog? Like clearview AI, does not ask permission before scraping publicly available images either.We first issued this alert on and have updated it to reflect more recent developments. Other then that there are no real differences between the two companies. Furthermore the service that Clearview AI offers is only available to law enforcement agencies whereas the service pimeyes offers can be used by anyone (provided they pay). The difference between this company and Clearview AI is that pimeyes does not scrape images from social media websites. Meanwhile a polish facial recognition company called pimeyes is doing exactly the same thing as Clearview AI, it scrapes images from the web for use in an identity-matching service. In the past the company has also been fined by privacy watchdogs from italy, France and Australia. The main reason for the fine is that Clearview has never asked individuals whether it can use their selfies in an AI-based identity-matching service which it sells to entities such as law enforcement. Yesterday the UK has fined Clearview AI 7.5 million pounds for a string of breaches of local privacy laws.
