- Snap issued with preliminary enforcement notice over potential failure to properly assess the privacy risks posed by its generative AI chatbot ‘My AI’
- Investigation provisionally finds Snap failed to adequately identify and assess the risks to several million ‘My AI’ users in the UK including children aged 13-17.
The Information Commissioner’s Office (ICO) has issued Snap, Inc and Snap Group Limited (Snap) with a preliminary enforcement notice over potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot ‘My AI’.
The preliminary notice sets out the steps which the Commissioner may require, subject to Snap’s representations on the preliminary notice. If a final enforcement notice were to be adopted, Snap may be required to stop processing data in connection with ‘My AI’. This means not offering the ‘My AI’ product to UK users pending Snap carrying out an adequate risk assessment.
Snap launched the ‘My AI’ feature for UK Snapchat+ subscribers in February 2023, with a roll out to its wider Snapchat user base in the UK in April 2023. The chatbot feature, powered by OpenAI’s GPT technology, marked the first example of generative AI embedded into a major messaging platform in the UK. As at May 2023 Snapchat had 21 million monthly active users in the UK.
The ICO’s investigation provisionally found the risk assessment Snap conducted before it launched ‘My AI’ did not adequately assess the data protection risks posed by the generative AI technology, particularly to children. The assessment of data protection risk is particularly important in this context which involves the use of innovative technology and the processing of personal data of 13 to 17 year old children.
The Commissioner’s findings in the notice are provisional. No conclusion should be drawn at this stage that there has, in fact, been any breach of data protection law or that an enforcement notice will ultimately be issued. The ICO will carefully consider any representations from Snap before taking a final decision.
“The provisional findings of our investigation suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching ‘My AI’.
“We have been clear that organisations must consider the risks associated with AI, alongside the benefits. Today’s preliminary enforcement notice shows we will take action in order to protect UK consumers’ privacy rights.”
– John Edwards, Information Commissioner
The issue of this preliminary enforcement notice follows an ICO reminder to companies developing or using generative AI that they should be considering their data protection obligations from the outset. The ICO has issued advice to developers and users of generative AI on the issues that they must consider and will continue to scrutinise the compliance of products and services introduced to market.
Notes to editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
- Under the UK GDPR organisations are required to carry out a data protection impact assessment (DPIA) where the processing of people’s data is likely to result in a high risk to their rights and freedoms. A DPIA is a process designed to help organisations systematically analyse, identify and minimise the data protection risks of a project or plan. It is a key part of accountability obligations under the UK GDPR, and when done properly helps assess and demonstrate how to comply with data protection obligations.
- The UK GDPR singles out some types of personal data as likely to be more sensitive and gives them extra protection. Examples of special category data include: data concerning health, sex life and sexual orientation; data revealing racial or ethnic origin; and religious beliefs.
- A preliminary enforcement notice provides the controller with the opportunity to make representations before a notice is issued.
- An enforcement notice mandates action to bring about compliance with data protection law. It sets out: where specific correcting action is required and why; the action that needs to be taken; timescale; how to report the action that has been taken and right to appeal.
- The ICO can take action to address and change the behaviour of organisations and individuals that collect, use, and keep personal information. This includes criminal prosecution, civil enforcement and audit.
- To report a concern to the ICO telephone call our helpline on 0303 123 1113, or go to ico.org.uk/concerns.