Monday, December 23, 2024
Home > ICO > Keynote at PDP’s 19th annual data protection conference

Keynote at PDP’s 19th annual data protection conference

I began my career as an archivist, and I’ve always had a passion for records, for dusty papers tucked away in a basement. We tend to see archives as a way of reviewing the past, of being able to find a different contextual perspective on a period of history. But what always strikes me when I spend any time reading through historic documents is how much they inform contemporary challenges.

The unique events we face aren’t always quite as unprecedented as we think.

Everything changes and nothing changes.

And so today I’m going to talk about some of the data protection questions we’ve been asked to engage with across a unique year. And I’m going to set out how the answers to a lot of those questions are generally the same as they have ever been.

I’ll talk about what we might consider when we move out of the pandemic, and society takes stock of new priorities and new considerations.

I’ll give a quick overview of where the ICO sits in this, and our evolution as a regulator.

And I’ll end by looking ahead, and considering how we might prepare for an uncertain future.

But first, as I’m saying the past informs how we approach the present, let’s start with what we can learn from an Edwardian English novelist born in 1879.

I was recommended EM Forster’s short story The Machine Stops by a member of my book club, who suggested that despite it being over a century old, I might find it unnervingly prescient.

She was right.

Forster describes a world where people live in pods, and interact only via digital screens.

It is a society in which all music, all learning, all healthcare, all communication with friends and family happens digitally, with a sentient digital assistant managing the information flow.

People don’t entirely understand what goes on behind those screens – how they are connected with other people, how their information if used – but they use it because the Machine makes their lives easier.

I’m sure there’s plenty you recognise from aspects of our world today.

That entwined, interconnected relationship between society and technology has never been clearer than during this pandemic

The pandemic has changed society, not just in the UK but globally. Technology and innovation responded to that change: whether playing a crucial role in keeping us in touch with friends and family, or helping us to continue to attend health appointments or conferences like this.

We’ve seen an acceleration in the uptake of digital services that we would otherwise have expected to take years.

This accelerated progress brought new questions. Organisations asking how their staff can work on sensitive data from home? How they best collect customer details for contact tracing? Can they carry out tests to check whether staff have coronavirus?

We know how hard the data protection community has been working to support their businesses and their organisations in continuing to operate effectively.

And we’ve been pleased that our timely and pragmatic advice in answering questions and providing guidance has played a part in that.

The pandemic has brought new questions – how many employers started the year planning to temperature check every employee as they arrived for work, I wonder? But interesting point here is that while the answers my office has provided will have been pretty recognisable to all of you data protection experts.

The same themes come up.

Are you being transparent with people about what you want to do with their data and why?

Are people being treated fairly, and in ways they would expect?

A lot of this comes down to accountability, a topic I’ll return to shortly.

It’s important too to consider that technology changes fast, but society’s views shift too. Before lockdown, would people have been content with a representative of the government phoning them up to take a list of everyone they’ve been in contact with? Or a supermarket being given their health status?

COVID-19 has brought an increased pragmatism: people’s attitudes have responded to a threat to their health. As the pandemic passes, we should expect to see society’s attitude shift again, but we shouldn’t assume the dials will return to where they were they were at the start of 2020.

This is a theme that I believe will characterise the next year, not just across data protection, but far more broadly:

As we emerge from the pandemic, and as we respond to potential financial pressures, we will see society taking stock across a range of issues.

From our relationship with the NHS to our relationship with our employers, people will want to reassess where their priorities lie, and what balances there needs to be around liberty, privacy, innovation and prosperity.

These are interconnected discussions. Again, to best understand the present, we should consider the past.

Whenever I stop to think about how data protection has changed even just in the past five years, I’m reminded of a partner at a law firm who confessed to me that he only started specialising in data protection and privacy because it seemed quiet and fairly straightforward. Needless to say, that is not his view now!

We are all busy. When I speak to other regulators around the world, I hear the same thing. And that’s because data protection has become so complicated and central to our lives. We’re dealing with tough, tough questions. And we are being asked to find the answers quicker.

And I don’t just mean the complexity of the law, or the technology. Ours is a principles-based law, that requires us to consider subjective questions of what is fair, what is reasonable, what is proportionate? There are tough questions we have to answer on behalf of society – I’m thinking of those ‘would people be happy if we did this with their data’ questions – that society itself hasn’t decided on yet.

If someone has their bag snatched in the street, are they happy for police to use facial recognition technology to track down the perpetrator?

When someone searches the web for information on their medical condition, how much explanation do they need around adtech?

How much privacy will people sacrifice to allow for greater movement and liberty during the pandemic?
And so often, as a community we are playing catch up, retrofitting the privacy implications of technology that is already in place.

This was the case with my office’s work on our Age Appropriate Design Code – we are working to protect children’s data within an internet that wasn’t build with youngsters in mind.

It’s not been an easy piece of work, and the challenges – of retrofitting protections, of collaborating and liaising with so many stakeholders, of saying ‘just because you’ve done something for a while doesn’t make it ok’ – those are challenges I think many of you will recognise from your own work.

But it’s been a rewarding file for me personally, and that’s because I see where it makes a difference.

So often we deal with broad terms like citizens and consumers, and our work can feel a little removed or philosophical. But the kids code is an example of a file where I’ve spoken with parents about the very real impact bad data processing had on their children. I can see how this work affects the individual.

I see it in other cases too. How our work on the gangs matrix changes the life of a young man wrongly labelled on a database, for instance. And I’m sure you see it in your own work.

Data protection is deeply personal.

Perhaps the clearest example recently is the controversy around the algorithm used for A-level exam results. As an eighteen-year-old student, this could be your first engagement with data protection, with algorithms, with more opaque processing. How is that young person’s attitude to data protection being shaped by this early experience?

Let’s jump briefly back to EM Forster, and his novella, The Machine Stops.

I don’t want to spoil how the story ends, but I would say it is a dystopian vision, and they tend to not have a happy ending!

Let me say this: the people’s relationship with Forster’s Machine – a machine that provides their communication, education and entertainment – is based not on trust, but on relying only on the machine’s functionality. There’s no explanation of how their information is used. And so when the functionality begins to fail, a backlash quickly gains momentum.

And that, I think, is a message our community can still learn from more than a hundred years on.

Participation in new business processes and innovation only happens at scale and at pace when it has the public’s trust and confidence.

Is your organisation’s relationship with customers is based only on functionality? Does your organisation’s confidence in your innovation blind you to the importance of building customer trust? Then you best hope that functionality doesn’t fail, because the backlash could come quicker than you think.

And that’s why your role, as data protection professionals is so important.

At its most basic, data protection is about protections for consumers, for citizens, for people. But the law was born in the 1970s out of a concern that the potential of emerging technology would be lost if society didn’t embrace innovation. The law reassures people they can support innovation, safe in the knowledge there are checks and balances in place to protect them, with an independent regulator who has their back.

And so our role as data protection professionals is to protect both people and innovation.

We need show people how the machine works to build their trust.

Organisations need to be clearer why an app needs their data, where their data is going, how an algorithm works, what AI processing is going on.

As society takes stock of who it trusts, your work has never been so important.

I have role to play in that. My office needs to be there to help you succeed. We need to make sure our work helps your organisation to protect people’s data by following the law, and helps you to encourage people’s trust in the digital innovation – both private sector and public.

I think we have been able to do that more and more through an evolution in how we regulate. Let me give three examples.

Firstly, we are taking an approach that is more collaborative than ever, with regulators and organisations working together.

As a modern regulator, our approach is focused on working alongside organisations, helping them to make changes and improvements to comply with the law to reduce mistakes and misuse of people’s data. Working to get it right at the outset results in better outcomes for businesses and customers.

This really is the work that we are largely set up to deliver – about three quarters of my staff work in roles of this type.

Examples of this work include working with public authorities and supermarkets, so they could share information to support people shielding during Covid-19. Our report into the extraction of data from the mobile phones of victims and witnesses set out expectations of the police that have since been accepted as a sensible and empathetic way forward. And on the access to information side, we have launched our Freedom of Information toolkit for public authorities.

Secondly, we have put a greater emphasis than ever on supporting innovation, that key aspect of why we have data protection.

In the past few months we have published guidance on how Artificial Intelligence can comply with the law, set out how we will support businesses to better protect children’s data online, and have confirmed our continuing support to innovators through partnership with other regulators. Our Sandbox continues to help organisations using personal data to develop innovative services, from the use of data to support student mental health wellbeing at universities to an airport looking to use facial recognition to replace boarding cards.

Our advice and support focuses firmly on enabling innovation to happen: I hope it is clear that the days when data protection regulation was seen as a blocker to innovative business have long passed.

We will continue offering this support, with guidance scheduled on data sharing and accountability, and an information hub dedicated to helping SMEs.

And thirdly, we’re engaging more than ever with the wider regulatory community. I mentioned earlier how the difficult questions we get asked, the tough problems that fall on my desk, so often engage broader societal questions. We are at that intersection between tech, law and society.

And as data becomes less the trail that we leave behind us, and more the very medium through which we live our lives, so data protection becomes so, so broad.

This has a big impact on your work. I am sure many of you will have found aspects of your work overlapping with financial regulation, or content moderation.

As the Information Commissioner, my office cannot answer all the questions alone.

And so the ICO is now part of the UK Regulators Network, seeing the challenges other regulators are facing, and learning from each other.

We’re helping embed data protection by design into other regulators’ models of regulation, so that a holistic approach is conveyed to organisations. Our work around AI is a good example of this, as is our work with the CMA and Ofcom on digital regulation.

And there’s our international work. Chairing the Global Privacy Assembly, and building those global links to the benefit of people and organisations here in the UK.

I’ve covered today some of the data protection questions we’ve been asked across this unique year, and how everything changes and yet nothing changes.

I’ve spoken about what we might consider as society takes stock in a post COVID world. About the complexity of data protection, but its real value too, as an enabler of innovation, as a supporter of public trust.

And I’ve talked about the ICO’s own evolution.

I’d like to end by looking ahead, and considering how we might prepare for an uncertain future.

The first point to make here is that the ICO doesn’t make the weather on DP legislation

My role is not to make or shape laws. Our government and Parliament will decide how we approach our legislation outside of the EU, how we pursue adequacy with the EU, and how we shape our relationship with the rest of the world.

What I can say is that the government has made a clear commitment to high data protection standards equal to those of the EU, as part of an independent policy on data protection

That reflects the international trend of ever higher standards or privacy protections, and it reflects the strong UK tradition of appreciating the value of data protection laws as an enabler of innovation.

What the ICO can do is help you navigate whatever winds of change may come. As I said earlier, our focus is on protecting individuals by supporting organisations to get their compliance with the law right. Our website provides guidance and tips, and we’ll continue to update it as we become aware of any changes organisations need to make.

And this is a two-way conversation. If there are aspects you need help with that we are not covering, then please get in touch and let us know.

And to those of you who want to get ahead of the curve, and prepare for what is round the corner, I’d say this:

I don’t have a crystal ball. But what I can say with confidence is that accountability is a sail that will never fail you, both domestically and internationally.

Take stock of what data you process, and consider the risks that processing is creating. This is a fundamental part of compliance with the GDPR, and indeed with modern DP laws.

These are data protection concerns writ large, as organisational concerns. If your CEO or chairman isn’t across all the finer detail of how they comply with data protection, they should at least be across the corporate obligations around accountability. When neglected, it is a business risk like any other.

The ICO’s accountability toolkit is the perfect starting point on this.

I’ll close with this: if history has taught us anything, it’s that old maxim that trust is hard won, and easily lost.

Accountability, transparency, and data protection more broadly, are fundamental to the relationship you have with your customers. The law exists to protect people, and to enable you to innovate with people’s trust.

That way the machine doesn’t stop.

Original Source