Please see below the speech as just delivered by Executive Vice-President Magarethe Vestager during the Internet Week in Denmark. Internet Week Denmark 4 May 2021.
Introduction
Good morning and thank you for the invitation to speak. I’m very pleased to be here today, especially in the light of all that is happening in Brussels and around Europe, when it comes to the digital transition. This more European perspective is what I want to bring to the discussion. So I will start with a question:
Do Europeans trust technology? Do they trust how their data is being used in the virtual world? As it happens there are two recent Eurobarometer surveys which shed some light on these questions; one on willingness to share personal data in order to improve public services, and another one on whether Europeans feel comfortable travelling in a fully automated car.
Of course, those are very different aspects of technology, but they do intersect. If our self-driving cars rely on a data interface that is stored in the cloud, does this mean we are creating a permanent record of every European’s movements in such a car, every second of the day? And if so, who will have access to that record? Data protection has always been an important concern, yet the ease with which data can be harvested and stored in the digital age adds new potency to the debate.
What is interesting is that in both cases – the self-driving car and the data sharing – the level of trust varies enormously across countries. In Denmark or Ireland people are roughly twice as trusting of technology compared with, for example, France or Latvia. There could be many underlying reasons for these differences, but it is quite a gap!
The other big result in these surveys is that overall, the level of trust is actually still quite low. Over a third of Europeans said they would not be willing to share their personal data to improve public services, for any reason. And only 25% would trust a ride in a self-driving car! Whether it’s through vaccine hesitancy, concerns over mass surveillance, or worries over the security of online banking, Europe’s ‘trust gap’ is a major issue.
If we don’t manage to close this gap, we risk creating an even bigger digital divide in Europe; one that can weaken our Union. Low levels of trust will also make us slower to adopt frontier technologies, and that would mean consumers miss out on the benefits, while our businesses miss the first-mover advantage.
So what can we do to keep that from happening?
Digital Skills – an enabler of digital trust
First, let’s take a closer look at these surveys. What we can see is that the countries where trust in technology is lowest are also the places where digital skills lag behind. It makes sense: who among us would trust in something we don’t even know how to use?
But in a way it’s also good news, because it suggests we can help build trust in technology by supporting skills acquisition. And when it comes to digital skills, there certainly is a lot to be done, in all EU countries. Even here in Denmark, six out of ten students still only have basic skills when they start work; and 59% of Danish SMEs report difficulties in recruiting – and keeping – ICT specialists.
The European Recovery Plan includes significant funding instruments which can help fill these gaps. 20% of the total envelope – over 120 billion euros – is earmarked for the digital transition – and that includes spending on digital learning. Our Skills Agenda sets out the goals we want to achieve: 70% of European adults should have at least basic digital skills by 2025. We need now to make sure we deliver on this. The European Commission cannot do this alone. We need every Member State to assess its current state of play and see how the high school students’ generation of today will have more opportunities and become the ICT specialists we will need in less than ten years’ time.
Not only can digital upskilling help us build trust, but it is also a way for us to bridge the ‘digital social divide’. This, in turn, is a way to address inequality. The response to the pandemic has only made this more urgent. For example, when you see how school closures can affect children from disadvantaged backgrounds, you quickly realise how important digital skills and digital infrastructure are to achieving social justice in the new economy.
Strengthening the Single Market to build trust
There’s another big way for Europe to build more trust in technology, and that is by ensuring digital marketplaces stay competitive. Many Europeans are wary of so-called ‘Big Tech’. They worry about what is happening to their data, when a handful of big companies seem to know everything about them. They worry about large tech companies having the power to squeeze out new competitors, and even to use that power in a way that interferes with our democracies.
These are legitimate concerns. We cannot allow a few companies to decide what choices consumers can have, or decide what is newsworthy, or use their algorithms to steer the public debate.
The Commission is taking important steps to ensure big digital players play fair. In the first place, we will continue to use our competition policy instruments to full effect, in order to protect consumers and ensure a level playing field – whether that is to do with mergers, with antitrust, or with stamping down on unfair tax advantages that companies might receive in some Member States.
Beyond this, we have proposed a ‘Digital Markets Act’, a new legislative tool which complements our existing competition enforcement efforts. It targets a small circle of so-called digital gatekeepers. By setting out, ex ante, what we expect of them, and what we will not tolerate, we want to build online markets that are open and fair to all kinds of businesses. Market that give consumers a genuine choice. Markets consumers can trust.
There is another aspect of the Single Market that is relevant to building trust: as I see it, trust is a thing that can travel. When Danish companies are free to seek opportunities in other European markets, they bring with them the tools to foster the greater level of trust we enjoy here. That’s because when we trade together, we learn from each other. It’s an important way in which Europe has always been able to bring about convergence, and I’m convinced it can work in the Digital Decade too.
Trust in Digital Public Services
Of course, the public sector itself has an important role to play in fostering greater trust. After all, trust is a two-way street: If we want citizens to trust our digital tools, we have to use the new technologies to hold ourselves to the highest standard of accountability. E-Government can deliver on this promise, by increasing public accountability and transparency among public service providers. Again, there is an opportunity here for Member States like Denmark to share good practices with other EU countries, where trust in public institutions has traditionally been lower.
Beyond that, it is up to us to build electronic public services to the highest possible standards, especially when it comes to privacy and data protection. Europe’s data protection laws are already among the toughest in the world. The General Data Protection Regulation has led the way for personal data protection laws across the world.
And we want to harness that to provide citizens and businesses with digital services they can rely on to work safely and efficiently. That is the thinking behind the new European electronic identification, which we will be rolling on in the coming years.
Safeguarding digital rights
None of this is to say that we are being naïve. Sometimes there are good reasons for our citizens to worry about what technology can do, if it is not regulated in a transparent and accountable way. I was quite young when I first read the book ‘1984’. Back then, when people talked about Big Brother, or the risk of an ‘Orwellian society’, it always seemed like science fiction. But in recent years, we are hearing this kind of warning more and more – whether in relation to vaccine passports or the social credit system in China. Some of the risks are exaggerated, of course. But we do have to make a distinction between things that could work, if done the right way, and things that our free and democratic societies are absolutely unwilling to tolerate, such as discrimination or mass surveillance.
It starts with defining and protecting digital rights. We want digital products to ‘do what they say on the box’. This is why, last year, we passed a new Digital Content Directive, which will make software products live up to their promise. And we have now proposed a new Digital Services Act, which protects digital consumers by fighting unsafe and counterfeit goods, and by ensuring consumer rights are as protected in the digital marketplace as they are in the physical world. There is a Single Market aspect to this too – having common rules across the EU means Europeans can trust that everyone is playing by the same rules.
But perhaps the biggest issue comes back to the question of data sharing. In whatever sphere – as consumers, citizens or even socially, every European must have the confidence of knowing she controls her own data. We want transparency and we want citizen control, as one of the basic principles on which regulations must build.
This will form an important part of the new Declaration of Digital Principles, an instrument for us to promote and uphold European values in digital space. It’s an opportunity to refresh our commitment to the ideas of the Enlightenment, the intellectual heritage on which Europe is built, and which is just as important today as it was 300 years ago. Concretely this will include digital rights like freedom of expression, and digital principles like the need for a secure and trusted online environment.
It will also address ethical principles for human-centred algorithms; something I think is essential if we are to safely and confidently adopt new technologies like Artificial Intelligence. It’s clear there can be absolutely no place in Europe for AI which discriminates on the basis of gender, race or sexual orientation.
This is among a number of important points that is covered in our new package on Artificial Intelligence, which was adopted just last month. It sets out when AI can be considered high-risk, for example when it is used to decide whether you will be offered a job, or granted a loan. It also sets out areas where AI is wholly unacceptable, for example social scoring.
Conclusion
This is a process. One which will take time, and will evolve as the technology itself continues to evolve. Because as we all know from our personal relationships, trust is something you build over time.
And it is also something that has to be earned. If we want to meet our digital goals in the new decade, we need to be open and clear about where the problems are, and work together to address them. We need to treat European citizens like grown-ups, to listen to their concerns and do what it takes to address them, even if that means more work for us.
It reminds me a little of a quote from one of the Founding Figures of modern computing, the great Alan Turing, who said:
We can see only a short distance ahead, but we can see plenty there that needs to be done.
So, let’s get to work!
Thank you.