Doing dialogue: A talk at AVS 2019

Jack Stilgoe at AVS 2019. (Photo credit: AUVSI)

Last week I gave a talk to the Automated Vehicles Symposium in Orlando. This is the big annual meeting on all things self-driving, and it grows every year. This year, there were thousands of people there. Thousands. It was fascinating to observe and be a part of. This is what I said.

As a European visiting America, I’m aware that technological futures can turn out very differently in different places. For all the similarities, our governments are different, our industries our different, our cultures are different and our transport systems are different. Nothing is inevitable, and we shouldn’t pretend to know what the future of automated vehicles looks like. 

I’m interested in how we can have a better debate about the the possibilities and uncertainties of self-driving vehicles. This is where technology meets democracy. Conversations in this space can often be one-sided, with the questions, answers and the terms of debate determined by the proponents of a particular technology. I want to make the case for a more balanced dialogue. 

I’m going to offer five lessons from social science that has studied past controversies about technology – particularly the controversy around genetically modified crops in Europe – and then offer some insights from a recent exercise in public dialogue that took place in the UK. 

At the end of the 20th Century, GMOs were a technology full of promise. Scientists were excited about the technical possibilities of more precise crop improvement, and companies saw clear economic opportunities. Alongside realistic proposals for incremental improvement ran hyped-up claims that the technology would benefit everyone, particularly the world’s poorest people. Some in Europe disagreed. So while GMOs have become a fact of life in the US, in much of Europe they can’t be consumed and can’t be grown. A public backlash meant that companies have missed out on markets, scientists have missed out on research opportunities and farmers and consumers have missed out on new innovations.

Until the GM crops controversy, a lot of scientists, tech developers and policymakers thought they understood public concerns about new technology. They thought that if people understood the science they would trust and accept the technology. We saw movements in the 1980s and 90s towards what became known as the ‘Public Understanding of Science’. The assumption was that to know science was to love it. This assumption was wrong. People, often the most educated people, were unwilling to just accept the answers that scientists were offering. They had their own questions. 

So this is the first lesson: Debates about new technology are never just about science and technology. This is especially true with technologies that don’t exist yet. People will understand the technology in a range of different contexts. 

This leads to the second lesson: People are citizens as well as consumers. If Automated Vehicles are going to change the world, people will want to have a say. We have already seen research on whether people are comfortable paying for or getting in an AV. This is only a small part of the picture. People will have their own questions, and they won’t just relate to whether or not the technology works as expected.

The third lesson: It’s about more than safety. With GM crops, the developers of the technology assumed that public concerns would be dominated by questions of risk – will it be safe to eat? In fact, people also had concerns about effects on the environment, the ownership of the technology, inequalities in terms of who would benefit and more besides. 

So the fourth lesson: People in power need to listen as well as talk. We need to understand what people’s real hopes and fears for AVs are. The uncertainties here are huge. We have heard a lot about standards for AV safety, but we still have no idea how safe is safe enough? Do people think being safer than a human driver on average is acceptable? My hypothesis would be not, but we don’t know. Levels of acceptable risk can vary by orders of magnitude even among different transport modes. We don’t know whether people will have concerns about who owns AV data. We don’t know how people will balance values like privacy against convenience. We don’t know what people think about the interpretability of machine learning. We don’t know whether it matters to people if this is public transport or private; personal or shared. We don’t know how all of these things will vary from place to place. So we need to listen. But the conversation can’t end there. If innovators are going to ask people what they think, they need to respond; they need to say how they are going to change direction in response. Otherwise it is public engagement for engagement’s sake. 

The fifth and final lesson: Be clear on why you are doing public engagement. If it’s to sell a particular technology, or to lobby for policy change, be honest about that. People will see right through it if not. Is it to persuade or is it to empower? Is it to open up the debate to new perspectives or to close it down?

In the UK, we’ve been doing a large public dialogue exercise on behalf of the Government’s Centre for Connected and Automated Vehicles. It involved more than 150 members of the public in five locations around the UK, with each group meeting three times over a two-month period. The report is still being finalised, but a few quotes from the discussions suggest that members of the public would like to put some new questions on the table.

“Infrastructure has been my biggest issue.”

Facilitator – “Will the infrastructure need to change?”

“It’ll have to.”

“Significantly.”

Facilitator – “Who should pay for the infrastructure?”

“Users pay. I don’t think taxpayers should pay.”

Sciencewise dialogue participants

The dominant story about self-driving cars is that they will change the world without changing the world. The focus is on artificial intelligence, suggesting that the task is to mimic and then improve upon human drivers. It overlooks what else might need to happen for the technology to really work. In our dialogues, people picked up on this, and were sceptical. They thought that roads and the behaviours of other road users would need to change if AVs are going to work.

“There will be risks. We will learn from accidents, but I do not want my family to be those on the back of which the learning happens.”

Sciencewise dialogue participant

People understand that if the technologies are going to work, they will need to be tested, and tested in the real world. Some people thought this would be risky, and wondered therefore if the balance between risks and benefits would be fair.

“Cars were liberating for the working classes and older people. This seems to be restricting choice.”

Sciencewise dialogue participant

There was a lot of excitement about the potential benefits of AVs, but people wondered who would benefit. Would the technology be liberating or would it lock us in and make us dependent on a technology that people felt they had little control over? 

“Is there a need for it in a village? If they don’t have it, they’ll be stuck.”

“So what you’re saying is that people in the countryside can’t get one of your motors [AVs]? That’s a bit unfair isn’t it?”

Sciencewise dialogue participants

Finally, it is worth noting that, while there is a lot of talk about when self-driving cars will arrive, there is less consideration of where. Some participants, particularly those from rural communities, wondered if the technology would really make a difference to their lives in the foreseeable future.  For the people developing and regulating the technology, these issues are challenging. These questions do not have easy answers. But they will be a part of how the technology is defined by the public. To ignore them would be to risk being surprised in the way that developers of genetically modified crops were two decades ago.

AVS in Orlando, before kick-off (Photo credit: Jack Stilgoe)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s