Jul 09 2024
Trust in New Technology
In an optimally rational person, what should govern their perception of risk? Of course, people are generally not “optimally rational”. It’s therefore an interesting thought experiment – what would be optimal, and how does that differ from how people actually assess risk? Risk is partly a matter of probability, and therefore largely comes down to simple math – what percentage of people who engage in X suffer negative consequence Y? To accurately assess risk, you therefore need information. But that is not how people generally operate.
In a recent study assessment of the risk of autonomous vehicles was evaluated in 323 US adults. This is a small study, and all the usual caveats apply in terms of how questions were asked. But if we take the results at face value, they are interesting but not surprising. First, information itself did not have a significant impact on risk perception. What did have a significant impact was trust, or more specifically, trust had a significant impact on the knowledge and risk perception relationship.
What I think this means is that knowledge alone does not influence risk perception, unless it was also coupled with trust. This actually makes sense, and is rational. You have to trust the information you are getting in order to confidently use it to modify your perception of risk. However – trust is a squirrely thing. People tend not to trust things that are new and unfamiliar. I would consider this semi-rational. It is reasonable to be cautious about something that is unfamiliar, but this can quickly turn into a negative bias that is not rational. This, of course, goes beyond autonomous vehicles to many new technologies, like GMOs and AI.
There also appears to be a bias toward lack of trust in things that are highly complex. If someone has a hard time understanding the underlying science, their default position is negative. People are also biased towards information they recently encountered. So if someone sees a story on the news about an accident involving an autonomous vehicle, that will have more influence on their attitude than statistics. It is also easier to stoke fear than engender confidence. People have a risk-avoidance bias. One negative rumor about vaccines, nuclear power, or GMOs can cause a lot of risk avoidance that will be difficult to counter with information.
What about deference to experts and scientific authority? In this study reported deference to scientific experts did not modify the relationship between information and trust, but it did modify the impact of trust on risk perception. However, the desire for a new experience had a positive effect on the perception of risk. What does all this mean?
One take-away from this and many other studies that touch on this question is that there is a nuanced and complex relationship among the public perception of risk, reality, and public communication of information. Many factors shape public perception, including media reporting, the effectiveness of science communication, sensationalism, statistics, fear, the cool-factor, and trust in the relevant institutions. This also means that effective advocacy for science-based policy and public behavior is challenging.
The general media play a mixed role, but by my perception it is largely a negative one. They do provide information, for those who wish to avail themselves, but they are biased toward sensationalism, dramatic events, fearmongering, and fringe opinions. Fear, shock, drama, and spectacle drive clicks, but these are not the best way to make cold rational decisions.
Activist groups with an ideological agenda also have an easier time stoking fear and misinformation than science-based groups trying to improve public knowledge and perception. The anti-GMO campaign is a great example – it is based entirely on fear, distortion, and misinformation and yet has managed to convince a majority of the public that GMOs are something to be feared. Anti-vaccine campaigns also scare a lot of parents away from a safe and effective public health measure.
Those of us pushing for a science-based approach to these questions have many challenges. We cannot just give facts and information. The data shows that this is usually not enough (although it does help). We need to create scientific literacy, critical thinking skills, and media savvy. People need to understand that they have been manipulated by misinformation, and that they can be empowered with a more reality-based perspective.
But perhaps the most frustrating challenge is that trust itself is a complicated issue. There is no single source or institution that we can trust absolutely, and yet trust is essential. There are many examples of institutions lying, of researchers committing fraud, of experts just getting it wrong, and of governments covering up their failings. I find this the most challenging thing to communicate. People don’t deal well with complexity and nuance. We prefer simplicity and absolutes. This leads some people to effectively take the approach of – if I can’t trust absolutely, then I won’t trust at all. This means they will essentially believe whatever they want. This just leads to dueling experts and competing narratives.
The skeptical approach is – reality is messy, information is complicated, and trust is relative, but we can use a process to determine that some conclusions are more likely to be true than others. Conclusions are tentative, partial, qualified, and subject to revision, but it’s still better to make decisions on the best current information available than to live life as a “choose your own adventure” story.