An Expert on Trust Says We’re Thinking About It All Wrong

It’s always tempting for journalists to search for threads that connect disparate news stories, and easy to overstate their significance. But it’s remarkable how much one such thread—declining trust in the institutions that once dominated public life—ties together so many of today’s major headlines, from our deeply polarized politics to the proliferation of crazy conspiracy theories. This decline, sometimes called a “crisis of trust” or a “trust deficit,” has of course become an increasingly common topic in newsrooms and think tanks and global conferences that conjure a world without trust and search for solutions.

But Rachel Botsman, an author, teacher and Substacker who is considered one of the leading experts on the topic, argues we’re thinking about trust wrong. I met Botsman earlier this year at the World Economic Forum at Davos (theme: “Rebuilding Trust”). Because we’re asking many of the wrong questions about trust at places like Davos and elsewhere, she says, we’re missing some of the solutions. Below, condensed and edited for clarity, is our conversation about why that is, and how to fix it.

We hear a lot that trust is in decline. But that’s not really your view, is it? It’s more that it’s in a state of redirection or fragmentation?

Trust is like energy—it doesn’t get destroyed; it changes form. It’s not a question of whether you trust; it’s where you place your trust. In society today, trust is shifting from institutional trust to “distributed trust.” Trust used to flow upwards to leaders and experts, to referees and regulators. Networks, platforms and marketplaces change that flow sideways to peers, strangers and crowds, creating a dispersion of authority and fracturing of trust.

A lack of acceptance that the trust dynamics have changed, I think, is a systemic problem. We’re trying to solve trust issues in the distributed world through an institutional mindset.

And if I’m a leader of an institution or a company, then I can learn from that.

Yes. It’s like, “Oh, my trust went over there. That’s where it’s gone.” That’s how they’re being influenced. “That’s where I’ve got to be.”

You’ve said that crucial context is missing from a lot of conversations about trust.

Talking in general terms about trust is not helpful because trust is a belief, and like all beliefs, it is highly subjective and contextual. Whenever we ask the question, “Do you trust fill in the blank?” we should follow with “to do what?”

If you think about trust in AI in education versus trust in AI in healthcare, these are very different applications with different trust needs. Trusting these systems to do what? Talking about trust is problematic unless you get to the context.

So how do you define trust?

The way I define trust is “a confident relationship with the unknown.” The greater the unknown, the more uncertainty, the more trust that you need.

This perspective runs counter to many social scientists who claim trust is knowing another person will do what is expected or knowing how things will turn out. But this has always struck me as odd: If you know the outcome or how things will end, why do you need trust?

If you take AI, think of all the complexities and unknowns; you need a lot of trust. When most leaders talk about trust in new technologies, they are often talking about mitigating risk. They are focused on governance and controls, and guardrails and mitigating the unknown. And then of course there is the belief that you can increase trust through transparency. This is a big misconception. You can reduce the need for trust through transparency and through these risk controls. Or you can increase people’s confidence in the unknown.

So if we want to increase trust in society, we have to do more than mitigate risks.

Yes. I find it quite unsettling that when pressed for solutions, the default is often transparency. Let’s make the media transparent. Let’s make the algorithms more transparent. Let’s make the inner workings of government more transparent. But if that’s the way we head, we’ve kind of given up on trust. You’re actually lowering the need for trust. You’re saying, “This is how this thing works. You can be certain about the processes. Therefore, you don’t really need as much trust.”

So what does your research say about the right way to build trust, in our institutions and in each other?

Deep trust forms based on how people behave. Above all, I think, it comes from integrity. How do you realign the public’s belief and confidence that whatever this institution is is serving their best interests?

It’s around changing our behavior rather than guardrails?

You need both, don’t get me wrong. But if we’re talking about restoring trust, regulatory guardrails don’t always restore people’s confidence. People’s confidence comes more from a belief that you know what you’re doing–capability, and I know why you are doing it—–character. Put simply, it comes from not just doing things but doing the right things.

So we need to have more discussion of solutions as a way of building confidence in the notion that we can get from here to there?

There is a crisis of confidence in our society that the people in charge actually know how to get out of this mess and chaos. I often hear, “Oh, we need to lower expectations of what we think we’re going to get from political leaders.” I find it incredibly disconcerting that we’ve come to accept things unimaginable even five years ago regarding the erosion of trust. We’ve entered an age of information and content where it’s no longer “trust, but verify” but “verify, then trust.”

You recently wrote that, “Owning our uncertainty makes us kinder, more creative, and more alive.” It seems to me we are seeing some of this in a positive way around AI, at least as compared to social media, do you agree? An acknowledgement that there’s plenty we don’t know.

What I’m struck by is what I would describe as a push-pull between fear of joining in and fear of missing out. Business leaders need to innovate with AI because they’re scared of being left behind. At the same time, there is extreme caution not to move too fast because they understand the unintended consequences if they get it wrong.

The fear, the hesitation, is good for trust. It creates a trust pause, where people slow down and think, “How do I trust these systems?” For example, I trust it for input and information, but I don’t trust it yet to make decisions about my health or money.

I was impressed, I have to say, by the humility of Sam Altman when he said, the sign above his desk says “no one knows what happens next.” The next generation of AI leaders may have the humility to admit they don’t know. They may program that into the technology so that it says, “I don’t know the factual answer to that. Ask a human being or find another source.”

Is there an example in your research or work of an institution or a field that has turned it around, that has rebuilt trust?

It always comes down to an individual. It’s often a very low-tech intervention. Take schools. Sometimes, kids have a very low propensity to trust others. They’ve never experienced being trusted, so they don’t know or find it incredibly difficult to trust others. And then you have these remarkable teachers who slowly earn their trust over time.

One of the things that always strikes me is that the teachers never go into these situations with the assumption that the children will give their trust back. They assume these kids will not trust them. Earning trust is through small gestures over time. It’s how you treat people day in, day out. It’s how you make them feel, especially on tough days. Can you scale that type of human trust through technology? Do we want to? We’ll see.

Does that make you a pessimist?

Or a realist. Or a humanist! I’m hopeful in many ways because AI will make us think more carefully about human connection and why humanness will become a differentiator.

Related Posts

Colorado Arrests 2nd Suspect in 500K Black Hawk Casino Heist

The person in question, Juan Gutierrez-Zambrano, 31, was already charged on May 2 with theft of $100,000 to $1 million and his bail has been set at…

Skywind Group advances global expansion via Groove Gaming partnership agreement

In a move that further advances its global expansion effort, Isle of Man-based B2B software provider Skywind Group has partnered with fast-growing aggregator Groove Gaming. The new…

Relax Gaming’s Cluster Tumble Dream Drop Online Slot Adds New Twist to Player Favorite

Relax Gaming’s hit release Cluster Tumble rapidly became a huge hit among fans back in 2021. The company recognized that and decided to release a new version…

Starship Troopers- Extermination – Análisis

El anuncio de un juego como Starship Troopers: Extermination pilló a muchos sorprendidos. Editado por Knights Peak y desarrollado por Offworld Industries, se trataba de un videojuego…

Grand Korea Leisure Company Limited extends casino closures

South Korean casino operator Grand Korea Leisure Company Limited has reportedly extended the temporary coronavirus-related closure of its three Seven Luck Casino-branded venues for another six days…

Sportech Intends To Remove Stocks From London’s Junior AIM; Pre-Tax Loss Significantly Reduced

UK-based entertainment and online gambling firm Sportech, intends to ask for shareholder validation to remove its shares from London’s junior Alternative Investment Market (AIM) because of the…