Author Talks: How to cultivate trust in the digital age

In this edition of Author Talks, McKinsey Global Publishing’s Raju Narisetti chats with Jimmy Wales, cofounder of Wikipedia, about The Seven Rules of Trust: A Blueprint for Building Things That Last (Crown Currency/Penguin Random House, October 2025), coauthored with Dan Gardner. Wales examines how the erosion of public trust in various arenas, including in the media, institutions, and politics, has affected the consumer experience. By reflecting on Wikipedia’s earlier challenges and “deliberate vulnerability,” he shares how businesses can build lasting platforms by prioritizing accountability over traffic. An edited version of the conversation follows.

Why is it worth reminding the world about trust and these rules?

People have seen a general decline in trust, not in every aspect and not in every country, but quite broadly. There is a decline in trust in journalism, politics, and institutions of all kinds. And we see the impact of that. People don’t know what to believe. They’re not sure if they trust the media. They’re not sure who they should trust.

There’s cynicism and a feeling that politicians cannot be trusted and that we can’t expect them to be trustworthy because things have gotten so bad. It feels like it is worth reminding people, “We need people who behave in a trustworthy manner.”

We need to trust each other. For that to happen, we need to consider what are the things we can do to build trust, both trust one-on-one in an individual capacity, but also trust in our organization, in a business, and so forth?

When we think of trust in the context of Wikipedia, it’s pretty intimately tied to how the website works and how the editing process works. If someone makes a bad edit, reverting it takes one click. We can assume good faith on your first edit. Just come [to the site] and do something. If you do something terrible, then we revert it, and, hopefully, you receive a nice message saying, “Hey, knock it off. That’s not what we do here.”

If people continue to misbehave, then the community has the ability to block them. You can see everything everybody’s done, and you have to assume good faith is not a suicide pact. You don’t just say, “Well, everybody’s fine, and you do what you like.”

Roughly 15 years ago, I wrote on my user page, “Yes, you can edit this page. I trust you.” That was part of the inspiration for the book, although I didn’t know it at the time.

It’s that deliberate vulnerability, which can be quite good. If you say to someone, “I trust you and I think you’re going to do the right thing,” then people will often say, “Yes, I’m a good person. I will do the right thing.”

Yet if all the signals and signs say, “You’re probably not trustworthy; you’re probably a bad person,” then you might say, “Oh, well, if you’re not going to respect me, then maybe I’m not going to be a good person or I’m not going to bother.”

Does this approach to trust scale?

The approach is a piece of the overall puzzle, so scaling is necessary. You can remember that individual person, the person you want to trust or whom you want to trust you. Then you assess the conditions for making that happen and design systems around that individual case. That is how you can scale.

You want us all to remember what we learned in kindergarten?

It’s a your-mother-was-right thing. There’s a deep truth to what we learned as children or what we teach our children. We forget those lessons at our peril. Some of these things are actually quite simple.

You teach your children to be honest, not just because it’s the right thing to do, but because it’s practical. You explain, “If you’re not honest, people won’t trust you and then you’re going to lose all kinds of opportunities in life for friendship, for jobs, for whatever.”

There are a lot of rules for Wikipedia. I’m sure the editing rules are hundreds of pages. However, when the rules are well made, you shouldn’t even have to read them. You already know them.

There’s a deep truth to the things that we learned as children or the things that we teach our children. We forget those lessons at our peril.

If you add something to the Wikipedia page, it should be factual. You shouldn’t make up content. You should indicate your sources. These are all really basic things. As far as the details of the Wikipedia rules, if you don’t put your footnote in the right format, someone will help you out. The important things are: Tell the truth, give the facts, and don’t be biased, et cetera.

Can anonymity and trust really coexist?

It’s interesting. We can look at a lot of different situations, and in some cases, we see very strongly the need for privacy and anonymity. For example, suppose you are trying to be a good and honest editor in an authoritarian society. In that case, you may find it necessary to be private, because telling the truth and citing sources about a political figure could be dangerous where you are. So you need that anonymity.

We can also see classic examples involving entities like Airbnb, where anonymity isn’t really a value. Knowing who the person is is valuable because a real person will enter my house and stay for the weekend. I need to know—or the system does—who that person is so that they can be held accountable. That person could do something damaging. One of the interesting things about Wikipedia that we have found is that “pseudo-anonymity,” a consistent identity over time, actually does all the heavy lifting that we need.

What we care about is not who you really are in your personal life, so that we know your home address and how we can sue you. Yet [we care about] knowing you’re the same person over time, because we can just block you if you’re misbehaving. We’ve got a mechanism for dealing with that. As long as that’s true, then it’s a lot easier. If every time you came to Wikipedia you had no identity, you edited, and we didn’t keep track of anything, then trolls would immediately take massive advantage of that.

At Wikipedia, you build a reputation over time, and sometimes that reputation is a pseudonym. Many of the “Wikipedians,” particularly the very active ones, know each other. They are those who have served on the Wikipedia board, met lots of Wikipedians, and attended our conferences.

One of the interesting things about Wikipedia that we have found is that “pseudo-anonymity,” a consistent identity over time, actually does all the heavy lifting that we need.

They’re not anonymous to one another. It’s a community in a more genuine sense than when people talk about online community. They actually know each other. They’re emailing, going out for drinks, and having dinner together at our conferences. Anonymity for us is somewhat limited. But when we need privacy, we need privacy. That’s also an important part of doing well on Wikipedia.

Can commercial platforms really solve the ‘wicked’ their models create?

Certainly. I am a fan of Reddit, but not an unreserved fan of each component. Reddit’s a big place with a lot of different kinds of communities. But Reddit is a good example of a company that transformed its thinking from initially operating, as if to say, “We’re a wide-open free speech forum and we’re not going to take anything down unless it’s illegal.”

After a period of time, they realized some entries were pretty toxic and not beneficial in any way. They came to an understanding that those behaviors were not OK. It wasn’t a case of defending free speech to have a pluralistic debate in society. Rather, it was people abusing other people.

One of the problems that a lot of social media has, particularly the ones that are more feed-based and scroll-based, is that if you choose videos by just turning algorithms loose and saying, “Keep people on the site as long as possible,” those kinds of algorithms can promote pretty bad things. Unless you work really hard to avoid it, that can happen even if that’s not your intention.

Those kinds of things and the business models surrounding them are really tough problems. I know great people at [social platforms] who struggle with this situation. They do a pretty good job, but it’s a hard problem. And it’s one that we’re very lucky at Wikipedia that we don’t struggle with.

We don’t have algorithms that detect patterns of behavior and keep you on the site as long as possible. Our business model is completely different. Actually, I don’t care how often you come to Wikipedia. It’s not like every time you visit our site, you might see ten ads that generate a little money. I care that when you see that banner at the end of the year, you say, “Wikipedia made my life better this year. I’m going to chip in $20.”

That’s a fundamentally different business model that drives a fundamentally different outcome. I don't really have solution for social media. It’s a hard problem.

We need to build a world where we may disagree politically, but at least we come to it with some shared understanding of the facts. And then we can disagree about what to do about it. But when we can’t even share a basic understanding of the facts, it’s really hard to do anything other than punch each other.

Doesn’t neutrality mean people can believe in two sets of truth?

The way I think about it is that there is only one reality. There are a lot of different facts about reality, but they shouldn’t contradict each other, because reality is reality. But there are a lot of different things that you could say about it and a lot of different perspectives.

On the other hand, for a lot of issues, both sides have something legitimate to say. The truth is probably somewhere in the middle or sometimes not. For a reader to make an informed decision, it’s important that they’re aware of all reasonable points of view.

We need to build a world where we may disagree politically, but at least we come to it with some shared understanding of the facts. And then we can disagree about what to do about it.

We don’t necessarily have to include every single point of view. But you do need to be made aware that critics have said this and advocates have said that. There are a lot of cases where, at least for some period of time—and that some period of time may be forever, depending on what it is—the actual facts are somewhat murky.

For the Wikimedia Foundation [the nonprofit that runs Wikipedia], we consider the role that we should play or can play. I currently lead a neutral point of view working group. I speak with the research team and meet with a lot of different people to ask, “Where might we have problems with neutrality now? How might we address those?”

Given how Wikipedia is structured, it’s not for the Wikimedia Foundation to say, “This is biased, and we’re going to start editing.” It’s to say, “In areas where we struggle with neutrality, what has gone wrong? What are the social rules that we could change? What are the social parameters? What kind of coaching could happen? Who are the people we might need to invite in?”

How do we address bias in the very sources that are cited?

The question of low-quality media or media that’s been co-opted by authoritarian governments is a huge problem. Our Wikipedia community grapples with that. Fortunately, the world is still global. In many languages where the media has gone down one path, it’s generally not by their own choosing but because there’s an authoritarian government pushing them in that direction. If the editor-in-chief doesn’t comply, they get thrown in jail. The volunteers are very brave about using the global media, using media from other countries.

Maybe there are media based in another country that share that language, so that they can speak the truth and speak freely. That’s really important. Of course, this is part of a broader category of larger problems.

We’ve seen an enormous decline in the viability and the health of local newspapers. What that means for writing history, for writing Wikipedia, is that the sources one might have relied on may not exist anymore. History is not being recorded on that first draft of history, which is the news.

For example, in my hometown, I was a newspaper boy. We had a morning paper and an afternoon paper, and they were independent of each other. The morning paper died years ago. The afternoon paper is now a shell of its former self. Huntsville, Alabama’s Huntsville Times publishes three times per week—largely out of Birmingham, a town 100 miles away. It’s mainly the AP Newswire and a bunch of ads. It’s not what it used to be.

The number of journalists working in local places is much smaller than it used to be. In many ways, if you want to write about the city council in Huntsville, Alabama, in 1980 versus in 2020, you will have a much richer set of content to write about because history was being recorded.

That’s a different kind of problem. The problem of the decline and the quality of sources that we all have to grapple with is something that we at Wikipedia can’t fix by ourselves. We’re not journalists. But it is something that society needs to think about and work on a lot.

If Wikipedia were starting today, would you do anything differently?

I’m a pathological optimist, so I always think everything is great. It’s really hard for me to think about that. For some things, we actually had to do them wrong first to learn and to understand.

I couldn’t have done it differently, because I didn’t know better. For instance, in the early years, we didn’t have the biographies of living persons [BLP] policy. Biographies of living people were treated just like every other article. As we got bigger and more important, we came to understand that a negative statement without a source in a biography of someone [living] can be very, very hurtful—even if it’s pretty mild by other standards. While we don’t want to have errors, if there’s an error about the mayor of Warsaw, Poland, in 1740, for example, we need to fix it. Yet it’s not actually going to hurt anyone right away.

I wish we’d had those BLP policies a little earlier.

But Wikipedia’s ‘transparent’ rules are also very daunting to most people.

Burying people with documentation is one form of failing to be transparent. In other words, you say you’re being transparent. I say, “All the rules of Wikipedia are very open. Just go and read these 700 pages.”

I can’t read 700 pages. Which rules apply to what I’m trying to do right now? How do I do this? How do I solve this issue that I have? That can be quite arcane and quite hard to find.

Another example is viewing a Wikipedia entry and seeing that there has been an extensive debate about the wording of a particularly controversial sentence, and hundreds of pages are spent arguing a specific point. I would prefer to have a summary that indicates who said what and provides the major points of view. Then, I can see if there is a way that I could help.

That’s quite hard at the moment. I think gen AI has a hope of doing that sort of thing—summarizing as a helper to the community. There’s great promise in helping humans grapple with huge bodies of text.

What do you want this book to do for trust?

I hope it has some impact on the conversation and on people’s thinking. I’m always a little disappointed when we see politicians who are clearly not trustworthy getting a pass from their own supporters or from people who say, “Well, what are you going to do?”

I would love to see a much stricter look at trust and at what we should expect from people in leadership positions. People in leadership positions—whether in companies, politics, or journalism—can really think about, “In my work, what are the things that I can do to increase the trust in what we’re doing?”

For example, there’s a big section in the book about journalism and the importance of neutrality in that profession. One might say, “When we write a really biased article, it appeals to our audiences and we get lots of clicks.”

That may be true in the short run, but in the long run, it undermines people’s trust in you. Sometimes I’ve read what was ostensibly a news article, not an opinion piece, and I see the language that’s used and the cherry-picking of facts.

I think, “This is a rant. And I agree with this rant, but you know what? I actually would prefer to be a little more challenged.” I’d like to understand the nuance of the facts rather than be fed a rant that I agree with.

I want journalism to consciously avoid bias, even though it can never be perfect, obviously. I want it to try to be fair to the other side to increase trust. It’s usually not the rank-and-file journalists’ fault. It’s the business model pressures.

Why has Wikipedia outlasted all the skeptics?

If you use the internet, depending on the sites you visit, and you interact a little bit or you watch the interactions, you see a lot of trolling and very bad behavior. That might make you think, “The general public can’t be trusted. There are a lot of crazy people out there.” And you might assume that if you make something available to open editing, it will just be immediately taken over by trolls. I think what people miss in all that is most people are pretty trustworthy.

You must have the right design of the software and of the community to avoid amplifying the worst elements and giving them free rein. Many people who haven’t even logged in make an anonymous edit. It’s the only edit they’ve ever made. You see the edit and say, “Oh, they fixed a spelling error. How nice that some random person just did this.”

Being optimistic about other people is something that we often miss. We miss it particularly when we’re looking at social media platforms, which are badly designed. They can be well-designed for a bad end—to increase engagement and noise. One could get the wrong idea of people from that. And basically, people are pretty nice.

Author Talks

Visit Author Talks to see the full series.

Explore a career with us