Facebook whistleblower allegations, controversies go unanswered.


For the past few weeks, as Facebook has dominated the news, there’s been this question: Where’s Mark? As in, Mark Zuckerberg. The founder and CEO of Facebook. Last week, we found out: He’s been in the metaverse.

At the end of a monthlong news cycle that has included a whistleblower named Frances Haugen, her cache of secret documents, and hours of testimony on the floor of the Senate, Mark Zuckerberg released a video about his company’s future. That future apparently includes a new name—Meta, as you’ve probably heard by now—and a utopian vision of virtual reality for everyone. What it does not include is a vigorous investigation into the many ways Facebook has harmed its users.

I think it’s a giant public relations mistake to hide from this,” says Steven Levy. Levy has what he calls a “Ph.D. in Facebook.” He’s written a book about the company. He’s interviewed both Zuckerberg and his No. 2, Sheryl Sandberg. Like a lot of observers, Levy has given the company’s response here a big thumbs down.

But what’s shocked him most about the way Zuckerberg and Sandberg have approached this scandal is that he thought they’d learned by now. Levy had a front-row seat during the company’s last big crisis, when Cambridge Analytica was able to use the site to mine for user data in a bid to elect Donald Trump. “For five days, Mark and Sheryl went to a bunker and didn’t even talk to their own employees,” Levy says. “They told me that was a mistake, that they should have been more upfront.”

Now, Levy says, the bunker mentality is back. “The day that Frances appeared on 60 Minutes, something that Mark knew was going to happen, he posted a video of him and his wife sailing on the bay.” Levy says. Mark Zuckerberg can’t escape to the metaverse forever. On Monday’s episode of What Next, I spoke with Levy about why. Our conversation has been condensed and edited for clarity.

Mary Harris: Can you lay out what are, in your opinion, the biggest revelations from this cache of documents leaked by Frances Haugen?

Steven Levy: There’s a lot of details that stand out and are kind of shocking, like anger rates so much higher as a signal to boost the distribution of a post than anything that’s benign or likable. But ultimately, it’s many of the things that people were thinking: Facebook poisons the conversation, it divides us, etc., etc. It makes teenage girls feel bad. People get bullied. We all were writing about that. And the people in Congress were complaining about it. The regulatory agencies were complaining about it. But this is overwhelming proof of how deeply Facebook knew this internally and didn’t take the aggressive steps it needed to take to minimize the damage it causes. To me, that’s the big thing. It’s not so much a scoop that totally changes the way we see Facebook. It is just kind of worse than we thought. And Facebook should have done more. And there were these groups of people, hundreds of people, doing research at Facebook who were super concerned about this and kept pushing Facebook to be more aggressive in the actions it took.

Including Frances Haugn. 

Yeah. I’m not saying it didn’t do anything. Generally what happened was a study came up saying, “We’re failing here, and we should do X, Y, and Z,” and Facebook might do X or Y, or they might try something else, but it turned out to be not enough.

Some of them weren’t due to nefarious motivations. They were because the way Facebook is structured, it’s much easier to kill an innovation that protects people than it is to institute it. Because if you want to make a change in the news feed, many, many groups are involved, and all of those groups get to weigh in. If someone’s really against it, that improvement probably will get nixed.

Part of why I wanted to talk to you is that your book really centers Mark Zuckerberg as the personality behind a lot of Facebook’s business decisions. I wonder how you see that play out in these documents that you’ve been poring over.

Mark has ultimate control over Facebook. He literally controls 56 percent of the voting stock. He can’t be fired. He doesn’t make every decision on Facebook, but people think of him when they make decisions.

Like, what would Mark do?

It very much is Mark’s company. Even on content decisions, sometimes he’s the last person to sign off. If people can’t agree on how to handle a hot potato in terms of content like, for instance, whether Donald Trump should be suspended or banned, that goes up to Mark.

Mark is super engaged. And he’s a stubborn fellow. This was one thing I learned about him from talking to him so much—he’ll dig his heels in. Sometimes when the evidence is overwhelming, he’ll give up this stance. He won’t flip on a dime like Steve Jobs would. Steve Jobs would be adamant about a subject, but as soon as he saw it was a convincing argument the other way, he would act like he was on the other side all along. Bang.

Mark isn’t that way. It takes a lot to make him change his mind, but he can change his mind. He’s a very data-driven person. But the data, which enforces him, is that Facebook is still making money hand over fist. Let’s not forget that.

Yeah, the company is still profitable, and you can have a big media problem. But if Wall Street still loves you or your board still loves you, I don’t know if it matters.

And Mark makes sure the board loves him because he’s gotten rid of all of the people on the board who have been questioning him.

A lot of people, including you, characterize Mark Zuckerberg’s approach to the work he does as this “move fast and break things” approach, which I think is true in some ways. But it feels to me like it’s combined with this caution when it comes to Facebook’s bottom line. You’ve told this story about how Zuckerberg didn’t want to include a like button on Facebook originally because he thought it would drive down engagement. And it’s kind of a back and forth. I’m wondering if you think that nuance is important and how you would describe it?

I think it is important. So “move fast and break things,” originally, it’s a technical term. Mark, as a child of the web, understood in a way that established companies like Microsoft did not that if your system literally breaks, goes down, you could make a fix, and it’s like, hey, no harm, because you can have the fix up within an hour. As opposed to, if you’re using Microsoft Word and there’s a bug, to get that not only fixed but out to update the program and millions of computers, that’s a long, more serious process. So it used to be a badge of honor among Facebook engineers to do something on the edge to break the program.

Because you’re always updating.

Yeah. And that’s characterized the move to introduce new products. It became a metaphor. But when it interferes with the company’s main drive to grow, that’s when things get slowed down.

I pointed out when Facebook did this oversight board, it took three years from the idea to the point where there was a board and they were making decisions. So in that case, Facebook moved really, really slow because this wasn’t something that was going to increase the number of users on Facebook.

Hmm. And do you see in these documents how “move fast and break things,” it’s kind of the growth mindset that is more important than that?

Sure. You see this in action in these documents about how Facebook, to this day, despite the billions of dollars it makes in profits, has not invested in having people moderate the content as thoroughly in foreign languages as it does in North America.

In Myanmar, for instance, years ago, in the early part of the 2010s, the system was being used to foment riots and attack political opponents with misinformation. And Facebook was told about this and did almost nothing.

That’s totally a growth thing. It wants to be everywhere, but it doesn’t want to spend the money to make it safe everywhere.

I want to talk about this internal dissent that you’ve found as you’ve looked through the Facebook documents. You focused on badge posts. Can you explain what a badge post is for someone who’s outside of Facebook culture?

Just like in any organization, when people leave, quite often they’ll write an email to the staff saying, “I’m off to my next adventure. Great working with you.” At Facebook, they’re called badge posts because there’s a custom of taking a picture of the badge you hand in when you leave. The badge that you swipe when you go into the building.

These badge posts that I was writing about they all universally say, I love the people I work with. As an employer, Facebook can be great, but they point out to different degrees that they came to feel that Facebook was not good for the world, and they felt that the decisions made to improve it weren’t aggressive enough. They were quite often tainted by political considerations. A number of badge posts point that out.

It’s funny because the main question I’ve had looking at the coverage over the past few weeks of Facebook is, gosh, I wonder what it’s like to work at Facebook right now, where every day if you look at the newspaper, your employer is splashed all over the front page. Did these posts give you any insight into that?

It’s clear that if you work at Facebook, you’re much less likely, say, to wear company swag when you go out with your family.

The thing that struck me looking at your reporting about these Facebook employees and their posts when they leave is the Facebook response to Frances Haugen has been, “Well, oh, you’re just looking at a small slice of our research and it’s biased and this is one person. But then you look at these badge posts and you realize, first of all, there are so many more people out there who have something to say who worked at Facebook and are just saying it internally. They’re not doing what Frances Haugen did. But then also there’s this nuance to it where the employees clearly appreciate Facebook and think of it not as an all-encompassing bad. There was a reason they went to go work there. And so to not see those people and their nuanced reactions feels like a real miss to me on the company’s part.

Definitely, these are people who wanted to save Facebook. A lot of people told me that they got burned out and they were too tired to keep fighting.

One released his badge post as a video. This was engineer Max Wang, and it was a long video, first of all, and he was like, “I’m doing this because I want you to see my face, and I think we’re failing and we’ve enshrined that failure in our policies,” which is pretty damning.

These badge posts prove that the system isn’t working. If it was working, these people would be saying, “Hey, press, you’re using these things wrong.” “Congress, don’t look at it this way.” Instead, they’re saying not only is the harm there, but we were hoping that Facebook would aggressively address it and take measures to minimize the harm. And those measures aren’t being taken to the degree that they have to be taken.

You’ve taken issue with this comparison that a lot of lawmakers and pundits have made where they’ve said this is Facebook’s Big Tobacco moment, and I’m hoping you can just tease out why you think that analogy doesn’t work here.

Well, I don’t see any good that comes from tobacco. It’s a chemical that poisons your body and can kill you. Facebook is different. Billions of people use it for a reason. Most people use it, get some value out of it. But there are way too many cases where Facebook does harm and that has to be addressed. But again, it’s not at the level that everything in Facebook is wrong and if you wipe Facebook off the Earth, we would necessarily be better off.

You mentioned how former employees talked about how they’d seen people connect using Facebook in ways that were really positive for them. It’s not all bad in the way that tobacco was. But it’s kind of interesting to play with the tobacco analogy because you can also look at it as how Facebook is responding in this moment.

In terms of what Facebook the company could do to rehabilitate itself and change itself, I went back to the 1982 Tylenol poisonings.

The company immediately took 30 million bottles of Tylenol off the shelves. They stopped selling Tylenol, and they paid the families of the victims, even though they could have argued that it wasn’t us; the drugstore was responsible. They didn’t do that. No questions asked. They paid them. Their people talked to them personally. The chairman and CEO went out there to 60 Minutes, Nightline, other places, and said what he personally was doing.

I don’t see Facebook as doing anything like that. No one’s considering putting a halt to anything.

That’s the difference. Because within a year, the Tylenol sales were as strong as they ever were. It’s still one of the most popular analgesics in America. I’m not saying that Facebook necessarily should be shut down. But compare the public reaction: They’re attacking a whistleblower. They’re making implications about publications like mine. Mark and Sheryl are nowhere. Mark wrote one blog post about this whole thing of the whistleblower a few weeks ago. It seems to me that the two of them should be everywhere talking about how concerned they are about the harm that the company does, as shown in these documents. And even though it’s a small percentage of people possibly affected by this—they could argue—they’re going to do everything they can and make a bigger effort to change that, even if it does mean slowing down growth. Or maybe, having people spend less time on Facebook.

But no one’s talking about slowing down growth. You’ve interviewed both of them. It sounds like you’re frustrated. You must have tried to interview them again in this moment.

Yeah, I did try to interview Mark again in the last couple of weeks. It’s funny, as soon as I mentioned one thing I wanted to talk about was the moral aspect of what he was doing that sort of shut down negotiations right away. He wants to talk about the metaverse, but the subject can’t be changed so easily.

Subscribe to What Next on Apple Podcasts

Get more news from Mary Harris every weekday.

Laisser un commentaire