More than 2 billion people have come to accept that Facebook is, more or less, what a social network is. Its highly particular and historically contingent bundle of features and applications has become the yardstick by which all other networks (outside China) are measured.
But was this Facebook inevitable, or have the company and its users simply fallen into this particular configuration? For example, Facebook began as a simple desktop website in which people could connect to friends in a small number of universities through MySpace-like profiles. But then came the News Feed in 2006, which collected and ranked the different things your friends were doing (like posting new pictures or breaking up). People hated it, even according to the engineers who worked on it. “A lot of folks wanted us to shut News Feed down. And most other companies would have done precisely that, especially if 10% of their users threatened to boycott the product,” recalled Ruchi Sanghvi, an engineer on the original team, in 2016. They didn’t, though, because as Sanghvi explained, to their minds it “was actually working.” “Amidst all the chaos, all the outrage, we noticed something unusual,” she wrote on her Facebook page. “Even though everyone claimed they hated it, engagement had doubled.”
The “feed” format spread far and wide. And Facebook learned an important thing: It didn’t have to listen to what users said, when it could watch what they did. That’s guided decision after decision as the company morphed into the global powerhouse that it is.
But what if Facebook had shut down News Feed? Would Facebook have learned that its data might not reflect how users actually felt about a service they feel compelled to use? What if that decision flips the direction of both the social network and its internal processes?
Siva Vaidhyanathan is a professor of media studies at the University of Virginia and the author of a new book, Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. Vaidhyanathan has been a strenuous critic of the technology industry, and the book is best described by his own pithy summary: “The problem with Facebook is Facebook.” He’s spent the past several years reading and thinking with scholarly depth about not just how Facebook works, but why it was built the way that it was.
I asked him to explore the question of Facebook’s rise, both in its actual features as well as the historical circumstances that shaped the social network. An edited and condensed transcript of our conversation is below.
Alexis Madrigal: Thinking about how Facebook is welded together, did we need to have a social network that did or aspired to do everything?
Siva Vaidhyanathan: One of the reasons why there is nothing like it and there’s been nothing like it and there might never be anything like it is that Mark Zuckerberg never held back in his vision of what this could be. He never thought that he was going to build something that would be the next killer version of HotOrNot, a cute little experiment. He never thought, I’m gonna build a dating app. Because he had a much grander theory of human interaction, misinformed by the shallowest reading of network theory and sociology, that he thought: Well, let’s not stop at building the best dating app, let’s change the world. Let’s give people a way to manage their social relations in a rich way.
At some point, he decided that there is a universal principle here, and that is the principle of engagement. What we should really be paying attention to, what he will make sure we pay attention to, are the things that generate engagement. That was one of the core mistakes. You could measure engagement and you can’t measure things like depth of thought or kindness.
Madrigal: Do you see any other app out there that serves as a counterexample to how Facebook works?
Vaidhyanathan: Instagram is the best argument against Facebook in a bizarre way. At some point, Zuckerberg decided Instagram was worth acquiring because he realized the power of the image and social connection. Instagram was either a threat to future growth or an opportunity for future features to be folded in. But to this day, it remains a saner, cleaner experience and one that has had minimal political effect on the world. And it’s a great place to see puppies.
Madrigal: As we think about the institutional history of Facebook, what else could have happened?
Vaidhyanathan: 2011 was one inflection point. In spring of 2011, there’s this instant myth out there that Twitter and Facebook were instrumental in the overthrow of dictatorships and the establishment of democracies. Even though by 2013, it was pretty clear that they were just new dictatorships, the myth remained.
That insulated Facebook from self-criticism. It was easy to go to work at Facebook—whether you were Mark Zuckerberg or Sheryl Sandberg or someone working at the lower level of the Facebook Messenger project—and convince yourself you were improving the world.
If we’d been able to deflate that myth, we might have seen Facebook behave more modestly internally. But 2011 was an affirmation that the global vision of bringing people together was going to yield good things for humanity.
Madrigal: What do you think this counterfactual, wiser Facebook might have done?
Vaidhyanathan: People could have raised issues and formed a red team to test the ways their system could go terribly wrong or cause problems in the world. I see no evidence they had that conversation. It’s not until 2016 that they are taking anything seriously but their own servers.
Madrigal: For me, the biggest inflection point was when Facebook moved into the media realm in 2013, which we wrote about contemporaneously. They were becoming your “personalized newspaper,” the best-ever personalized newspaper. They never say that anymore, but they used to say it all the time. They also used to say they wanted to “rewire” different things, political systems, etc. And for me that is the moment when suddenly, they rewire the whole information sphere through Facebook. And that just didn’t have to happen, even for Facebook to become a huge powerful globe spanning company.
Vaidhyanathan: It didn’t have to happen. The power of an ideology that says social engineering is possible and therefore, we should do it—that’s pretty irresistible to a bunch of idealistic, half-educated young people. If you’re someone who looks around the world and says, “There are some big problems but almost all of them can be solved with one thing”—and that thing is better and deeper personal connections so that we can overcome differences. And we have these tools that we can design that can do that for people. Then, at that point, once you’ve accepted all that, it makes it really hard for you to question anything.
These ideological blinders ... were so pervasive in hacker culture in the 1990s. Coming out of the Cold War, there was this amazing moment, once we freed up the channels of communication and people could learn about the world and other people around them. That was the great potential for human flourishing. And there weren’t a lot of people who directly objected to that optimism. If some communication is good, then more must be better.
Madrigal: James C. Scott’s Seeing Like a State traces the usually disastrous process of quantifying and flattening real social processes into data that is legible to governments. It feels like something analogous happens at Facebook and other big tech companies, which I call the “estrangement of scale.” They are looking at these crazy dashboards of hundreds of millions of people, and a lot of human detail can be obscured in there.
Vaidhyanathan: If you view the world as the data dashboard available to people at Facebook or Google, you see a very strange world. If you remember, soon after the election in 2016, when Zuckerberg was first confronted with the challenge that pollution on his service might have made a difference, Zuckerberg’s first reaction was quantitative: The bad stuff was a very small portion of what was flowing around Facebook. And it was true! But it was besides the point. To look at it in aggregate, to look at it statistically, was to miss the point entirely.
And it took him several more months to admit that small things can make a big difference, especially when small things are amplified by the algorithms he’d helped build.
Madrigal: Especially in a high-stakes binary outcome election decided by a tiny percentage of people.
Vaidhyanathan: Behaviorism is embedded in Facebook. They’ve been clear about this. Facebook is constantly tweaking its algorithms to try to dial up our positive emotional states, otherwise known as happiness. That’s one of the reasons that they measure happiness to the best of their ability, or so they think. It’s one reason that they’ve run mood changing studies (that they got into trouble for). This is the kind of social engineering that they want to engage in. It’s one of the reasons that they are trying to turn up the dial on the hedonic meter on the whole species. And that lets them ignore the edge cases, and those edge cases can be millions of people. People in Myanmar and Kenya. Women who are stalked and harassed through Facebook and have to rely on a clunky reporting system. The edge cases fall away and only recently has Facebook faced the sort of public scrutiny that has encouraged the company to take these problems seriously.
The information was available four or five years ago, longer in some cases. And they did nothing. But again, when you’re looking at that hedonic meter on your screen and you are seeing that the general happiness of Facebook users might be edging up, you can feel really good about the work you do every day and ignore the horrors on the margins.
Madrigal: Do you see any plausible scenario where Facebook declines in importance?
Vaidhyanathan: I think the only possibility of Facebook becoming less meaningful and important in people’s lives would occur among elites in North America and Western Europe. People who read The Atlantic or read long academic books about Facebook might reduce their usage. But I see nothing but growth and more dependence on Facebook and WhatsApp and Instagram in the rest of the world. Every day in so many ways, the United States matters less in the world. That’s not just Trump’s fault. It’s been a long-term process.
Look, if at some point, Facebook is allowed to compete with WeChat in China, then we’ve got a totally different game and what Americans think of Facebook will matter very little.
Madrigal: How do you mean?
Vaidhyanathan: Here is the nightmare scenario for Facebook that could bring them down. And it’s a worse situation than we have now. [Dominant Chinese social network] WeChat goes global. It bursts out of the confines of [China] and its diaspora and there is a functional version in 20, 50, 100 languages. Facebook could be in big trouble because WeChat does everything. WeChat is the operating system of a billion users. If you’re in China, you must use WeChat, almost everybody does. If you’re in China and you use WeChat, you are checking out library books, making doctor’s appointments, buying fast food. WeChat does what Facebook does and what Twitter does and what Instagram does and what your banking app does. Each of us has like, 60 apps on our phones and we use like, 7 of them. In China, you really only need the one.
What will then happen is that Facebook will double down on becoming the operating system of our lives. It will bolster the ways that its various functions work together. Instagram would be folded into Facebook and WhatsApp will be folded into Messenger. You can see Messenger looking more like WeChat all the time. If Facebook gets into China, it’ll introduce Messenger first.
Facebook breaks down and gives authoritarian states information about us and WeChat romps around the world, giving that information to the government of China. So, in that perverse way, serious competition for Facebook is a far worse scenario than we have now.
from The Atlantic http://bit.ly/2mGnHz1
via IFTTT
No comments:
Post a Comment