Features & Columns

Broken Book

From baby photos, puppies and long-lost uncles to a mass surveillance platform designed to monitor everyone's behavior—how it all went wrong at Facebook

Broken Book | Excerpt

Facebook's business model, in concert with its algorithms, have worked to push people into addictive and polarizing echo chambers.

Mark Zuckerberg isn't a bad person—at least not according to Roger McNamee.

Nowhere in his new book, Zucked: Waking Up to the Facebook Catastrophe, does McNamee make such a claim. The problem is that when a CEO lives in a filter bubble, everything that happens downstream from him will probably reflect the CEO's own predicament.

McNamee, a decades-long Silicon Valley investor, has held significant positions in many firms over the years, leading growth-stage investments in Electronic Arts, Sybase and Radius before co-founding Integral Capital Partners, a fund created in tandem with Kleiner Perkins, which put him at ground zero for the internet revolution. He was there on the very day Jeff Bezos pitched Amazon and when Larry and Sergey pitched Google. His most recent fund, Elevation Partners, included U2's Bono as a co-founder.

In 2006, McNamee was among those who mentored the idealistic young Zuckerberg, shortly after he relocated Facebook to Palo Alto. He even introduced the social network's founder to Sheryl Sandberg, who would eventually become Facebook's No. 2.

During those days, McNamee invested in Facebook and still owns a substantive number of shares. He still likes Facebook because it does useful things for more than a billion people. However, ever since the role the social network played in spreading misinformation and driving division in the runup to the 2016 election, an increasing number of people are finding that the platform has a dark side—one that McNamee insists can't be fixed unless the company overhauls its entire business model.

As McNamee tells it, it was precisely this model that drove Facebook to become what it is today: a worldwide apparatus of persuasive technology designed to keep users engaged by appealing to lizard brain emotions and modifying their behavior, all while harvesting personal data and metadata to sell to advertisers.

By now, plenty of people understand at least this much. But without taking a deep dive, it is difficult to wrap one's head around just how serious the Facebook fallout is. Filter bubbles account for much of the problem. McNamee writes that roughly 40 percent of Americans believe in something that is untrue: Obama was born in Kenya; humanity doesn't contribute to climate change; Pizzagate.

These are all demonstrably false. And yet, one man was so convinced of the veracity of Pizzagate—a reference to a conspiracy theory, which claims that high-ranking members of the Democratic Party ran a Satanic pedophile ring out of a Washington, D.C., pizza parlor—that he let loose three rounds inside the pizzeria.

In Zucked, McNamee explains that Facebook creates pockets of "intellectual isolation"—separating individuals from normal group activity and placing them in an echo chamber.

"Filter bubbles exist wherever people are surrounded by people who share the same beliefs, and where there is a way to keep out ideas that are inconsistent with those beliefs," he writes. "They prey on trust and amplify it."

Furthermore, these bubbles generate the most activity when their content is ideologically extreme. Over time, Facebook's algorithms have learned that anger and fear keep people far more engaged than shinier, happier emotional states. And so, in the interest of keeping users glued to their Facebook feeds, those algorithms stoked tribalist outrage.

As a result, an illusion of online consensus is created where none exists, all of which enables bad actors to weaponize the platform, amplify hate speech, exacerbate polarizing opinions and spread fake news to scare people and pit them against each other—precisely what Russian intelligence did with their online influence campaigns. It's not that anyone working for Facebook wanted this to happen—McNamee believes that no one at the company wanted this—it's just that platforms like Facebook have no incentive to eliminate the chaos because it improves all the right metrics: time on site, engagement, sharing.

And since Facebook's business plan is to monetize your attention and your time on the platform, a great deal of work went into every little detail—for example, just what shade of red makes the notifications more habit-forming, thus more successful at getting users to repeatedly check their news feed all day long.

On a larger level, the algorithms hoover up thousands of data points on each user's activity, tracking everything they like, click or watch online, as well as the behavior of their friends and groups—all of which feeds an enormous artificial intelligence engine that constructs with ghoulish accuracy a massive data model of each user. Since the algorithms operate with an intrinsic bias by recommending sensationalistic, radicalizing or conspiratorial stories, the users become much more vulnerable to emotional manipulation and they remain engaged much longer. To this day, similar scenarios continue to unfold, not just on Facebook but on Twitter, Instagram and YouTube as well.

But Zucked is about Facebook above all else, so McNamee, fearing that the company he financially supported might now be a serious catastrophe for privacy, innovation, democracy and the psychological health of humanity, is sounding the alarm and calling for a nationwide conversation. Zucked is an elaborate and triumphant manifesto-like screed that details the history of how it all went wrong—from baby photos, puppies and long-lost uncles to a global platform of mass surveillance designed to manipulate user attention and monitor everyone's behavior.

In particular, one chapter titled, "Silicon Valley Before Facebook," provides a compelling and concise history of how we got to where we are. In it, McNamee illuminates the last 70 years, drawing a line from the Apollo space program straight up to the current day, in terms of technology philosophy, business ethics, infrastructure and startup economics. Previous geniuses from Doug Engelbart on through Steve Jobs all believed that technology should function as a bicycle for the mind, augmenting human intelligence rather than replacing it.

For a half-century before 2000, engineers and scientists were limited by processing power, memory, storage or bandwidth and thus couldn't create what they truly envisioned, so they had to function as artists while trying to make something usable for the public. In the short few years from the original dotcom boom to the social media era, several dynamics all fell into place that few people outside of the technology industry understood.

Both Moore's Law and Metcalf's Law evolved to a point where the aforementioned constraints disappeared and any aspiring CEO could build a business for far less than they could in the days before cheap cloud storage, ultra-fast processing power and high-speed internet. They could focus much more on the application layer rather than infrastructure.

Before the social media era, there was no such thing as a globally successful startup comprised of college nerds without skills, pulling together a product just to see what would happen. Zuckerberg came along at the perfect time. Facebook exploded relatively quickly compared to the longer arcs of Apple and Microsoft. This new era also emerged hand-in-hand with an extreme libertarian ideology, characterized by a total disregard for the role of government in business and a total absence of empathy for anyone else's problems, even if Facebook's business plan aided and abetted those problems.

McNamee writes that he never knew Facebook would begin evolving its business model toward the dark arts of persuasive technology, which quickly became the equivalent of AI-controlled crystal meth for the masses. As Facebook grew in scale, its techniques enabled emotional contagion to overwhelm reason and spread like wildfire. Left unchecked, bad actors will continue to take advantage of the tools Facebook uses to manipulate people. Hate speech and confusion will continue to spawn violence. Disinformation will continue to damage democracy, says McNamee.

Governments seem to agree. As of presstime, Facebook was wrapping up negotiations with the Federal Trade Commission that could result in a multibillion-dollar fine, thanks to the company's privacy violations stemming from the Cambridge Analytica scandal. Similarly, the UK Parliament released a damning 100-page report calling Facebook and its execs "digital gangsters," and concluding that the company intentionally and knowingly violated both data privacy and anti-competition laws, and should not be allowed to govern itself from now on.

An early Facebook adopter as well as an investor, Roger McNamee says the platform must change for the sake of democracy. Photo by Rick Smolan

Thankfully, in the midst of the catastrophe, McNamee says there is light at the end of the tunnel. In Zucked, he provides a wealth of solutions. If society can summon its collective will and come together to tackle these issues, we can move democracy forward, encourage non-monopolistic innovation, repair the psychological problems caused by platforms such as Facebook, and ultimately, make the world a better place.

The following interview has been edited for length and clarity.

METRO: In retrospect, could you have predicted the way in which Facebook's platform began to facilitate bad actors weaponizing Facebook's business plan?

McNAMEE: The key thing, and to me this is the important point, is imagine that there are three phases to my relationship with Facebook. There is phase one where I am one of Mark's advisers. I'm also separately a friend and adviser to Sheryl Sandberg, and then I had the opportunity to bring them together. Then comes what I would call the quiet period, which is sometime in 2009 through the end of 2015. And there I'm just a cheerleader. I no longer have an insider relationship. But it was a period where I couldn't have been prouder. By all the public metrics, the company was exceeding my wildest dreams. And then comes the third period, what I will characterize as the activist period, where, beginning in January or early February of 2016, I start to see things that literally did not compute for me. The notion that bad actors would do harm to innocent people over Facebook literally never occurred to me until I saw it.

METRO: Since many people still find Facebook extremely useful, how can the public be made to understand the negative aspects of the platform?

McNAMEE: What I have learned is that the business model and the architecture of the product in combination with the culture of the company—those three things have combined to enable a lot of bad outcomes for innocent people. And the impact can be seen in at least four areas. First, public health, which is really the mental health of the users of all ages, from little children up to adults. Second, democracy, with obvious impact in the United Kingdom and the United States. Third, privacy. And, fourth, innovation. And so what you have here is a wonderful technology operating in an environment with no constraints and no accountability.

I constantly repeat this point: I do not think that any of these people are bad people. I believe that they have been encouraged from the earliest years of their companies to pursue growth and stock price appreciation without concern for any other consequence. And the investors, the boards of directors, the friends, the spouses—all have played a role in making the behaviors of these companies seem reasonable. I get that. I'm actually sympathetic. It's not like these products do nothing good. In fact, quite the opposite. The reason they're so successful is because they provide services people want in a very convenient form. Now, the fact that they use psychological tricks that create behaviors that lead to behavioral addiction, that's a conversation we need to have.

METRO: Most of the tech industry believes that AI is the future, but you don't think so. If AI must exist, what can be done to prevent future damage?

McNAMEE: The transition from the fallibility of humans to AI should be done in a manner that eliminates implicit bias. But there's nobody to make that happen. So when I look forward—and I think this is a very core point—I have great optimism, because I not only can see the path we should be on, I believe we have the ability to get there. When I look at, say, Facebook or Google, those businesses depend on the attention of human beings. They depend on the hands-off behavior of policy makers. And the people that use these products have a lot of leverage against both of these things; they have the ability to withdraw their attention to make a point, same way the teachers in L.A. went on strike, or the way the air traffic controllers called in sick. You withdraw your attention and you really profoundly impact the outcome. And by the way, you see this with Facebook now. People have really changed their usage patterns in the United States, and I don't think they're done. Policy makers recognize that [Facebook's] behavior is inappropriate and all they're asking is to get enough pressure from voters to justify taking action.

And if you ask yourself what's the policy, let's just look at a new category like AI. You go, "I think all artificial intelligence products should have to prove safety and efficacy." Now, do we have a model for that? Interestingly, we do. When the Food and Drug Administration in 1938 was given its current powers to find safety and efficacy for any new drug, it had all these inspection things from the earlier Food and Drug Act relative to the food supply. Why? Because everybody recognized that food and medicine were so important; you couldn't afford to have bad actors hurting people. And I believe with AI it's the same thing. We need to get back to having technology being something that empowers people, not takes away their humanity.

So my basic point is, I want Silicon Valley to return to the culture and philosophy of "bicycles for the mind." And I believe as a solution to a man-made problem, it's directly analogous to wind power and solar in that there's a gigantic business opportunity for fixing this mess. I'm not saying we have to blow up Silicon Valley and go home. I'm saying just the opposite. I want us to stop hurting people.

METRO: Plenty of people can use Facebook in a positive way without feeling manipulated, or they simply just don't care if their behavior is being monitored 24/7. So what do we do?

I do not pretend that I know the right answer. What I know is that I care deeply about Silicon Valley, I care deeply about technology. Facebook continues to be the largest investment that I hold. And I hold it because I don't want anybody to be confused about why I'm doing this. I'm doing this because I'm terrified about the future of our country and about the future of civilization. I believe that what [Shoshana] Zuboff calls "surveillance capitalism" may be as great a threat as climate change because the underlying technology strips us of our humanity, and the people who operate these technologies are not accountable to the people affected.

So it's not a choice. People can say, "Oh, but you pick your friends, and you pick what you click on." But that isn't actually true, because all of these things operate with this notion that Tristan Harris calls brain hacking, which is very clever psychological tricks to play on the weakest elements of human psychology. And there is something deeply dystopian about what's going on. And my point here is, we need to engage in this conversation. Let's not pretend that this is in any way good for humanity or good for society. Because the evidence right now is demonstrably that it's not.

METRO: How difficult will it be to move the needle and bring the bulk of society on board with this?

McNAMEE: I am really sympathetic. I think this problem is really hard. I spent 34 years doing nothing but investing in tech. I've spent the last three years trying to figure out what the hell is going on, and I did it as a full-time job. IT is really confusing. I wrote the book because it was so damn confusing I realized, "Wow, if I describe my journey to everybody, that will help them get a jump start." I am playing Jimmy Stewart in Rear Window. I see something that looks like a crime, and I pull on the thread until I understand it. And in sharing that journey with the reader, I allow the reader to skip the three years I spent figuring that part out. Now that everybody's focused on it, we're learning really quickly. And my point here is, your kids' health depends on it. Your health depends on it. The country's health depends on it. The world's health depends on it. And we shouldn't feel sorry for the billionaires that own these companies. They're incredibly rich, they're going to be happy and fine no matter what. Now it's time to focus on everybody else.