Big Tech’s Wakeup Call Isn’t Just for “Them” — It’s for All of Us

Mike Dossett
13 min readNov 20, 2018

There’s no shortage of criticism in response to the jarring reports detailing the highly-publicized turmoil and controversy surrounding and within Facebook over the last few years.

“Zuck and Sandberg make tobacco execs look like Mr Rogers.”

“Facebook is the new cigarettes.”

“They have absolutely no morals.”

Too much of this criticism is too narrowly focused on the company and the company alone, failing to identify the numerous contributing factors and dynamics at play.

What follows is an attempt to unpack and understand the bigger picture in service of greater understanding, and perhaps greater clarity for how to move forward productively.

Make no mistake, there’s blame that lies directly at the feet of Facebook; at times the institution and at times its leadership, to the extent they can be disaggregated. Some of the reported actions detailed in the myriad exposés are typical of self-preservation instincts, while others are perhaps more egregious. It would be a futile and misguided effort to make universal excuses for these actions.

But briefly stepping above the made-for-Twitter zingers and Chyron-friendly-clapbacks, it’s lazy to dismiss phenomena as profound as the underlying issues at play here exclusively as the failings of one company blinded by profit and recklessly managed in its pursuit, all at the expense of American sanctity.

Without totally absolving Facebook or its leadership of responsibility, any rational analysis must be conducted with an understanding that the issues at play not just Facebook issues. They are part of a complex, highly nuanced set of challenges. Moreover, they are interconnected at a global socioeconomic/geopolitical level, and manifest themselves in the actions and influences of powerful institutions (governments, private enterprises, media entities) and in the stakeholders both in control of and affected by them.

Relishing in the anti-insertwhatever fervor or fanning the flames of discord as some sort of catharsis is misguided, and only contributes to the vicious cycle of cynical oppositionism. No less ineffective is any 11th-hour admonition from high, rooted in indignant faux-surprise but unsupported by any action — past or present.

Beyond generating fleeting headlines, this does little to advance understanding — let alone approach some semblance of resolution — of the underlying issues that manifest themselves in ways both easily visible and unseen.

The products companies like Facebook, Twitter, and Google have built possess an incredible capacity to influence, amplify, shape, reinforce people’s beliefs and behaviors at enormous scale. It’s precisely why advertisers pay them tens of billions of dollars a year. Generally speaking, they do what they are paid to do very well.

These products and systems get leveraged for good every day. Without resorting to blind idealism, people do get connected to ideas, experiences, and other people that they might never have encountered. They rally around and support each other in times of tragedy in ways that arguably scale faster and more efficiently than ever before. Millions of businesses grow and thrive by tapping into this infrastructure. People and creativity prosper.

But clearly, these products are also shamelessly exploited for bad, leveraging precisely those same dynamics that make the “good” possible.

The same mechanics that rally groups of like-minded people in support of a community ravaged by natural disaster enable vicious hate groups to propagate and assemble to action, or help dictators to sow the seeds of division in lockstep with heinous crimes against humanity. Entire nations are reverting to aggressive and unabashed tribalism, often with outside entities arming all opposing sides with disinformation and actively spurring on the division. People and communities suffer.

What are the common threads that enable these two polar opposite outcomes? What individual and systemwide incentives exist — both for the platform and the entities surrounding it — that fuel this seemingly intractable dichotomy of good and bad?

To truly understand, we need to look both at the platforms themselves, and the broader context they operate within.

First, let’s focus on the platform. (In this case Facebook, though it largely applies to places like Twitter and YouTube with minimal variation.)

Many, many years ago, with the seemingly noble and harmless aim of showing you more of what you were interested in, News Feed algorithms stopped showing you everything, from everyone you were friends with or every Page you followed, in chronological order.

Simultaneously, organic reach — when a user who follows a Page sees a piece of content published by that Page without the Page paying for that exposure in the person’s News Feed — dwindled to low single-digit percentages as the monetization engine of Facebook began to kick into high gear.

So rather than users simply seeing all the content from all the Pages and people they Like and follow, the data factory that is the Facebook ecosystem collects, analyzes, assigns value to, makes inferences about, and forms profiles about each user; their tastes, their preferences, their likes and dislikes, their clicking/viewing/sharing/buying habits, their frequently visited locations, their strongest relationships, and — at risk of descending into the hyperbole I’m actively decrying — their greatest hopes, dreams, and fears. Armed with that data, the learning engine that discovered more about each user and began to introduce more ‘relevant’ content into their feed.

So Formula1 fans got more videos about Lewis Hamilton’s latest championship, foodies got more kitchen hacks from Tasty and restaurant reviews from Eater, crafters got more DIY tutorials for the holiday and the every day.

Publishers, brands, and people who aligned with these interests created content for these audiences that captured their attention, evoked some emotion, and inspired ‘engagement’ were all rewarded with greater reach, higher sharing volume, more active audience participation, and fanbases who stuck around for the long haul. Publishers who sought their way into News Feeds with more inert content that didn’t lend itself to that strong emotional response saw their reach diminish as their lack of distinctiveness was detected and penalized by the algorithms that control what gets shown to whom.

People saw more of what they ‘wanted’ to see, and the business grew — fast.

Did it really matter that the sensationalized video of the harmless post-race shoving match in the paddock got more exposure than a thoughtful #longreads dissection of the journey that each racer took on their path to international racing stardom, which might have influenced their stubbornness in not yielding the corner to the challenger on the racing line? Sporting puritanism aside, of course it didn’t matter. Those stakes there are indescribably trivial in the grand scheme.

Where it began to get more rocky (to politely understate it) was when the interests that were catered to and exploited extended beyond the comforting joy of knitting or the glamour of a Grand Prix weekend, and treaded into the realms of politics, government, policy, social structures, economic systems, and the like.

These topics — online and offline — generally lend themselves to much stronger, more vocal, more personal, more identity defining and shaping perspectives. They’re emotional in a very different way. There is so much embedded history, so many decisions that shape the lives of so many people, that the weight these discussions carry — real or perceived — is massive. The stakes are immeasurably higher.

Topics like immigration, religion, healthcare, social safety nets, war, race, gender, and beyond have been supercharged with emotion for as long as they’ve been considered.

On social platforms, where ‘engagement’ is rewarded with eyeballs, and emotional response is highly correlated with engagement, the incentives were as clear as the ultimate resulting outcomes (with some benefit of hindsight.)

The simple underpinning: more relevant content leads to more engagement, more attention, more time spent. More scale, more data, more share of mind, more momentum in culture, and, yes, more addressable ad inventory to be transacted on the back of it all.

So thoughtful, rational, nuanced explorations of a complex and multifaceted topic like immigration reform or tax policy — which require meaningful investment of time and effort both to perform and to consume — are overshadowed by the quintuple-bacon-avocado-grilled-cheese video (no harm done), and the increasingly-radical talking head who can shout the loudest, most scathing, most soothing, most evocative and most easily disseminated hot take to a legion of eager acolytes (lots of harm done, as it turns out.)

In an auction-based environment like the News Feed where ‘slots’ for content to fill are finite, a choice to serve something substantive, balanced, and nuanced (which is less likely to get the ring-the-cash-register click/view/engagement/share than its extreme, evocative, emotionally-charged counterpart) in service of greater public discourse and general social health becomes much harder to make.

Echo chambers, filter bubbles, and aggressively antagonistic, tribalist mentalities have thrived in this environment, and whether partly because of or in spite of it, the business of Facebook has continued to thrive, too.

But Facebook does not operate in isolation. While it may amplify, accelerate, or simply play host to these bad behaviors (alongside lots of good behaviors and outcomes), there are a few key entities — each with their own incentives and disincentives — that play their part in this story.

So let’s look at the broader context in which Facebook and these contributing players all operate.

First, let’s explore the relationship between Facebook and the publishers that occupy the spaces in the News Feed.

Facebook is, as Stratechery’s Ben Thompson coined years ago, an “Aggregator.” This first part of the system analysis draws heavily from the insights that he crystallized as part of his renowned Aggregation Theory.

Briefly, the logic is as follows: People come to Facebook to connect with their friends and family, or to read and watch content from people and publishers they are interested in. Publishers come to Facebook to connect with people; to find an audience. The more publishers and content on Facebook, the better for users, who have more things to read and watch. The more people on Facebook, the more people there are for publishers to count among their audiences. It’s a pure two-sided network effect.

Between those two lies Facebook. Facebook has the ultimate leverage — ownership of the user relationship and all the data that comes from it — and can exert that leverage by controlling access between the two.

Say a fashion publication wants their story to be seen by fashion enthusiasts, who are more likely to click on the article and be exposed to the ads on their website. In an era of low organic reach, that publisher likely needs to pony up, paying Facebook to serve that content in that person’s News Feed.

Say a clothing brand just opened a new location in Downtown LA and wants to promote its exclusive launch lineup to its customers or people they think might be interested in it based on what they read/watch/follow on Facebook? Yep, that’ll cost them.

(As it should, by the way. No one is disputing the validity of this business model, but it is only outlined here in service of understanding the forces at work in this system.)

So when publications — largely supported by advertising dollars — depend on Facebook to drive traffic to their sites, they need to either pay up, or do things that stand out in the feed (or, typically, both.) Publications are incentivized to create and disseminate content that evokes emotion, which drives engagement and sharing, which supports the economic underpinnings of both the publisher and of Facebook. Sometimes this lends itself to sensationalism at the expense of depth, completeness, or even truth.

Sensationalism in journalism is a tale as old as time, because it racks up revenue-generating views as quickly as it flurried papers off the stands in days gone by. When sensationalism is incentivized and tribal factions thrive off that sensationalism, what used to stay at the fringes seeps into the mainstream. In this environment, sometimes facts matter less than eyeballs.

To some, actual fake news is increasingly indistinguishable from news that simply doesn’t comport with existing belief structures. And hate speech masquerading as opinion (or sometimes even as “fact”) finds a home on the feeds of the innumerable niche communities that often aren’t all that niche in size.

NOTE: Mark Zuckerberg recently published new content governance guidelines in a post that can be viewed in its entirety here. A key component is the intended removal of the engagement reward that certain extreme content receives.

Fake news isn’t new. Hate speech isn’t new. But critically, these phenomena and their destructive impacts scale infinitely faster online than off.

In the past, a small number of mass media entities held the dominant (if not exclusive) share of news/information creation and dissemination. This granted them tremendous influence, and they were regulated as such, by laws that were built for the dynamics of the time. Institutional ethics and things like journalistic integrity were widely understood, rewarded, and cherished.

But today, while those mass media organizations exist and still hold credibility and influence, they are met in the marketplace of social feeds by highly fragmented new entrants who are in practice unbound by legacy business models, unburdened with institutional ethical standards, and can spin up and disappear quickly with little consequence. They transact not on hard-earned authority, but on the emotional reward the content they create offers their audiences, all supported by the mechanics of the platforms they populate, and the people on those platforms who have the power to share, create, and amplify on their own.

Lastly, not absent from this entire system is the massive force that is the government. Or rather, the network of governments representing vastly diverse people, policies, and laws. Different social mores, economic structures, censorship guidelines, power dynamics, military controls.

Governments have long been the creators and purveyors of these “rules,” and their capacity to enforce them — legally, technically, ethically — was aligned with the dynamics of the environments of the time.

Governments and their representatives are tasked to uphold the rule of law, appear to (or actually) intervene when legal or social issues arise, and help manage the codification of the country’s generational evolution within the operations of government and legislation.

But laws often change much more slowly than technologies and cultures do.

And when the lines of a government mandate on something like free speech are less clear than something like safety protocols for public airlines or toxic waste management regulations for chemical companies, many Western capitalist cultures hesitate to introduce new, active government intervention in private enterprise.

Facebook, its critics, and its supporters all find themselves at this crossroads now.

Platforms, people, publishers, advertisers, investors, governments all exert different influences within this connected system. Deconstructing their incentives and dynamics both in isolation and in relation to each other helps us better understand the root causes and articulate more thoughtful proposed solutions without yielding to hyperbole and empty rhetoric.

With this more nuanced understanding of the platform(s) and the broader context of the system, a few foundational questions arise:

How do we universally stamp out the bad without handcuffing or wholly extinguishing the good? Is it even possible to do so, or are we resigned to playing whack-a-mole for every crisis, tallying it up against the positives, and decreeing a universal judgment?

Can publishers, advertisers, or investors who celebrate the system when its used for good credibly decry when the very same system is leveraged for less economically productive or socially responsible means, without recognizing that it’s two sides of the same coin?

How do we snuff out the bad without ceding the control of information access or opinion arbitration to a single, private entity? Is this better or worse than welcoming new, active government oversight and policing? Who decides what is bad and what is just distasteful?

These are not new questions. They’ve been asked by independent observers, critics, and supporters for years.

But the wrinkles forming on the edges of these questions don’t make them any easier to answer today than they did in 2016. Or 2014. Or 2008.

I don’t have the answers for how to fix it. Nor do I hold a core belief that it can be “fixed” as cleanly as we’d like — by one new law, one new CEO, one new social contract that we all delightedly enter into after a moment of shared awakening.

But I do know that resorting solely to hyperbolic rhetoric does as little to address the core issues as do the empty threats of the many interconnected parties (advertisers, governments, international regulatory bodies, media entities, etc.) that are unsupported by action — thoughtful or otherwise.

Further, falling back on this crutch and failing to understand the depth and breadth of the dynamics at play reinforces the idea that it’s one company or one Axis of Evil triopoly that holds sole and total responsibility for what’s happening.

Instead, it’s vital to look not in isolation at the single pieces of a deeply interwoven system of individuals, groups, governments, economic and social structures.

If Facebook is the new cigarettes, we don’t have to just be the smokers or the anti-Big Tobacco activists.

This can be our wake-up call, too.

We can choose to acknowledge — to understand — the lobbyists, the legislators, the shareholders, the farmers, the oncologists, the chemical companies, the trial lawyers, the protesters, the medical associations in this less-than-perfect metaphor. Their motivations, their incentives and disincentives, their impact on their own and on each other.

Whatever role(s) we choose to play, we can’t put the blinders on to the others.

One can be a #deletefacebook first-mover, a strictly-Instagram-only idealist (ha!), a begrudging blue app holdout, or a fervent user and loyal supporter.

The things that do or don’t show up in our feeds, the effects that these digital dynamics have in the physical world, the rewiring of our collective brains; all this takes place not in the isolated confines of one app, one company, one boardroom.

That’s wishful thinking. It’s easy, but it abdicates any shared responsibility — or resorts to comfortably laying blame in one place without acknowledging the contributions of the system, its contributors, and the dynamics it yields.

Likewise, recognition of shared responsibility and accountability does absolutely nothing to excuse or deflect from the gross breaches of trust, misguided decision-making, or reprehensible actions of any company in question. Those actions should be evaluated within the context of the system, and punished when necessary.

It simply reinforces that the problems are not easy, and the solutions won’t be either. Governments, NGOs, academics, private enterprises, and citizens will all play a critical role in shaping the policies, guidelines, principles, incentives, and penalties that form the basis of the solution.

Much like the sources of the problems at hand, the solutions are deeply interdependent. Do we meet this complexity with evermore complexity? Is simplicity the ultimate sophistication in this search for a better way? Can we foster an environment where mistakes and incorrect decisions — however egregious — are treated not purely as fodder for the latest takedown, but as opportunities to learn, to evolve, to mature — for the individuals responsible and for those affected by them? Can we permit ourselves to explore the nuance that lives on the boundaries of these challenges, in pursuit of something better? Something more durable?

I don’t have the answers, but count me in for a seat at that table.

--

--

Mike Dossett

Global Head of Activation & Content Management @ Google | Fmr. SVP, Digital Strategy @ RPA