Democrats use social media to respond to Trump Lead Stelter PKG_00003830.jpg
Dems, GOP both use social media to push agenda
03:41 - Source: CNN

Editor’s Note: Ramesh Srinivasan is an associate professor at UCLA, director at the UC Digital Cultures Lab and author of the new book “Whose Global Village? Rethinking How Technology Shapes Our World” Follow him @rameshmedia. The opinions expressed in this commentary are his.

Story highlights

Ramesh Srinivasan: The Trump team deftly employs Facebook to pander to and maintain its base of support

But Facebook must try to serve a more balanced media diet to all of its users, writes Srinivasan

CNN  — 

Why is Donald Trump consistently able to make false claims while maintaining his base of support? In many ways it’s because Facebook has transformed into a major source of journalism for the American public. Over 65% of Facebook users – 44% of all US adults – access news through the social media giant’s platform. That group greatly outnumbers readers who frequent mainstream news outlets like The New York Times or The Washington Post.

And there is evidence that shows this platform was and continues to be exploited by conservative political advocacy groups and consultants to target users based on data-driven psychological profiles. As a result, many users are fed targeted content instead of being presented with a variety of alternative viewpoints.

Ramesh Srinivasan

Trump’s team, in particular, has taken advantage of the ascendance of Facebook as the platform where Americans go first to get their news. By gathering a great deal of data on eligible American voters, targeting them with ads and taking advantage of Facebook’s algorithms, the Trump machine has been able to significantly influence how information is disseminated through social media.

In other words, Facebook users receive sponsored posts, created by data gathered from both Facebook and data brokers that track and sell our personal information. Cambridge Analytica, a data analytics company that has Trump advisor Steve Bannon on its board, has proudly boasted of its ability to predict voter behavior by correlating Facebook data with other information it purchased from data brokers, given the relatively lax information privacy laws in the United States.

In a statement to Motherboard, a Cambridge Analytica spokesperson denied influencing the electoral outcome. “Cambridge Analytica did not engage in efforts to discourage any Americans from casting their vote in the presidential election,” the company said. “Its efforts were solely directed towards increasing the number of voters in the election.”

While these techniques have been honed by marketers in relation to product campaigns, their application to the domain of national politics marks an important and troubling shift.

For example, according to Bloomberg, in the Little Haiti district of Miami, the Trump campaign used Facebook targeted “dark posts” to sway African American voters. These posts – newsfeed-like ads – reminded voters of the Clinton Foundation’s failures in Haiti, as well as when Hillary Clinton in 1996 referred to some black youth as “superpredators.”

The key with these targeted posts was not just the demographic profile of the Facebook user but the choice of which post would best influence them based on psychological profile.

And in an interview with Motherboard, Alexander Nix, CEO of Cambridge Analytica, says they have a large amount of data to profile users. “We have profiled the personality of every adult in the United States of America – 220 million people,” Nix said.

A revolution in how we get our news

Our experience of news has therefore been altered. Social media networks and the invisible algorithms that shape them determine the content that is curated and presented to us. The selection of news stories, their framing and the opinions raised about them are then determined by our “friends” on Facebook, who often share our political persuasion.

Far less transparent are the problems with algorithmic curation, where Facebook decides what posts we see and what are left invisible. As I argue in my just-released book, Whose Global Village?, when an algorithm reinforces certain political or cultural perspectives while leaving others hidden, it becomes difficult to respect or learn from one another.

The challenges posed by Facebook’s rise as the primary source of news are magnified by the Trump administration’s aggressive first several weeks of political activity. Numerous executive actions around the US-Mexico border, immigration bans and the Keystone and Dakota pipeline bring to the forefront issues that were already divisive.

When the Trump team’s actions are refracted through Facebook, the euphoria or anger users may feel is further magnified. The public is left more divided than ever, inflamed by how these political issues are described and portrayed on their Facebook walls. When the mainstream media attempts to debunk examples, such as Trump’s claim of Obama wiretapping him (which he has since pulled back on) or his argument that he lost the popular vote due to voter fraud, it is quickly rejected by the administration as partisan, or even worse, in Trump’s own words, as an “’enemy of the American people.”

In such a political environment, it becomes even more difficult to understand a position different from our own, or see a calm, rational way out of the hysteria it produces. It emboldens Trump to the point where he can say “the public doesn’t believe [the news media] anymore … and maybe I had something to with that.”

Pressuring Facebook to do better

We must do more to fight back. On January 25, Facebook announced that it would modify its trending topics algorithm to promote vetted news stories, as opposed to only promoting articles that are popular within a social network. While this is a step forward, the American public can further pressure Facebook to make some important changes, as well as cultivate strategies to overcome the current crisis.

First, we can ask for social media companies to make transparent and comprehensible the filters and choices that go into the most important algorithms that shape interactivity. Here I do not mean publishing proprietary software code but instead giving users an explanation of how the content they view on Facebook is selected.

More specifically, Facebook can explain that content is chosen because of location, number of common friends or similarity in posts. It can provide users with a glimpse into how its algorithm may select between different options in its decisions to make certain information visible on one’s newsfeed.

Second, we must provide users with the opportunity to choose between different types of information, whether it is news shared by people outside of their social networks or whether users are given options for how filters may be applied to their Facebook feed. Such filters would allow users to determine what parts of the world they’d like to see information from and the range of political opinion they would want to view.

Third, we can get back to a practice that long characterized the web: open-ended browsing. Social media companies can develop tools that allow news credibility to be visualized, enabling users to browse content within and outside of their immediate social network.

For example, Facebook could make posts available from users who are not in their friend network, or provide the user with tools to browse the networks of others, assuming permission is provided. It could even develop interfaces that allow users to look across posts from multiple perspectives, places and cultures around a given topic.

At the same time, we must continue to think about the long game – how to develop political literacy in our educational and social systems. This would involve learning how to assess, evaluate and recognize the difference in credibility across information sources. It requires not viewing any piece of information, whether presented on Facebook or through a traditional news outlet, as infallible, but instead scrutinizing the ways in which that story was framed and the agendas it serves.

Opening up the black boxes

While the largest news networks in our world are private corporations, they should be held accountable to the public. We have a right to demand fair and honest reporting. And in the current media landscape, we must pressure our news networks to tell stories that respect the perspectives of the diverse communities they report on.

Get our weekly newsletter

  • Sign up for CNN Opinion’s new newsletter.
  • Join us on Twitter and Facebook

    Now as Facebook, with nearly 1.8 billion global users, has arrived as the major gateway to our news, it too must be held accountable to the public – rather than just its shareholders. Facebook CEO Mark Zuckerberg has expressed concern over Trump’s executive orders, and justifiably so. Yet he must understand that his company is not only a private entity, but also a major public source of news and should therefore consider its role in Trump’s rise to power.

    We have no choice but to rein in our blind trust of the social media platforms that increasingly define the internet. To do so, we must not only pressure these companies to open up the black boxes by which they unite and divide us, but also recognize that Facebook, as it stands, divides us rather than unites us, politically and culturally.