The human touch on Facebook's algorithms
May 19, 2016
By Gene Policinski
Inside the First Amendment
Forgive me for a little old-fashioned smirking when following the digital-era dilemma of Facebook having to own up to some human involvement in its tidy, algorithmic universe.
Millennials and others were outraged — outraged! — at the recent disclosure that the internet social media giant's "trending topics" report may have had more than a smidge of real people decision-making involved in the daily determination of what's hot in posted news.
On May 9, web tech blog Gizmodo carried a report citing an anonymous former contractor who claimed that while he worked on the "topics" report, he and colleagues were directed to regularly insert liberal topics into the report while suppressing conservative subjects.
News flash (if I may use that no-doubt-dated term): Not everything on the web is true, unbiased or selected by a soulless mathematical computer-guru with your best interests at whatever passes for its content-neutral mechanical heart — and the same goes for things you don't get to read.
None of this is to minimize the real concerns about potentially hidden bias built into a source of news for something like 40 percent of U.S. adults, used daily by some 1 billion people worldwide. Those concerns were serious enough to cause Facebook founder Mark Zuckerberg to convene a meeting with a group of conservatives — and even to attract attention from a Congressional committee.
And for First Amendment purposes, let's up the controversy-ante a bit and inject the future of journalism and the news media into the mix.
The popularity of the social media empire that is Facebook rests in no small part on a self-proclaimed goal of being a mere provider of information directly to its users, without the "traditional media" structure that an increasingly skeptical group of our fellow citizens believes has baked its own biases into news reports.
Ah, but while Facebook said in a statement last week that it "does not allow or advise our reviewers to discriminate against sources of any political origin, period," a number of unchallenged news reports said Facebook concedes that its algorithms are not the only way trends are determined. Staffers — called curators — can "inject" or "blacklist" topics for certain reasons, including duplication, or if a story was popular but erroneous.
So, "curators" can use good judgment and knowledge of the breadth and details of the day's stream of newsworthy events news to shape a report consumed by others about the developments of the day. Sounds familiar, doesn't it? Substitute "editor" or "news director" or "media mogul" for that "curator" newspeak and that same definition applies to those making editorial decisions in newspapers and electronic media.
Social media experts have been theorizing for some time about the need for Facebook — as it (gasp!) ages — to replace users who no longer consider it cutting-edge with those who use it to send and receive information of value.
A recent report in Mashable, an online site about media and entertainment, says Facebook is devoting "tremendous resources" to attract newsy video posts, celebrity items, news articles, sports event streams, and even is "considering a branded morning show." If it did all that on paper or via the airwaves, we'd call it a newspaper or a network. But let's just note that Facebook drives up to 20 percent of daily visitors ("traffic" in digital nomenclature) to traditional and new media news sites.
And that goes to the heart of the matter — and why Facebook's founder is right to get out in front of the issues of bias and trust. For some time, I have considered it obvious that news operations in the 21st century must face up to the idea that there is only one big "thing" on which their survival will depend: credibility.
Financial problems for "old media" rooted in a loss of amazingly lucrative advertising and relatively easy-to-get circulation? Facebook's balance sheet shows you can get nearly $18 billion in revenue (2015) by delivering news and information in a newer format that people like.
Lose ads and "circ" and what's left? Content. And what have people demanded from news media throughout history — content on which they can depend? Credible news. That requires a whole new level of transparency, ethics and acceptance of the responsibilities that earlier forms of news media have embraced, including acting as a "watchdog on government."
So I don't mind, Facebook, if you have decided that some fluffy piece of click-bait, which may or not be true, ought not to push real news off the "trending topics" list, and that you do it via some real person's hand rather than by deus ex machina.
As it happens, good journalism throughout the years has had to balance the sensational and silly against the important and needed-to-know. Not something to simply be automated, at least yet. And even algorithms can be biased, reflecting in the selections the values and measurements built in by their creators.
Face up to it, Facebook. You have added "news media" to your moniker of "king of social media."
Hey, that's news worthy of anybody's "trending topics."
Gene Policinski is chief operating officer of the Newseum Institute and senior vice president of the Institute’s First Amendment Center. He can be reached at firstname.lastname@example.org. Follow him on Twitter: @genefac.