ADDED 05/11/2018

How Facebook fired workers who blocked ‘fake news’ — ‘After the Fact’ book excerpt

FROM 05/06/2018 | USA Today

BY Nathan Bomey

⇩ Use your ears. Click below to hear this post.

EDITOR’S NOTE: This excerpt is adapted from USA TODAY reporter Nathan Bomey‘s new book, After the Fact: The Erosion of Truth and the Inevitable Rise of Donald Trump, a nonpartisan analysis exploring society’s increasingly tenuous commitment to the facts. Printed with permission from Prometheus Books.

Adam Schrader arrived for his secret job at Facebook one Friday morning in late August 2016 without realizing that his hours there were numbered.

After taking the elevator to the seventh floor of the social media giant’s gleaming office in lower Manhattan, the former Dallas Morning News community publication editor strode by inspirational posters, white desks and then by TVs that blared the latest news from the presidential campaigns of Donald Trump and Hillary Clinton.

Having completed his daily trek to his workspace in the former Wanamaker department store overlooking Broadway in lower Manhattan, Schrader took his place among a crew of contractors who had been sworn to secrecy over the existence of their jobs.

Their role? To authenticate and select “trending” news stories for display to hundreds of millions of Facebook users who encountered the items whenever they logged on. The team of about 25 news curators primarily consisted of journalists skilled in the art of determining source credibility, ascertaining truth and applying news judgment.

They had an early window into Facebook’s grisly underworld — a vortex of fabricated news stories and highly distorted partisan content.

“We had a backside view of what people are talking about outside of our own Facebook bubbles,” Schrader said.

Every day, the curators sought to contain the inferno of deception by dousing the blaze with journalistic sensibility, ensuring false and extremely skewed stories did not reach the trending topics list for the entire nation to see.

Fearnow said the curators shielded the public from a “bombardment of fake news.”

Plans to hide the curators imploded in early May 2016 when tech blog Gizmodo published stories revealing the secret team’s existence and alleging that the curators “routinely suppressed news stories of interest to conservative readers.”

The claim drew intense criticism from conservatives, prompting U.S. Senator John Thune (R-SD) to fire off a letter demanding accountability.

Facebook executives panicked. The accusations threatened their priority of keeping people of all political persuasions clicking, sharing, posting, scrolling, and commenting. Any perception that Facebook was biased could have undermined the company’s carefully manicured reputation for political neutrality.

It was particularly ironic that the news curators were battling deception in a complex that once housed John Wanamaker’s department store. Wanamaker is regarded for having printed an advertisement in 1874 that introduced the concept of “truth in advertising,” which “earned him the public’s trust,” according to PBS.

But the retail legend’s heritage as a purveyor of truth was not top of mind when the news curators received an ominous message at about 4 p.m. Aug. 26, 2016.

“There’s an email saying, ‘Hey we’re going to have a meeting in the boardroom upstairs,’” Schrader said.

After gathering together, the curators were summarily informed that they had all been fired. “They literally had security escort us out of the building almost immediately after we were told that we were let go,” Schrader said.
Although political consternation over the trending-news team had already died down, Facebook had still decided in the weeks following the controversy that it was ready to hand the verification process over to technology—as it had long planned.

The move reflected a bet on technology over human journalists. Yet two months earlier CEO Mark Zuckerberg had publicly acknowledged that the company’s artificial intelligence capability remained limited.

“There are millions of posts, right, that are available for you that people share publicly every day,” he told investors in June 2016. “And right now, we don’t understand the meaning of those posts.”

The bloodletting of the curators brought into full view the sharp cultural conflict between Silicon Valley and the news industry. Whereas professional, mainstream journalists have historically taken responsibility for selecting, reporting, and presenting accurate information to the public — at least in the post–Vietnam War era — Facebook has delegated that responsibility to powerful algorithms and sterile digital platforms, retaining scarcely any hands-on role in discriminating based on legitimacy, meaning, quality, and impact.

Facebook did not agree to interviews for this book after multiple requests.

Facebook has repeatedly pledged since the election to combat bogus stories and play a more active role in promoting authentic content — with executives suddenly acknowledging that the company could no longer sit on the sidelines. But the former curators were skeptical that much would change in the long run.

“People cannot function without Facebook. I’ve even tried,” Schrader said. “I’ve tried deleting my Facebook account several times over the years. It’s just— in this day and age, until Facebook gets replaced by something else—it is the most efficient way to communicate with businesses, with news publishers, with your friends, with your family. And Facebook knows that they don’t really have to change. They’re so big now that nothing can touch them. They’re untouchable. So it’s easier for them to lay off 25 journalists than it is to acknowledge the problem and move forward appropriately.”

Facebook’s dismissal of the news curators — although limited in its long-lasting effects — was a microcosm of a much broader issue. It symbolized a wholesale shift that was already well underway across society—a transfer of trust from news professionals to secretive algorithms and ideologically motivated groups. The resulting upheaval has gravely undermined our collective grasp of reality.

“What I thought was weird was that people were more up in arms about this ‘group of liberal journalists’ dictating their stories than they were about four engineers in a room in Menlo Park who are creating this algorithm that is inevitably deciding what you would see—what anyone would see,” Fearnow said, referring to the California town where Facebook is headquartered.

“How is that not scarier—that these unnamed engineers at Facebook are the ones who are the gatekeepers to how this works? The culture at Facebook is, the engineers there are like the editors. They’re like God—because no one really knows what . . . they do.”

see source

See Something, Say Something

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>