14 November 2016

Fake news from Facebook

Michael Nunez has an article in Gizmodo about problems at Facebook:

It’s no secret that Facebook has a fake news problem. Critics have accused the social network of allowing false and hoax news stories to run rampant, with some suggesting that Facebook contributed to Donald Trump’s election by letting hyper-partisan websites spread false and misleading information. Mark Zuckerberg  (photo, looking twelve) has addressed the issue twice since Election Day, most notably in a carefully worded statement that reads: “Of all the content on Facebook, nearly a hundred percent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.”
Still, it’s hard to visit Facebook without seeing phony headlines like FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide or Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement promoted by no-name news sites like the Denver Guardian and Ending The Fed.
Gizmodo has learned that the company is, in fact, concerned about the issue, and has been having a high-level internal debate since May of 2016 about how the network approaches its role as the largest news distributor in the US. The debate includes questions over whether the social network has a duty to prevent misinformation from spreading to the 44 percent of Americans who get their news from the social network.
According to two sources with direct knowledge of the company’s decision-making, Facebook executives conducted a wide-ranging review of products and policies earlier this year, with the goal of eliminating any appearance of political bias. One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.
“They absolutely have the tools to shut down fake news,” said the source, who asked to remain anonymous, citing fear of retribution from the company. The source added that “there was a lot of fear about upsetting conservatives after Trending Topics,” and that “a lot of product decisions got caught up in that.”
In an emailed statement, Facebook did not answer Gizmodo’s direct questions about whether the company built a News Feed update that was capable of identifying fake or hoax news stories, nor whether such an update would disproportionately impact right-wing or conservative-leaning sites. Instead, Facebook said it “did not build and withhold any News Feed changes based on their potential impact on any one political party.” The full statement:
We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam, and hoaxes. Zuckerberg himself said that “I want to do everything I can to make sure our teams uphold the integrity of our products.” This includes continuously reviewing updates to make sure we are not exhibiting unconscious bias.
A New York Times report cited conversations with current Facebook employees and stated that “The Trending Topics episode paralyzed Facebook’s willingness to make any serious changes to its products that might compromise the perception of its objectivity.” Our sources echoed the same sentiment, with one saying Facebook had an “internal culture of fear” following the Trending Topics episode.
The sources are referring to a controversy that started in May of 2016, when Gizmodo published a story in which former Facebook workers revealed that the trending news team was run by human “curators” and guided by their editorial judgments, rather than populated by an algorithm, as the company had earlier claimed. One former curator said that they routinely observed colleagues suppressing stories on conservative topics. Facebook denied the allegations, then later fired its entire trending news team. The layoffs were followed by several high-profile blunders, in which the company allowed fake news stories (or hoaxes) to trend on the website. One such story said that Fox News fired Megyn Kelly for being “a closet liberal who actually wants Hillary to win.”
After Gizmodo’s stories were published, Facebook vehemently fought the notion that it was hostile to conservative views. In May of 2016, Zuckerberg invited several high-profile conservatives to a meeting at Facebook’s campus, and said he planned to keep “inviting leading conservatives and people from across the political spectrum to talk with me about this and share their points of view.” Joel Kaplan, Facebook’s vice president of global public policy, emphasized in a post that Facebook was “a home for all voices, including conservatives.”
“There was a lot of regrouping,” the source told Gizmodo, “and I think that it was the first time the company felt its role in the media challenged.”
As Facebook scrambled to do damage control, the company continued to roll out changes to News Feed, which weighs thousands of factors to determine which stories users see most frequently. In June of 2016, the company rolled out several updates to prioritize updates from friends and family and downgrade spam. But according to one source, a third update— one that would have down-ranked fake news and hoax stories in the News Feed— was never publicly released.
Facebook has addressed its hoax problem before. In a January 2015 update, the company promised to show fewer fake news stories, by giving users a tool to self-report fake stories on their feeds. It wrote:
The strength of our community depends on authentic communication. The feedback we’ve gotten tells us that authentic stories are the ones that resonate most. That’s why we work hard to understand what type of stories and posts people consider genuine so we can show more of them in News Feed. And we work to understand what kinds of stories people find misleading, sensational, and spammy, to make sure people see those less.
Facebook’s efforts have had mixed results. Earlier this year, Buzzfeed News studied thousands of fake news posts published on Facebook, and found that, while the average engagement on fake posts fell considerably from January 2015 to December 2015, the reach of fake posts skyrocketed in 2016, during the lead-up to the presidential election. (A Facebook spokesperson told Buzzfeed that “we have seen a decline in shares on most hoax sites and posts,” but declined to produce specific numbers.) Another Buzzfeed investigation this fall found that a group of young Macedonian publishers were running huge networks of popular Facebook pages filled with fake conservative news, targeted at Trump supporters in the US on websites such as TrumpVision365.com, USConservativeToday.com, and USADailyPolitics.com.
“We can’t read everything and check everything,” Adam Mosseri, head of Facebook’s news feed, said in an August of 2016 TechCrunch interview. “So what we’ve done is we’ve allowed people to mark things as false. We rely heavily on the community to report content.”
In a Facebook post published after the election, former Facebook product designer Bobby Goodlatte blamed the social network for boosting the visibility of “highly partisan, fact-light media” and for not taking bigger steps to combat the spread of fake news in the lead-up to the election. “A bias towards truth isn’t an impossible goal for News Feed,” he wrote. “But it’s now clear that democracy suffers if our news feeds incentivize bullshit.”
Rico says he's never done much with Facebook, and his fiancée is bailing on it (too much ugly politics since Trump got elected), so they may be in trouble; if Zuckerberg starts losing money, he may not be going to Mars... (But isn't 'incentivizing bullshit' what wins elections?)

No comments:

 

Casino Deposit Bonus