In the tense days after the presidential election, a team of Facebook employees presented the chief executive, Mark Zuckerberg, with an alarming finding: Election-related misinformation was going viral on the site.
President Trump was already casting the election as rigged, and stories from right-wing media outlets with false and misleading claims about discarded ballots, miscounted votes and skewed tallies were among the most popular news stories on the platform.
In response, the employees proposed an emergency change to the site’s news feed algorithm, which helps determine what more than two billion people see every day. It involved emphasizing the importance of what Facebook calls “news ecosystem quality” scores, or N.E.Q., a secret internal ranking it assigns to news publishers based on signals about the quality of their journalism.
Typically, N.E.Q. scores play a minor role in determining what appears on users’ feeds. But several days after the election, Mr. Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to N.E.Q. scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.
#facebook #fb #internet #fakenews #hatespeech #martketing
Full Article >>