• Search

Facebook Admits It’s Hiding News From Sites It Doesn’t Like, Is Boosting CNN And Other Left-Wing Publishers

Menlo Park, CA – Facebook hid news that its employees considered “hyperpartisan” like The Police Tribune and Breitbart, and instead promoted CNN and among other mainstream media outlets in the days that followed the election.

The suppression of content is still in progress at the time this article was written.

The move was made after employees notified Facebook CEO Mark Zuckerberg that “misinformation” about the election was going viral on the social media platform, The New York Times reported.

A team of employees proposed an “emergency change” to Facebook’s news feed algorithm that would change what more than two billion users see on their feeds every day.

The change involved implementing a secret internal ranking based on “news ecosystem quality” (N.E.Q.) that was assigned to news publishers based on what Facebook employees have determined is good journalism, The New York Times reported.

Per that publication:

“Typically, N.E.Q. scores play a minor role in determining what appears on users’ feeds. But several days after the election, Mr. Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to N.E.Q. scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.”

Employees said that the change was part of the “break glass” plan developed by Facebook ahead of Election Day under the assumption the election would be contested.

The end result was an increase in visibility for mainstream media like CNN, The New York Times, and NPR and a dramatic drop in visibility for “hyperpartisan” publications like The Police Tribune and Breitbart.

The Police Tribune Editor-in-Chief Christopher Berg said that the change is affecting non-partisan content as well.

“Not only are our posts not reaching our followers, but I’ve been in contact with numerous other independent publishers who publish non-partisan news and they are affected as well,” Berg said. “Facebook would like to have people believe that an algorithm is involved in a non-biased quality score, but it’s apparent that Facebook has people making an editorial decision about which news can be seen and which can’t.”

“Facebook is deciding to show people biased anti-police content and suppressing content which gives facts and context behind police incidents,” Berg continued. “CNN has been caught editing video to make police incidents look worse, but Facebook is boosting their content while hiding ours. None of this has anything to do with the election.”

A number of Facebook employees have argued the “emergency change” should be made permanent and asked in a meeting a week after the election if the “nicer news feed” could stay, two people who attended that meeting told The New York Times.

But Guy Rosen, who is in charge of the integrity division at Facebook charged with cleaning up the platform, told reporters the changes were only temporary, The New York Times reported.

“There has never been a plan to make these permanent,” Rosen told The New York Times.

The company might roll back some of the changes but it would still study and learn from them, according to Facebook’s John Hegeman, who is in charge of the news feed.

All of this comes amidst ongoing internal turmoil at the social media giant, The New York Times reported.

The company is split between two factions – those who want to do more to limit “misinformation” and those who think doing so will hurt Facebook’s growth and provoke lawmakers to regulate social media platforms.

“There are tensions in virtually every product decision we make and we’ve developed a companywide framework called ‘Better Decisions’ to ensure we make our decisions accurately, and that our goals are directly connected to delivering the best possible experiences for people,” Facebook spokesman Joe Osborne told The New York Times.

The company’s semi-annual “Pulse Survey” showed that employee morale has tanked, BuzzFeed News reported.

The survey of 49,000 Facebook employees in October showed the platform’s staff have lost faith in the company.

Only 51 percent of employees said they believed Facebook was having a positive impact on the world, BuzzFeed News reported.

That was down from 74 percent who thought Facebook was making the world a better place in May.

The October survey showed that 56 percent of employees had a favorable opinion about the company’s leadership, BuzzFeed reported.

That was a stark contrast to the 76 percent who expressed confidence in May, and 60 percent who had a favorable opinion in 2019.

An employee told BuzzFeed that the spike in May’s confidence rating was likely due to Facebook’s widely-praised handling of the pandemic.

The biggest complaint was the company’s decision-making based on growth metrics for hate speech and misinformation.

Some Facebook employees have quit because they didn’t want to work for a company with harmful product, but others felt they could make more of a difference by staying and collecting huge paychecks, The New York Times reported.

“Facebook salaries are among the highest in tech right now, and when you’re walking home with a giant paycheck every two weeks, you have to tell yourself that it’s for a good cause,” Gregor Hochmuth, a former Instagram engineer who left Facebook in 2014, said. “Otherwise, your job is truly no different from other industries that wreck the planet and pay their employees exorbitantly to help them forget.”

Instagram is owned by Facebook.

But Facebook employees haven’t stayed silent in their disapproval of how the company leadership has handled news throughout the election cycle, The New York Times reported.

A few employees formed a Workplace group in the spring called “Take Action” that quickly grew to more than 1,500 members.

The members changed their profile pictures to the Black Lives Matter fist logo and the group became a place to discuss internal problems and share dark humor about the company, The New York Times reported.

More than once, employees posted a meme of Nazis having a moral epiphany and asking “Are we the baddies?” after a negative story about Facebook ran in the news.

Employees protested Zuckerberg’s decision not to remove President Donald Trump’s “when the looting starts, the shooting starts” post in June by staging a virtual walkout, The New York Times reported.

But in September, Facebook cracked down on the open dissent and changed its policies to say that contentious political debates could no longer be held on open Workplace forums.

The new policy required employees to use their actual pictures or first initials for their profiles rather than the political imaging that had become prominent, The New York Times reported.

In November, Facebook engineers and data scientists shared the findings from a series of experiments they had conducted called “P(Bad for the World).”

Facebook had asked users whether certain posts were “good for the world” or “bad for the world” and learned that most high-reach posts were considered “bad for the world,” The New York Times reported.

So they engineered a machine-learning algorithm to predict which posts Facebook would consider “bad for the world.”

Once identified, the visibility of those posts was reduced dramatically, resulting in less arguably “objectionable content” on news feeds, The New York Times reported.

The result of that change was fewer users opening Facebook reducing the number of “sessions” logged, an important metric to the company.

“The results were good except that it led to a decrease in sessions, which motivated us to try a different approach,” a summary of the results posted to Facebook’s internal network and reviewed by The New York Times read.

So the team tweaked the algorithm to demote the “bad for the world” content less strongly and that worked to help the number of sessions.

Employees created a number of other tools for cleaning up “misinformation” on the platform that were never implemented including one called “Correct the Record” that would have notified users who Facebook’s fact checking partners had determined were sharing false information, The New York Times reported.

Facebook’s fact checking partners have repeatedly labeled true information as false, and have a financial incentive to do so because Facebook diverts traffic to the fact checkers.

They also didn’t implement an algorithm to identify and demote “hate bait” posts that inspired hateful comments but didn’t actually violated Facebook’s rules because the policy team ultimately determined it would only affect right-wing publishers.

Rosen said the “hate bait” tool was never used because it would have unfairly punished publishers for comments made by readers, The New York Times reported.

Publishers whose content was affected are furious about Facebook’s actions and have questioned their constitutionality.

Breitbart News pointed out “the suppression was not apparently based on post-election content published.”

“The move was tantamount to what in a legal First Amendment context would be considered an illegal prior restraint of speech,” the conservative news giant concluded.

All of the mechanisms used by Facebook and other social media platforms are likely to face continued bipartisan scrutiny by Congress even under a Biden-Harris administration.

In an effort to help deal with the ban, Blue Lives Matter has created pages on rival social media platforms Parler and MeWe.

Avatar
Written by
Sandy Malone

Managing Editor - Twitter/@SandyMalone_ - Prior to joining The Police Tribune, Sandy wrote the Politics.Net column for the Wall Street Journal and was managing editor of Campaigns & Elections magazine. More recently, she was an internationally-syndicated columnist for Conde Nast (BRIDES), The Huffington Post, and Monsters and Critics. Sandy is married to a retired police captain and former SWAT commander.

View all articles
Avatar Written by Sandy Malone

Newsletter

Sign up to our daily newsletter so you don't miss out on the latest events surrounding law enforcement!

Follow Me

Follow us on social media and be sure to mark us as "See First."