Facebook will reduce political content; Facebook, the social media conglomerate, is globally the most popular social media company. The original Facebook application, the core app, has been around for quite a long time and people love to stay in touch with their friends and family through this platform.
However, it has become a great way for people to consume content for entertainment purposes over the years. Businesses find the platform very engaging, allowing them to communicate and interact with their target customers. But, what kind of content do people like? The answer is definitely not politics.
We’ve seen positive results from our tests to address the feedback from people about wanting to see less political content in their News Feed. As a result, we plan to expand these tests to Costa Rica, Sweden, Spain, and Ireland.
Facebook will reduce political content On social media Now.
Facebook is expanding an experiment in reducing political content in the News Feed. In an update to a February blog post, the company says it’s seen “positive results” in reducing this content for some users in a handful of countries. Now, it’s expanding a test of the strategy to Costa Rica, Sweden, Spain, and Ireland.
Axios reported on Facebook’s plans before the announcement. As it notes, Facebook’s new test also involves changing the signals it favors when promoting content. “Some engagement signals can better indicate what posts people find more valuable than others,” product management director Aastha Gupta writes. “Based on that feedback, we’re gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment on or share political content.”
Conversely, Facebook will more heavily weigh signals like “how likely people are to provide us with negative feedback on posts about political topics and current events.” The company acknowledges that this could affect “public affairs content” and decrease traffic to news publishers. It’s planning a “gradual and methodical rollout” of these tests over the coming months.
Facebook started its politics-reduction tests in February for some users in Canada, Brazil, the US, and Indonesia.
It later announced that it would emphasize “inspiring and uplifting” posts and provide more avenues for people to explicitly indicate what they don’t like — rather than having Facebook infer it from their usage patterns.
The Bottom Line
Facebook stated that some engagement signals indicate which posts people find interesting or valuable than others. Therefore, the company is expanding some of the tests related to participating signals.
These tests place little emphasis on the possibility of users sharing or commenting on political content. “At the same time, we pay more attention to new signals, such as how likely it is that people will provide us with negative feedback on political topics and current events when we rank these types of posts in news feeds,” Facebook wrote. In the updated blog post.
If the algorithm can effectively detect and downplay all political content, these changes may reduce the content on Facebook. The political movement may also have to reconsider its strategy of reaching voters.
On the other hand, given the traffic that Facebook can bring to the site, this move may cause a blow to news organizations, especially those that focus on politics.
This shift may also make Facebook a less hostile place to users. Political discussions will soon become intense, which may discourage those who use the service primarily from keeping in touch with their loved ones and share pictures of children. In terms of its value, Facebook’s political content accounts for only 6% of what users see.
“We understand that these changes will affect public affairs content more broadly, and publishers may see their traffic affected,” Facebook said. “Knowing this, we are planning to roll out these tests gradually and methodically, but are still encouraged and expect to announce further expansions in the coming months.”
The changes in the report follow other measures taken by Facebook to reduce the visibility of political content. For example, after making these recommendations before the 2020 U.S. election, it recommended civic and political groups to users earlier this year.