Facebook released data declined is shown During a week in October, seven of the 10 most engaged pages were primarily political, with President Donald J. Democrats included Trump, Fox News, Breitbart and the OQP.
three years ago, Facebook said it would pull back On the amount of content posted on the site by news publishers and brands, an overhaul said it focused more on interactions between friends and family. At the time, Mr. Zuckerberg said he wanted to make sure that Facebook’s products were “not only fun, but good for people.” He also said that the company would take those actions even if it meant hurting the bottom line.
Nevertheless, Facebook users have had no problem finding political content. Non-governmental organizations and political action committees paid to show highly targeted political advertisements to millions of Americans in the months before the November presidential election. Users formed a large number of private groups to discuss campaign issues, organize protests and support candidates. Until sometime ago, Facebook’s own system often suggested new, different political groups that users could join.
In recent months, Facebook has done something behind it. After the close of elections on Election Day, the company Turn off capacity To buy new political advertisements. Following the fatal capital riot on 6 January, Mr Zuckerberg said the company would discontinue the ability to recommend political groups “temperature down” on global negotiations.
Under the new test, a machine-learning model will predict the probability that a post – whether it is posted by a major news organization, a political pundit, or your friend or relative – is political. Politically deemed posts will appear less frequently in users’ feeds.
It is unclear how Facebook’s algorithm will define political content, or how much the change will affect people’s feeds. Facebook spokesman Lauren Svensson said the company would “put this model on trial to better identify political content during the testing period, and whether or not we can use this method for a longer period of time.”
It is also unclear if Facebook’s tests determine that reducing political content also reduces people’s use of the site. In the past, the company has Shelter or modified algorithm changes The aim was to reduce the amount of misleading and divisive content people had after determining that the changes caused them to open Facebook less frequently.