The purpose of this algorithm change is to reduce the scope of Facebook's use by political groups to simulate news vendors who profit inappropriately.
In the personal download of Mark Zuckerberg on Facebook, the action to change the news feed algorithm is done, because if the content violating the rules of Facebook is not controlled, it will have the effect of "problems basic incentive ". According to Zuckerberg, "people will be disproportionately involved in sensational and provocative content".
"On Facebook, we are obliged to protect users against terrorism, abuse or any other attack," wrote Zuckerberg.
Over the past two years, Zuckerberg has acknowledged that some users are abusing Facebook to influence elections, spread false news and incite propaganda. "When you connect two billion people (via Facebook), you will see all the beauty and ugliness of humanity," he said.
Facebook's efforts to curb negative content are often enough recently. In April 2018, Facebook issued internal guidelines on the removal of content violating the provisions.
There are at least 18 types of content banned by Facebook, such as content showing photos of internal organs or burning humans. Then Facebook also formed a special team working in 10 offices located in six different countries, which, according to Zuckerberg, was meant to "reflect differences in norms and cultures".
In addition to forming teams, Facebook has also developed an artificial intelligence capable of containing negative content. At present, Amnesty International's work aims to detect content containing terrorism. According to Facebook, the AI is able to eradicate 99% of the content accused of terrorism.
In general, the changes in Facebook's body are not this time around, especially those related to negative content. In January 2015, Facebook changed the news feed algorithm to specifically stop hoaxes. Then, in April 2016, Facebook banned click-bait perched in his newsfeed. Finally, since November 2016, Facebook no longer allows negative ads.
Kids' photos on social media are indeed one of the problems of the era of the digital revolution.The acceptance of digital culture is not necessarily followed by a judicious use of the device. Including the issue of downloading photos of children, especially their nude photos.
On Wednesday, October 24, 2018, Facebook announced that it had removed 8.7 million photos from its customers, who displayed vulgar images of children using software support.
Reported by Reuters, the software automatically marks the photos. The device launched last year will identify images of naked children. In addition, Facebook has also developed the system to capture users who sexually exploit minors.