Wednesday, May 27, 2020


TECH



mark zuckerberg
Facebook knew that its algorithm would promote extremism and did nothing

Facebook does not have the best track record when it comes to information legitimacy or fairness in promoting content. Proof of this is the number of posts filled with fake news that end up gaining traction on the social network or extremist content that ends up appearing without a defense mechanism for users.
According to a report in the Wall Street Journal, however, Mark Zuckerberg's company knew in advance of at least one of these two situations. Based on a presentation shown exclusively to select employees (and since then discarded without being applied in the official communication of Facebook products), the newspaper argues that the company's management was aware of the potential of its algorithm to promote divisive content, amplifying feelings of polarization.
“Our algorithm explores the human brain's attraction to division. If we leave it unattended, Facebook can introduce users to more and more polarized content in an effort to gain more audience attention and extend the time [spent] on the platform, "says a slide of the presentation that, according to Wall Street Journal, was summarily discarded and its remarks ignored.
The newspaper also notes that Facebook's chief policy officer, Joel Kaplan, thought at the time that the changes that led to the new algorithm would have affected more conservative users and publications, but did not explain in what context (whether he hid them or promoted them, for example) .


Responding to the story, Facebook issued a statement:
“We have learned a lot since 2016 and we are not the same company today. We have built a robust team focused on integrity, reinforced our policies and practices in order to limit harmful content, and used research to understand the impact of our platform on society so that we could improve ”.
In other words, Facebook has not denied the information raised by the Wall Street Journal. The American newspaper stressed, however, that even before the social network created this “integrity team”, a researcher at the service of the company, named Monica Lee, discovered in 2016 that “64% of all extremist groups enter [the Facebook] through our recommendation tools ”. There was even an attempt to adjust the algorithm to curb this behavior, but the idea was supposedly overturned because it was “anti-growth”.
Recently, the company headed by Mark Zuckerberg appointed a supervisory committee that attacks precisely the promotion of polarizing and extreme content. The Brazilian Ronaldo Lemos, lawyer and director of the Rio de Janeiro Institute of Technology and Society, is in this group.


by Rafael Arbulu

No comments:

Post a Comment

  DIGITAL LIFE Wearable walking robot allows disabled persons to don it from their wheelchairs KAIST researchers have unveiled a new wearabl...