Friday , May 14 2021

Facebook moves to limit hate speech like Times Times Scandal Svirls



Mark Zuckerberg would As if you know that despite the report of stunning The New York Times, which shows Facebook as merciless and selfish corporate behemoth, things get better – at least as he sees it.

In a long-standing call with reporters on Thursday and an equally long "note" on Facebook, the company's executive director brought out a series of changes made by Facebook, designed to prevent toxic content on the platform and provide more transparency in content decisions. But most likely it is the latest update that the Facebook Nevs Feed algorithm now attempts to limit the spread of sensational content on the platform, which is a major change from how the social network traditionally approached moderation. All this is in the service of restoring confidence in a company whose reputation – and its leaders – has steadily slowed down the body in the past two years.

"When you have a failure like this year, it's a big question, which is erosion of trust, and it takes time to make it," Zuckerberg said at the call. "It is certain that our work is not only to have these things on a good level and to constantly improve, but to face new problems. I think that over the past few years this is one of the areas where we are most left behind, especially about electoral issues."

These words come the day after Times published a report on the damage that shows Facebook, not only for questions of election interference, as Zuckerberg suggested, but actively working to reduce what he knew about that mixing. She claimed that the leaders of Facebook, who were cautious in the selection of parties in the partisan fight for Russia's interference in the 2016 elections, aimed at minimizing the role of Russia in spreading propaganda on the platform. The story says former security chief Alex Stamos was embarrassed by Chief Operating Officer Sheril Sandberg for investigating Russian actions without the approval of the company and repeated it again because he discovered too much about Facebook members.

In his remarks, Zuckerberg rejected this allegation. "We must have appeared on the road, but to say that we were not interested in knowing the truth, or that we wanted to hide what we know or that we tried to prevent an investigation simply untrue," he said. (Stamos, for his part, tweeted earlier on Thursday that "he never told Mark, Sheril or other executives not to investigate").

The Times The story also says that Facebook has campaigned against its competitors through a research firm called Definers Public Relations. The company has worked on Facebook's links several times, including groups such as the Open Market Institute and Freedom from Facebook billionaire George Soros. Critics say Facebook has participated in the same anti-Semitic tropics used by white nationalists and other hate groups that regularly mitigate Soros.

Zuckerberg denied that he had personal knowledge of Defenser's work with Facebook and added that he and Sandberg just found out about the relationship on Wednesday. This is despite the fact that the Definers often coordinated large-scale press calls on behalf of Facebook and its employees, and in at least one case, sat on meetings between Facebook and the media.

After Zuckerberg read the story Times, said that Facebook immediately completed its relationship with the company. "This type of company can be normal in Washington, but it's not the kind of thing that I want to connect to Facebook, which is why we will no longer work with them."

While acknowledging that they did not know about the work of the Defensers or his messages, the executive director defended Facebook's criticism of activist groups such as Freedom from Facebook. The intention was not to attack Soros, for whom Zuckerberg said he had "a lot of respect," but to show that Facebook's Freedom "is not a spontaneous effort."

Zuckerberg refused to determine the guilt for the tactic allegedly employed by Definers, or to comment on wider questions about staff within Facebook itself. He only said that Sandberg, who oversees Facebook's lobbying efforts and which is unsuccessful in the whole Times the story – "doing a great job for the company." "She was an important partner for me and she continues to be and will continue to be," added Zuckerberg. (Sandberg was not on the call.)

Already for the fourth time this year, Zuckerberg has been working overtime to clean up Facebook's confusion, even as he wished to desperately address the progress the company has made. In Myanmar, where false news on Facebook has animated a brutal ethnic cleansing campaign against Rohingya people, the company has hired 100 Burmese speakers to moderate content there, and now automatically identifies 63 percent of hate speech it takes, just 13 percent late last year. Facebook has expanded its security and security team to 30,000 people around the world, more than 20,000 people the company has started to employ this year. It also changed the process of content removal, allowing people to complain about company decisions about content that they publish or report. Facebook announced on Thursday that it will open an independent oversight body to address content complaints in the coming year.

But the by far the biggest news that appeared on Thursday was the change that appeared on the Facebook Feed algorithm. Zuckerberg acknowledged what most observers already know is one of the most basic problems of Facebook: sensationalist and provocative posts, even those who do not explicitly violate Facebook's policies, have the most engagement on the platform. "As content approaches the line of what is forbidden by the standards of our community, we see that people are more engaged in it," he said. "It looks like it's right no matter where we set our policy lines."

This question is likely to satisfy most of Facebook's problems in the last few years. That's why the disappointing political propaganda was so successful during the 2016 campaign and why the false news was able to break through. So far, Facebook has functioned in a black and white environment, in which content violates the rules or not, and if not, millions of clicks are freely available – even if the poster's intention is to mislead and upset. Now Facebook says that even content that does not explicitly violate Facebook's rules can see its reach reduced. According to Zuckerberg's post, this includes, among other things, "photos close to the naked line" and "posts that do not come in our definition of hate speech, but are still offensive."

Zuckerberg called the shift "a big part of the solution to ensure that polarization or sensational content does not spread in the system, and we have a positive effect on the world."

With this move, Facebook takes the risk. Reducing engagement on the most popular content will probably cost the company money. And such dramatic change, no doubt, opens Facebook even more accusations of censorship, at a time when the platform refuses constant criticism from all angles.

Nevertheless, Facebook bet on the upside. If you no longer reward more clicks, it's thought, maybe people will behave better. Facebook is ready to take advantage of this opportunity to talk a lot about the public pressure put on the company over the past two years. After all, what does Facebook have to lose?


More Great VIRED stories


Source link