Tuesday , October 19 2021

Mark Zuckerberg thinks he does not trust Facebook because he does not understand the network


I think we can agree that last year it was not very good for Facebook. User confidence dropped to a record level as the company faced a series of scandals. Now, there are innumerable reasons why users can and must take care of Facebook.

This Thursday (24), the Wall Street Journal He published a piece of opinion written by Mark Zuckerberg about company data collection practices, entitled "Facts about Facebook." If you wish, you can also read the Portuguese letter that was published in About me.

In her, Zuckerberg says that the company has as a priority "people" and insists, as it has done in the history of the company of 15 years, that we should trust this. He has the impression that the main reason why users have little faith in the ability of the company to handle data in a responsible and ethical way is due to targeted advertising sales practices. About this he writes: "This model may seem opaque, and we all do not trust the systems we do not understand." Keep on:

Sometimes it means that people assume we do things we do not do. For example, we do not sell data from people, although it is often said that we do. In fact, selling information from people to advertisers would be against our interests as it would reduce the unique value of our service to advertisers. We have a great incentive to protect the information of people from other users.

Of course. Let's start with the ads.

Earlier this month, a Pew Research Center survey found that users, in fact, do not know how Facebook monitors their information to show relevant ads (from which the company gets money flows). Of the nearly thousand adults surveyed, 74% of Facebook said they had no idea of ​​the "ad preferences" section where interest-based interests appear. Fifty-one percent of users said they were not "very comfortable or a bit" with the amount of information Facebook had.

This research shows that the company has much to do in terms of transparency. But the additional data shows that, in fact, the more we know about how Facebook works, the less reliable the company is.

An annual survey of the Ponemon Institute shows that user confidence in the social network giant decreased significantly during the Cambridge Analytics scandal when it became known that Facebook was aware that the # 39; a research company had obtained personal data from millions of site users using a personality test and did nothing. Quoting the survey to the # April, the Financial Times said trust in the user was growing before the scandal, but the user's security that the company protected their information has fallen from almost 80 percent in 2017 to 27 percent one hundred last year. This was just the beginning of the year, and then the company had a series of scandals.

In 2018, we learned that Facebook shared data with other companies such as Bing, Microsoft, Spotify, Netflix and others in exchange for more information from users. It was also reported that the Cambridge Analytics data collection was worse than it was thought; that Facebook has shared contact information with advertisers; and that the sharing feature of the network can not be disabled. This, of course, does not cite the use of conspiracy theory that involves George Soros to counter Facebook critics, as well as the inadequate form that the company dealt with the genocide in Myanmar and the growth of fake news sharing.

Regarding its entry at the end of the year – which ignored the image problems of the previous year -, Zuckerberg seemed optimistic about how your business works. To be clear, this is exactly the founder of Facebook who once called users "idiots" to entrust their sensitive information to their product.

If users do not trust Facebook, it's not because they necessarily do not understand the network. It's because of the things the network does.

[Wall Street Journal]

Source link