Subscribe to our LinkedIn so you don't miss important media news and analysis
Social network Facebook has announced some new changes. Yes, again. This time on the Facebook Newsroom appeared a message about the company’s update of Terms of Service. But almost immediately we are warned – in essence, it does not change anything. The new update aims to simply “better inform” users about how the company makes money, how it targets advertisement, what is happening with the content that was removed and how intellectual property rights are applied.
Currently, there are two versions of the rules on the company’s website: those that are in force now and those that will come into force on July 31. We decided to compare in detail both versions and check weather the newer version is actually more understandable and user-friendly.
In fact, Facebook is not the first tech giant that decided to polish its rules. Only a few months earlier Twitter did an impressive work by shortening its rules from 2,500 words to less than 600. At the same time, for Facebook, it wasn’t a completely voluntary decision. The European Commission forced the social media platform to admit that it earns money using targeted advertisement which involves user’s data.
The European Consumer Protection Cooperation Network (a division of the European Commission) also insisted on updating the rules to make it more clear and readable. Do you remember the joke about the “biggest lie ever”?
Probably the reason why this kind of a joke emerged is long and complicated rules created by the tech companies. And this is what the European Commission wanted to be fixed by Facebook.
Earning money and ad targeting
One of the most important changes in updated rules is recognizing that the company is actually using personal data to generate profit. “We don’t sell your personal data,” says the company in updated rules. “We allow advertisers to tell us things like their business goal and the kind of audience they want to see their ads (for example, people between the ages of 18-35 who like cycling). We then show their ads to people who might be interested in.”
The company explains that this is why people can use their products for free, because it earns money on advertisement.
This is one of the changes that the European Commission wanted to achieve. On the other hand, what will this change mean? After the scandal with Cambridge Analytica and the leak of user’s data, Reuters/Ipsos conducted a poll in the US. The survey results showed that three-quarters of Facebook users were as active or even more since the privacy scandal.
Analyst Michael Pachter of Wedbush Securities commented for the Reuters that possible reason for such behavior is that the data was used only for political ads, and did not specifically affect the lives of people.
Content removal
The main innovation in this section is an amendment of the term “removal” of the content into “removing or restricting access to” the content. Basically it reflects the latest changes in Facebook’s strategy from simply removing potentially harmful posts to restricting access to this kind of things.
In general, the story with the removal of content on Facebook is a messy one. No one knows exactly how it works. Several former moderators who worked as contractors for Facebook shed some light on what happens to moderation and content removal on their side of the process. And in 2017 moderation rules were leaked to The Guardian. All the stories are matching one point, that often moderators have very little time to decide whether or not content should be removed. And the instructions themselves for moderators are so confusing that the workers themselves are having a hard time to follow them. In this context, it is worth mentioning that Facebook several times was criticized because of the slow reaction to the content with the scenes of violence. For instance, two years ago a video clip of the 74-year-old man’s murder was available online for about three hours. And this May video of Christchurch terrorist attack was streamed on Facebook Live.
What do official rules tell us? Content is deleted from Facebook if it violates Community Standards. Among the violations that can lead to blocking of the content or even pages or profiles – distribution of publications containing violence and crime, harassment and bullying, hate speech, nudity, copyright violations, spam, and fake news.
This list looks pretty serious, but in practice, Facebook often blocks such things as well-known classic paintings with nude bodies or photographs of sculptures. Over the years it became the cause of many jokes and memes.
Another reason for a post being taken down is that it may expose Facebook or other parties to legal liability. As for the procedure, now Facebook is going to notify users in case they did violation and explain what options they have in case they want to require re-moderation.
In case a user himself wants to remove something from the platform, it will take up to 90 days to be completely removed. And after that Facebook won’t have a right to display and share that content. But in case the user has shared content with someone else it will continue to exist on the platform until others remove it too.
Intellectual property rights
The social network emphasizes – all the content that users share, belongs to them. All photos, videos, and texts. But by posting something on Facebook, users give to social network “a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings).” Putting it into a simple language, Facebook can use your content for their products. To limit this access you need to review your privacy settings and set the limits of access to your posts.
We are using cookies to give you the best experience on our website.
You can find out more about which cookies we are using or switch them off in settings.