The aim of this series is to identify the biggest trends, to separate the news from the noise. And also to summarize the most important studies and reports to help you advance your media business.
2020 was another year of social media reckoning, due to the global pandemic and US elections. There were waves of misinformation spreading from public figures and even now at the end of the year the outlook is not great. Misinformation keeps spreading on social media, platforms help radicalization and the leaders of these companies remain largely unchallenged.
Facebook infamously refused to ban political ads, but later caved and a couple of months before the US elections adopted stricter rules. The chiefs of Facebook and Twitter appeared two times before the congress. Once together with Apple and Google CEOs and once jus the both of them.
One of the big questions analysts and journalists had after both hearings – why was YouTube missing. Some lawmakers took notice and we might see this omission addressed in 2021.
2020 also marked a year that the US government brought yet its strongest lawsuit against Facebook and demanded it to divest assets, including Instagram and WhatsApp, effectively breaking up Facebook as we know it.
A study by researchers at Switzerland’s Ecole polytechnique fédérale de Lausanne and the Federal University of Minas Gerais in Brazil found evidence that users who engaged with a middle ground of extreme right-wing content migrated to commenting on the most fringe far-right content. The report, called “Auditing radicalization pathways on YouTube”, suggests users who started out commenting on alt-lite/IDW YouTube content shifted to commenting on extreme far-right content on the platform over time.
48 fact-checking organizations from 30 countries started working to debunk false information about covid-19. The initiative is ongoing, follow #CoronaVirusFacts and #DatosCoronaVirus on social media for the latest updates.
HBO premiered a documentary movie ‘After Truth: Disinformation and the Cost of Fake News’, directed by Andrew Rossi and executive produced by CNN’s Brian Stelter. In a review, The New York Times wrote the film argues sympathetically, disinformation hurts the people who believe in it and are sometimes moved to act drastically on it.
The “Plandemic” was probably one of the most viewed vital misinformation video regarding the pandemic, reported The New York Times at the time. Just over a week after it was released, it had been viewed more than eight million times on YouTube, Facebook, Twitter, and Instagram, and had generated countless other posts.
In a bombshell investigation, reporters from Wall Street Journal found out that back in 2016 an internal research found that ‘64% of all extremist group joins are due to our recommendation tools’, that ‘our recommendation systems grow the problem’ and ‘our algorithms exploit the human brain’s attraction to divisiveness.’ Some managers tried to address algorithm changes and were allowed only partly. Here is Facebook’s response.
The Digital News Report 2020 was published, stating that the use of online and social media substantially increased in most countries. DNR also reminded us: Those aged 18–24 (so-called Generation Z) have an even weaker connection with websites and apps and are more than twice as likely to prefer to access news via social media.
Reddit banned r/The_Donald subreddit. The company decided to ban about 2000 subreddits after an update to its content policy.
A job listing revealed that Twitter might be working on a subscription platform. So far there is nothing new to this information even in the end of 2020 at the time of writing this.
The US government started to consider banning Chinese apps including TikTok. After months of media frenzy and ultimatums given by the Trump administration, TikTok remains, even at the end of 2020, in Chinese hands.
Twitter got hacked. A few high profile accounts including those of Joe Biden, Elon Musk, Bill Gates, Barack Obama, Uber, and Apple tweeted cryptocurrency scams. The New York Times revealed the fascinating back story of the hack and profiled the main hacker behind the attack, the 17-year-old Graham Ivan Clark.
C.E.O.s of Google, Apple, Amazon, and Facebook appeared online on a hearing before the House Judiciary Committee regarding antitrust issues after lawmakers collected hundreds of hours of interviews and obtained more than 1.3 million documents.
Instagram launched its TikTok copycat Reels. The best review came from NYT’s Brian X. Chen and Taylor Lorenz basically saying that Reels is a much worse TikTok clone.
YouTube had to revert to using human moderators again to vet harmful content after its artificial intelligence systems removed also videos that broke no rules. YouTube gave its machine systems greater autonomy after the company “put offline” its 10-thousand moderation team due to the pandemic.
Celebrities and influencers joined the Facebook ‘Ad boycott’ organizers by freezing their accounts, for a day. Needless to say what the effect on the social media giant was.
The House Judiciary subcommittee on antitrust, after 16 months of investigation, published its findings in an antitrust report on Big Tech. Among other findings, we got a look at how the Instagram acquisition looked from the inside (details here).
In 2018, Facebook’s chief executive, Mark Zuckerberg, famously cited Holocaust deniers in a fumbled attempt to make a point about free speech. In October Zuckerberg announced he was reversing his decision. Facebook started banning content that “denies or distorts the Holocaust.”
The ‘Section 230 hearing’ happened. It was not enlightening.
Facebook Journalism Project published 22 innovative ideas to try on Instagram. Lessons learned from 22 recent college graduates who participated in the Instagram Local News Fellowship and worked for 10 weeks in 21 local newsrooms across the US. Some of the case studies are pretty interesting.
Casey Newton reviewed the platforms’ attempts to stop election misinformation and keep the election integrity. He concluded: In short, Facebook and Twitter — and, to a lesser extent, YouTube — met two of the key challenges revealed by the 2016 election.
Facebook made editorial tweaks to its algorithm. According to sources The New York Times spoke to, after the election, Zuckerberg agreed to temporarily tweak Facebook’s algorithm to make authoritative news like CNN and NYT appear more prominently. Basically, the company knows how to set up the platform to be less toxic, the problem is, that the outcome means less time spent by users.
Facebook told employees that it’s developing a tool to summarize news articles so users won’t have to read them. It also laid out early plans for a neural sensor to detect people’s thoughts and translate them into action.
Facebook disabled several features in Messenger and Instagram apps for users in Europe, claiming that it wanted to make sure they comply with a change in privacy rules. Since December 21st messaging apps fall under new EU rules, the ePrivacy directive. According to the BBC: There’s nothing in the ePrivacy directive that bans the use of fun stickers or polls in messaging apps, so Facebook’s move to disable them is a bit puzzling. Its vague notification alerting users that “some features [are] not available” raised more questions than it answered.