Platforms are still not doing enough to combat disinformation related to the coronavirus crisis, the European Commission said said today.
In communication, there is a pressure on technical platforms provide monthly reports on their efforts in this area and request more detailed data on measures to promote relevant content; improve user awareness; and limiting disinformation and advertising related to corona viruses.
It also wants to see more Platform collaboration between researchers and fact-checkers in all EU member states (for all languages) and increased transparency in the implementation of guidelines for informing users in cases where they interact with disinformation.
In recent years, the Commission has been pushing platforms for measures to combat misinformation. It has committed technology giants and adtech players to a voluntary code of conduct for disinformation that aims to disrupt advertising revenue and enable counterfeiting to be reported.
Since then, assessments of platforms' efforts to combat malicious counterfeiting have been mild to say the least, and they have been repeatedly asked to do more. A problematic persistent lack of transparency in relation to these self-regulatory efforts has also been repeatedly highlighted.
The corona virus crisis has further increased political pressure on platforms to deal with online disinformation – and technology giants like Google have responded with measures aimed at proactively targeting core content in addition to corona virus content (which originally focused on the U.S.) Health information access case).
As early as April, Facebook announced it would alert users who had interacted with certain types of coronavirus misinformation. A debunking popup with messages from the World Health Organization is displayed.
However, the Commission said today that it wants to see more evidence that such measures work.
EU lawmakers are also in the process of developing new rules for digital services and platforms that could redefine the liability line and impose new responsibilities on technology companies in relation to the content they host. A draft of this Incoming Digital Services Act (DSA) is planned for the end of the year after a public consultation started last week.
"The coronavirus pandemic was accompanied by a massive" infodemic, "said Commissioner Josep Borrell at a press conference today. "We have seen a wave of false and misleading information, jokes and conspiracy theories, and targeted interference from foreign actors."
Borrell gave examples of disinformation that are at risk for public health and that the Commission has disseminated online in Europe, such as false claims that drinking bleach can cure the coronavirus or that washing hands does not help.
He also pointed to the vandalism of the 5G infrastructure fueled by COVID 19 conspiracy theories.
“Part of it aims to harm the European Union and its Member States, which are trying to undermine our democracies, the credibility of the European Union and the national authorities, ”he added. “Disinformation in times of coronavirus can also be fatal. Misleading health information, consumer fraud, cybercrime or targeted disinformation campaigns by foreign actors pose various potential risks for our citizens, their health and their trust in public institutions. "
In a statement, the Vice President of the Commission for Values and Transparency, Věra Jourová, added: “Waves of disinformation hit Europe during the coronavirus pandemic. They came from home and abroad. In order to combat disinformation, we have to mobilize all relevant actors from online platforms to authorities and support independent fact checkers and the media. While online platforms have taken positive steps during the pandemic, they need to step up their efforts. Our actions are deeply embedded in fundamental rights, especially freedom of expression and information. "
"I believe that working with the platforms and working with them to design the disinformation code of conduct helped introduce new policies faster," she said, discussing disinformation about coronaviruses and what other platforms during a press conference have to do.
“Platforms have to do more here, too, and our code was just the first step. There is room for improvement. For example, we only know as much as platforms tell us – that's not good enough. They need to open up and provide more evidence that the measures they are taking are working well. They must also enable the public to independently identify new threats. We now invite you to provide monthly reports in more detail than ever before. "
Removing financial incentives for those who want to benefit from disinformation is "crucial" according to Jourová, who said the Commission is taking steps to "gain a better understanding of the flow of advertising revenue related to disinformation".
"We need to ensure transparency and accountability," she added. "Citizens need to know how information reaches them and where it comes from."
Jourová announced that TikTok has agreed to join its EU code of conduct for disinformation – and expects the formalities to be completed "very soon".
She added that the Commission is also "negotiating" with Facebook WhatsApp about registration.
She emphasized that EU lawmakers are not asking the platforms to remove general disinformation (with some exceptions related to COVID-19; e.g., when counterfeit products or advice could harm the public), but rather information about surface quality and checking of Facts so users can get the facts for themselves.
Jourová praised Twitter's recent decision to label some tweets from U.S. President Donald Trump and called it the type of action that platforms are looking for.
"Twitter is a very good example of what we support," she said. “Twitter didn't remove President Trump's statement, just added the facts. And that's what I call plurality and the possibility of competition for freedom of speech. Because we shouldn't just rely on an authoritative explanation if it is possible to add some facts that you could look at from a different perspective. So this is the speech competition.
“We never wanted the platforms to remove the content – unless and here comes the COVID-related situation – unless it is obvious and clearly harmful to people's health. Which is the case with a lot of strange advice, and dangerous advice has been posted on social media. "
During the press conference, the commissioners were made aware of how little resources the Commission is making available for disinformation task forces – with an annual budget for strategic communication of just around EUR 5 million last year.
Jourová replied that the cooperation system that has been set up to solve the problem comes from bundled resources from EU Member States, civil society and the platforms themselves.
“The platforms invest a lot in the creation of the task forces, their special units to fulfill their obligations – which we also expect from them in this communication – we engage civil society and fact-checkers, we engage the research sector. So you have to talk about a much broader field and a lot of other capacities that we use for that, ”she said, adding that in the context of COVID disinformation, the health sector is also concerned with combating junk content.
"I have always said that the fight against disinformation is not about censorship – it is not about eliminating the false claims and eliminating disinformation and misinformation." Those responsible for the issue need to proactively defend their facts and proactively bring in trustworthy information, ”she continued.
W.Although disinformation is not generally considered illegal across the EU (with some exceptions in certain Member States), Jourová argued that counterfeiting "can cause significant harm" – although she also suggests that the Commission will avoid setting hard legal lines here how it works to update digital regulation.
"For the disinformation, our logic will be to look at how big the potential public damage could be," she said, giving an indication of how the issue is viewed in relation to the upcoming DSA. “I do not foresee that we will make strict regulations in this regard. Because it is too sensitive to evaluate this information and have some rules, it plays with freedom of expression and I really want to make a balanced proposal. So in DSA you will very likely see the regulatory measures against illegal content – because what is offline offline must be clearly illegal online and the platforms must work proactively in this direction. For disinformation, however, we have to consider how the harmful effects of disinformation can be effectively reduced.
"We will focus on the impact before the elections because we see that disinformation – targeted and targeted – can harm free and fair elections. So these are very serious issues that we have to deal with. "
Jourová warned that the next battleground for health-related disinformation in Europe will be vaccination.
She also named China and Russia as foreign entities, which the Commission has confirmed are behind state-sponsored disinformation campaigns against the region.