27 October 2020
Covid-19 Disinformation & Social Media Manipulation
A range of actors are manipulating the information environment to exploit the COVID-19 crisis for strategic gain. ASPI’s International Cyber Policy Centre is tracking many of these state and non-state actors online, and will occasionally publish investigative, data-driven reporting that will focus on the use of disinformation, propaganda, extremist narratives and conspiracy theories by these actors.
The bulk of ASPI’s data analysis uses our in-house Influence Tracker tool - a machine learning and data analytics capability that draws out insights from multi-language social media datasets. This new tool can ingest data in multiple languages and auto-translate, producing insights on topics, sentiment, shared content, influential accounts, metrics of impact and posting patterns.
The reports are listed in chronological order:
This report builds from a Twitter network take-down announced on 8 October 2020 and attributed by Twitter as an Iranian state-linked information operation. Just over 100 accounts were suspended for violations of Twitter’s platform manipulation policies. This case study provides an overview of how to extrapolate from Twitter’s take-down dataset to identify persistent accounts on the periphery of the network. It provides observations on the operating mechanisms and impact of the cluster of accounts, characterising their traits as activist, media and hobbyist personas. The purpose of the case study is to provide a guide on how to use transparency datasets as a means of identifying ongoing inauthentic activity.
This research investigation examines Russia’s efforts to manipulate the information environment during the coronavirus crisis. It leverages data from the European External Action Service’s East StratCom Task Force, which, through its EUvsDisinfo project, tracks pro-Kremlin messages spreading in the EU and Eastern Partnership countries. Using this open-source repository of pro-Kremlin disinformation, in combination with OSINT investigative techniques that track links between online entities, we analyse the narratives being seeded about COVID-19 and map the social media accounts spreading those messages.
We found that the key subjects of the Kremlin’s messaging focused on the EU, NATO, Bill Gates, George Soros, the World Health Organization (WHO), the US and Ukraine. Narratives included well-trodden conspiracies about the source of the coronavirus, the development and testing of a potential vaccine, the impact on the EU’s institutions, the EU’s slow response to the virus and Ukraine’s new president. We also found that Facebook groups were a powerful hub for the spread of some of those messages.
27 Oct 2020
For the latest report in our series on Covid-19 disinformation, we’ve investigated ongoing inauthentic activity on Facebook and YouTube. This activity uses both English and Chinese language content to present narratives that support the political objectives of the Chinese Communist Party (CCP). These narratives span a range of topics, including assertions of corruption and incompetence in the Trump administration, the US Government’s decision to ban TikTok, the George Floyd and Black Lives Matter protests, and the ongoing tensions in the US–China relationship. A major theme, and the focus of this report, is criticism of how the US broadly, and the Trump administration in particular, are handling the Covid-19 crisis on both the domestic and the global levels.
29 Sept 2020
#7: Twisting the truth: ongoing inauthentic activity promoting Falun Gong, the Epoch Times and Truth Media targets Australians on Facebook
This ASPI ICPC report investigates two Facebook pages which appear to be using coordinated, inauthentic tactics to target Australian users with content linked to Falun Gong, The Epoch Times and other Falun Gong-associated media groups. This includes running paid advertisements, as well as systematically seeding content into Australian Facebook groups for minority communities, hobbyists and conspiracy theories. Falun Gong and its supporters are entitled to participate freely in Australia's national debate, however inauthentic and covert efforts to shape political opinions have no place in an open democratic society.
09 Sept 2020
This latest report in our series on COVID-19 disinformation and social media manipulation investigates vaccine disinformation emerging – the day after Russia announced plans to mass-produce its own vaccine - from Eastern Ukraine’s pro-Russian media ecosystem.
We identify how a false narrative about a vaccination trial that never happened was seeded into the information environment by a pro-Russian militia media outlet, laundered through pro-Russian English language alternative news websites, and permeated anti-vaccination social media groups in multiple languages, ultimately completely decontextualised from its origins.
The report provides a case study of how these narratives ripple across international social media networks, including into a prominent Australian anti-vaccination Facebook group.
The successful transfer of this completely fictional narrative reflects a broader shift across the disinformation space. As international focus moves from the initial response to the pandemic towards the race for a vaccine, with all of the complex geopolitical interests that entails, political disinformation is moving on from the origins of the virus to vaccine politics.
24 Aug 2020
Automating influence on Covid-19 looks at how Chinese-speaking actors are attempting to target US-based audiences on Facebook and Twitter across key narratives including amplifying criticisms of the US’s handling of Covid-19, emphasising racial divisions, and political and personal scandals linked to President Donald Trump.
This new report investigates a campaign of cross-platform inauthentic activity that relies on a high-degree of automation and is broadly in alignment with the political goal of the People’s Republic of China (PRC) to denigrate the standing of the US. The campaign appears to be targeted primarily at Western and US-based audiences by artificially boosting legitimate media and social media content in order to amplify divisive or negative narratives about the US.
04 Aug 2020
#4 ID2020, Bill Gates and the Mark of the Beast: how Covid-19 catalyses existing online conspiracy movements
Against the backdrop of the global Covid-19 pandemic, billionaire philanthropist Bill Gates has become the subject of a diverse and rapidly expanding universe of conspiracy theories. This report takes a close look at a particular variant of the Gates conspiracy theories, which is referred to here as the ID2020 conspiracy (named after the non-profit ID2020 Alliance, which the conspiracy theorists claim has a role in the narrative), as a case study for examining the dynamics of online conspiracy theories on Covid-19. Like many conspiracy theories, that narrative builds on legitimate concerns, in this case about privacy and surveillance in the context of digital identity systems, and distorts them in extreme and unfounded ways. Among the many conspiracy theories now surrounding Gates, this one is particularly worthy of attention because it highlights the way emergent events catalyse existing online conspiracy substrates. In times of crisis, these digital structures—the online communities, the content, the shaping of recommendation algorithms—serve to channel anxious, uncertain individuals towards conspiratorial beliefs. This report focuses primarily on the role and use of those digital structures in proliferating the ID2020 conspiracy.
25 June 2020
This report analyses a persistent, large-scale influence campaign linked to Chinese state actors on Twitter and Facebook.
This activity largely targeted Chinese-speaking audiences outside of the Chinese mainland (where Twitter is blocked) with the intention of influencing perceptions on key issues, including the Hong Kong protests, exiled Chinese billionaire Guo Wengui and, to a lesser extent Covid-19 and Taiwan. Extrapolating from the takedown dataset, to which we had advanced access, given to us by Twitter, we have identified that this operation continues and has pivoted to try to weaponise the US Government’s response to current domestic protests and create the perception of a moral equivalence with the suppression of protests in Hong Kong.
11 June 2020
This new research highlights the growing significance and impact of Chinese non-state actors on western social media platforms. Across March and April 2020, this loosely coordinated pro-China trolling campaign on Twitter has:
- Harassed and mimicked western media outlets
- Impersonated Taiwanese users in an effort to undermine Taiwan’s position with the World Health Organisation (WHO
- Spread false information about the Covid-19 outbreak
- Joined in pre-existing inauthentic social media campaigns
23 April 2020
Includes case studies on:
- Chinese state-sponsored messaging on Twitter
- Coordinated anti-Taiwan trolling: WHO & #saysrytoTedros
- Russian Covid-19 disinformation in Africa
8-15 April 2020