August 1, 2023
[Mr President], I am pleased to present the report from the Senate Select Committee on Foreign Interference through Social Media examining the risk to Australia’s democracy and values by foreign interference through social media including via the spread of misinformation and disinformation.
At the outset, I want to thank my fellow Committee members, in particular Deputy Chair Senator Jess Walsh, for their constructive and bipartisan collaboration on what is one of Australia’s most pressing security challenges. I would also like to acknowledge Senator Jenny McAllister and the late Senator Jim Molan who first led the inquiry into foreign interference through social media in the 46th Parliament.
[Mr President], foreign interference is now Australia’s principal national security concern. It is pervasive, insidious, subtle, and has the potential to undermine our values, freedoms and way of life.
Australia has led the world in combatting foreign interference, and has moved quickly to counter the threat through reforms including the introduction of the Foreign Influence Transparency Scheme in 2018, the Foreign Arrangements Scheme in 2020, and the decision to exclude Huawei from our 5G network in 2018.
However, foreign interference tactics have continued to evolve along with technology, and the threat has only increased in a worsening strategic environment.
Authoritarian regimes continue to threaten democratic societies through targeted disinformation campaigns that use social media to advance their strategic interests. Perpetrating states use these platforms to skew public debate, undermine trust in democratic institutions, and peddle false narratives. The proliferation of emerging technologies like artificial intelligence has made their job easier, dramatically increasing the scale and reach of foreign interference campaigns. This calls for urgent action to ensure Australia stays ahead of the threat.
Social media platforms are the new town square in liberal democracies. It was estimated in February 2022 that some 21.45 million Australians are active social media users, and more than half of all Australians use social media as a news source. Social media is the place where news is reported, contentious issues are debated, consensus is formed, and public policy decisions are shaped. The health of these forums directly affects the health of our nation.
Foreign authoritarian states know this. They do not permit free debate on their own social media platforms. They use ours as a vector for information operations to shape our decision-making in their national interest – at the expense of our own.
As ASIO assessed, social media itself is not the threat, but serves as a vector for foreign interference. Not all social media platforms are the same. The ability for a social media platform to be weaponised varies according to the laws of the country from where it is headquartered.
The Committee was particularly concerned by the unique national security risks posed by companies like TikTok and WeChat, whose parent companies – ByteDance and Tencent respectively – are headquartered in China. China’s 2017 National Intelligence Law means the Chinese Government can compel these companies to secretly cooperate with Chinese intelligence agencies.
The Committee heard that TikTok’s China-based employees can and have accessed Australian user data, and could even manipulate algorithms dictating what Australian users see on the platform. But TikTok cannot tell us how often Australian data is accessed, despite suggesting this information was logged. Nor was TikTok was able to provide the legal basis upon which its employees would refuse to comply with Chinese law – the short answer is, it can’t.
Throughout the inquiry, companies headquartered in authoritarian countries were consistently reluctant to cooperate with Australian parliamentary processes. TikTok was hesitant to provide witnesses sought by the Committee, and was evasive in their answers when they finally did agree to appear. WeChat showed contempt for the Parliament by refusing to appear at all, and through the disingenuous answers it provided to questions in writing. The representations made by the Chinese embassy to the Department of Foreign Affairs and Trade about our inquiry on WeChat’s behalf only serves to prove the point about the close relationships these platforms have with the Chinese government.
This stood in contrast to the more constructive engagement the Committee had with platforms based in Western countries who recognised the fundamental importance of the checks and balances inherent in democratic systems despite the impost this can create.
These companies are facing novel challenges in combatting foreign interference as authoritarian regimes continue to pump disinformation on their platforms. Between 2017 and 2022, Facebook’s parent company, Meta, disabled more than 200 covert influence operations originating from more than 60 countries that targeted domestic debate in another country.
In the first quarter of 2023, YouTube terminated more than 900 channels linked to Russia and more than 18,000 linked to China.
In the case of both authoritarian and Western-based companies, the Committee explored concerns platforms are being used to “pull” and “push” information to gather intelligence on individuals that will enable them to be targeted, to gather behavioural data by population or cohort to refine interference campaigns, to harass and intimidate Australia’s diaspora communities, and to undermine societal trust, spread disunity and influence decision-making.
Countering this is becoming more complex as authoritarian regimes evolve their methods. Artificial intelligence and the commercialisation of disinformation services – where state actors engage companies to orchestrate disinformation campaigns – threaten to exponentially increase the scale and reach foreign interference through social media.
It is crucial that Australia develops a real-time capability to counter this malign activity. For two reasons, this approach should be underpinned by a guiding principle of transparency rather than censorship.
The first is to expose disinformation activity which thrives off secrecy. The second is to empower Australians so that they can both evaluate the content they see on these platforms, and the conduct of the platforms themselves. For example, state-affiliated media entities should be proactively labelled on all platforms. Any content censored at the direction of a government should be disclosed to users. Platforms should be open to independent external researchers who can investigate and attribute coordinated inauthentic behaviour. The access to user data, especially to employees based in authoritarian countries, must be disclosed.
WeChat comprehensively failed the transparency test by refusing to participate in public hearings on the basis that, despite its significant digital presence, it does not have a legal presence in Australia. If social media companies want to operate in Australia, they should be required to establish a presence within Australia’s legal jurisdiction to more effectively be held accountable under our laws.
The Committee found that TikTok engaged in a determined effort to avoid answering basic questions about its platform, its parent company ByteDance and its relationship to the Chinese Communist Party.
We recommend companies that repeatedly fail to meet the minimum transparency requirements should be subject to fines and, as a last resort, may be banned by the Minister for Home Affairs with appropriate oversight mechanisms in place. Should the US Government force ByteDance to divest ownership of TikTok to another company that is not beholden to the Chinese Communist Party, the Australian Government should consider similar action.
The April 2023 TikTok ban on government-issued devices, due to serious espionage and data security risks, should also apply to government contractors and entities designated as systems of national significance. We must move beyond its whack-a-mole approach to assess and mitigate the next TikTok before it is widely deployed on government devices.
Amended Magnitsky-style sanctions, greater enforcement of espionage and foreign interference offences and support for diaspora communities targeted by transnational repression all need to be part of a package of reforms to make Australia a harder target.
Despite Australia’s world-leading efforts to counter foreign interference, evolutions in technology and threat environment demonstrates there is more work to be done to protect Australia from sophisticated disinformation campaigns of authoritarian regimes. With a concerted joint effort by government, the private sector and civil society, we can ensure Australia’s way of life prevails, and preserve the extensive benefits social media platforms provide while managing the accompanying risks.
I commend this report to the Senate.