Japanese

Events

  • NPI Home
  • Events
  • On March 6, 2024, NPI held a webinar, "The risk of disinformation in Indo-Pacific Region-the influence on election and national security".
  • Facebook
  • Twitter
  • LINE

2024/03/29
On March 6, 2024, NPI held a webinar, "The risk of disinformation in Indo-Pacific Region-the influence on election and national security".

On March 6, 2024, NPI held a webinar, "The risk of disinformation in Indo-Pacific Region-the influence on election and national security".


With the development of social networking services (SNS) social media in the information space, the proliferation of disinformation has become an issue that threatens social stability. In the Indo-Pacific region, the increase of disinformation in the information space has become an issue. As an example, during Taiwan's presidential election in January 2024, the dissemination of disinformation was observed. The members of the Research Project for the Risks in Information Sphere at Nakasone Peace Institute (NPI) discussed this growing risk in the information space, focusing on the situation of disinformation in the Indo-Pacific region and Taiwan's presidential election.


Time and Date: 16:00-17:15, March 6, 2024

Theme: "The risk of disinformation in Indo-Pacific Region-the influence on election and national security"


Panelists:

Nagasako Tomoko, Researcher, Office of Cyber Domain Awareness, Information-technology Promotion Agency

Kawaguchi Takahisa, Senior Research Fellow, Tokio Marine dR Co., Ltd.

Murakami Masatoshi, Associate Professor, Department of Contemporary Society, Kogakuin University

Moderator:

Osawa Jun, Senior Research Fellow, NPI; Leader, Research Project for the Risks in Information Sphere


*Ms. Nagasako and Mr. Kawaguchi are members of NPI's Research Project for the Risks in Information Sphere and Mr. Murakami is a member of the Marine Security Study Group.


There was active discussion with participation by attendees from government agencies, corporations, researchers, and the mass media. The main points of discussion were as follows.


(Ms. Nagasako)

During the 2019 general election in Australia, as during the U.S. presidential election in 2016, cyberattacks on political parties and parliament occurred prior to the election with suspected involvement from China. In response, Australia established Electoral Integrity Assurance Taskforce, conducted investigations of the election and related threats, assessed the impact of fake news proliferation on journalism, and legislated the National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 and conducting a media literacy campaign.


In the Philippines, until around 2022, disinformation from domestic origins targeting the presidential election for the purpose of maintaining the system and gaining authority within the country was predominant. However, since around 2023, influence operations by Chinese state-affiliated media are increasing. In response to disinformation, the Philippines introduced a new bill called Act Criminalizing the Creation and Dissemination of Fake News in August 2022. Although the Act clearly stipulates severe penalties on offenders, there is ambiguity in the definition of "fake news," such as the inclusion of misinformation and disinformation in the term as well as issues regarding the specific criteria for application and the scope of penalties.


Disinformation in India is strongly associated with Hindu nationalism and tends to be used in domestic political campaigns. There are risks threatening social harmony by inciting binary oppositions such as rural/urban and elite/non-elite. The use of disinformation in elections and other democratic processes could lead to political instability.


In Indonesia, propaganda, including government disinformation aimed at manipulating information, dividing the population, and distracting the public from domestic politics has become a concern. Since 2018 the government has conducted information literacy programs, which emphasize obedience to the government, with, as an example, "do not spread falsehoods and think carefully when criticizing the government." Such programs could carry the risk of reinforcing the state's power of controlling the citizens' speech.


In Thailand, disinformation from China has been increasing in recent years. Besides the proliferation of Chinese bot accounts, there has been a rise in pro-China narratives in Thai media, often in cooperation with Chinese and Russian media. Even if these are not direct disinformation operations, such media cooperation promotes to create the favorable narratives to China. Taking the general election as an example, a narrative was created suggesting that the U.S. supports candidates behind the scenes who would worsen the relationship between Thailand and China.


An overview of the disinformation situation in the Indo-Pacific region reveals a tendency for disinformation to be used to advocate for authoritarian or socialist regimes and to strengthen domestic regimes. In the name of disinformation countermeasures, governments exercise control over public discourse, and cases of disinformation aimed at directing or controlling public opinion have been observed. It is important to note that in addressing disinformation in these regions, there are aspects that differ from strategies that promote democracy taken in the West. This point should serve as a valuable lesson in promoting disinformation countermeasures in Japan.


On the contrary, from the perspective of foreign influence operations, China's involvement should be cautioned. The approaches are diversifying, including cross-posting and strengthening relations with local media, and it can be seen from various cases that China's interest is expanding to Pacific island countries, so the future impact of this trend on Japan should also be closely monitored and countermeasures against them need to be conducted.


(Mr. Kawaguchi)

On January 13, 2024, during Taiwan's presidential and legislative elections, China was found to have manipulated online information in an apparent effort to influence public opinion in Taiwan. The issue in the 2024 election, on which the manipulation was based, was "whether to give the eight-year-long T sai Ing-wen administration a third term." Because of a series of scandals and blunders, mainly sexual harassment and deception, that occurred in 2023 within the administration, the election issue became a battle over Taiwan's domestic politics, economy, and society, rather than one about Taiwan-China relations, Taiwan-U.S. relations, or the state of Taiwan.


It has been revealed that China utilized a cross-platform network known as " S pamouflage " to spread disinformation that discredited President Tsai Ing-wen and candidate Lai Ching-te. "H ack-and-leak" operations were also observed, in which data obtained through unauthorized means was intentionally leaked. These operations can be broadly classified into two types: p ublic trust-breach and h igh-value target exposure, the latter of which is often practiced in Russia. It is based on the "K ompromat-type" operation, through which images and photographs are leaked to discredit the targeted individual.


It is unclear whether China has a grand strategy of election interference through the digital sphere or whether it involves coordination among various actors. However, election interference is characterized by China's targeting of political issues, social problems, and insecurity that already exist in the region. Since it is difficult to distinguish between foreign disinformation, spontaneous misinformation, and disinformation originating within the region, it is possible that China is deliberately resorting to such tactics.


Even if China fails to achieve its desired outcome of the election through information manipulation, it can gain leverage to obstruct the opposing candidate or party from implementing its policies. It can also damage confidence in the opposing individuals or political systems, so medium- and long-term risks remain for the targeted party. In response to these issues, Taiwan has indeed seen various efforts by the civil and private sectors to fight information manipulation. In the public sector, policies and laws against information manipulation were strengthened under the Tsai Ing-wen administration. Nevertheless, the reality is that there is still no definitive measures or tools to address this issue.


(Mr. Murakami)

As with the policy process, not all actors necessarily transmit accurate information in policy dissemination. The analytical model, the "Distorted-information Hypothetical Model (DHM)," which is based on the assumption that information is distorted due to the rationality and constraints of the sender, is necessary.


By adopting the same model, it is possible to obtain different results from the conventional CHM (an analytical approach that accepts the truthfulness of information without evaluation). The DHM is also meaningful for policy prediction. Distorted information refers to information that is distorted from its original form before being disseminated. While it is similar to disinformation, it may contain some truth.


If we use the DHM to analyze U.S. Speaker of the House Nancy Pelosi's visit to Taiwan in August 2022 as a case study, it is possible that President Biden had sufficient motivation to transmit distorted information, with the possibility that the transmission of "the military is negative" could be distorted information. It can be speculated that Pelosi's plan to visit Taiwan was strategically aligned with a shift in the Biden administration's Taiwan policy, and, in order to mitigate the Chinese opposition to the plan, the dissemination of distorted information might have been employed.


In Taiwan, partisan support is fairly deep-rooted, and Chinese influence operations have yet to significantly change the views of the DPP's solid supporters.

< Back to previous page

Latest Articles on Events

Go to Article List >
Nakasone Peace Institute (NPI)(NPI)
Copyright ©Nakasone Peace Institute, All Rights Reserved.