Looking for Love in China - AI fakes and exploitation of Russian female images in Chinese propaganda

Looking for Love in China - AI fakes and exploitation of Russian female images in Chinese propaganda

By Wei-Ping Li, PhD

In February 2024, a YouTube video produced by a Ukrainian student named Olga Loiek, who currently studies at the University of Pennsylvania in the United States, drew attention to how perpetrators used AI to transform her images in Chinese propaganda praising China and selling Russian products.

After investigation, the Taiwan FactCheck Center found more white females, including an American law professor, fell victim to similar disinformation campaigns.

This propaganda that mainly targets Chinese audiences actually stems from the Chinese sexism that fantasizes about Russian females, as well as the country's long-standing propaganda strategy of deploying foreigners to glorify China.

For months, the fake Loiek videos have been shared on Chinese websites like Douyin and Xiaohongshu. Instead of using Olga’s real name, the social media accounts that hosted the fake videos used other Russian names and translated them into Chinese, such as "捷琳娜 (Jié lín nà)." The fake account on Xiaohongshu, for instance, claimed that “Jié lín nà” is a Russian who has been living in China for ten years and is applying for Chinese citizenship. Speaking fluent Mandarin, the “Jié lín nà” avatar in the video praised Chinese society, saying that in China, "everything belongs to the people" and contrasted China with the United States, which is selfish and has no future. "Jié lín nà" also inquired about the interests of Chinese male viewers in marrying Russian women. 

A screenshot of a group of womenDescription automatically generated
A screenshot of a page of a Xiaohongshu fake account that posted AI-manipulated videos based on Olga Loiek’s original images.  

The real Olga Loiek found the videos via a text message asking if she speaks Mandarin and informing her that her videos might be inappropriately used on social media.

According to the genuine YouTube video made by Olga, the fake videos exploiting her images included themes such as praising the friendly relationship between Russia and China, the significance of Chinese history and culture, and the benefits of marrying Russian women. Some of the videos even promoted Russian products to audiences.

Having been extremely disturbed by these AI-manipulated videos, Loiek asked Chinese social media platforms to take down the videos. However, some of the videos could still be watched on social media platforms at the time this article was written. 

Loiek was not the only Western female victim whose images were turned into fake videos. In her video, Loiek revealed that a Swedish influencer, Lana Blakely, had also been cloned by AI technology in propaganda videos. The Taiwan FactCheck Center found that the fake account profile exploiting Lana’s image also delivered similar messages as Loiek’s, such as being a self-proclaimed Russian, having lived in China for almost ten years, and being a fan of China and Chinese culture.  

A screenshot of a video chatDescription automatically generated
A comparison between the original social media page of the Swedish influencer Lana Blakely (the picture on the right) and a fake WeChat page that posts AI-manipulated videos based on Lana Blakely’s images (the picture on the left). The fake WeChat account introduced herself as “Lynna,” who came from Russia and has lived in China for almost ten years. 

Nevertheless, the perpetrators who made fake videos targeted not only online influencers but also renowned academics. An investigation by the Taiwan FactCheck Center discovered that malicious actors also used online videos of a law professor from Columbia University's law school to produce AI-generated propaganda. The fake AI videos of the law professor have also been disseminated on Douyin and WeChat since January 2024. 

The fake accounts and videos using the image of Columbia law professor Katharina Pistor claimed themselves to be Russians with names 伊琳娜 (Yīlínnà)” or “季婭娜 (Jìayana).” Of course, both “Yīlínnà” and “Jiayana” loved Chinese culture and history. The avatars in these videos also spoke fluent Chinese. The FactCheck Center found that these manipulated videos are based on the professor’s previous lectures and interview footage that have been posted on YouTube for years. 

A screenshot of a personDescription automatically generated
This is a screenshot of a fake WeChat account that posted AI manipulation videos based on the original image of Columbia Law School professor Katharina Pistor. The account introduced itself as “Jiayana” from Russia, who is now living in China and loves Chinese culture and history. 

The videos of “Yīlínnà” and “Jiayana” also center on how powerful and developed China is. For example, she enumerated Chinese brands that have gained international popularity and praised the Chinese industry. She also reminded audiences of the cruel biomedical experiments that the Japanese military had conducted against Chinese people during World War II.

But a large portion of her videos also promoted marriage between Chinese men and Russian women. In some videos, she repeated the claim that “there are more females in Russia and more males in China, so Russian women and Chinese men were good matches.” Moreover, she warned Chinese men: “Don’t go to Vietnam or Myanmar to look for girls to marry. They are not reliable. Chinese men might be cheated!” In addition, the avatar sold Russian products and persuaded audiences to place orders on her page.

But what makes the Yīlínnà” and “Jiayana” videos different from the manipulated videos of other online personas is the former’s spotlight on the Chinese military. In several videos, the avatar lauded the People’s Liberation Army's invincible power and the close bond between the military and ordinary people. One of the videos even praised a late professor of national defense at Nankai University.

Although these videos have mostly been disseminated on Chinese social media platforms, some of them have been spread to platforms in Taiwan, too, such as LINE.

Experts specializing in AI imagery told the Taiwan FactCheck Center that these videos were manipulated using deepfake technology. Although the avatars' voices resembled those of the victims and their lip movements in the videos highly matched their voices, the modifications were not sufficiently sophisticated and revealed signs of AI tampering. For example, the videos shared identical backgrounds; the avatars’ eyes did not focus naturally, and their body movements were awkward. Some of the videos also used filtering features to enhance facial features.    

A person in glasses with a microphoneDescription automatically generated
A comparison between the original image (on the right) of Prof. Pistor in a lecture video and the image (on the left) in a manipulated video. The facial features of the professor in the manipulated image were also tampered. 

The narratives of the above-mentioned AI-manipulated videos have common themes. First, the avatars enthusiastically lauded China, stressing "equality" in Chinese society, Chinese business prosperity, China's ascent in the world, and the strong Chinese military force. Second, the avatars praised Chinese men and expressed Russian females' desire to marry Chinese males. Third, several avatars promoted Russian products.  

These narratives actually follow a long tradition of Chinese propaganda that uses foreigners to promote China, showcase international recognition of China’s success, and further “tell the Chinese story right to the world” or cement nationalism domestically. As researchers have pointed out, the themes of these propaganda campaigns include the admiration of China’s political system, the problems of democratic countries and capitalism, and the contributions of China to the world. In the cases of AI-manipulated videos that appropriate Western females’ images, these videos apparently use their identities as foreigners to send messages that Chinese audiences should be proud of their own country and the military.  

The use of avatars that appear to be Russian is also consistent with the trend of Chinese propaganda using Russian influencers who are gaining popularity on social media.

According to the Australian Strategic Policy Institute report, some Russian influencers broadcasting their lives in China have encouraged tight ties between Russian and Chinese people, as well as highlighted Russian women's desire to marry Chinese men. Despite not being the work of real influencers, the recently submerged AI videos are a continuation of the "Russian women looking for love in China" videos. On the one hand, the videos appeal to Chinese men's interests; on the other hand, the footage reinforces Chinese’ stereotypes about Russian women, as well as sexism fantasizing about foreign females (Note 1).

As Loeik commented in her video addressing the incident, the AI-clone videos were terrifying and disturbing. This is more so for Loeik since she is a Ukrainian but was depicted as a Russian and promoted Russian goods. Moreover, the females in the videos became both tools of propaganda and targets of gender bias. Since 2022, China has enacted rules to ban the creation of deepfake videos without consent and prohibit altering images to create and spread disinformation. Nevertheless, AI videos, especially those employed in international or domestic propaganda by China, are going to stay.

Wei-Ping Li is a research fellow at the Taiwan FactCheck Center.

Mary Ma (a fact-checker at the Taiwan FactCheck Center) contributed to this analysis.


Note 1: As a matter of fact, this kind of sexism targets not only Russian females but also Ukrainian women. For example, when the Ukrainian-Russian War broke out, a great number of posts saying, “I volunteer for sheltering Ukrainian beauties,” circulated on Chinese social media platforms. Meanwhile, in Taiwan, posts by Taiwanese users objectifying Ukrainian women also appeared in online forums and social media. See Yang, W. (2022, March 7). 兩岸「烏克蘭美女」言論惹議 專家:突顯長久問題. DW.COM. https://www.dw.com/zh/%E5%85%A9%E5%B2%B8%E7%83%8F%E5%85%8B%E8%98%AD%E7%BE%8E%E5%A5%B3%E8%A8%80%E8%AB%96%E6%83%B9%E8%AD%B0-%E5%B0%88%E5%AE%B6%E7%AA%81%E9%A1%AF%E9%95%B7%E4%B9%85%E5%95%8F%E9%A1%8C/a-61036916


References

🔗Ryan, F., Knight, M., & Impiombato, D. (November, 2023). Singing from the CCP’s songsheet. Australian Strategic Policy Institute | ASPI. https://www.aspi.org.au/report/singing-ccps-songsheet

🔗Fang, K. (2022). Praise from the International Community: How China Uses Foreign Experts to Legitimize Authoritarian Rule. China Journal, 87, 72–91. https://doi.org/10.1086/717550

🔗Ma, W. (2024, March 1). Ukrainian YouTuber finds her AI clone selling Russian goods on Chinese internet. Voice of America. https://www.voanews.com/a/ukrainian-youtuber-finds-her-ai-clone-selling-russian-goods-on-chinese-internet-/7509009.html