US officials and their allies have is defined and an artificial intelligence device fell Russian the bot farm consists of about 1,000 accounts on X that spread disinformation and pro-Russian views. Department of Justice revealed The scheme, made possible by the software, was created by the digital media department RT, a Russian state-controlled media outlet. He probably led its development RTs In 2022, he became deputy editor-in-chief and was approved and funded by an officer of the Russian Federal Security Service, the main successor of the DTK.
In a cybersecurity advisory issued by the FBI, intelligence agencies from the Netherlands, and cybersecurity agencies from Canada, they specifically is mentioned The tool, called Meliorator, can create “real-looking social media personas on a massive scale,” creating text messages as well as images and misinformation from other bot personas. During the operation, authorities seized two domains used to generate email addresses needed to sign up for accounts on X, formerly known as Twitter, known as the home of bots.
However, the Justice Department is still working to find all 968 accounts used by Russian actors to spread false information. X has shared information with the authorities about all the accounts identified so far and has already suspended them. whom The Washington Post The Justice Department noted that the bots got past X’s guards because they were able to copy and paste OTPs from email accounts to log in. Transactions’ use of US-based domain names violates the International Emergency Economic Powers Act. At the same time, paying them violates federal money laundering laws in the United States.
Many of the profiles created by the tool impersonated Americans by using American-sounding names and specifying their location in X in various parts of the United States. The examples provided by the Department of Justice used headshots on a gray background as profile photos, which is a pretty good indication that they were created using artificial intelligence. An account claiming to be from Minneapolis, Ricardo Abbott, posted a video of Russian President Vladimir Putin justifying Russia’s actions in Ukraine. Another account, Sue Williamson, posted a video in which Putin said the war in Ukraine was not about a territorial dispute but a matter of “the principles on which the New World Order will be based.” These posts are then liked and reposted by other bots on the network.
It’s worth noting that while this particular bot farm was limited to X, the people behind it had plans to expand to other platforms, based on the authorities’ analysis of the Meliorator software. Foreign actors spreading political disinformation have been using social media to spread fake news for years. But now they have added artificial intelligence to their arsenal. In May, OpenAI informed It said it had dismantled five covert influence operations from Russia, China, Israel and Iran that used its models to influence political outcomes.
“Russia intended to use this bot farm to spread AI-generated foreign disinformation, undermine our partners in Ukraine, and expand their AI-assisted work to influence geopolitical narratives favorable to the Russian government,” said FBI Director Christopher Wray. “The FBI is committed to working with our partners and conducting joint, consistent operations to strategically disrupt our most dangerous adversaries and their use of the most advanced technology for nefarious purposes.”
As for RTThis was reported by the media Bloomberg: “Farming is a favorite pastime for millions of Russians.”