In a concerning incident involving advanced technology, a US senator was nearly duped by a deepfake video call. The head of the Senate Foreign Relations Committee, Senator Ben Cardin (D-MD), was targeted by an individual using deepfake technology to impersonate a senior Ukrainian official. This attempt highlights the growing sophistication of digital impersonation, posing serious risks to global diplomacy.
Deepfake caller posed as a top Ukrainian official
According to a report by The New York Times, Senator Cardin received an email last Thursday that appeared to be from Dmytro Kuleba, Ukraine’s former foreign minister. The email requested a Zoom meeting, which Cardin agreed to. The person on the other end of the call not only resembled Kuleba in appearance but also sounded remarkably like him. However, the senator grew suspicious when the supposed Kuleba began asking strange and politically charged questions about the upcoming US election.
The fake official pressed Cardin on foreign policy matters, demanding his opinion on highly sensitive issues. Among the questions was whether Cardin would support firing long-range missiles into Russian territory, a question that raised red flags for the senator.
Senator Cardin raises concerns after call
Finding the conversation odd and out of character for Kuleba, Cardin decided to report the incident to the State Department. After reviewing the matter, officials confirmed that the senator had not been speaking with the real Dmytro Kuleba, but rather an imposter using deepfake technology. At this point, it remains unclear who was behind the deceptive call or what their intentions were.
In a public statement to The New York Times, Cardin confirmed that he had been the target of a deceptive ploy. “In recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual,” Cardin said, though he stopped short of naming the individual. However, an email from Senate security officials later confirmed that the impersonation involved someone pretending to be the former Ukrainian foreign minister.
Security warnings and AI-related threats
Following this incident, Senate security officials have advised lawmakers to remain vigilant for similar attempts. In their internal communication, which was shared with The New York Times, officials warned that other attempts are likely to occur in the coming weeks.
“While we have seen an increase in social engineering threats over the past months and years, this attempt stands out due to its technical sophistication and believability,” the email from the Senate security office stated.
The use of artificial intelligence (AI) tools to create convincing deepfakes has become more widespread, raising concerns about the implications for both politics and security. As these technologies become easier to access and cheaper to produce, their potential for misuse has only grown. Deepfakes have already been used to manipulate public perception, disrupt elections, and damage the reputation of political figures.
Earlier this year, the Federal Communications Commission (FCC) proposed multimillion-dollar fines against a political consultant responsible for a robocall campaign impersonating President Joe Biden. The fake Biden robocall targeted voters in New Hampshire, attempting to discourage them from participating in the state’s primary election.
I checked with renowned world authority, Professor Suggon Deeznutz, and he said parody is legal in America 🤷♂️ https://t.co/OCBewC3XYD
— Elon Musk (@elonmusk) July 29, 2024
In addition, Elon Musk shared a deepfake video of Vice President Kamala Harris on X (formerly Twitter). In the video, Harris appears to refer to herself as “the ultimate diversity hire” and claims she was “mentored by the ultimate deep state puppet, Joe Biden.” Former President Donald Trump also posted an AI-generated endorsement from Taylor Swift on his Truth Social platform in August, which Swift later addressed in her real-life endorsement of Vice President Harris.
and S.E. Hinton does too. https://t.co/OPwFT2ajSO
— S. E. Hinton (@se4realhinton) December 17, 2023
This rise in politically motivated deepfakes poses an urgent challenge for lawmakers and security officials, with implications for the integrity of democratic processes and public trust. With the upcoming elections in the US and heightened tensions globally, such incidents are likely to become more frequent and require careful monitoring.