North Korean Hackers Use AI Deepfake Video Calls to Attack Crypto Practitioners
Odaily News North Korean-linked hacker groups are continuously upgrading their attack methods targeting crypto industry practitioners, using AI-generated deepfake video calls to impersonate individuals familiar to or trusted by the victims, inducing them to install malware. Martin Kuchař, co-founder of BTC Prague, disclosed that attackers initiate video calls using compromised Telegram accounts and, under the pretext of "fixing Zoom audio issues," trick victims into installing malicious programs disguised as plugins, thereby gaining full control of the devices.
Security research firm Huntress pointed out that this attack pattern is highly consistent with its previously disclosed operations targeting crypto developers. The malicious script can execute multi-stage infections on macOS devices, including implanting backdoors, recording keystrokes, stealing clipboard contents, and crypto wallet assets. Researchers have attributed this series of attacks with high confidence to the North Korean state-sponsored hacker group Lazarus Group (also known as BlueNoroff).
The information security lead at blockchain security company SlowMist stated that such attacks show clear reuse characteristics across different operations, with targets concentrated on specific wallets and crypto practitioners. Analysis suggests that with the proliferation of deepfake and voice cloning technologies, images and videos can no longer be reliably used as proof of identity authenticity. The crypto industry needs to increase vigilance and strengthen multi-factor authentication and security protection measures. (decrypt)
