In a sophisticated cybersecurity campaign, North Korean-linked hackers are employing AI-generated video calls to impersonate trusted contacts, deceiving crypto industry employees into installing malware on their devices. This criminal scheme, recently reported by BTC Prague co-founder Martin Kuchař, utilizes advanced visual manipulation tools to compromise the security of various digital assets and sensitive personal information.
Through these AI-powered social engineering attacks, perpetrators initially contact their victims via Telegram to schedule meetings on platforms like Zoom or Teams, where the deception takes place. Once the session begins, attackers use AI-generated videos that mimic the appearance of known industry peers, creating a false sense of security for the targeted worker during the live interaction.
Under the pretext of a non-existent technical failure, specifically an audio problem, the criminals urge the user to download a supposed repair patch, which is actually a malicious script designed for macOS systems. Upon executing this file, the criminals gain full system access, allowing them to steal Bitcoin and take over social media accounts to further expand their network of potential victims.
The technical evolution of identity impersonation in video calls
This intrusion method, which closely mirrors documented tactics of the BlueNoroff group, a subgroup of the North Korean state organization Lazarus, demonstrates a significant leap in the complexity of contemporary cyber scams. By using spoofed Zoom domains and deceptive download links, attackers manage to bypass the initial suspicions of professionals who, theoretically, possess advanced technical knowledge within the criptocurreny ecosystem.
However, the danger of this tactic lies in the emotional urgency they generate, pressuring the target to install malicious software immediately during the live interaction. According to technical reports, once the malware infects the device, it disables shell history and repeatedly prompts for system passwords, thereby gaining elevated privileges that compromise the integrity of any digital wallet present on the machine.
How do deepfakes affect global financial security?
The impact of these operations is alarming, considering that losses related to AI-driven impersonation scams reached a record 17 billion dollars during the past year 2025. By relying on familiar social patterns, these AI-powered social engineering attacks manage to make even security experts doubt their own judgment when interacting with what appears to be a legitimate colleague in a professional setting.
Finally, security analysts warn that visual and auditory content can no longer be considered reliable proof of identity in the digital environment. Given the growing sophistication of these state-sponsored groups, the crypto industry must implement mandatory cryptographic signatures and more rigorous multi-factor authentication processes, understanding that human error, amplified by artificial intelligence, remains the weakest link in the security chain.
