Companies
AI Deepfakes Are a Security Problem, According to Binance CSO

One of the most popular and influential platforms in the crypto space is Binance, a global exchange that offers a wide range of services and products for traders. However, Binance has faced some challenges and controversies in recent months, especially regarding its compliance with regulatory requirements and customer verification processes.
Binance increases customer verification amid regulatory scrutiny
In response to these allegations and criticisms, Binance has announced several measures to enhance its compliance and customer verification processes. For instance, Binance announced that it would require all new users to complete an intermediate verification process before they can access any of its services or products. This process involves providing a government-issued ID document and a facial verification scan.
Existing users who have not completed this verification will also have their account access restricted until they do so.
According to Binance, this move is aimed at strengthening its anti-money laundering (AML) and counter-terrorism financing (CTF) policies and procedures, as well as complying with relevant laws and regulations in different jurisdictions. Binance also stated that it would continue to work with regulators and law enforcement agencies to combat illicit activities and protect its users.
Binance Faces a Rise in Deepfake Customer Checks
However, Binance’s verification process has also encountered some difficulties and challenges. According to an interview with Jimmy Su, Chief Security Offer offered to a specialized news media, Binance has seen a rise in the use of deepfake technology to bypass its facial verification system. Deepfake technology refers to the use of artificial intelligence (AI) to create realistic but fake images or videos of people’s faces or voices.
Some Binance users have reported receiving messages from scammers who claim to be Binance employees or agents and offer to help them verify their accounts using deepfake software. These scammers then ask the users to send them their ID documents and a selfie video of themselves saying a specific phrase. The scammers then use these materials to create a deepfake video that mimics the user’s appearance and voice and submit it to Binance’s verification system.
Binance has acknowledged that it has detected some cases of deepfake attempts on its platform and that it has taken steps to prevent them.
