Deep Fake Scams Set to Cause $25 Billion Crypto Losses by 2024, Bitget Predicts
Summary:
Deep fake schemes are predicted to cause over $25 billion in crypto losses by 2024, according to Bitget Research. This figure represents a major increase from previous years driven by a 245% global escalation in deep fakes. In Q1 2024, the majority of deep fakes were detected in China, Germany, Ukraine, the US, Vietnam, and the UK. Fraudsters commonly use deep fake technology in phishing attacks, fake projects, and Ponzi schemes to gain the trust of crypto investors. Bitget foresees that without effective measures, deep fakes could account for 70% of all crypto-related crimes by 2026.
Deep fake schemes and scams in the crypto sector are expected to cause losses exceeding $25 billion by 2024, a significant increase compared to the previous year, as predicted by research from Bitget. Figures from a report published on June 27th reveal a 245% global increase in deep fakes in 2024, based on earlier data from Sumsub. The most significant instances of deep fakes in Q1 2024 were detected in China, Germany, Ukraine, the US, Vietnam, and the UK, with the cryptocurrency industry registering a comparative gain of 217% from Q1 2023. According to Bitget, deep fakes have led to $6.3 billion in losses in cryptos during the first quarter and project the losses to increase to $10 billion per quarter by 2025.
“An influx of deepfakes in the crypto business is an inevitable ordeal, and there is little that can be done to prevent them without sufficient education and awareness,” stated Bitget CEO, Gracy Chen to Cointelegraph. Interestingly, the tactics used by deepfake fraudsters have not altered significantly over time. Most losses occurring in cryptos due to deep fakes are from phishing attacks, fake projects, and Ponzi schemes involving deep fake tech as a conduit to instill trust among the crypto investors. For over two years, this method has accounted for over half of all losses linked to deepfakes.
“Through imitating influential personalities, these schemes generate an illusion of credibility and considerable project financial backing, consequently attracting large investments from unsuspecting victims who fail to perform due diligence,” Bitget Research reported. Deep fakes find extensive application in a host of other scams, including cyber extortion, market manipulation, and identity fraud. Bitget projected that by 2026, these tech frauds could account for 70% of all crypto-related crimes if no effective control measures are put in place.
Ryan Lee, Chief Analyst at Bitget Research told Cointelegraph that criminals are progressively using fake images, videos, and audios to create a greater impact on their victims. A video of a person pretending to be a close acquaintance of the victim could be a turning point for fraudsters, whereas a fabricated video of an influencer might enhance investor confidence in a scam project.
Lee points out that a major cause for worry over deep fake tech is AI-supported voice imposters, which enable con artists to simulate calls from people known to their victims and request money. Another potential threat could be deep fakes that bypass Know Your Customer (KYC) measures to obtain unauthorized access to an individual's funds.
Lee cautions that at present, "exchanges must pay close attention to their 'Proof of Life' features of the KYC systems." Such a feature confirms the authenticity of a user by tracking in real-time, actions like moving or blinking or through secondary 'Proof of Life' requests. “We use cutting-edge AI tools to swiftly identify and forestall any cases of deepfake usage, and we caution all our users upon registration of this,” he added.
Published At
6/27/2024 11:00:00 AM
Disclaimer: Algoine does not endorse any content or product on this page. Readers should conduct their own research before taking any actions related to the asset, company, or any information in this article and assume full responsibility for their decisions. This article should not be considered as investment advice. Our news is prepared with AI support.
Do you suspect this content may be misleading, incomplete, or inappropriate in any way, requiring modification or removal?
We appreciate your report.