Brands
Latest top stories
Technology

DuckDuckGoose CEO warns AI-generated identities are already testing digital banking security

12 March 2026

 

Digital identity verification systems used by banks, fintechs and online platforms are entering what security experts describe as a new risk phase. Advances in generative artificial intelligence (AI) are making it easier to create convincing synthetic identities capable of interacting with digital onboarding systems in real time.

That is the central finding of a new threat intelligence report by Dutch deepfake detection company DuckDuckGoose. Published on 10 March 2026, the report – When Identity Becomes Generatable – argues that identity fraud is shifting from manipulated photos or videos towards entirely AI-generated human identities.

In an exclusive interview with MoveTheNeedle.news, DuckDuckGoose CEO and co-founder Parya Lotfi says this shift is already visible in digital financial ecosystems.

“Identity verification was built on the assumption that visual presence implied a real person,” she explains. “Generative AI removes that assumption. Identity has become generatable. Trust must be established at the moment of identity creation, not after fraud occurs.”

The report focuses particularly on Latin America’s fast-growing digital banking sector, where remote onboarding and real-time payments have transformed financial services – but also increased exposure to identity fraud.


What is a synthetic identity?

A synthetic identity is a digitally created persona that does not correspond to a real individual. Traditionally, fraudsters constructed synthetic identities by combining stolen personal information with fabricated details.

Generative AI has significantly changed this process.

New image and video generation tools can create realistic human faces and animate them into lifelike video sessions. These synthetic identities can interact with identity verification systems designed to confirm that a person is physically present.

According to DuckDuckGoose’s research, the ecosystem producing such synthetic media is expanding rapidly. The report states that more than 55 synthetic media generators were released during the final quarter of 2025, while hundreds of thousands of model variants are now circulating across open AI ecosystems.

These figures come from the company’s internal monitoring of generative AI ecosystems.

Each new model introduces unfamiliar identity characteristics. As a result, verification systems may encounter synthetic identities they have never previously analysed.


Why digital onboarding systems struggle

Digital onboarding systems are designed to verify a new customer remotely. Most rely on a combination of identity document checks and biometric verification such as facial recognition or “liveness detection”.

Liveness checks typically confirm that a person interacting with a camera is physically present, rather than using a photograph or replayed video.

However, generative AI introduces a different category of attack.

“Most digital onboarding systems were designed to defend against presentation attacks such as printed photos, replayed videos, or masks,” Lotfi says in the interview. “Generative AI introduces a different category of threat.”

Attackers can animate a single photograph into a realistic moving face using image-to-video tools. These synthetic video feeds can then be injected into onboarding systems using virtual camera software, which makes the video appear to originate from a live user.

Because the system sees natural facial movement and expressions, traditional liveness checks may not always detect the manipulation.


Why Brazil and Latin America are highlighted

The DuckDuckGoose report highlights Brazil and other Latin American digital banking markets as particularly exposed to this evolving threat landscape.

Brazil’s financial system has experienced rapid digital transformation in recent years. Pix, the real-time payment system launched by the Central Bank of Brazil in 2020, has become one of the world’s most widely used instant payment infrastructures.

The convenience and speed of such systems have enabled widespread financial inclusion and innovation.

However, high-velocity digital ecosystems also create opportunities for fraud.

DuckDuckGoose cites industry estimates suggesting that Brazilian institutions recorded R$10.1 billion in banking fraud losses in 2024. While this figure is cited in the report, publicly available material does not independently confirm the exact number.

What is widely acknowledged across the financial sector, however, is that fraud cycles are accelerating. In some markets, scams are completed within hours of account creation.

That speed places enormous pressure on identity verification systems.


Industries most exposed to synthetic identity fraud

While the report focuses primarily on financial services, the risks extend to any sector that relies on remote identity verification.

Industries identified as particularly exposed include:

  • Banks and neobanks

  • Fintech platforms

  • Cryptocurrency exchanges

  • Telecommunications providers

  • Digital lending platforms

  • Online marketplaces.

The common denominator is remote onboarding combined with financial or transactional value.

Once attackers identify weaknesses in onboarding systems, synthetic identities can potentially be generated at scale.


The growing identity verification gap

The report argues that identity verification infrastructure is struggling to keep pace with the speed at which generative AI systems evolve.

New synthetic media generation models are released frequently, and open ecosystems allow developers to produce countless derivative versions.

Each variant may introduce new visual characteristics that detection systems have not previously encountered.

Lotfi describes this as a structural gap between identity generation and identity verification.

“AI-generated identities are changing the threat model for digital verification,” she says. “The key question is no longer only whether a face is alive, but whether that face represents a real person.”


How organisations can respond

DuckDuckGoose argues that organisations must treat synthetic media as a permanent component of the fraud landscape.

The company recommends that organisations begin testing onboarding systems against modern attack scenarios, including AI-generated video and virtual camera injection.

“Traditional liveness checks alone are often not sufficient,” Lotfi explains. “Adding dedicated synthetic media detection alongside existing identity verification systems can significantly improve resilience.”

Equally important is adaptability.

Because generative models evolve rapidly, verification infrastructure must update continuously rather than through occasional upgrades.

Regulatory transparency also plays an increasing role. Verification decisions must remain auditable and explainable so that financial institutions can demonstrate compliance with anti-money-laundering and fraud prevention requirements.


A broader question of digital trust

DuckDuckGoose develops deepfake detection technology designed to determine whether a face or voice has been generated by artificial intelligence. Its systems analyse biometric authenticity and are intended to work alongside existing onboarding infrastructure.

But the broader issue raised by the report goes beyond any single technology provider.

If generative AI continues to improve at its current pace, digital platforms may face a more fundamental challenge: ensuring that the participants interacting with their systems are real people.

For banks, fintechs and regulators, that question touches the core of digital trust.

As Lotfi puts it: “Authenticity assurance will become a foundational part of digital infrastructure.”

 

 

Further reading on MoveTheNeedle.news:

AI-Powered Fraud Is Outpacing Enterprise Defenses, Trustpair Warns 
An exclusive interview with Trustpair CEO Baptiste Collot examining how AI-driven fraud is scaling faster than traditional manual controls and why enterprises must redesign financial verification systems.

 

 

Liked this article? You can support our independent journalism via our page on Buy Me a Coffee. It helps keep MoveTheNeedle.news focused on depth, not clicks.

👉 https://buymeacoffee.com/movetheneedle.news