The woman looks into the camera and explains how the product “completely changed” her routine. She speaks naturally, gestures as she talks, and sounds convincing. There is just one problem – she isn’t real.
Across the internet, a growing number of tools now allow businesses to create video “customer testimonials” using entirely synthetic people, complete with human faces, voices and expressions. The result is something that looks almost identical to a genuine review, but without any real customer behind it.
A growing market for synthetic “customers”
Research conducted by Andrew Madden, consultant at eCommerce Smart, identified multiple platforms openly marketing tools that generate “UGC-style” video testimonials using artificial intelligence.
These platforms are not being positioned as experimental technology. They are being sold as practical marketing tools. Some promise videos that “look real” or “sound natural”, while others describe outputs as “indistinguishable from real people”.
In practice, creating a testimonial-style video can require little more than uploading a product link, selecting a digital presenter and inputting a script. Multiple variations can then be generated within minutes.
For a consumer, there is no obvious way to distinguish these from genuine customer experiences.
Why video testimonials are so persuasive
Video testimonials carry weight because they feel personal. A person speaking directly to camera, describing a product in the first person, signals lived experience.
Research shows people form impressions of trust from faces almost instantly, often without consciously analysing what they are seeing. In fast-moving environments such as social media feeds, those impressions can be enough to influence a decision.
This technology is designed to replicate that effect. It mirrors the exact format consumers associate with authenticity – a person, a story and a recommendation – even when no real experience exists behind it.
Where consumers are encountering them
These videos are designed to appear where decisions are made: social media adverts, product pages and landing pages.
Some tools guide businesses to place testimonial-style content at key points in the customer journey. Others allow multiple versions to be generated and tested to identify which is most persuasive.
From a consumer perspective, this means it is entirely possible to encounter several different “customers” recommending the same product – none of whom are real.
The rules are clear – enforcement is not
UK consumer protection rules prohibit misleading advertising, including fake reviews and endorsements. Advertising standards also require testimonials to reflect genuine customer experiences.
Where AI-generated videos are presented as real customers describing real experiences, they are likely to fall within these rules.
However, there is little public evidence of how these rules are being enforced in relation to AI-generated testimonial content. There is also no clear requirement for businesses to disclose the use of artificial intelligence in advertising, leaving consumers with limited ability to distinguish between genuine and synthetic endorsements.
Regulator asked, but no response
eCommerce Smart contacted the Competition and Markets Authority (CMA) on 24 February 2026 to request clarification on how these rules apply to AI-generated video testimonials.
The regulator was asked whether such content would fall within the scope of prohibited fake reviews, whether it is monitoring platforms offering these tools, and whether existing enforcement powers are sufficient.
The CMA did not respond to requests for comment.
Most people won’t realise what they’re watching
For consumers, the issue is not simply that artificial intelligence is being used in advertising. It is that the signals traditionally used to judge trust – a face, a voice and a personal recommendation – may no longer indicate a real person or a real experience.
Online shopping depends heavily on reviews and endorsements. If those signals become harder to interpret, the reliability of the wider system is weakened.
For now, most people are unlikely to question whether the person in a video testimonial is real. The format is familiar, and the experience feels authentic.
But as this technology becomes more widespread, that assumption may become harder to maintain.
Because once it is possible to create a convincing “customer” who never existed, it becomes significantly more difficult to know what – or who – you are actually seeing when you shop online.









0 Comments