Insights Into AI Avatars

Tags
Thought Leadership

AI avatars are quickly becoming part of everyday life - think of them as digital doubles of real or imagined people, brought to life with help from AI. They’re no longer just for fun. In business, their use is booming, with the market expected to jump from about $26.8 billion in 2025 to a massive $584 billion by 2032. There are now countless platforms racing to nail human-like realism and claim the top spot.

As part of a conference paper from our sister company, Ada and Alan, for BledCom 2025, we dived into the use of AI in employee communications, and took the chance to build a custom avatar of one of our own team members using the latest tools. (Nothing like getting hands-on experience to see what's possible!)

Platform Selection: A World of Options

We started by exploring a range of platforms—Synthesia, HeyGen, D-ID—and honestly, you're spoiled for choice. Each one has its own twist, but overall, the output quality is impressive. Even better, everything moves fast: generating custom avatar videos can take just minutes, which makes experimenting (and iterating) surprisingly easy.

The Creation Process: Simpler Than Expected

So, what does it actually take to create an avatar? If you're used to AI tools, the workflow will feel pretty intuitive. After trying a few options, the common thread is a straightforward user experience and plenty of transferable skills. Depending on your subscription, you'll get different levels of editing control—but at the higher tiers, you can really personalize the avatar to feel uniquely yours.

The Devil's In The Details

That said, realism doesn't come from the software alone. When editing, it's worth being picky about how "human" your avatar looks. The small stuff matters: facial expressions, body movement, and lip-syncing are what tip something from realistic to hyper-realistic.

Most platforms will ask for video footage and audio to build your avatar's core look and behaviour. This makes the source material critical—the cleaner the input, the cleaner the output. Any jitters, odd movements, or stumbles will show up in the final result. From there, you can choose whether your avatar interacts live or delivers a polished pre-recorded message, depending on the job to be done.

The Future Is Almost Here

The space is evolving fast. Sure, you might still spot the occasional Americanized British accent or a tiny facial twitch—but at the current pace, the "uncanny" is on track to blend seamlessly with the real, and soon.

This article is based on research conducted for BledCom 2025 by Ada and Alan.

By
Freya Meek

More From Perspectives

Copyright © , Company details.