Our study with Realeyes on viewer engagement saw engagement rocket by 1200%. Read on to find out how, why, and what it means for your business.
After our video of David Beckham flawlessly speaking nine different languages for a campaign for Malaria Must Die went viral, we’ve been asked: how much does it actually matter how you translate a video?
This video clearly shows how AI, and the translation technology we create at Synthesia, is empowering humans to communicate in new, better ways.
From the time we’ve spent talking to our customer base, and previous research we’ve undertaken, we know the impact that our tech has on viewer engagement. But we wanted to go beyond qualitatives, and prove this in a quantifiable way - which is why we commissioned a study, with 300 participants, in partnership with AI-powered video technology pioneer Realeyes.
Think about if you’re watching a movie with subtitles - you have to focus on reading them. And we’ve all watched TV ads with badly-synced dubbing - the experience is jarring, and it’s tough to truly lose yourself in the content, let alone be sold on the product.
Animated films, think Toy Story and Frozen, are proven to perform far better than live-action in non-English speaking countries. Why? The digital characters are re-animated to match the different languages, so the content is perceived as being native.
It makes sense, right?
Of course, there are lots of factors that affect how you consume a video, from production values, to storytelling, the music.
But, if you aren’t working out of Hollywood, there are some non-negotiables when creating video content.
Quality production, clear sound, humans on-screen, the use of graphics if you need to explain complex concepts, a solid introduction, and translation that allows the viewer to consume your content without distraction.
To research the impact of different translation methods, we partnered with AI-powered video technology pioneer Realeyes on a study to find out just how much different methods impact video engagement.
The Realeyes platform uses computer vision and machine learning to measure viewers’ attention and emotions as they watch a video.
Through a webcam, Realeyes’ emotion AI measures and quantifies consumers’ attention levels and emotions as they watch, by analyzing facial expressions.
Imagine you’re watching a video of something really upsetting.
Your face would show signs of distress; a knotted brow, a down-turned mouth. If you’re viewing something exciting, your eyes might widen, your smile might brighten.
Realeyes combines all of these signs of emotion and attention to assess how well content performs and calculates two scores - ‘Engagement’ (did the content produce a strong emotional moment?), and ‘Impact’ (did the content leave a lasting positive impression?).
Using this technology we decided to test two videos with two identical groups of 150 participants, one using traditional dubbing, and one using Synthesia Translate.
To make sure we were isolating the translation method from the content, we chose a purposely dull video - an introduction to an e-learning course (as a side-note, e-learning doesn’t need to be dull...if you’re creating content for e-learning, get in touch!).
We knew the results were going to show an improvement in engagement and impact.
Emotional impact is why most people choose to use video in the first place – we’ve all read the research on the ‘video effect’ and how viewers retain 95% of a video’s message compared to 10% when reading text.
But up until now it’s been difficult to measure and quantify emotional impact and attention levels.
The results are impressive.
• 200% improvement in overall RealEyes engagement score vs. traditional dubbing
• 175% improvement in the overall RealEyes impact score
• More than 1200% increase in engagement score for sub-34 group
• 600% increase in impact score for sub-34 group
People connect more with AI-dubbed content in their own language – the RealEyes engagement score doubled compared to traditional dubbing, and the impact score saw a 175% increase. Potentially the most interesting finding was the extreme difference between older and younger age groups.
From the results, we can see that traditional dubbing is a huge turn-off for younger audiences, who have grown up with content personalized - they’re used to being picky.
Older generations grew up with traditional, out of sync dubbing, and the results showed the impact of this.
For the over 34’s, the impact score increase from using AI dubbing was 40% - impressive, but nothing close to the under 34’s group.
These results were astounding.
• The engagement score in this younger group rose from 3 (normal dubbing) to 37 (AI dubbing), an over 1200% increase.
• The impact score rose from 8 (normal dubbing) to 50 (AI dubbing) - an almost 600% increase.
What does this mean for you?
• When it comes to business goals, these results highlight huge opportunities.
• In advertising there is a direct link between emotional impact, brand affinity and buying decisions.
• For online training and corporate communication videos one of the most important factors is content comprehension and memory recall, both directly linked to attention and emotional impact.
• In sales and customer communications you absolutely must create a solid connection - your targets constantly bombarded with information, and the messages they’ll remember are those they connect with emotionally.
Next up we’ll be testing out other aspects of video creation. What’s the impact of using a human in a video? And how much does personalization affect emotional impact?
Stay updated by signing up to our newsletter and follow us.
Click on the link below and get started using Synthesia with full access to all 40+ avatars and 40+ languages.
"Synthesia is changing how my clients enter the global marketplace by delivering a hyper-local impact. In my world where language is so crucial it makes huge difference. The process itself is so advanced you can’t tell your actor is not a native speaker!"