People first. Always.

While the full scope of Artificial Intelligence's impact on our society is still unfolding, our position is clear: People first. Always.

Our mission is to make video easy for everyone

Our products aim to empower and amplify people’s capabilities, never to replace them.
We value the unique qualities and insights only humans can bring, ensuring
our technology always complements, not competes.


Ethical use of AI

Since 2017, we’ve been pioneering advancements in Generative AI. Ethics and AI safety have always been core to our mission, and to translate our commitments into action, we rely on our 3Cs framework: Consent, Control, and Collaboration. Here’s how it guides our decision-making:


We respect your right to your own image. You should decide how it's used. This means we will never create an AI avatar without your clear consent. This includes politicians or celebrities for satirical purposes. No impersonations, no gray areas.

Here’s how we ensure that:

  • Real people, real consent: Our stock AI avatars are based on real human actors, with their explicit consent.
  • Fair play: Actors are given clear and transparent information about how their likeness will be used, and are fairly compensated for their participation.
  • Opting out: Our actors can choose to opt out. We respect that decision, ensuring a smooth transition.
  • Your image, your say: Your avatar can be created only with your explicit consent, following a thorough KYC-like procedure.
  • Complete control: Our platform ensures you can decide who uses your avatar, when, and how. No one can access it without your explicit consent.
  • Opting out: You can request an opt-out. We guarantee your data and likeness will be entirely deleted from our databases.


Our platform is more than just a tool. It's a secure environment for businesses, ensuring you and your company's data is safe and under your control.

Here's how we ensure this control:

  • Content Moderation Policy: We implement rigorous content moderation at the point of creation, using advanced tech filters and human oversight to prevent inappropriate or harmful content. Read more
  • Trust and Safety Team: Our dedicated T&S team team works around the clock to keep our platform safe and make sure it's used responsibly.
  • Tailored access: We've designed user roles for tailored access, ensuring our tool benefits for all while keeping sensitive information secure for authorized individuals.


Tackling AI's challenges is a team effort. We believe in joining forces with industry leaders and regulators.

Our collaborative efforts include:

  • Engaging with Regulatory Bodies: We actively work with regulatory bodies and champion the formulation of robust AI policies and regulations.
  • Partnership on AI (PAI): We are launch partners of the Partnership on AI on Responsible Practices for Synthetic Media. This is the first industry-wide framework for the ethical and responsible development, creation, and sharing of synthetic media. Read more
  • Content Authenticity Initiative: Along with industry-leading companies such as Adobe, Nvidia, and Microsoft, we are active members of the Content Authenticity Initiative. Read more
Explore more

Interviews on AI Ethics with our CEO


All your questions about ethics answered

Where can I find detailed information about Synthesia's security measures?

For an in-depth look at how we manage and protect your data, please visit our security portal.

Is Synthesia compliant with industry standards for data protection?

Absolutely! Our data handling processes are both SOC 2 and GDPR compliant, ensuring the highest levels of security and privacy. Please visit our security portal for more details.

What happens to the rights and access to the content I create using Synthesia, both during and after using the service?

You retain the rights to the content you create with Synthesia and can download it. If you integrate Synthesia's content into your videos, we grant you a perpetual license. When you stop using our services, your content will be deleted from our databases.