detect·deepfakesby Resemble AI
Glossary

C2PA

Also: Content Credentials · Coalition for Content Provenance and Authenticity

An open standard, developed by Adobe, Microsoft, BBC, and others, for cryptographically signing media at creation to produce tamper-evident provenance metadata — letting downstream viewers verify how a file was created, edited, and by whom.

C2PA (Coalition for Content Provenance and Authenticity) is the industry standard for content provenance — cryptographically attested metadata that travels with a media file describing its origin and edit history.

How it works

A C2PA-enabled device or tool (a camera, an AI image generator, an editing app) adds a signed manifest to every file it produces:

  • Who/what generated it (device model, software, AI model).
  • When it was generated.
  • What edits have been applied (including by what tool).

Each manifest is cryptographically signed. Subsequent edits append new manifests, so the full chain of custody is preserved. Tampering breaks the signature.

What C2PA does and doesn't give you

Does:

  • A strong positive signal that a file was produced by a specific signed device or tool.
  • A complete edit history (which edits, which tools, when).
  • Resistance to tampering.

Doesn't:

  • Prove that an unsigned file is fake (most files won't be signed).
  • Prove that a signed file accurately depicts reality (someone could sign an AI-generated image with a valid creator credential).
  • Survive aggressive re-compression or platform strippings that remove metadata.

Adoption status

As of 2026:

  • Cameras: Leica, Sony Alpha series, Nikon Z9 ship with C2PA signing.
  • AI generators: OpenAI, Google, Adobe Firefly embed C2PA.
  • Platforms: TikTok and LinkedIn have begun surfacing C2PA data; most platforms still strip or ignore it.
  • Browsers: Limited. Verification usually requires a third-party viewer or browser extension.

Relationship to deepfake detection

C2PA and deepfake detection are complementary, not redundant:

  • C2PA provides positive provenance: "this file was generated by X tool at Y time."
  • Deepfake detection provides negative provenance: "this file's fingerprint matches the distribution of AI-generated content."

Most files in the wild have neither. A robust verification workflow checks both when they're present and falls back to detection when they're not.

See also