LogoLogo
  • Introduction
    • FAIR AI Attribution (FAIA)
  • Technology
    • Vocabulary
    • Implementation
  • Examples
  • Consortium
    • Project Partners
    • Funding
  • Contact
Powered by GitBook
On this page
  • The FAIA Project
  • Why is FAIA necessary?
  • Goals and Scope
  • Outcomes and Impact
  1. Introduction

FAIR AI Attribution (FAIA)

NextVocabulary

Last updated 1 day ago

The FAIA Project

The FAIA project (FAIR AI Attribution) develops an open, structured framework to disclose the role of artificial intelligence in content creation. FAIA is developed by Liccium in collaboration with Leiden University and the GO FAIR Foundation, addressing a growing need for transparency in digital publishing and research.

As AI tools become more common in writing, publishing, and media production, transparency is no longer optional - it’s essential for trust, credibility, and compliance. With the increasing integration of generative AI tools – ranging from writing assistants to full content generators – creators, publishers, and academic researchers need a consistent and verifiable way to indicate whether and how AI has contributed to the content. FAIA provides that mechanism.

This short video introduces the FAIA initiative. It highlights the problem of AI opacity, emerging regulation like the EU AI Act, and a practical solution for certifying content with verifiable AI involvement.

Why is FAIA necessary?

As AI tools increasingly shape the production of text, images, audio, and video, it is becoming harder to distinguish between content made by humans and that produced or modified by machines. This lack of transparency creates serious challenges:

  • Loss of trust in digital media, journalism, and scientific publishing

  • Legal and regulatory gaps, particularly under frameworks like the EU AI Act

  • Misrepresentation risks for researchers, creators, and platforms

  • Inability to trace provenance or evaluate reproducibility of AI-influenced work

FAIA addresses these issues by making AI involvement visible, verifiable, and portable. It enables compliance with disclosure obligations, supports ethical publishing practices, and builds a foundation for content authenticity and accountability in the age of generative AI.

Goals and Scope

The FAIR AI Attribution (FAIA) framework provides a practical solution for creators and rightsholders to disclose the role of generative AI in the production or editing of digital content. Its core goals are to:

  • Enable transparent documentation of AI involvement in content creation.

  • Support compliance with emerging regulatory frameworks, such as Article 50 of the EU AI Act.

  • Strengthen provenance, reproducibility, and trust in digital media and publishing ecosystems.

FAIA is implemented as a plugin in the Liccium software, allowing users to flag AI involvement directly within the declaration and signing process. These flags are machine-readable, interoperable, and cryptographically linked to the content via its ISCC code. This ensures that AI attribution data travels with the content, is verifiable across systems, and can be accessed by platforms, publishers, or researchers at any stage of the content lifecycle.

Outcomes and Impact

The FAIA framework contributes to more responsible and transparent publishing practices by:

  • Enabling creators, researchers, and editors to avoid unintentional misrepresentation of AI-generated or AI-assisted work.

  • Supporting the disclosure policies of journals, research institutions, and funding bodies that increasingly require clarity around AI use.

  • Allowing platforms, repositories, and downstream users to filter, assess, or verify content based on persistent, transparent attribution metadata.

While FAIA is intended as a cross-sector solution applicable to all types of digital media, the first implementation focuses on academic publishing and trade book workflows. By embedding transparency into the metadata layer of content itself, FAIA enhances trust in human-AI collaboration and provides a foundation for long-term accountability in digital knowledge production.

Video Pitch