Adobe Inc.

05/15/2025 | News release | Distributed by Public on 05/15/2025 05:22

The Future of UK Creativity: Safeguarding Creators in the Era of AI

The Future of UK Creativity: Safeguarding Creators in the Era of AI

The UK's creative sector is a global powerhouse, contributing over £124 billion in gross value added to the country's economy in 2024, accounting for 2.4 million jobs. We are entering a golden age of creativity, with 16.6 million* people in the UK expected to contribute to the creator economy by 2027. As technology advances, generative artificial intelligence (AI) is revolutionising the creative industries, offering tools that enhance efficiency, unlock new artistic possibilities, and allow creators to push the boundaries of innovation.

At Adobe, we operate at the intersection of enterprise, technology, and the creative community. We firmly believe that AI serves as a tool that assists and amplifies human creativity. When used responsibly, AI can free creators from tedious tasks, enabling them to focus on their craft and unleash their creative potential.

Whilst many creators are already using AI as an ingredient in their creative process, some are also concerned about issues such as style imitation for commercial gain, copyright infringement, misattribution, and loss of control over their work. AI will only reach its full potential when the tech and creative industries can harness the opportunity presented by AI responsibly whilst ensuring creator rights and preferences are respected.

Because Adobe is committed to unlocking the potential of AI responsibly, this is a question that we have considered at length. Our Adobe Firefly, generative AI models are IP friendly and commercially safe - trained on licensed imagery, Adobe Stock content, and public domain content where the copyright has expired. This means that customers can tap into the power of AI knowing that they are using a model that respects their fellow creators, and enterprise customers can leverage AI with confidence, using a model that minimises legal and reputational risks, such as risk of infringing on someone else's IP.

We also believe there are steps that industry and government can take together to both protect creators and foster AI innovation. This is an area the UK Government is actively exploring, specifically with regard to copyright protections in the age of AI. National copyright systems were created to provide economic incentives for creativity. Today, the government has a unique opportunity to ensure the UK's approach to copyright preserves these economic incentives for creators while maintaining the UK's appeal to companies looking to scale AI development and investment.

The Government's consultation on potential new AI and copyright laws closed at the end of February 2025 and whilst a lot of focus has been understandably placed on the content that goes into developing AI models - the input - there are also measures that should be considered to protect creators and consumers with respect to what comes out of AI models - the output.

To foster responsible innovation while protecting the UK's creative industries, here are some measures for policymakers to consider:

Establishing an Anti-Impersonation Right

A creator's unique artistic style forms a large part of their currency in the creative market. Bad actors can misuse AI models to intentionally replicate artists' unique style without their consent for commercial gain, thus competing directly with artists in the marketplace using their own distinctive style. This threatens the livelihoods of creators.

Introducing specific protections against AI-enabled imitation would mark a significant step in UK legislation that would benefit creators. Similar protections are being explored in the U.S. through legislation like the Preventing Abuse of Digital Replicas Act (PADRA), which aims to safeguard creators from unauthorised digital reproductions that undermine creators' livelihoods.

Facilitating an Opt-Out Standard

While broad access to data is important for accurate and reliable AI development, creators must also have the ability to effectively opt out of having their work used to train AI models if they choose.

Since 2019, we've been working on developing and promoting widespread adoption of Content Credentials, a "nutrition label" for digital content. Built on an open technical standard maintained by the C2PA, a technical standards body, Content Credentials allow anyone to attach secure metadata to their digital content to share information about themselves and how their work was created. Content Credentials also allows creators to set preferences around generative AI training and usage. By attaching a "do not train" tag to their content, creators can signal to AI developers that specific pieces of their content should not be used to train AI models.

Domain-level approaches to controlling data use also exist through instructions to bots and scrapers that are contained in robots.txt files. This can be effective in instances when the creator or rightsholder also owns the domain where the content is hosted. However, in cases where a creator publishes content on domains that they do not control, such as a social media platform, or when their content is available on multiple sites, Content Credentials can provide creators greater control over their preferences because the "do not train" tag remains attached to their content wherever it exists online. Adobe also recently launched the public beta of Adobe Content Authenticity, a free app that enables creators to easily apply Content Credentials to their work, and signal to other generative AI models that they don't want their content used for training.

Ultimately, creators benefit most when they have access to a range of approaches and tools to express their preferences around whether they want their work to be used to train AI. Government support is needed for approaches like Content Credentials to be adopted industry-wide, and to also ensure that all AI developers respect the training preferences that creators set.

Bringing Transparency and Attribution to Digital Content

Transparency is more important than ever in the age of AI. Our research highlights this as a growing challenge for consumers, with 76% of UK respondents in a recent study by Adobe reporting increased difficulty in verifying the trustworthiness of online content.

In addition to helping creators signal their AI training preferences, Content Credentials are instrumental in helping consumers to discern the trustworthiness of content, especially amid concerns over harmful deepfakes and global elections. The UK Government can take two critical steps to drive widespread availability of Content Credentials:

  • First, they can help drive adoption of Content Credentials as an industry standard. This would remove the burden of content moderation from governments and companies and instead empower creators.
  • Second, the UK Government itself can lead by example and use Content Credentials on official government content, helping to build trust and transparency online by offering citizens a way to verify that government content really came from the agency, department, or policymaker it says it came from.

Adobe is committed to working with policymakers, creators, and industry partners to ensure AI benefits the UK's creative community and consumers alike. Strong creator protection measures and responsible AI development are crucial to maintaining the UK's leadership in both innovation and as a global cultural and creative force.

Protecting artists from commercial, AI-based exploitation is in all our best interests. It is critical that government, industry leaders, and businesses act in partnership to protect creators.

As AI continues to evolve, our priority remains clear: empowering creators with tools that enhance their capabilities while giving them greater control over their intellectual property. By fostering transparency, upholding ethical standards, and advocating for supportive policies, we can ensure that AI accelerates creativity and does so without compromising the rights of the creators.

*Data source: Adobe CC Total Potential Market (TPM17) survey and modeling; Projection year: 2027; Creator definition: Adults (18+), internet-connected, who have done at least one of the following creative activities in the past 12 months: Photo Editing, Video Editing, Graphic Design, Layout Design, Web/App Design, or 3D Design.

*Data source: Adobe CC Total Potential Market (TPM17) survey and modeling; Projection year: 2027; Creator definition: Adults (18+), internet-connected, who have done at least one of the following creative activities in the past 12 months: Photo Editing, Video Editing, Graphic Design, Layout Design, Web/App Design, or 3D Design.

Adobe Inc. published this content on May 15, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on May 15, 2025 at 11:22 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at support@pubt.io