AI copyright ownership is a question most businesses havent answered.

The answer isn't "you." It isn't "the AI." And it might not be as simple as "whoever paid for the tool." The legal position on AI-generated IP is genuinely unsettled — contested in courts across multiple jurisdictions simultaneously, under active review by the UK Intellectual Property Office, and being litigated by some of the largest media organisations in the world. For UK businesses, the practical risk is real and growing.

The core uncertainty

Unlike almost every other asset your business creates, AI-generated content sits in legal grey territory across three separate dimensions: whether it can be protected, whether it infringes someone else's rights, and who would own it if it could be protected at all.

Three legal questions nobody has fully answered yet

1
Can AI-generated content be copyrighted?
Copyright traditionally requires human authorship. If an AI generates the content autonomously, there may be no copyright to claim — leaving the output in the public domain and unprotectable.
2
Does AI output infringe training data copyright?
AI models are trained on vast quantities of existing content. When the output closely resembles that training material, the question of whether it constitutes infringement is live and unresolved.
3
Who owns AI output — user, provider, or nobody?
Terms of service vary significantly between AI providers. Some claim rights to outputs. Some grant users rights but with restrictions. Some leave the question entirely to the user to navigate.

The UK legal position — more nuanced than you might expect

The UK is actually ahead of most jurisdictions on this question, but not in a way that makes things simpler. The Copyright, Designs and Patents Act 1988 (CDPA) already has a provision — section 9(3) — that addresses "computer-generated works." Under this provision, the author of a computer-generated work is taken to be the person who made the arrangements necessary for its creation.

That sounds useful. The problem is that the provision was drafted in 1988 to address software-generated reports and procedurally produced outputs. It wasn't designed for a world where an AI model trained on billions of human-created works generates content that's substantively indistinguishable from human creative output.

"The CDPA's 'computer-generated works' provision gives UK businesses more than most jurisdictions — but it was written 35 years before modern generative AI existed."

The UK Intellectual Property Office ran a consultation on AI and IP in 2023, acknowledging that the existing framework doesn't map cleanly onto current AI capabilities. The consultation examined three possible approaches: maintain the existing framework, extend copyright protection to AI-generated works, or exclude AI-generated works from protection entirely. As of 2026, no legislative change has followed. The uncertainty continues.

Four IP risk areas for UK businesses using AI

Where AI creates intellectual property exposure
Output copyright
Who owns AI-generated content? The answer affects whether you can protect it commercially or assert rights against a competitor who copies it.
Training data liability
Reproducing copyrighted material in outputs. If the AI's output is substantially similar to copyrighted training data, you may be the entity that has infringed.
Ownership disputes
Employee vs company vs AI provider. Who owns AI outputs created by employees using employer-licensed tools? Employment contracts rarely cover this.
Contractual clauses
IP terms buried in vendor agreements. AI provider Terms of Service vary dramatically on ownership and licensing — and most businesses haven't read them.

The litigation landscape that will set your precedents

The legal uncertainty isn't abstract. It's being resolved through active litigation that will produce binding precedents affecting every business that uses AI-generated content commercially. The cases already underway include Getty Images' suit against Stability AI over the reproduction of watermarked training images, the New York Times' suit against OpenAI and Microsoft alleging copyright infringement through training data use, and multiple class action suits from writers and visual artists.

These cases are working their way through courts in the US and UK simultaneously. The outcomes — which are not expected quickly — will determine whether AI providers are liable for training data infringement, whether their outputs infringe the rights of the original creators, and by extension, what liability flows downstream to the businesses that use those outputs commercially.

The uncomfortable position for UK businesses is this: you may be building commercial assets on a foundation that court decisions in the next 12 to 24 months could destabilise. That doesn't mean you should stop using AI tools. It means you should understand your exposure and put appropriate protections in place.

Legal and business team reviewing AI vendor contracts

Legal teams at most SMEs haven't reviewed their AI provider contracts — let alone assessed the IP implications of using AI-generated content commercially.

There's also a practical dimension that most businesses overlook entirely. If you're building a brand or product around AI-generated content, and that content turns out not to be protectable — because it lacks human authorship in the legally required sense — a competitor can copy it without consequence. The investment you made in creating it doesn't give you the protection you assumed it did.

Detect, Assess, Defend

The consultant's approach to AI intellectual property risk
Detect
IP clause review
In all AI vendor contracts
Output similarity checks
Flag potentially infringing outputs
Staff IP usage audit
What AI tools are in use and for what?
Assess
Commercial use of AI outputs?
Scope of exposure by use case
Jurisdiction of provider?
Which legal framework applies?
Employee IP assignment clauses?
Do contracts cover AI-generated work?
Defend
IP policy for AI content
Clear rules for all AI-generated assets
Human creative contribution
Strengthen copyright claim
Contract clauses reviewed
Vendor terms understood and recorded
Legal counsel briefed
On AI use in commercial outputs

The practical steps businesses can take now

The legal uncertainty doesn't require paralysis. There are concrete steps that reduce your exposure materially while the courts and legislators work through the larger questions.

The most important is ensuring meaningful human creative contribution to any AI-assisted output you intend to protect commercially. A prompt-to-output workflow, where a person types a brief and publishes the AI's response unchanged, is at maximum risk of being found unprotectable. A workflow where a human significantly edits, refines, curates, and develops the AI output is on much stronger ground — both legally and creatively.

The second is understanding what your AI provider's terms actually say. Some providers grant you broad rights to AI outputs. Others retain rights, restrict commercial use, or include indemnification limitations that matter enormously if a copyright claim arises. Most businesses have agreed to these terms without reading them.

The third is updating your employment contracts and IP assignment clauses to explicitly address AI-generated content. The standard "IP created in the course of employment belongs to the employer" clause was written before employees were generating commercially significant content with AI tools. Whether it covers AI-assisted output is a live question that a properly drafted clause can resolve.

How BBS helps with this

  • AI Governance & Policy Drafting — We draft an IP usage policy covering AI-generated content, code and creative assets — defining ownership, approved use cases and the human contribution requirements that strengthen your copyright position.
  • EU AI Act Compliance Assessment — IP risk assessment built into the compliance review, so your AI governance and your regulatory obligations are addressed together rather than separately.
  • Vendor Contract Review — We identify IP and liability clauses in your AI provider agreements, surfacing the terms that matter for commercial use and flagging the gaps that need addressing.
  • AI Acceptable Use Policy — We define approved use cases for AI-generated content across your organisation, reducing commercial IP exposure while giving your teams clear, workable guidance on what they can and can't do.