top of page
Search

How Hip-Hop Taught Me Everything I Need To Know About AI Copyright

  • Writer: Adrian Munday
    Adrian Munday
  • Aug 10
  • 8 min read
ree

It’s the Bronx in the late '70s. DJs are throwing block parties, and they've discovered something magical: if you loop the drum break from James Brown's "Funky Drummer”, people lose their minds on the dance floor (one of the most widely sampled pieces of music in history - you will recognise it if you look it up on YouTube).


But this wasn't theft. It was alchemy.


By the mid-'80s, this practice had evolved into an art form. Public Enemy's "Fight the Power" alone contained over 20 samples, creating a collage of sound that felt entirely new while honouring its sources. One of my favourite albums growing up - De La Soul's 3 Feet High and Rising - was even more audacious with hundreds of samples woven into that amazing piece of art.


I remember hearing "Me Myself and I" for the first time and having no idea that the infectious backing came from Funkadelic's "(Not Just) Knee Deep." These weren't cover versions, they were re-imaginings.


Now some of you may remember that the band’s first 6 albums (to my delight) were finally released to streaming services just over two years ago.


What may surprise you is that De La Soul's classic albums were unavailable not because the band didn't want them there, but because clearing hundreds of samples retrospectively was a legal and financial nightmare.


The Turtles sued them over a 12-second sample. Twelve seconds. The settlement reportedly cost more than the band had made from the entire album.  Where that left creators after the dust had settled was more thoughtful re-use of the work of others and a more equitable sharing of the spoils.


My strong sense is that we're seeing history repeat with AI-generated content. Just as sampling forced us to reconsider ownership and creativity, AI is doing the same thing at warp speed. Innovation is outpacing regulation once again.


With that, let's dive in.


Why I Dug Into 300-Year-Old Copyright Law

In researching this blog I did what we all tend to do - I went to what I know.  Having studied law as an undergraduate, I figured I’d dig into the recent court cases to find some clarity.


Big mistake.  The recent case law is patchy, contradictory and varies wildly across jurisdictions. So I did what any reasonable person would do when faced with that kind of chaos:  I went backwards.


The original copyright law (the Statute of Anne) was created to deal with…wait for it…. The printing press disrupting the publishing world.  Sound familiar?  The law granted authors time-limited rights to encourage learning and creativity, then handed everything over to the public.


The original creators v technology battle.


Fast forward to today and AI companies are hanging their hats on something called “transformative use” - the idea that if you add new meaning or purpose to an existing work, it’s fair game.  The US courts are stretching this existing framework to its limit. When Public Enemy sampled James Brown, they weren't just copying - they created something that made you feel differently about both songs.


Is AI doing the same?  Honestly, I’m not convinced.


The legal wins so far have been messy on both sides.  No clean victories, just years of more expensive litigation ahead.


So if the law doesn’t have answers yet, where else can we look?  Ethics gets complicated fast - creators say “they should have asked”, while others argue humans don’t ask permission for inspiration either.  Article 27 of the Universal Declaration of Human Rights similarly just encapsulates the tension between public access and creators’ rights.


Really, it boils down to two questions: is AI training more akin to a person reading books (fine) or a business exploiting people’s content without permission (usually requires a licence)?  And how do we balance the private rights of creators versus the public good of promoting education, access to information and technological progress?


Whatever you take from the above, I can’t shake the feeling there’s still an unresolved question of natural justice. Because behind every dataset are real people who created something of value - and right now, the spoils of AI’s boom aren’t exactly being split evenly.


What's Clear Right Now And My Initial Take

But back in the real world, while I was wrestling with these philosophical questions, the courts were making actual decisions.


In late June 2025, both Anthropic and Meta won landmark copyright cases. Judge Alsup found that Anthropic's use of books was "transformative," while Judge Chhabria ruled Meta's training constituted fair use. These aren't technicalities - they’re substantive legal victories.


Also, licensing deals where they are being cut are genuinely impressive. The News Corp's estimated $250M licensing agreement with OpenAI marked a turning point, with similar deals projected to generate over $500 million in licensing revenue for publishers in 2025. The New York Times recently signed a deal with Amazon reportedly worth $20-25 million annually.


And also, publishers do not have “clean hands” in this debate.  According to the Reuters Institute Digital News Report 2024, 72% of leading publishers now employ AI tools in content creation and distribution, up from just 28% in early 2024.


Reading this data, away from the philosophical debate we considered earlier, I thought maybe this is how technological transitions actually work. Maybe the doomsday scenarios are overblown, and we're seeing the emergence of a functioning market where creators get paid and innovation thrives.


For several weeks, I leaned toward thinking this was creative destruction in action - messy at first, but ultimately beneficial.  And history is littered with examples of where this has happened before.


We've Been Here Before, Right?

Back to De La Soul and Public Enemy.  History shows that while new technology often feels like a threat, it’s actually the catalyst for a stronger and more resilient creative economy. This predictable cycle of disruption and adaptation has played out time and again, ultimately benefiting creators and society.


Think of the printing press which led to modern copyright law that we mentioned earlier.  It initially seemed to devalue scribes but ultimately created the entire modern publishing industry and the very concept of authorship for millions. The VCR didn't kill cinema as predicted; instead, it created the massive home video market, a brand-new revenue stream for studios. Likewise, the chaos of the Napster and MP3 era gave way to the global streaming ecosystem, allowing artists to reach audiences on a scale previously unimaginable (although not without controversy).


Each of these innovations forced us to ask hard questions and build new systems - like the Statute of Anne, licensing deals, and subscription models - that are now the bedrock of our creative industries.


The unprecedented speed and scale of AI is not a sign that this cycle is broken; it’s just a sign that it's accelerating. This rapid advancement is compressing the timeline for innovation, pressuring us to develop smarter, more equitable solutions for compensation and attribution faster than ever before. As I expressed in my blog on the future of employment, human judgment, creativity and wisdom are more valuable than ever.  We are on the cusp of defining the next great creative economy, one that will likely unlock opportunities for artists and storytellers that we can't even yet envision.


What Governments Can Do Right Now

In researching this blog I have realised how enormous this topic is, the far reaching implications and that I can only just skim the surface in my self-imposed word limit.  But I did want to sketch out some immediate options that could better balance the books between creators and AI companies:


Emergency Relief for Displaced Creators: We showed what was possible during COVID. Stock photo photographers, voice-over artists or content writers are seeing contracts dry up due to AI alternatives.  Emergency grants or extended benefits for those demonstrably affected could stem the immediate economic pain to give space for longer-term solutions.


Further Empower Copyright Offices: The US Copyright Office and others have issued initial guidelines, but this work needs funding and expansion. Creating databases of works available for licensing, building opt-out portals, and establishing international co-ordination between agencies should be prioritised.  The infrastructure exists - it just needs resources.


Educate everyone: Many creators still don’t realise their work was likely used in training or what options they have.  A simple “Your rights and AI” campaign could include how to add no-use meta-data, how to track if an AI model is using your work and how to join collective efforts.


These are short-term fixes.  The longer term challenge is developing “Fair Use 2.0”, updating decades-old legal framework for the AI age.  We need clearer definitions: AI enabling new types of analysis?  Fair.  AI simply substituting for reading the original?  Likely not legitimately transformative.


But I’ll leave the deeper policy rabbit holes for another day - and encourage everyone to take an interest in these decisions being made about our digital future.


The Bottom Line

I started this research genuinely conflicted, leaning towards the “creative destruction” narrative. The licensing deals looked promising.  The court victories suggested maybe the market was working.


But the deeper I dug, the clearer it became.  We’re not just talking about legal precedent or economic theory.  We’re designing the blueprint for how human creativity (and maybe human work more generally) gets valued in an AI world.


The companies building AI systems have the resources to train models responsibly.  They’re paying billions for compute and talent - they can afford to compensate the creators whose life works powers their products.


This matters beyond the immediate economics.  What sort of society do we want?  If we hollow our the incentives for human creativity now, what happens to the next generation of artists, writers and thinkers?  Do we want an AI future built on a foundation of unpaid human labour, or one that demonstrates technology can lift everyone up?


I’ve landed on my position.  As someone who spends their day thinking about risk, this feels like the ultimate long-term bet on human potential.


Until next time, you'll find me checking whether this blog has been scraped into the next generation of models - and planning what to do about it when inevitably it has…


Resources and Further Reading

Cultural Context (Where it all started):

  • James Brown’s “Funky Drummer” - list on YouTube to hear the most sampled drum break in history

  • Mark Ronson’s TED Talk: “How Sampling Transformed Music” - brilliant exploration of creative transformation (thanks to James C for putting me onto this one: great inspiration after sharing an early draft of the blog with him)

  • De La Soul’s “Me Myself and I” vs Funkadelic’s “(Not Just) Knee Deep” - compare the original and the reimagining

  • Public Enemy’s “Fight the Power” - listen for the 20+ samples woven into one track


Legal Decisions (Noting most of these are still ongoing):

  • Bartz v. Anthropic PBC - Judge Alsup's fair use ruling (June 2025)

  • Kadrey v. Meta - Judge Chhabria's decision (June 2025)

  • Thomson Reuters v. Ross Intelligence - Delaware court ruling against fair use (February 2025)

  • Grand Upright Music vs Warner Bros (1991) - the case that changed sampling forever


Industry Analysis:

  • 2024 timeline of major AI licensing deals (Digiday) - well worth a look. The list of deals is impressive

  • Reuters Institute Digital News Report 2024 on publisher AI adoption

  • CISAC Global Economic Study on AI Impact (December 2024)

  • Generative AI deals revealed (Press Gazette) - the 'Who's Suing AI and who's Signing...' article is the latest news at July 31st


Other Reading:

  • Platforms and Publishers: AI Partnership Tracker - gives the stats on the latest deals and who they're signed with

  • EU AI Act transparency obligations

  • Universal Declaration of Human Rights Article 27

  • Spawning’s Source.Plus - opt out tool for creators

 
 
 

Comments


© 2023 by therealityof.ai. All rights reserved

bottom of page