AI Training Licenses: Are You Training Your Replacement?

“Ethical” AI Training Licenses

Production music libraries and composers are increasingly receiving licensing requests for a new purpose: training generative AI models.  Many companies are asking libraries and composers, including PMA members, to “opt-in” to licensing their catalogs for supposed “ethical” AI training.  And while these licensing offers present a new source of revenue that is seemingly attractive at first glance, they may ultimately lead to libraries and composers providing the data needed to create their own replacement.

The following outlines the risks to production music libraries and composers in licensing their music for generative AI training and provides guidance for PMA members facing these decisions.

AI Music Generation

When AI companies request your music for “training data” – even supposed “ethical training” they are using it to teach their systems to understand and recreate musical patterns. By analyzing the music and learning patterns, structures and relationship within that music, these AI models can then learn to generate new music based on its learned patterns.

Unlike licensing music for a film that will use it once, AI training creates a permanent capability to generate unlimited similar works.  And while the “ethical” data training touted by some in the music industry may provide certain compensation in the short-term for the music on which these models are licensed, there are tangible long-term risks to production music libraries, composers and creators that should be considered. 

The Replacement Risk: Why This Matters

Direct Competition from Training Data

AI systems trained on production music libraries will specifically learn to create music with characteristics similar to the libraries’ works at a massive scale.  Indeed, AI music generators can already produce music tracks in specific styles within seconds – far faster than human creators could ever match.

This massive influx of “ready-made” music tracks into the marketplace will directly compete with traditional production music libraries in the market, and could result in an unfair competitive landscape, making it more difficult for production music libraries to grow their catalogs and continue to employ songwriters, recording artists and other creators.

Permanent Transfer of Value

Once an AI system learns from the licensed music, it cannot “unlearn” those patterns. The value extracted from a library’s catalog becomes permanently embedded in the AI model, which can then generate infinite variations of similar music.  Without human contribution, AI generated music can be offered at lower prices or for free, permanently undercutting the value of production music libraries in the long term.  Moreover, generative AI models can continue to improve the quality of their generated music without any additional payments or license fees to rightsholders, thereby creating a risk of ultimately outperforming and rendering unnecessary human-created production music.  

The Law May Not Protect Against The Risks of AI Training

The law surrounding AI training and content generation currently remains unclear.  While there are several pending lawsuits that will define the contours and limitations on the use of data to train AI models, even when licensed, the existing laws governing copyright and unfair competition may not be adequate to protect against the potential value degradation that could come from the influx of AI generated production music.  

Moreover, standard AI training licenses, including those “opt in” licenses for purported “ethical” training models typically include problematic terms such as overly broad, perpetual rights to “learn from” a licensor’s entire catalog, little to no restrictions on how similar AI outputs can be to the works on which the model is trained, minimal compensation that fails to account for the long-term market impact, and a lack of revenue sharing from the resulting AI capabilities.  

Protecting Library Content

Key Contract Safeguards

Any production music library facing a decision to license its content to generative AI training models should consider all of the risks associated with allowing third party AI models to train on their content.  The potential for near-total replacement of production music library catalogs by AI generated content is real and the long term effects should be weighed against limited short term gains.  

If the decision is ultimately made to license the content and allow such training, libraries should consider demanding certain protective measures, including:

  • Specific prohibitions against creating outputs that compete with production music
  • Require substantial compensation reflecting the permanent value transfer of a library’s catalog
  • Require ongoing revenue sharing from the AI system’s commercial uses
  • Establish specific stylistic limitations on what the AI can generate
  • Maintain rights to approve or reject specific implementation techniques

Conclusion

While AI development is inevitable, how and whether production music libraries contribute to it remains a choice. The central question is clear: Does the short-term licensing revenue justify training a system that may permanently reduce the value and demand for a library’s music?

Before signing any AI training license, consult with legal counsel experienced in both music rights and technology. Consider not only the immediate terms but the long-term impact on your catalog’s value in a market where AI-generated alternatives exist. The decisions production music creators and libraries make today will shape the industry’s future for years to come.

—–

The foregoing provides general information for educational purposes only and does not constitute legal advice. PMA members should consult with qualified legal counsel regarding their specific circumstances.