SoundCloud appears to have quietly changed its terms of use so that users can train AI with audio they upload to the platform.
As discovered by Tech Ethicist’s Ed-Newton Rex, the latest version of SoundCloud terminology includes clauses that grant permission to the “information, training, development” AI platform for using uploaded content.
“You expressly agree that the Content may be used as input to and to provide the Services as part of the Services and to be used to provide the Services,” read the terminology last updated on February 7th.
These terms are engraved in content under a “individual agreement” with third-party right-hand sholders, such as record labels. SoundCloud has many license agreements with indie labels and major music publishers, including the Universal Music and Warner Music Group.
TechCrunch was unable to find an explicit opt-out option in the platform’s settings menu on the web. SoundCloud did not immediately respond to requests for comment.
SoundCloud, like many large creator platforms, is increasingly embracing AI.
Last year, SoundCloud partnered with almost a dozen vendors to provide AI-powered tools to the platform for remixing, generating vocals and creating custom samples. In a blog post last fall, SoundCloud said that these partners will receive access to content ID solutions and “ensure that rights holders (sics) receive appropriate credit and compensation,” and “promised to support ethical and transparent AI practices that respect the rights of creators.”
TechCrunch Events
Berkeley, California
|
June 5th
Book now
Many content hosting and social media platforms have changed policies in recent months to enable first-party and third-party AI training. In October, Elon Musk’s X updated its privacy policy to allow external companies to train AI with user posts. Last September, LinkedIn revised the conditions and rubbed user data for training. And in December, YouTube began training third parties to AI with user clips.
Many of these moves have spurred a backlash from users who argued that AI training policies should be opted in as opposed to opt-out, and who claimed that they should be credited and paid for their contributions to the AI training dataset.