Tuesday, March 14, 2023

AI-Generated Content: Here to Stay

 This Week's Topic: AI Art vs Artist / AI Composition vs Authors

Whoooooboy it's wild to be alive during another episode of technology aiding and infringing upon creative works. This season on Helpful and Harmful, we have machines being trained on copyrighted works and regurgitating bastardizations of those works without permission from Intellectual Property owners or remuneration paid to said owners. 

All was well and good in AI's nascent stages when developers used works in the public domain as source material. Then, sourcing tapped into lesser-known protected works under the education umbrella of the Fair Use Doctrine. Still hungry for data, sourcing leveled up to web crawling, blowing past any pretense of acknowledging Intellectual Property laws and protections. Now, AI is like Audrey II from Little Shop of Horrors, screaming "Feed  Me, Seymour!"

Developers and project leaders pointed to Consumer Interest to continue to acquire funding. Creative AI hit the sweet spot of the 3 Cs of internet Consumerism: Cool, Cute, and Creepy. Image mashups went viral. Predictive texts got baked into apps as a "sticky" feature. It was all entertaining and time-saving. Then the fourth C of Consumerism arrived, slightly behind schedule: Costly. 

Through the theft of intellectual property, the AI projects didn't bear the cost of sourcing their data. The cost fell to the copyright holders through lost income. 

Suddenly, artists were discovering machine-generated collages containing significant portions of their original works, including modifications of commissioned works purchased by individuals and large Multinational Corporations. Reuse of purchased art without permission is a violation of the artist-client contract. So who was at fault? Neither of the parties in the contract. These modified works were being reused in commercial ventures without credit, permission, or remuneration. In legal terms, the businesses behind the AI machines infringed on the artists' copyright by exceeding the Substantial Similarity standard. 

Growing pains, the technologists scoffed. The AI "mind" is much like the human mind: the more information to which it is exposed, the more it is capable of expressing original concepts. Similarly phrased, the larger the pool of source material, the less readily identifiable the Intellectual Property infringements. Fully aware the enforcement of IP law lags significantly behind technology development, the AI teams push ahead. By the time the courts tell them to stop, it'll be far too late. Market integration and saturation will have peaked. The revenue realization will make whatever damages are to be paid a pittance, in the unlikely event that damages are awarded at all.  

Seeing artists being screwed, writers winced and wished them luck. Pirating has long been a problem for both groups, so have fan works that cross from appreciation into appropriation. Now there are machines programmed to do both with both clunky and slick consumer-facing frontends. Artists despaired, but their works remained cataloged. 

Despite sniggering over nonsensical AI-generated scripts and genre snippets, writers felt the creep of inevitability. We may not be in the same boat as the artists, but we are navigating the same sea. 

Sure enough, before long, the composition AIs were fed enough source data that predictive text expanded from a sentence to a short reply, to an article summary, to a short article, to short stories, to novellas, to novels. Freelance writers are losing gigs to composition bots. Magazines are inundated by AI-composed articles. Publishers, already unable to efficiently manage slush piles, are buried by the AI additions.

But is AI bad? No. Just because a significant portion of its development came about through peak avaricious capitalism doesn't make the programs themselves bad. Within 3-5 years AIs integration into our daily lives will be as seamless as emojis and voice assistants. Is it the death knell for creative arts? No, of course not. However, our marketplace is going to be inundated with AI-generated content. It is going to impact our revenue. It is going to demand we learn how to leverage the technology to help us succeed or we will suffer the fate of Luddites. 

The arrival of this technology isn't too different from when ebooks went mainstream. Publishing went through massive change and expansion. Cottage industries popped up to support the development of the primary technology which then spawned secondary and tertiary supporting technologies. Remember when the book market exploded with the deluge of self-published books? We're already seeing an influx of AI-generated books.

Can we look to the heavy hitters of industry to push for responsible use of AI? Pfft. If their approach to combating plagiarism and IP infringement is any indicator, it is highly unlikely that major retailers are going to stop AI-generated content from being listed in their stores. Sure, I'd love for the creatives' guilds and the parent companies of publishers to force retailers to use AI detection and employ deterrent programs and policies, but, let's be realistic. Anyone who read the US vs Simon's Radom Penguin transcripts can see what little value parent companies place on talent. They'd have to lose billions to AI to bully big retailers like Zon, Walmart, and Apple. It's way more likely that the parent companies will have stood up their own AI divisions before investing in protections for human talent. Remember, profits matter most. 

Lest we think we are too holy to partake in the sins of AI, we can't forget that we too are business owners looking to make a profit. If we are presented with low-cost, legally licensed use of AI-generated images for our covers or marketing materials, will we turn away from it on principle? If we are presented with a reasonable cost for an AI voice-acting app to create audiobooks of our novels, are we going to decline for fear of putting voice actors out of work? We are the pot and we are the kettle.

What about protecting our IP from AI? It's an expensive Sisyphean effort, particularly once our works are indexed by machines in countries that don't participate in IP protection. Once the data is added, there's no removing it from every system that has accessed the data. That's a battle to be waged at the level of national governments. Sure, we now have small claims courts for copyright infringement in the US, and, yes, the Author's Guild recommends adding a "not for AI training use" clause to all publishing contracts, but the burden of proof falls on us--not the data farms--to prove that that specific farm was the one who imported our protected text. Good luck proving it before you go broke. 

Look, we--the authors--have never had a say in how many books of what quality are released in our genre. Sure, we worry about reader experiences and how "badly written" books turn away potential buyers, but we can't control any of it. All we can do is write our stories to the best of our abilities...and scream into the din of Buy Me in search of readers. As for welcoming AI into our creative and business processes, we shouldn't shy away, but we need to be more responsible when it comes to the IP of others. That means being more diligent about verifying the licensing of images and voice work.  

AI isn't going away. It's intended to make our lives easier. It's on us to figure out how, responsibly.

No comments:

Post a Comment