INFO Blog 2: ARTificial: Why Copyright Is Not the Right Policy Tool to Deal with Generative AI
Published on:
News Article: ARTificial: Why Copyright Is Not the Right Policy Tool to Deal with Generative AI
Case Study Summary
The case study explains why copyright is not the best tool for dealing with generative AI. It shows how these models learn from huge amounts of data, and it raises questions about fairness, authorship, and how the law should treat AI generated work.
Answering Questions from the article
How should authors be compensated, and under which circumstances?
In class we talked about the question “Should these companies be required to compensate copyright holders?” and we all pretty much agreed that they should. If companies are using an author’s copyrighted work to train their AI, then the author deserves something in return, whether that is royalties or another fair way of being paid. These models become better and more valuable because they learn from real creative work, and it feels wrong for big companies to profit from that while the original creators receive nothing. Even if the AI is not directly copying anything, the value still comes from the author’s labor. Because of that, I think authors should be compensated whenever their work is included in a training dataset, and whatever system is used should make sure the money actually reaches the creators.
Is training with unlicensed works a fair use, or an infringing one?
In class we talked about whether it is ethical for companies to use huge amounts of copyrighted material to train their AI models without paying or even asking the people who created it. We agreed that it felt unethical, and even though there is no law that makes it illegal right now, it still comes across as unfair in a real world sense. The creators never gave consent, they lose control over their own work, and they are not compensated even though their work directly helps improve these models. Because of that, it feels much closer to infringement than fair use. A legal label cannot erase the fact that these companies are benefiting from creative work they did not pay for. So even if the law might protect them in some situations, using unlicensed copyrighted work without permission still crosses an ethical line for me.
Are the outputs generated by GAI original or derivative works?
I personally see most AI outputs as derivative works. Even if the final image or text is not an exact copy of anything in the dataset, it still depends completely on the creative work that real people made. The model only knows how to generate something because it was trained on thousands of examples that came from human authors and artists. So when I look at AI generated content, it does not feel truly original to me. It feels like the AI is pulling from patterns, styles, and ideas that already existed, just in a new arrangement.
Should the outputs be entitled to copyright protection, and how should we deal with AI authorship?
Reading the article made me think about what it actually means to create something. When I make a project or write something for class, I know it reflects my effort, my choices, and my own experiences. AI does not have that. It does not create with intention or emotion. It just predicts what comes next based on patterns. The article implies this too when it talks about how copyright originality is tied only to human authorship and how the current framework has a human centered bias. Because of that, I do not think AI generated outputs should receive copyright protection on their own. If a human is clearly guiding the process and making the creative decisions, then the copyright should belong to that person. But when something is produced entirely by the model, I do not see a reason to treat it like human made work. Keeping those outputs unprotected feels more fair and keeps copyright focused on real human creativity.
New Discussion Question:
If AI eventually becomes capable of creating stories, art, and ideas that feel more relatable or emotional than what most people can make, how will we decide what role humans should play in the creative world, and what will creativity even mean for us anymore?
I chose this question because it goes beyond the legal and technical issues and focuses on something personal. Creativity is a big part of how people express themselves and connect with each other. If AI becomes better than us at doing that, then we have to think about what being creative even means in a future where machines can do it too. It makes me wonder what space is left for human imagination and how we will define our place in a world where AI can generate anything we can think of.
Reflection:
Writing this blog made me think a lot more deeply about AI and creativity than I expected. Answering the questions helped me slow down and think about what fairness really looks like for creators, especially when big companies have so much power. It also made me notice how much I care about keeping space for human expression, because real creativity comes from lived experience and emotion. Overall, this reflection pushed me to think more about the future we are building and what kind of relationship we want between humans and the technology we create.
