this post was submitted on 26 Jul 2023
859 points (96.4% liked)
Technology
59575 readers
3611 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This feels like a solution to a non-problem. When a person asks the AI "give me X copyrighted text" no one should be expecting this to be new works. Why is asking ChatGPT for lyrics bad while asking a human ok?
It's not legal for anyone (human or not) to put song lyrics online without permission/license to do so. I was googling this to make sure I understood it correctly and it seems that reproducing the lyrics to music without permission to do so is copyright infringement. There are some lyrics websites that work with music companies to get licensing to post lyrics but most websites host them illegally and will them then down if they receive a DMCA request.
Wait wait wait. That is not a good description of what is happening. If you and i are in a chat room and you asked me the lyrics, my verbalization of them isn't an issue. The fact it is online just means the method of communication is different but that should be covered under free use.
The AI is taking prompts and proving the output as a dialog. It's literally a language model so there is a process of synthesizing your question and generating your output. I think that's something people either don't understand or completely ignore. Its not as if there are entire books verbatim stored as a contiguous block of data. The data is processed and synthesized into a language model that then generates an output that happens to match the requested text.
This is why we cant look at the output the same way we look at static text. In theory if you kept training it in a way then opposed the statistical nature of your book or lyrics you could eventually get to the point where asking the AI to generate your text would give a non-verbatim answer.
I get that this feels like semantics but creating laws that don't understand the technology means we end up screwing ourselves over.
I get how LLMs work and I think they're really cool. I'm just trying to explain why OpenAI is currently limiting these abilities to continue operating within our legal system. Hopefully the court cases find that there is in fact a difference between publishing the information on a normal website versus discussing it with a chatbot so that they don't have to be limited like this.
Publishing lyrics publicly online is illegal while communicating them privately in a chatroom is probably fine. Communicating them in a public forum is a grey area, but you likely won't be caught or prosecuted. If a big company hosts an AI chatbot which can tell you the lyrics to any song on demand, then that seems like an illegal service unless they have the rights.
Feel free to look up the legality of publishing lyrics online, all I saw was information saying that it is illegal but they don't prosecute anyone but the larger companies.
I guess my question is why does it seem like an illegal service? Not saying it isn't but it feels like non technical people will say "it knows the lyrics and can tell me them so it must contain them."
To me the technology is moving closer to mimicking human memory than just plain storage retrieval. ChatGPT gets things wrong often because that process of presenting data is not copying but generation. The output is the output so presenting anything copyright falls under the appropriate laws but until the material is actually presented some of the arguments being made feel wrong. If i can read a book and then write anything, the fact your story is in my head shouldn't be a problem. If you prompt the AI for a book...isn't that your fault by asking?