this post was submitted on 22 Dec 2024
1318 points (97.3% liked)

Technology

60055 readers
3163 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

It's all made from our data, anyway, so it should be ours to use as we want

you are viewing a single comment's thread
view the rest of the comments
[–] patatahooligan@lemmy.world 5 points 4 hours ago (3 children)

the AI companies have a pretty good defense in the fact analyzing publicly viewable information is a pretty deep rooted freedom that provides a lot of positives to the world

They are not "analyzing" the data. They are feeding it into a regurgitating mechanism. There's a big difference. Their defense is only "good" because AI is being misrepresented and misunderstood.

I agree that we shouldn't strive for more strict copyright. We should fight for a much more liberal system. But as long as everyone else has to live by the current copyright laws, we should not let AI companies get away with what they're doing.

[–] ClamDrinker@lemmy.world -1 points 50 minutes ago (1 children)

They are not “analyzing” the data. They are feeding it into a regurgitating mechanism. There’s a big difference. Their defense is only “good” because AI is being misrepresented and misunderstood.

I really kind of hope you're kidding here. Because this has got to be the most roundabout way of saying they're analyzing the information. Just because you think it does so to regurgitate (which I have yet to see any good evidence for, at least for the larger models), does not change the definition of analyzing. And by doing so you are misrepresenting it and showing you might just have misunderstood it, which is ironic. And doing so does not help the cause of anyone who wishes to reduce the harm from AI, as you are literally giving ammo to people to point to and say you are being irrational about it.

[–] patatahooligan@lemmy.world 0 points 13 minutes ago

Yes if you completely ignore how data is processed and how the product is derived from the data, then everything can be labeled "data analysis". Great point. So copyright infringement can never exist because the original work can always be considered data that you analyze. Incredible.

[–] Landless2029@lemmy.world 1 points 2 hours ago

Not to mention patent laws are bullshit.

There are law offices that exist specifically to fuck with people over patent and copywrite law.

There's also cases where people use copywrite and patent law to hold us back. I can't find the article but some religious jerk patented connecting a sex toy to a computer via USB. Thankfully someone got around this law with bluetooth and cell phones. Otherwise I imagine the camgirl and LDR market for toys would've been hit with products 10 years sooner.

[–] gazter@aussie.zone 2 points 4 hours ago (3 children)

I've never really delved into the AI copyright debate before, so forgive my ignorance on the matter.

I don't understand how an AI reading a bunch of books and rearranging some of those words into a new story, is different to a human author reading a bunch of books and rearranging those words into a new story.

Most AI art I've seen has been... Unique, to say the least. To me, they tend to be different enough to the art they were trained in to not be a direct ripoff, so personally I don't see the issue.

[–] ClamDrinker@lemmy.world 1 points 22 minutes ago* (last edited 19 minutes ago)

Yes, this is my exact issue with some framing of AI. Creative people love their influences to the point you can ask them and they will point to parts that they reference or nudged to an influence they partially credit to getting to that result. It's also extremely normal that when you make something new, you brainstorm and analyze any kind of material (copyrighted or not) you can find that gives the same feelings you desire to create. As is ironically said to give comfort to starting creatives that it's okay to be inspired by others: "Good artists copy, great artists steal."

And often people very anti AI don't see an issue with this, yet it is in essence the same as the AI does, which is to detach the work from the ideas it was built on, and then re-using those ideas. And just like anyone who has the ability to create has the ability to plagiarize or infringe, so does the AI. As human users of AI we must be the ones to ethically guide it away from that (Since it can't do that itself), just like you would not copy-paste your influences into a new human made work.

[–] catloaf@lemm.ee 1 points 2 hours ago (1 children)

The for-profit large-scale media blender is the problem. When it's a human writing Harry Potter fan fiction, it's fine. When a company sells a tool for you to write thousands of trash "books" for profit, it's a problem.

[–] ClamDrinker@lemmy.world 1 points 1 hour ago

Which is why the technology itself isn't the issue, but those willing to use it in unethical ways. AI is an invaluable tool to those with limited means, unlike big corporations.

[–] trashgirlfriend@lemmy.world 0 points 3 hours ago (1 children)

ML algorithms aren't capable of producing anything new, they can only ever produce a mishmash of copies of existing works.

If you feed a generative model a bunch of physics research papers, it won't create a new valid physics research paper, just a mishmash of jargon from existing papers.

[–] ClamDrinker@lemmy.world 1 points 57 minutes ago* (last edited 56 minutes ago)

You say it's not capable of producing anything new, but then give an example of it creating something new. You just changed the goal from "new" to "valid" in the next sentence. Looking at AI for "valid" information is silly, but looking at it for "new" information is not. Humans do this kind of information mixing all the time. It's why fan works are a thing, and why most creative people have influences they credit with being where they are today.

Nobody alive today isn't tainted by the ideas they've consumed in copyrighted works, but we do not bat an eye if you use that in a transformative manner. And AI already does this transformation much better than humans do since it's trained on that much more information, diluting the pool of sources, which effectively means less information from a single source is used.