this post was submitted on 01 Aug 2023
685 points (91.7% liked)

Technology

59696 readers
2022 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 13 points 1 year ago* (last edited 1 year ago) (1 children)

Illegally, maybe. Immorally, probably not. It’s fine for a human to read something and learn from it, so why not an algorithm? All of the original content is diluted into statistics so much that the source material does not exist in the model. They didn’t hack any databases, they merely use information that’s already available for anyone to read on the internet.

Honestly, the real problem is not that OpenAI learned from publicly available material, but that something trained on public material is privately owned.

[–] uranibaba@lemmy.world 3 points 1 year ago (1 children)

but that something trained on public material is privately owned.

Is that really a problem? Is a create something new based on public knowledge, should I not be able to profit from it?

I learn to paint from YouTube, should I paint for free now?

I'll admit that the scope of ChatGPT is MUCH bigger than one person painting.

[–] pennomi@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

I’d say that was a more controversial opinion. From a purist perspective I tend to believe that intellectual property in general is not ethical and stifles innovation.