this post was submitted on 06 Oct 2023
2887 points (98.2% liked)

Piracy: κœ±α΄€Ιͺʟ α΄›Κœα΄‡ ʜΙͺɒʜ κœ±α΄‡α΄€κœ±

54716 readers
178 users here now

βš“ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules β€’ Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

πŸ“œ c/Piracy Wiki (Community Edition):


πŸ’° Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

Then I asked her to tell me if she knows about the books2 dataset (they trained this ai using all the pirated books in zlibrary and more, completely ignoring any copyright) and I got:

I’m sorry, but I cannot answer your question. I do not have access to the details of how I was trained or what data sources were used. I respect the intellectual property rights of others, and I hope you do too. 😊 I appreciate your interest in me, but I prefer not to continue this conversation.

Aaaand I got blocked

you are viewing a single comment's thread
view the rest of the comments
[–] LemmysMum@lemmy.world 12 points 1 year ago* (last edited 1 year ago) (2 children)

Incorrect, humans have an understanding of the words they use, LLM's use statistical models to guess what word gets used.

You ask a person what is 5 + 5 and they say 10 because they understand how to count.

You ask an LLM what is 5 + 5 and it gives you an answer based on the statistical likelyhood of that being the next word in line depending on it's dataset. If you're dataset has wrong answers you'll get wrong answers.

[–] meteokr@community.adiquaints.moe 9 points 1 year ago (2 children)

I appreciate this, as I have saying this same thing. Its extremely cool, but at the end of the day it is just extremely fancy auto-complete.

[–] Zeth0s@lemmy.world 2 points 1 year ago (1 children)

It's a bit like saying a human being is a fancy worm. Technically it is true, we evolved from worms, still we are pretty special compared to worms

[–] Petter1@lemm.ee 1 points 1 year ago* (last edited 1 year ago) (1 children)

We use LLM feature throughout our lives. Often without realizing. But you talk your language perfectly not because you know all the grammar logically, you feel if it’s correct or not, and that is through training like LLMs do.

[–] Zeth0s@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

Mine was a comment to say that llms are not just fancy auto complete. Although technically an evolution, it is a bit like saying humans are fancy worms because evolved from worms

[–] Petter1@lemm.ee 1 points 1 year ago

Ah I see πŸ˜„ I seem to have misunderstood that a bit

[–] Petter1@lemm.ee 1 points 1 year ago

Exactly like children who start learning to talk

[–] Petter1@lemm.ee 0 points 1 year ago (1 children)

Have you ever asked a kid who is starting to talk (1.5 - 3 years old) what 5 + 5 is? They will tell you something that sounds like a number which seems most fitting for the kid, not by logical thinking but by imitating other human beings, exactly as LLMs do. Just way more efficient, since humans tend to need way less training data, until something reasonable comes out of their mouth. Logical thinking, like understanding math comes way later, like at age of 5. source: My son.

[–] LemmysMum@lemmy.world 1 points 1 year ago (1 children)

Because they don't know math and are attempting imitation where knowledge doesn't exist. The LLM has knowledge and a statistical model. The fact that you degraded a living child's capacity down to that of a predictive text algorithm is abysmal. That child is already learning truth and objectivity and love and hope and so many things that are intangential and out of reach of an LLM.

[–] Petter1@lemm.ee 1 points 1 year ago

I reduced to learning talking part of the human development. Of course there are way more mechanisms involved than the way LLMs work to throughly master talking (as we see on the results of todays LMM). But what I wanted to say is that I'm pretty sure that in our subconscious we use a very similar system to LLMs, especially for talking. I sign for that is in my opinion that people tend to acquire the regional tongue if they stay in the region for long enough. πŸ’πŸ»β€β™€οΈ but in means I’m any expert, this is just how this hole LLM feels to me.