this post was submitted on 11 Jan 2024
244 points (99.6% liked)

Technology

37738 readers
348 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Apparently, stealing other people's work to create product for money is now "fair use" as according to OpenAI because they are "innovating" (stealing). Yeah. Move fast and break things, huh?

"Because copyright today covers virtually every sort of human expression—including blogposts, photographs, forum posts, scraps of software code, and government documents—it would be impossible to train today’s leading AI models without using copyrighted materials," wrote OpenAI in the House of Lords submission.

OpenAI claimed that the authors in that lawsuit "misconceive[d] the scope of copyright, failing to take into account the limitations and exceptions (including fair use) that properly leave room for innovations like the large language models now at the forefront of artificial intelligence."

you are viewing a single comment's thread
view the rest of the comments
[–] pupbiru@aussie.zone 1 points 10 months ago (1 children)

branding

okay

the marketing

yup

the plagiarism

woah there! that’s where we disagree… your position is based on the fact that you believe that this is plagiarism - inherently negative

perhaps its best not use loaded language. if we want to have a good faith discussion, it’s best to avoid emotive arguments and language that’s designed to evoke negativity simply by their use, rather than the argument being presented

I happen to be in the intersection of working in the same field, an avid fan of classic Sci-Fi and a writer

its understandable that it’s frustrating, but just because a machine is now able to do a similar job to a human doesn’t make it inherently wrong. it might be useful for you to reframe these developments - it’s not taking away from humans, it’s enabling humans… the less a human has to have skill to get what’s in their head into an expressive medium for someone to consume the better imo! art and creativity shouldn’t be about having an ability - the closer we get to pure expression the better imo!

the less you have to worry about the technicalities of writing, the more you can focus on pure creativity

The point is that the way these models have been trained is unethical. They used material they had no license to use and they've admitted that it couldn't work as well as it does without stealing other people's work

i’d question why it’s unethical, and also suggest that “stolen” is another emotive term here not meant to further the discussion by rational argument

so, why is it unethical for a machine but not a human to absorb information and create something based on its “experiences”?

[–] Phanatik@kbin.social 1 points 10 months ago

First of all, we're not having a debate and this isn't a courtroom so avoid the patronising language.

Second of all, my "belief" on the models' plagiarism is based on technical knowledge of how the models work and not how I think they work.

a machine is now able to do a similar job to a human

This would be impressive if it was true. An LLM is not intelligent simply through its appearance of intelligence.

It's enabling humans

It's a chat bot that's automated Google searches, let's be clear about what this can do. It's taken natural language processing and applied it through an optimisation algorithm to produce human-like responses.

No, I disagree at a fundamental level. Humans need to compete against each other and ourselves to improve. Just because an LLM can write a book for you, doesn't mean you've written a book. You're just lazy. You don't want to put in the work any other writer in existence has done, to mull over their work and consider the emotions and effect they want to have on the reader. To what extent can an LLM replicate the way George RR Martin describes his world without entirely ripping off his work?

i’d question why it’s unethical, and also suggest that “stolen” is another emotive term here not meant to further the discussion by rational argument

If I take a book you wrote from you without buying it or paying you for it, what would you call that?