Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
AI needs data to train up on. It can't create art without first consuming existing art and spitting out parts of the originals. There's a reasonable claim to be made that AI synthesis of prior art is itself original enough to count as having intrinsic worth, but if the only way to get it is stealing other people's work to train up your model, the whole value proposition of AI art is probably net negative, entirely at the expense of artists whose work was used to feed the model.
Yes, there's the argument that automation of new things is inevitable, but we do have choices about whether the automated violation of copyright to feed the model is tolerable or not. Sure, it's a cool sexy technology and who doesn't love getting on the bandwagon of the future and all, but the ethics of modern AI development are trash and despite promises that automated AI labor will save the owner class money by doing for free what the plebes demand to be paid for, it's really as much a ponzi scheme as all those crypto currencies that don't have intrinsic value unless enough suckers can be convinced to feed the scheme.
And yet, it's a powerful technology that has potential to be a legitimate boon to humanity. I'd like to see it used to do things (like picking crops that are hard to automate with dumb machines, or cleaning trash off of beaches or out of the ocean, or refactoring boilerplate code to not use deprecated packages or to review boilerplate contract text for errors) that aren't just ways for owners to cut labor out of the economy and pocket the differences.
Perhaps, if we are going to allow AI to be a great big machine that steals inputs (like art, or writing, or code) from others and uses them to do for-profit work for their owners, the proceeds attributable to AI ought to be taxed at a 90%+ rate and used to fund a Universal Basic Income as payment for the original work that went into the AI model.