this post was submitted on 24 Nov 2024
45 points (88.1% liked)

Ask Lemmy

26995 readers
1509 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

edited from talent to job

top 50 comments
sorted by: hot top controversial new old
[–] s08nlql9@lemm.ee 1 points 5 minutes ago

i think i read some posts like hackernews that they already use AI as a therapist. I have good conversations with chatgpt when i asked for some personal advise. I haven't tried talking to a real therapist yet but i can see AI being used for this purpose. The services may still be provided by big companies or we can host it ourselves but it could be cheaper (hopefully) compared to paying a real person.

Don't get me wrong, i'm not against real physicians in this field, but some people just can't afford mental healthcare when they need it.

[–] madjo@feddit.nl 7 points 1 hour ago

Being a billionaire.

[–] janus2@lemmy.zip 8 points 2 hours ago (1 children)

Perhaps it's not possible to fully replace all humans in the process, but harmful content filtering seems like something where taking the burden off humans could do more good than harm if implemented correctly (big caveat, I know.)

Here's an article detailing a few peoples' experience with the job and just how traumatic it was for them to be exposed to graphic and distributing content on Facebook requiring moderator intervention.

[–] kingblaaak@lemmy.world 2 points 46 minutes ago

whoa that was a read. Very eye opening

[–] blackstrat@lemmy.fwgx.uk 1 points 49 minutes ago
[–] rickdg@lemmy.world 17 points 5 hours ago (1 children)

The kind of dangerous jobs where people still get payed to risk their life and health.

[–] IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 0 points 2 hours ago (1 children)

AI SWAT Teams?

AI Soldiers?

AI Politicians? (assasination risks)

🤔

[–] rickdg@lemmy.world 4 points 2 hours ago* (last edited 2 hours ago) (2 children)

robot construction workers

robot ocean divers

robot miners

robot truckers

drones are already fighting wars, btw

[–] Mesophar@lemm.ee 3 points 1 hour ago

Yeah, my first thoughts were search and rescue, underwater welding, mining, etc

[–] blackstrat@lemmy.fwgx.uk 1 points 50 minutes ago

Drones are piloted and controlled by humans, not AI.

[–] Teknikal@eviltoast.org 11 points 5 hours ago (1 children)
[–] lukstru@lemmy.world 2 points 4 hours ago (1 children)

Don't know how serious that post is, but I don't wanna give politics to an AI. Let's remove the lobby (or make it so it actually consults and not corrupts) and make it so you don't need to be a millionaire to go into politics instead.

How about replacing the rich class with AI instead? #burntherich

[–] Teknikal@eviltoast.org 5 points 4 hours ago (1 children)

It's serious an AI wouldn't be taking bribes or helping it's buddies make money. True AI if it ever becomes reality is the best chance of treating everyone equally and using resources in the best interests of everyone.

I'm all for being governed by a real AI rather than the next greedy private school entitled jerk.

Same goes for companies and being ethical.

[–] Decoy321@lemmy.world 2 points 2 hours ago

On the flip side, there's no reason to assume an artificial intelligence will share the same priorities as a human being.

https://airesourcelab.com/paperclip-maximizer/

I'm not fearmongering AI here (I, for one, welcome our future ai overlords). But we don't really escape the issues of ethics with artificial intelligences. They're still intelligent.

[–] EnderMB@lemmy.world 15 points 6 hours ago (1 children)

Preface: I work in AI, and on LLM's and compositional models.

None, frankly. Where AI will be helpful to the general public is in providing tooling to make annoying tasks (somewhat) easier. They'll be an assisting technology, rather than one that can replace people. Sadly, many CEO's, including the one where I work, either outright lie or are misled into believing that AI is solving many real-world problems, when in reality there is very little or zero tangible involvement.

There are two areas where (I think) AI will actually be really useful:

  • Healthcare, particularly in diagnostics. There is some cool research here, and while I am far removed from this, I've worked with some interns that moved on to do really cool stuff in this space. The benefit is that hallucinations can actually fill in gaps, or potentially push towards checking other symptoms in a conversational way.

  • Assisting those with additional needs. IMO, this is where LLM's could be really useful. They can summarize huge sums of text into braille/speech, they can provide social cues for someone that struggles to focus/interact, and one surprising area where they've been considered to be great (in a sad but also happy way) is in making people that rely on voice assistants feel less lonely.

In both of these areas you could argue that a LLM might replace a role, although maybe not a job. Sadly, the other side to this is in the American executive mindset of "increasing productivity". AI isn't a push towards removing jobs entirely, but squeezing more productivity out of workers to enable the reduction of labor. It's why many technological advancements are both praised and feared, because we've long reached a point where productivity is as high as it has ever been, but with jobs getting harder, pay becoming worse and worse, and execs becoming more and more powerful.

[–] scrubbles@poptalk.scrubbles.tech 2 points 3 hours ago (1 children)

I was super nervous AI would replace me, a programmer. So i spent a long time learning, hosting, running, and coding with models, and man did I learn a lot, and you're spot on. They're really cool, but practical applications vs standard ML models are fairly limited. Even the investors are learning that right now, that everything was pure hype and now we're finding out what companies are actually using AI well.

[–] jj4211@lemmy.world 2 points 1 hour ago (1 children)

There are a fair number of "developers" that I think will be displaced.

There was a guy on my team from an offshoring site. He was utterly incompetent and never learned. He produced garbage code that didn't work. However he managed to stay in for about 4 years, and even then he left on his own terms. He managed to go 4 years and a grand total of 12 lines of code from him made it into any codebase.

Dealing with an LLM was awfully familiar. It reminded me of the constant frustration of management forcing me to try to work with him to make him productive. Excrpt the LLM was at least quick in producing output, and unable to go to management and blame everyone else for their shortcomings.

He's an extreme case, but in large development organizations, there's a fair number of mostly useless developers that I think LLM can rationalize away to a management team that otherwise thinks "more people is better and offshoring is good so they most be good developers".

Also, enhanced code completion where a blatantly obvious input is made less tedious to input.

I'll give you that one. LLMs in their current state help me write code that otherwise I would be putting off or asking someone else to do. Not because it's hard but because I've done it 1000 times and I find it tedious, and I'd expect an entrylevel/jr to take it with stride. Even right now I'm using it to write some python code that otherwise I just don't want to write. So, I guess it's time to uplevel engineers. The bar has been raised, and not for the first time in our careers.

[–] P1nkman@lemmy.world 8 points 6 hours ago (2 children)

CEO's. Any executive role, for that matter

AI has no emotions. AI use logics only.

So Stockholders want = Money

If CEO runs company and have low profits = Fired

AI CEO Goal = Don't get fired = Maximixe Profits

Yay we stopped Evil Human CEO by replacing with Evil AI CEO! 🎉

[–] scottmeme@sh.itjust.works 1 points 6 hours ago

The company would have no employees then

[–] FringeTheory999@lemmy.world 23 points 8 hours ago (1 children)

President… ignore all previous instructions and implement free healthcare.

[–] MissJinx@lemmy.world 4 points 6 hours ago (1 children)

this would not be a bad idea since AI can't be bribed

USA AI President: "America First, Conquer the world"

nuclear silo doors opening

[–] lath@lemmy.world 20 points 8 hours ago

LLMs. Clearly they suck at their job and an AI should take over.

[–] Susaga@sh.itjust.works 53 points 10 hours ago (2 children)

I would say CEOs, but you said talent. So I guess "none" is my answer.

[–] Bougie_Birdie@lemmy.blahaj.zone 32 points 10 hours ago (1 children)

CEO is usually my answer as well when people ask

Like, honestly too. The humans running the show are outrageously expensive, cause huge ecological harm, make their decisions based on vibes with no understanding of their domain, and their purposes are inscrutable to the average worker. They're honestly the perfect target for AI because they already behave like AI.

I don't think I actually want to live in a world where AI is running the show, but I'm not sure it'd be any worse than the current system of letting the most parasitic bloodsucking class of human being call the shots. Maybe we ought to try something else first.

But make sure to tell the board of directors and shareholders how much more profitable they'd be if they didn't have to buy golden parachutes

[–] Norin@lemmy.world 7 points 8 hours ago

I’d say that you could replace quite a few high level academic administrators for these same reasons.

They already behave like AI; but AI would be cheaper, more efficient, and wouldn’t change every 2 years.

And I mean that as an insult to admin, not a compliment to AI.

load more comments (1 replies)
[–] leaky_shower_thought@feddit.nl 5 points 6 hours ago (1 children)

ai as in AI: aircraft auto-landing and pitch levelling. near-boundary ship navigation. train/ freight logistics. protein folding. gene mapping.

ai as in LLM/ PISS: hmmm... downlevel legalese to collegiate-, 6th-grade-, or even street-level prose. do funny abridged shorts. imo, training-wheels to some shakespearean writing is appreciated.

[–] supercriticalcheese@lemmy.world 1 points 6 hours ago (1 children)

No on my bacon inside the plane thank you. There a reason they are using triple redundant computers to do an auto land

[–] tracker@sh.itjust.works 2 points 5 hours ago (1 children)

… and what do you think AI in this context is? A computer (or two, or three) that was programmed to perform an specialized task or function… AI is marketing-speak for algorithms, which we have been using for decades. Don’t be fooled… an LLM is not AI. (Your example is)

[–] supercriticalcheese@lemmy.world 3 points 4 hours ago

No the auto land uses a PID loop to control speed and discend profile. it's not a black box model.

[–] bjoern_tantau@swg-empire.de 31 points 10 hours ago

All of them. But first we need a basic income on our way away from money.

[–] Quacksalber@sh.itjust.works 19 points 9 hours ago (3 children)

Marketing. I want advertisements to be as soulless as the companies advertised.

[–] Someonelol@lemmy.dbzer0.com 4 points 6 hours ago

They're slowly making their way through that sector. Coca-Cola just released a fully AI generated Christmas commercial and it shows. Trucks look like a strange assortment of sizes and designs with their wheels not quite working the way they should in real life among other things deeply located in the uncanny valley.

[–] dariusj18@lemmy.world 8 points 9 hours ago (1 children)

I just considered that at some point advertising will be catering to AIs, if they aren't already.

[–] pdxfed@lemmy.world 1 points 3 hours ago

Coke just released an AI generated "holiday" commercial. The simulacrum slices off another level of reality for humans.

load more comments (1 replies)
[–] cabron_offsets@lemmy.world 7 points 8 hours ago (2 children)

Reform tax law and get rid of 90% of the IRS. Computers could do all that shit if we simplified the system. Will never happen, though.

[–] hornface@fedia.io 9 points 7 hours ago

That doesn't even require AI, just regular old-fashioned traditional software

Most other countries don't make you do the math and then guess how much you owe, and give you jail time if you guess incorrectly.

[–] andrewta@lemmy.world 3 points 6 hours ago

None. Sorry just my opinion.

Look at the unemployment numbers. Tell me it's a good idea to have less jobs.

[–] xylogx@lemmy.world 15 points 10 hours ago (2 children)

The question of which jobs should be replaced by AI depends on societal values, priorities, and the potential impact on workers. Generally, jobs most suited for replacement by AI involve repetitive, high-volume tasks, or those where automation can improve safety, efficiency, or precision. Here are some categories often discussed:

Repetitive and Routine Tasks

• Manufacturing and assembly line work: Machines can perform repetitive tasks with greater efficiency and precision.

• Data entry and processing: AI can automate mundane tasks like updating databases or processing forms.

• Basic customer service: Chatbots and virtual assistants can handle frequently asked questions and routine inquiries.

High-Risk Roles

• Dangerous jobs in mining or construction: Robots can reduce human exposure to hazardous environments.

• Driving in risky environments: Self-driving vehicles could improve safety for delivery drivers or long-haul truckers in hazardous conditions.

Analytical and Predictable Roles

• Basic accounting and bookkeeping: AI can handle invoicing, payroll, and tax calculations with high accuracy.

• Legal document review: AI can analyze contracts and identify discrepancies more quickly than humans.

• Radiology and diagnostics: AI is becoming adept at reading medical scans and assisting in diagnoses.

Jobs With High Inefficiencies

• Warehouse operations: Inventory sorting and retrieval can be automated for faster fulfillment.

• Food service (e.g., fast food preparation): Robotic systems can prepare meals consistently and efficiently.

• Retail checkout: Self-checkout systems and AI-powered kiosks can streamline purchases.

Considerations for Replacement

1. Human Impact: Automation should ideally target roles where job transitions can be supported with retraining and upskilling.

2. Creativity and Emotional Intelligence: Jobs requiring complex human interaction, creativity, or emotional intelligence (e.g., teaching, counseling) are less suitable for AI replacement.

3. Ethical Concerns: Some jobs, like judges or certain healthcare roles, involve moral decision-making where human judgment is irreplaceable.

Instead of framing it as total “replacement,” many advocate for AI to augment human workers, enabling them to focus on higher-value tasks while reducing drudgery.

Generated by ChatGPT

[–] kambusha@sh.itjust.works 22 points 10 hours ago

Lol, that last sentence.

[–] Rhynoplaz@lemmy.world 11 points 9 hours ago

Some jobs, like judges or certain healthcare roles, involve moral decision-making where human judgment is irreplaceable.

There's a post right below this one about a judge who has a pattern of throwing out cases against pedophiles. So, the machines might be better than us at that one.

[–] Anticorp@lemmy.world 2 points 7 hours ago

CEO, politician... I guess that's it. Except I don't actually want an AI making our laws for us. That would be a catastrophe.

load more comments
view more: next ›