Literally anybody who thought about the idea for more than ten seconds already realized this a long time ago; apparently this blog post needed to be written for the people who didn't do even that...
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
You underestimate the dumbassery of Pencil-Pushers in tech companies (& also how genuinely sub-human they can be)
A reason I didn't see listed: they are just asking for competition. Yes by all means get rid of your most talented people who know how your business is run.
I'm fine with this. Let it all break, we've earned it.
I wonder if there will eventually be a real Butlerian Jihad
The jihad starts with the tech bros' butlers, that'd be very poetic
The irony of using an AI generated image for this post...
AI imagery makes any article look cheaper in my view, I am more inclined to "judge the book by its cover".
Why would you slap something so lazy on top of a piece of writing you (assuming it isn't also written by AI) put time and effort into?
this post is about programmers being replaced by ai. the writer seems ok with artists being replaced.
I thought it was intentional AI slop
Yeah, I'm sure they left the spelling mistake in the image on purpose to get increased engagement from pedants like me. I'm sorry, it works on me.
https://defragzone.substack.com/p/run-massive-models-on-crappy-machines
the author doesn't oppose AI, just programmers being replaced for it.
I know that it's a meme to hate on generated images people need to understand just how much that ship has sailed.
Getting upset at generative AI is about as absurd as getting upset at CGI special effects or digital images. Both of these things were the subject of derision when they started being widely used. CGI was seen as a second rate knockoff of "real" special effects and digital images were seen as the tool of amateur photographers with their Photoshop tools acting as a crutch in place of real photography talent.
No amount of arguments film purist or nostalgia for the old days of puppets and models in movies was going to stop computer graphics and digital images capture and manipulation. Today those arguments seem so quaint and ignorant that most people are not even aware that there was even a controversy.
Digital images and computer graphics have nearly completely displaced film photography and physical model-based special effects.
Much like those technologies, generative AI isn't going away and it's only going to improve and become more ubiquitous.
This isn't the hill to die on no matter how many upvotes you get.
But people still complain about CGI in film, likely for the same reason it was criticised in the past that you mention - it looks like ass, if done cheaply (today) or with early underdeveloped tech (back in the past). Similarly so, the vast majority of AI-generated images look lazy, generic (duh) and basically give me the "ick".
Yeah, maybe they'll get better in the future. But does that mean that we can't complain about their ugliness (or whatever other issue we have with them) now?
people don't like generated so bc it's trainer on copyrighted data but if you don't believe in copyright then it's a tool like any other
There are thousands of different diffusion models, not all of them are trained on copyright protected work.
In addition, substantially transformative works are allowed to use content that is otherwise copy protected under the fair use doctrine.
It's hard to argue that a model, a file containing the trained weight matrices, is in any way substantially similar to any existing copyrighted work. TL;DR: There are no pictures of Mickey Mouse in a GGUF file.
Fair use has already been upheld in the courts concerning machine learning models trained using books.
For instance, under the precedent established in Authors Guild v. HathiTrust and upheld in Authors Guild v. Google, the US Court of Appeals for the Second Circuit held that mass digitization of a large volume of in-copyright books in order to distill and reveal new information about the books was a fair use.
And, perhaps more pragmatically, the genie is already out of the bottle. The software and weights are already available and you can train and fine-tune your own models on consumer graphics cards. No court ruling or regulation will restrain every country on the globe and every country is rapidly researching and producing generative models.
The battle is already over, the ship has sailed.
I work for a fortune 500 company.
just recently lost a principal engineer that built an entire platform over the last four years.
just before they left I noticed they were using AI an awful lot. like...a lot a lot. like, "I don't know the answer on a screen share so I'll ask ChatGPT how to solve the problem and copy/paste it directly into the environment until it works" a lot.
they got fired for doing non-related shit.
it's taken us three months, hundreds of hours from at least 5 other principal engineers to try to unravel this bullshit and we're still not close.
the contributions and architecture scream AI all over it.
Point is. I'll happily let idiots destroy the world of software because I'll make fat bank later as a consultant fixing their bullshit.
That's what I expect if I'm fired and rehired: at least +25% on my salary.
We hired a junior at work from a prestigious university. He uses ChatGPT all the time but denies it. I know that because all his comments in the code are written like some new Tolkien book. Last time I checked his code, I told him it had something like 20 bugs and told him how to fix that because I'm not a bad guy. The next day, he came back with a program that was very very different. Not knowing how to apply my fixes, he used another prompt and the whole thing was different with new bugs. I told my boss I was not wasting time on that shit again.
There's also the tribal knowledge of people who've worked somewhere for a few years. There's always a few people who just know where or how a particular thing works and why it works that way. AI simply cannot replace that.
I don't disagree with that, but there's so many "wtf is this shit" moments that defy all logic and known practices.
like for example, six different branches of the same repo that deploy to two different environments in a phased rollout. branches 1-3 are prod, 4-6 are dev. phases go 3,1,2 for prod and 6,4,5 for dev. they are numbered as well.
also, the pipelines create a new bucket every build. so there's over 700 S3 buckets with varying versions of the frontend....that then gets moved into....another S3 bucket with public access.
my personal favorite is the publicly accessible and non-access controlled lambdas with hard-coded lambda evocation URLs in them. lambda A has a public access evocation URL configured instead of using API Gateway. Lambda B has that evocation URL hard coded into the source that's deployed.
there's so much negligent work here I swear they did it on purpose.
there’s so much negligent work here I swear they did it on purpose.
Depending on the place, it's the "work insurance" - companies would usually think twice before firing the only person who can understand the spaghetti. Now they won't need said person to generate "working" code
Institutional knowledge takes years to replace.
It’s hard for people who haven’t experienced the loss of experts to understand. Not a programmer but I worked in aerospace engineering for 35 years. The drive to transfer value to execs and other stakeholders by reducing the cost of those who literally make that value always ends costing more.
those executives act like parasites. They bring no value and just leech the life from the companies.
executives act like parasites
WE MAED TEH PROFITZ!!!1!!1
Executives think they are the most important part of the company. They are high level managers, that is all.
Well, yeah, but those costs are for tomorrow's executive to figure out, we need those profits NOW
It’s utterly bizarre. The customers lose out by receiving an inferior product at the same cost. The workers lose out by having their employment terminated. And even the company loses out by having its reputation squandered. The only people who gain are the executives and the ownership.
This is absolutely by design. The corporate raider playbook is well-read. See: Sears, Fluke, DeWalt, Boeing, HP, Intel, Anker, any company purchased by Vista (RIP Smartsheet, we barely knew ye), and so on. Find a brand with an excellent reputation, gut it, strip mine that goodwill, abandon the husk on a golden parachute, and make sure to not be the one holding the bag.
I'm just a dabbler at coding and even i can see getting rid of programmers and relying to ai for it will lead to disaster. Ai is useful, but only for smallest scraps of code because anything bigger will get too muddled. For me, it liked to come up with its own stupid ideas and then insist on getting stuck on those so i had to constantly reset the conversation. But i managed to have it make useful little function that i couldnt have thought up myself as it used some complex mathematical things.
Also relying on it is quick way to kind of get things done but without understanding at all how things work. Eventually this will lead to such horrible and unsecure code that no one can fix or maintain. Though maybe its good thing eventually since it will bring those shitty companies to ruin. any leadership in those companies should be noted down now though, so they cant pretend later to not have had anything to do with it.
This is prophetic and yet as clear as day to anyone who has actually had to rely on their own code for anything.
I have lately focused all of my tech learning efforts and home lab experiments on cloud-less approaches. Sure the cloud is a good idea for scalable high traffic websites, but it sure also seems to enable police state surveillance and extreme vendor lock-in.
It’s really just a focus on fundamentals. But all those cool virtualization technologies that enable ‘cloud’ are super handy in a local system too. Rolling back container snapshots on specific services while leaving the general system unimpacted is useful anywhere.
But it is all on hardware I control. Apropos of the article, the pendulum will swing back toward more focus on local infrastructure. Cloud won’t go away, but more people are realizing that it also means someone else owns your data/your business.
I think they were suckered in also by the supposed lower cost of running services, which, as it happens, isn't lower at all and in fact is more expensive. But you laid off the Datacenter staff so. Pay up, suckers.
Neat toolsets though.
Imagine a company that fires its software engineers, replaces them with AI-generated code, and then sits back, expecting everything to just work. This is like firing your entire fire department because you installed more smoke detectors. It’s fine until the first real fire happens.
This is a bad analogy.
It would be more akin to firing your fire departments, because you installed automatic hoses in front of everyone’s homes. When a fire starts, the hoses will squirt water towards the fire, but sometimes it’ll miss, sometimes it’ll squirt backwards, sometimes it’ll squirt the neighbour’s house, and sometimes it’ll squirt the fire.