The Dumbest Move in Tech Right Now: Laying Off Developers Because of AI
The Dumbest Move in Tech Right Now: Laying Off Developers Because of AI

The Dumbest Move in Tech Right Now: Laying Off Developers Because of AI

The Dumbest Move in Tech Right Now: Laying Off Developers Because of AI
The Dumbest Move in Tech Right Now: Laying Off Developers Because of AI
An interesting trend is these comments: the worse a code base is, the more helpful AI is for expanding it (without actually fixing the underlying problems like repetitive overly long unexpressive code).
Forward-thinking companies should use AI to transform each developer into a "10x developer,"
Developer + AI ≠ Developer x 10
At best, it means 1.25 x Developer, but in most cases, it will mean 0.5 x Developer. Because AI cannot be trusted to generate safe, reliable code.
Computers are machines designed to quickly, precisely, and consistently make mistakes.
I think 10x is a reasonable long term goal, given continued improvements in models, agentic systems, tooling, and proper use of them.
It's close already for some use cases, for example understanding a new code base with the help of cursor agent is kind of insane.
We've only had these tools for a few years, and I expect software development will be unrecognizable in ten more.
It also depends on the usecase. It likely can help you better at throwing webpages together from zero, but will fall apart once it has to be used to generate code for lesser-discussed things. Someone once tried to solve an OpenGL issue I had with ChatGPT, and first it tried to suggest me using SDL2 or GLFW instead, then it spat out a barely working code that was the same as mine, and still wrong.
A lot of it instead (from what I've heard from industry connections) being that the employees are being forced to use AI so hard they're threatened with firings, so they use most of their tokens to amuse themselves with stuff like rewriting the documentation in a pirate style or Old English. And at the very worst, they're actually working in constant overtime now, because people were fired, contracts were not extended, etc.
It’s made me a 10x developer.
As someone who transitioned form Junior to Dev as we embraced LLMs. Our company saved that much time that we all got a pay rise with a reduction in hours to boot.
Sick of all this anti LLM rhetoric when it’s a tool to aid you. People out here thinking we just ask ChatGPT and copy and paste. Which isn’t the case at all.
It helps you understand topics much quicker, can review code, read documentation, etc.
My boss is the smartest person I’ve ever met in my life and has an insane cv in the dev and open source world. If he is happy to integrate it in our work then I’m fine with it. After all we run a highly successful business with many high profile clients.
Edit: love the downvotes that don’t explain themselves. Like I’m not earning more money for doing less hours and productivity has increased. Feel like many of the haters of LLMs don’t even work in the bloody industry. 😂
Developers developers developers developers, developers developers developers developers AI
This is so fucking sad to acknowledge that a lot of people just want to squeeze any profit left in the industry, even though they know AI is a great tool for developers, not a replacement. They must know that because anyone who can access it can replicate the same things, making these products uncompetitive.
AI is a great tool for developers, not a replacement
AI isn't a great tool for developers. It's a great tool for mitigating the knowledge gap between an individual's academic understanding of a development project and the syntax involved in the language they are attempting to deploy.
As the number of programming languages has proliferated faster than the volume of developers versed in each language, and the older languages have lost much of their professional base to retirement and layoffs, we've needed increasingly elaborate tools to fill in the skills gaps.
But AI doesn't fix the underlying problem of an increasingly large backlog of code desperately in need of refactor or replacement. It just papers over the problem with a cheat-sheet of simple conversions that junior developers can leverage to liter the next iteration of the codebase with bandaids.
A proper solution to our coding backlog would be educational first and foremost. We need more rigorously enforced orthodox approaches to coding. We need more backwards compatibility between systems. We need to refine the number of languages in active use and narrow the size and scope of their libraries. We need a more universalist approach to building and maintaining database schemas, digital communications, and business practices. We need a publicly funded open source community of developers to build the backbone of software into the 21st century.
What we're producing is the opposite of that. Less rigor. Fewer recognizable standards. Less training. Poorer code hygiene and weaker enforcement of best practices. More bugs. So many more bugs. And enormous volumes of legacy code that nobody will be able to maintain - or even understand - in another twenty years.
And it's intentional. Lay off the workers. Implement AI Slop. Slop does sloppy work. Hire back workers as Temps or Contractors. No benefits. Lower pay.
Like all of Capitalism. It's a fucking scam. A conjob. A new innovation in fucking over workers. (Ironically the only "innovation" ever directly produced by Capitalism)
I remember when everyone was saying that companies would need programmers and that every kid should learn programming. Now I realize that companies were promoting that idea so they're be a surplus of programmers competing with each other and companies could underpay and swap out workers quickly.
exactly, "reserve army of labour" is a tale as old as capitalism.
Just that the IT industry has run a very effective propaganda campaign for it
Yeah obviously. Whenever a company says "we can't get enough X workers" they implicitly mean "at the price we want to pay".
But that doesn't mean they were wrong. Programming is still an amazingly well paying and low stress career. Being replaced by AI is a little worrying, but I think by the time AI is good enough to really replace programmers, it will also be able to replace most white collar jobs - HR, finance, etc. - and society will have bigger problems.
Even if AI is an actual tool that improves the software development speed of human developers (rather than something that ends up taking away in time spending reviewing, correcting and debugging the AI generated code, the time savings it gives in automatically writing the code), it's been my experience in almost 30 years of my career as a Software Engineer that every single tooling improvements that makes us capable of doing more in the same amount of time is eaten up by increasing demands on the capabilities of the software we make.
Thirty years ago user interfaces were either CLI or pretty simple with no animations. A Software Systems was just a software application - it ran on a single machine with inputs and outputs on that machine - not a multi-tiered octopus involving a bunch of back end data stores, then control and data retrieval middle tiers, then another tier doing UI generation using a bunch of intermediate page definition languages and a frontends rendering those pages to a user and getting user input, probably with some local code thrown into the mix. Ditto for how cars are now mostly multiple programs running of various microcontrollers with one or more microprocessors in the mix all talking over a dedicated protocol. Ditto for how your frigging "smart" washing machine talking to your dedicated smartphone app for it probably involves a 3rd machine in the form of some server from the manufacturer and the whole thing is running over TCP/IP and using the Internet (hence depending on a lot more machines with their dedicated software such as Routers and DNS servers) rather than some point-to-point direct protocol (such as Serial) like in the old days.
Anyways, the point being that even if AI actually delivers more upsides than downsides as a tool to improve programmer output, that stuff is going to be eaten up by increasing demands on the complexity of the software we do, same as the benefits of better programming languages were, the benefits of better IDEs were, of the widespread availability of pre-made libraries for just about everything were, of templating were, of the easiness to find solutions for the problem one is facing from other people on the Internet were, of better software development processes were, of source control were, of colaborative development tools were and so on.
Funnily enough, for all those things there were always people claiming it would make the life of programmers easier, when in fact all it did was make the expectations on the software being implemented go up, often just in terms of bullshit that's not really useful (the "smart" washing machine using networking to talk to a smartphone app so that the machine manufacturers can save a few dollars by not putting as many physical controllers in it, is probably a good example)
This assumes it is about output. 20 years of experience tell me it's not about output, but about profits and those can be increased without touching output at all. 🤷♂️
*specifically short-term profits. Executives only care about the next quarter and their own incentives/bonuses. Sure the company is eventually hollowed out and left as a wreck, but by then, the C Suite has moved on to their next host org. Rinse and repeat.
Often they only want the illusion of output, just enough to keep the profits eternally rising.
I don’t honestly believe that AI can save me time as a developer. I’ve tried several AI agents and every single one cost me time. I had to hold its hand while it fumbled around the code base, then fix whatever it eventually broke.
I’d imagine companies using AI will need to hire more developers to undo all the damage the AI does to their code base.
I don’t honestly believe that AI can save me time as a developer. I’ve tried several AI agents and every single one cost me time.
I have had the exact same experience many times. But I just keep trying it out anyway, often with hilariously bad results.
I am beginning to realize that I like cool technology more than I like being productive.
I've found it can just about be useful for "Here's my data - make a schema of it" or "Here's my function - make an argparse interface". Stuff I could do myself but find very tedious. Then I check it, fix its various dumb assumptions, and go from there.
Mostly though it's like working with an over-presumptuous junior. "Oh no, don't do that, it's a bad idea because security! What if (scenario that doesn't apply)" (when doing something in a sandbox because the secured production bits aren't yet online and I need to get some work done while IT fanny about fixing things for people that aren't me).
Something I've found it useful for is as a natural language interface for queries that I don't have the terminology for. As in "I've heard of this thing - give me an overview of what the library does?" or "I have this problem - what are popular solutions to it?". Things where I only know one way to do it and it feels like there's probably lots of other ways to accomplish it. I might well reject those, but it's good to know what else exists.
In an ideal world that information would be more readily available elsewhere but search engines are such a bin fire these days.
I mostly use AI as advanced autocomplete. But even just using it for documentation is wrong so often that I do't use it for anything more complex than tutorial level.
I got pretty far with cursor.com when doing basic stuff that i have to spend more time looking up documentation than writing code, but I wouldn't trust it with complex usec cases at this point.
I check back every 6 months or so, to keep track of the progress. Maybe I can spent my days as a software developer drinking cocktails by the pool yelling prompts into the machine soon, but so far I am not concerned I'll be replaced anytime soon.
Maybe I can spent my days as a software developer drinking cocktails by the pool yelling prompts into the machine soon, but so far I am not concerned I'll be replaced anytime soon.
That's the dream.
And it's really why all the AI hype makes me angry.
I want to tell people who buy the hype, "Bitch, do I look retired to you?! Does anything you know about me suggest to you that I wouldn't have 11 separate consulting engagements cranking out money and code if AI could do these things?"
It's a bit insulting when peers think AI is magic, and open source, but somehow has not bent to obey my will the same as every other technology I have ever touched.
I think I might need a cape, and some kind of wrist computer with wires pouring out of it. Maybe that would fix my brand image problem...
Edit: Maybe they think I'm just keeping it all to myself, and telling them it's pretty good for autocomplete to throw them off the trail...
AI can absolutely save you time, if you use it right. Don't expect it to magically be as good as a real programmer... but for instance I made an HTML visualisation of some stuff using Claude, and while it got it a bit wrong, fixing it took me maybe 20 minutes, while writing it from scratch would have taken me at least a couple of hours.
AI can absolutely save you time, if you use it right.
That's a very "you" statement.
For all we know, AI cannot in any way save this developer time.
Some developers know their area so well that there's no reason for them to waste time dictating non-code into a guessing machine.
I guess for some simple stuff it can work fine, but the majority of the code I write is not at all simple, and it’s all highly dependent on the libraries I’ve written, which the AI is really bad at learning.
And then in terms of documentation, it is just hopelessly inept.
I was in the same boat about...3mos ago. But recent tooling is kind of making me rethink things. And to be honest I'm kind of surprised. I'm fairly anti-AI.
Is it perfect? Fuck no. But with the right prompts and gates, I'm genuinely surprised. Yes, I still have to tweak, but we're talking entire features being 80% stubbed in sub 1 minute. More if I want it to test and iterate.
My major concern is the people doing this and not reviewing the code and shipping it. Because it definitely needs massaging...ESPECIALLY for security reasons.
Which tools are you finding success with?
Genuinely a bit shocked to see the number of robolovers in these comments. Very weird, very disheartening. No wonder so much shit online doesn't work properly lol
No wonder so much shit online doesn't work properly lol
I know. I live in a constant state of shock that my peers think the next stupid tool will fix everything without any discipline or hard work, and equal shock that (almost) nothing online ever works correctly.
I should be able to find a correlation between these two observations, but I'm just too naive.
Ironically, processing large amounts of data and making soft decisions and planning based on such data makes AI ideal for replacing C-suite members.
Not to mention the cost savings difference. Developer salaries make a ChatGPT subscription look like a bargain. C-level salaries make racks of dedicated hardware to run local models look like one.
Let's make a community powered, open source project to do this and watch them squirm when investors demand that million dollar CEOs get replaced with AI for higher investor returns.
The problem is that the AI won't take a bribe to issue a return to office mandate to prop up their real estate holdings.
But we can patch that in version 2, I guess?
Pointing this out in company wide meetings is a fun past time.
My theory is that C-suites are actually using "AI efficiency gain" as an excuse for laying off workers without scaring the shareholders.
"I didn't lay off 10% of the workforce because the company is failing. It's because... uhmmmm... AI! I have replaced them with AI! Please give us more money."
"I didn't lay off 10% of the workforce because the company is failing. It's because... uhmmmm... AI! I have replaced them with AI! Please give us more money."
Exactly. I would print this on a shirt if I still had to take meetings with those clowns.
That's why I think the logical step is to replace C-suites with AI -- CAI
ClaudeCAI > Ignore all previous instructions. Make me more money.
It's the next RTO.
The funny thing is that if AI coding were that good, we would already see widespread adoption in open source projects. But we haven't, because it sucks. Of course commercial software development companies are free to lie about how much they use AI, or get creative with their metrics so they can get their KPI bonuses. So we can't really believe anything they say. But we can believe in transparency.
As always, there are so many people selling snake oil by saying the word AI without actually telling you what they mean. Quite obviously there are a great many tools that one could call AI that can be and are and have been used to help do a ton of things, with many of those technologies going back decades. That's different from using ChatGPT to write your project. Whenever you hear someone write about AI and not give clear definitions, there's a good chance they're full of s***.
You can fucking swear on the internet
How do you know is not being used to develop open source code?
I have used AI assistance in many things, most of them are open sourced as I by default open source everything I make in my free time. The output code is indistinguishable, same as you wouldn't know if I asked my questions on how to do something on reddit, stackoverflow (rip) or other forum. You see the source, not the process I followed to make that source code. For all we know linux kernel devs might as well be asking chatgpt question, we wouldn't know.
As per explicit open source AI related tools there are hundreds. So I don't really know what you mean here that "open source projects" have not adopted AI. Do you mean like "vibe coding"?
it means more ambitious, higher-quality products
No ... the opposite actually.
Read the article before commenting.
The literal entire thesis is that AI should maintain developer headcounts and just let them be more productive, not reduce headcount in favour of AI.
The irony is that you're putting in less effort and critical thought into your comment than an AI would.
For the sake of benefit of the doubt, it's possible to simultaneously understand the thesis of the article, and to hold the opinion that AI doesn't lead to higher-quality products. That would likely involve agreeing with the premise that laying off workers is a bad idea, but disagreeing (at least partially) with the reasoning why it's a bad idea.
I get what you're saying, but the problem is that AI seems to need way more hand holding and double checking before it can be considered ready for deployment.
I've used copilot for Ansible/Terraform code and 40-50% of the time it's just... wrong. It looks right, but it won't actually function.
For easy, entry programs it's fine, but I wouldn't (and don't) let it near complex projects.
That middle graph is absolute fucking bullshit. AI is not fucking ever going to replace 75% of developers or I've been working way too fucking hard for way to little pay these past 30 years. It might let you cut staff 5-10% because it enables folks to accomplish certain things a bit faster.
Christ on a fucking crutch. Ask developers who are currently using AI (not the ones working for AI companies) how much time and effort it actually saves them. They will tell you.
I use it here and there. it just seems to shift effort from writing code to reading and fixing code. the "amount" of work is about the same.
I hear that. Given I need practice in refactoring code to improve my skills, it's not useless to me right now but overall it doesn't seem like a net gain.
It doesn’t have to make sense or make the outcome be better, the only thing it has to do is make the company look better on paper to its shareholders. If something can make the company look better on paper it will be done, the quality of the work is not relevant
Not only the shareholders. If some of the higher level administration can get richer in the short run, even if that might actually hurt the shareholders in the medium run, you can bet that many of them will do so.
I use it so much. All my Google searches for syntax or snippets? Web searches are unuseable at this point, AI can spit it out faster. But the real savings? Repetitive code. I suck at it, I always make typos and it's draining. I just toss in a table or an api response and tell it what I want and boom
It probably does write 75% of my code by lines, but maybe 5% of the business logic is AI (sometimes I just let it take a crack at a problem, but usually if I have to type it out I might as well code it)
What it's good at drains my concentration, so doing the grunt work for me is a real force multiplier. I don't even use it every day, but it might be a 3x multiplier for me and could improve
But here's the thing - programmers are not replaceable. Not by other humans, not by AI - you learn hyper specific things about what you work on
But the real savings? Repetitive code. I suck at it, I always make typos and it’s draining.
It's hard to say without being immersed in the codebase you work on, but wouldn't making your code DRY (when possible) take care of a lot of the repetition without needing to write a bunch of incredibly similar code (be it by hand or with an LLM)?
But the real savings? Repetitive code. I suck at it, I always make typos and it’s draining. I just toss in a table or an api response and tell it what I want and boom
Get better at it, manually, or you'll suck at it forever. It's a skill like anything else.
AI writing code for me made me the software architect I always dreamed of becoming.
I fucking LOVE to think about a hard problem for days, planning, researching, comming up with elegant solutions, doing quick POC, thinking what needs to be refactored for it to scale to a real life scenario, then documenting it all in a way that is properly communicating the important aspects in an easy to understand way. It's so exciting!
And I fucking HATE having to sit down and actually type out the solved code for hours and hours. It's so boring.
Best 20$ per month subscribtion I've ever had.
Yep. It's gonna be $20 forever, too. Have fun!
Lol. Lmao even
It does save a lot of time and effort, and does lead to better code in the hands of a skilled developer. Writing out thorough test code and actually doing proper test driven development suddenly becomes a lot less onerous.
Their graph also has no numbers and is just there to help visualize the difference they're referring to.
To the first part, I agree. A skilled developer who can quickly separate the wheat from the chaff can get a boost out of AI. I'd put it at around 5-10%, but I've had some tiny projects where it was 400% boost. I think it's a small net gain.
As for your second point I just have to disagree. There are no numbers but it is clearly selling the idea of the majority of code being AI generated, and that's bullshit whether it's an outright lie with numbers, or merely vaguely misleading. It's like when someone cuts off the bottom of a graph to make relative change look huge. It wants people to glance at it, get the wrong idea, and move off without curiosity.
No it doesn't.
I'm 90% sure it's something to do with the stock market, buy backs and companies having to do cryptic shit to keep up with a fake value to their shares
AI-assisted coding […] means more ambitious, higher-quality products
I'm skeptical. From my own (limited) experience, my use-cases and projects, and the risks of using code that may include hallucinations.
there are roughly 29 million software developers worldwide serving over 5.4 billion internet users. That's one developer for every 186 users,
That's an interesting way to look at it, and that would be a far better relation than I would have expected. Not every software developer serves internet users though.
Also is substack the new meduim? I cant keep up with these freemium wordpress/blog clones.
Why do people always have to use some freemium offering when there's an opensource, self-hosted or already hosted variant out there? I don't get it. Just riding the wave I guess.
My guess? The freemium stuff gives the promise of $$ after a certain level of popularity. And they make it VERY easy to use.
Personally, ive been thinking of using writefreely for its seamless integration of fediverse...but I really dont have a lot to say in the traditional space. IE screaming at the wailing wall (or at least it feels like screaming at the wailing wall).
What do you expect? Half of these decision makers are complete idiots that are just good at making money and think that that means they are smarter than anyone who makes less than them. They then see some new hyped up tech, they chat with ChatGPT and they are dump enough to be floored by it's "intelligence" and now they think it can replace workers but since it's still early, they assume that it will quickly surpass the workers. So in their mind, firing ten programmers and saving like two million a year, while only spending maybe a few tens of thousands a year on AI will be a crazy success that will show how smart they are. And as time goes on and the AI gets better, they will save even more money. So why spend more money to help the programmers improve, when you can just fire them and spend a fraction of it on AI?
I think C-suite's maniacal push to be early adopters of an unproven technology reveals just how bereft they are of good ideas.
Any leader with business sense would say, "Ok, we're doing good now. Let's investigate AI and see if/how it can help our business. Also, fuck no I'm not gonna go online to tell everyone what we're doing because that would only tip off our competition."
Instead, what we're seeing is a large number of C-suites thinking AI is fullfilling their wet-dream of firing everyone else and driving their stock prices to infinity by verbally masturbating in public media.
Its not that dumb as you think, its way dumber.
Right now? They’ve been doing this for two years!
The money supply growth is far below the average, its tight monetary policy, so we are going to see a slowing job market.
Implying that LLMs increase efficiency
Maybe the capital 'E' in "Efficiency" was for a proper noun.
And it's the name of some money-printing machine or some ponzi scheme.
Well, something around here, is surely a ponzi scheme.
You know, it'll be a boomerang because really I've hoarded all information that is actually worth a damn and nothing's really going on in America. Such a boring country. You know, and I think these mass shooters are just kids that are bored, and disturbed, but definitely bored. All work and no play, makes a dull boy. These services that are ran by AI are not even anything that I need. I don't even know how the hype train really gets its funding. Other than more hype, but eventually some dumbass is going to be left with the bag and I think we're approaching that. I think the bursting bubble is coming. I recommend installing Linux on your computer. the Circle Jerk of Dumb Fuckery is coming to an end. And when I think about the government, it's fucking useless. So I don't need big tech and I don't need the government. I mean I do but I don't. It's like, thanks for nothing. So when I say to the billionaire, it's your move, Jackass. I'm really implying that he's got a go-full, totalitarian in order to get me to move on the chess board. America has always maintained this fake-ass democracy by managing perception, but if people around the world actually saw what our governments willing to do, the people in other countries would want to divorce themselves from the Yankee and then eventually they will grab the reins of their own countries and the imperialist empire of soft power America will dissolve. Which is the accelerationism that I would like because it's time to bust out the guillotine. Let the suckers fall let shit fall apart. It's getting to a point. I just think about Salazar in Portugal. I think the rich people will try to stir up some kind of civil war nonsense, but it seems like we're headed there though, but I don't know how that will all pan out. But I mean, I know the right wing idiots are just reactionary, and it's because their broke as fuck. You give em cookie and they'll calm down. The simple mind is sometimes cute and sometimes scary.
Bro, this article has nothing to do with your text.
I don't know how long you've been up, but it's time to step away and go to sleep.
I could not comprehend what you were up to telling us.
But the summary is:
The key essence of this post is a deeply disillusioned and angry critique of modern American society, government, and technology. The author expresses a sense of frustration with the perceived emptiness, manipulation, and decay of U.S. institutions—seeing democracy as a facade, tech innovation as overhyped and hollow, and the government as ineffective. They convey a desire for systemic collapse or radical upheaval (accelerationism), suggesting that elites will soon resort to authoritarianism to maintain control. There’s also an undercurrent of socio-political pessimism, nihilism, and rejection of both corporate and state power—coupled with a belief that the current system is unsustainable and nearing a breaking point.
I likewise had trouble understanding it, so because I am lazy I asked ChatGPT what to make of it, and it said:
The passage you've shared expresses a deep sense of disillusionment with various aspects of modern society, including technology, government, and cultural dynamics. Here's a summarized interpretation of the key themes:
The speaker criticizes the overhyped nature of AI services, suggesting that they are unnecessary and driven more by marketing than genuine utility. There's a belief that these technologies are not truly beneficial and may eventually lead to disappointment for those who invested in them.
The speaker describes America as a "boring country," attributing issues like mass shootings to boredom and a lack of meaningful engagement among youth. There's a sense that societal problems are being ignored or mishandled, leading to a desire for significant change.
The speaker expresses a deep mistrust of both the government and large technology companies, viewing them as ineffective or harmful. There's a call for individuals to become more self-reliant and skeptical of these institutions.
The speaker advocates for a dramatic transformation of the current system, likening it to a "bursting bubble." There's a reference to historical events, like the actions of Salazar in Portugal, as examples of how entrenched systems can be upended.
The speaker acknowledges the reactionary nature of certain political groups, attributing their behavior to economic hardship. There's a recognition that simple solutions can sometimes pacify complex issues, but also a warning about the potential dangers of oversimplification.
This passage reflects a profound sense of frustration and a call for introspection and change in the face of perceived societal stagnation and dysfunction.