The future of web development is AI. Get on or get left behind.
The future of web development is AI. Get on or get left behind.

The future of web development is AI. Get on or get left behind.

The future of web development is AI. Get on or get left behind.
The future of web development is AI. Get on or get left behind.
What a glorious site. I wish every webpage looked something like this
What the fuck is Silverlight
Microsoft Flash. Netflix used it for a while. I don't remember anything else using it.
A bunch of Disney movie sites did for a while, back in the day when every movie had it's own website with trailers, promo, and a link to buy tickets and/or the DVD release.
The League of Legends launcher used it at one point. Not sure if it still does.
EA Tiburon in Orlando used flash for a while to do the menus in Madden and other sports games.
Be glad, you never had to interact with that 'technology'. I once did at an internship in 2016.
I'm not defending AI here, but "people have been wrong about other things in the past" is a completely worthless argument in any circumstance. See: Heuristics that Almost Always Work.
Is it worthless to say "(the current iteration of) AI won't be a huge revolution". For sure, it might be, the next decade will determine that.
Is it worhtless to say that many companies are throwing massive amounts of money at it, and taking huge risks on it, while it clearly won't deliver for them? I would say no, that is useful.
And in the end, that's what this complaint seems like for me. The issue isn't "AI might be the next big thing", but "We need to do everything with AI right now", and then in a couple of years when they see how bad the results are, and how it negatively impacted them, noone will have seen it coming...
I can't help but read this while replacing "rock" with "large language model"
Heuristics that almost always work. Hmm.
Interesting article, but you have to be aware of the flipside: "people said flight was impossible", "people said the earth didn't revolve around the sun", "people said the internet was a fad, and now people think AI is a fad".
It's cherry-picking. They're taking the relatively rare examples of transformative technology and projecting that level of impact and prestige onto their new favoured fad.
And here's the thing, the "information superhighway" was a fad that also happened to be an important technology.
Also the rock argument vanishes the moment anyone arrives with actual reasoning that goes beyond the heuristic. So here's some actual reasoning:
GenAI is interesting, but it has zero fidelity. Information without fidelity is just noise, so a system that can't solve the fidelity problem can't do information work. Information work requires fidelity.
And "fidelity" is just a fancy way of saying "truth", or maybe "meaning". Even as conscious beings we haven't really cracked that issue, and I don't think you can make a machine that understands meaning without creating AGI.
Saying we can solve the fidelity problem is like Jules Verne in 1867 saying we could get to the moon with a cannon because of "what progress artillery science has made during the last few years". We're just not there yet, and until we are, the cannon might have some uses, but it's not space technology.
Interestingly, artillery science had its role in getting us to the moon, but that was because it gave us the rotating workpiece lathe for making smooth bore holes, which gave us efficient steam engines, which gave us the industrial revolution. Verne didn't know it, but that critical development had already happened nearly a century prior. Cannons weren't really a factor in space beyond that.
Edit: actually metallurgy and solid fuel propellants were crucial for space too, and cannons had a lot to do with that as well. This is all beside the point.
Saying we can solve the fidelity problem is like Jules Verne in 1867 saying we could get to the moon with a cannon because of "what progress artillery science has made during the last few years".
Do rockets count as artillery science? The first rockets basically served the same purpose as artillery, and were operated by the same army groups. The innovation was to attach the propellant to the explosive charge and have it explode gradually rather than suddenly. Even the shape of a rocket is a refinement of the shape of an artillery shell.
Verne wasn't able to imagine artillery without the cannon barrel, but I'd argue he was right. It was basically "artillery science" that got humankind to the moon. The first "rocket artillery" were the V1 and V2 bombs. You could probably argue that the V1 wasn't really artillery, and that's fair, but also it wasn't what the moon missions were based on. The moon missions were a refinement of the V2, which was a warhead delivered by launching something on a ballistic path.
As for generative AI, it doesn't have zero fidelity, it just has relatively low fidelity. What makes that worse is that it's trained to sound extremely confident, so people trust it when they shouldn't.
Personally, I think it will take a very long time, if ever, before we get to the stage where "vibe coding" actually works well. OTOH, a more reasonable goal is a GenAI tool that you basically treat as an intern. You don't trust it, you expect it to do bone-headed things frequently, but sometimes it can do grunt work for you. As long as you carefully check over its work, it might save you some time/effort. But, I'm not sure if that can be done at a price that makes sense. So far the GenAI companies are setting fire to money in the hope that there will eventually be a workable business model.
No one can predict the future. One way or the other.
The best way to not be let behind is to be flexible about whatever may come.
Can't predict the future, but I can see the past. Specifically the part of the past that used standards based implementations and boring technology. Love that I can pull up html with elements using ALL CAPs and table aligned content. It looks like a hot mess but it still works, even on mobile. Plain text keeps trucking along. Sqlite will outlive me. Exciting things are exciting but the world is made of boring.
Once both major world militaries and hobbists are using it, it's jover. You can't close Pandora's Box. Whatever you want to call the current versions of "AI", it's only going to get better. Short of major world catastrophes, I expect it to drive not only technological advances but also energy/efficiency advances as well. The big internet conglomerates are already integrating it into search, and I fully expect within the next 5 years to have search transformed into an assistant-like chatbot (or something thereof).
I think it's shortsighted not to see the potential of accumulating society's knowledge and being able to present that to people in an understandable way.
I don't expect it to happen overnight. I'm not expecting iRobot or Android levels of consciousness any time soon, but the world is progressing toward the automation of many things - driven by Capital(ism) - which is powerful in itself.
(let me preach a little, I have to listen to my boss gushing about AI every meeting)
Compare AI tools: now vs 3 years ago. All those 2022 "Prompt engineer" courses are totally useless in 2025.
Extrapolate into the future and realize, that you're not losing anything valuable by not learning AI tools today. The whole point of them is they don't require any proficiency. It "just works".
Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.
Remember when "The Cloud" was going to put everyone in IT out of a job?
Naming it "The Cloud" and not "Someone else's old computer running in their basement" was a smart move though.
It just sounds better.
I don't think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it's a different set of skills, more similar to SDE than to system administrators.
I just think this is patently false. Or at least there are/were orgs where cloud costs so much more than running their own servers that are tended by maybe 1 FTE across a bunch of admins mostly doing other tasks.
Let me just point out one recent comparison - we were considering cloud backup for a couple petabytes of data, with a few hundred GB changing or adding / restoring every week or less. I think the best deal, where we held the software costs equal was $5/TB/Month.
This is catastrophically more expensive over a 10 year lifespan of a server or two and a small/mid sized LTO9 tape library and tapes. For one thing, we'd have paid more than the server etc in about a year. After that, tape prices have always tended down over time, and the storage costs for us for tape is basically $0 once in archive storage. We put it in a cabinet in another building - and you can fit A LOT of data in these tapes in a small room. That'll cost basically $0 additional for 20 years, forget about 10. So let's add in electricity etc - I still have doubts those will be over ~$100k over the lifetime of the project. Labor is about a wash cause you still need people to manage the backups to the cloud, and I think actually moving tapes might be ~.05 FTE in our situation. Literally anyone can be taught how to do it once the backup admin puts the tapes in the hopper or tells them which serial # to put in the hopper.
I also think that many companies are finding something similar for straight servers - at least it was in the news quite a bit for a while. Now, if you can be entirely cloud native - maybe it washes out, but for large groups of people that's still not possible due to controlling hardware (think factory,scientific, etc)or existing desktop software for which the cloud isn't really a replacement and throughput isn't great (think Adobe products, video, scientific, financial etc data).
And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something
Many of our customers store their backups in our "cloud storage solution".
I think they'd be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.
Yeah, AI is going to put some people out of work, but in turn will open lots of more specialized positions. And these positions that are lost could adapt to AI (for example, being part of the training instead of just being let go).
There is still difference.
Cloud was FOR the IT people. Machine learning is for predicting patterns following data.
Maybe stock predictors will adapt or replace but average programmer didn't have to switch to replit because it's "cloud IDE"
I mean, isn't that what "get on or get left behind" means?
It does not necessarily mean you'll lose your job. Nor does "get on" mean you have to become a specialist on it.
The post picks specifically on things that didn't catch on (or that only catched on for a period of time but were eventually superseeded), but does not apply it to other successful technologies.
I still think PWAs are a good idea instead of needing to download an app on your phone for every website. Like, for example, PWAs can easilly replace most banking apps, which are already just PWAs with added tracking.
They're great for users, which is why Google and Apple are letting them die from lack of development so apps can make them money.
Had to click through to change my downvote to an upvote, lol.
Non of those examples are relevant.
Those examples are specific tools or specific implementation pattern, AI in development is a tool.
It doesn't dictate how to write software or what the written code will look like, it's a tool that speeds up your code wiring. It catches typos and silly bugs that take hours to debug, it's able to generate useful unit tests, it can clean up and apply my code style way better than codemaid or resharper ever code, it's taken care of so much tedious shit and made software development fun again.
Vibe coding is not the future of development. If you aren't learning to use AI as a tool in development, you are going to be left behind.
It's more apt to compare it to IDEs. Sure, you can still write you entire app in vim and compile it in the terminal, but you would have been very foolish to deny the future of development was in IDEs.
You're describing exactly how all these web tools worked. "HTML, CSS, and JS are too hard to do manually. Here's a shiny new tool that abstracts all that away and lets you get right to making your site!" Except they all added additional headaches, security concerns, and failed to fill in edge cases, so you still need to know how to do all that HTML, CSS, and JS anyway. That's exactly how LLM generated code works now. It'll be useful and common for a while and then the technical debt will pile up and pile up and eventually everyone will look around and think "what the hell were we thinking" and tear it all down.
None of those examples are relevant.
They seem pretty relevant. Those things didn't go away, but they also didn't remove the need for programmers (the way their sales people said they would).
Pretty much everyone I work with uses vim, emacs, sublime, or vscode. I like IDEs and use them for.. well Java, but I wouldn't argue that they've made the other tools obsolete or you're a fool for sticking with the old ones. If it ain't broke and all that. It actually seems like more people are moving back to pluggable text editors over IDEs
I've used AI tools a bit. They've really helped drop in code that would previously just be a bunch of TODOs; they get you up and writing the core parts much faster to see if the idea even works. They've also really helped answer specific questions or lead me towards the answer. They've also straight up lied to me quite a bit. It's a weird tool.
I think the OP image is pretty wrong with the comparison it makes. LLMs/AI are a class of technology that are most definitely not going anywhere unless something dramatic happens. Some people, myself included, feel uneasy about the way they're created and the fact that people in powerful positions completely misunderstand them, and I think that leads to the hope that they're just a fad.
It is always hilarious and strange to see the buy-in on these things. We have a single coder in his late 60s that has bought in hard to spicy autocorrect. Meanwhile, the youngest on our team (like 22) won’t touch it with a 10 ft pole.
The other issue is just the morality of it. Do I know people that got rich on Bitcoin? Yes. Do I feel like they’re participating in a pyramid scheme still? Also yes. And with spicy autocorrect, where they got their training data for any and all of these models is so freaking morally bankrupt, and they’re desperate to paper over that and make it “ok” for businesses to use it.
As an old fart you can’t imagine how often I heard or read that.
You should click the link.
Hehe. Damn, absolutely fell for it. Nice 😂
Yeah but it's different this time!
I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.
I'd love to read a list of those instances/claims/tech
I imagine one of them was low-code/no-code?
/edit: I see such a list is what the posted link is about.
I'm surprised there's not low-code/no-code in that list.
"We're gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!"
Several months later...
"Well that was a complete waste of time."
You're right. It belongs on the list.
I was told several times that my programming career was ending, when the first low-code/no-code platforms released.
This technology solves every development problem we have had. I can teach you how with my $5000 course.
Yes, I would like to book the $5000 Silverlight course, please.
I'm skeptical of author's credibility and vision of the future, if he has not even reached blink tag technology in his progress.
<blink>
How dare they!</blink>
The future of web development is Angelfire.
it's funny, but also holy moly do I not trust a "sign in with github" button
Might I ask why? There are some pages where I see that as the least evil option, I.e. duckdns
Basically because my Github account has an important job, and I don't want to increase its attack surface by using it as a pseudo-Facebook
Good thing I hate web development
glorified autocomplete
Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don't feel like looking up. Any significant generation tends to go off the rails fast.
Getting it to format documentation for you seems to work a treat. Nothing too complex, just "move this bit here, split that into points".
If you use it basically like you'd use an intern or junior dev, it could be useful.
You wouldn't allow them to check anything in themselves. You wouldn't trust anything they did without carefully reading it over. You'd have to expect that they'd occasionally completely misunderstand the request. You'd treat them as someone completely lacking in common sense.
If, with all those caveats, you can get this assistance for free or nearly free, it might be worth it. But, right now, all the AI companies are basically setting money on fire to try to drive demand. If people had to pay enough that the AI companies were able to break even, it might be so expensive it was no longer worth it.
You sir haven't railed an entire ui out of your vibes up asshole
I've been using it to write unit tests, I still need to edit them to mock out some things and change a bit of logic here and there, but it saves me probably 50-75% of the time it used to take, just from not having to hand-write all that code.
I love how it fucks up closing braces/parentheses, some advanced tech right there.
I use it to discuss the pros and cons of potential refactorings, then laugh as it botches the implementation.
I use it to find easy to miss errors.
Thanks for summing it up so succinctly. As an aging dev, I've seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.
If you're not using Notepad, I don't even know what to tell you.
JEdit 4 life!
10/10. No notes.
It pains me so much when I see my colleagues pay OpenAI to do programming assignments.. they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.
I left 10 years ago, web development is shit.
I don't remember progressive web apps having anywhere near the level of fanfare as the other things on this list, and as someone that has built several pwas I feel their usefulness is undervalued.
More apps in the app store should be pwas instead.
Otherwise this list is great and I love it.
I love this
I will use AI to prompt AI to code for me, free money 🤑
I can see this partly being true in that it'll be part of a dev's toolkit. The devs at my previous job loved using it to do busy work coding.
"busy work coding" is that what you do when you try to look like you're working (like a real dev)?
Real world development isn't creating exciting apps all the time, it's writing the same exact boring convention based code sticking to an established pattern.
It can be really boring and unchallenging to create your millionth respiratory, or you can prompt your ide to create a new repo and with one sentence it will create stub out 10 minutes worth of tedious prep work. It makes programming fun again.
In one prompt, it can look at my finished code and stub out half decent documentation that otherwise wouldn't have been completed at. It does hallucinate sometimes, or it completely misunderstands the code, so you have to correct a few sentences, but the brain drain of coming to with the sentence structure to write useful documentation is completely lifted, and the code is now well documented.
AI programming is more than just vibe coding, and it's way more useful than everyone here insists it's not.
We're using it for closing security flaws identified by another tool. It's boring, unchallenging work that is nonetheless still important. It's also repetitive and uncreative enough that I'm comfortable having a machine do it.
There's still human review but when it's stuff like "your error messages should escape variables" or "write a longer function name" having a tool that can do most of the grunt work is valuable.
I agree that it will continue to be a useful tool. I've gotten a similar productivity boost using AI auto-complete as I did from regular auto-complete. It's also pretty good at identifiying potential uses with code, again, a similar productivity boost as a good linter. The chatbot does make a good sounding board, especially when you don't remember the name of the concept you are trying to implement or need to pro-con two solutions and you can't find articles about it.
But all these claims of 10x improvements in development speed are horse shit. Yeah, you might be able to shit out a 5-10,000 LOC tutorial app in an hour or two with prompt engineering, but try implementing a feature in a 100,000 LOC codebase and it promptly shits the bed: hallucinating internal frameworks, microservices, ignoring internal practices, writing straight up non-functional code, etc. I'd you spend enough time prompting it, you can eventually massage the solution you need out of it; problem is, it took longer to do that than writing the damn thing yourself.
Oh god the hate in this sub. It is definitely another tool for a dev to use. Like autocomplete or a lot of other stuff a good IDE does to help you. If you don't want to use it, fine. Perhaps you're such a pro that you don't need anything but a text editor. If you're not, and you're ignoring it for whatever petty reasons, you'll probably fall behind all the devs who learned how to use it to get more productive (or, in developer terms, lazier)
Agreed. Like it or not, old school auto complete was the same thing, just not as advanced. That being said, comment op probably didn't click the link.