Beyond enshittification, why does tech oftentimes suck?
Sometimes I’ll run into a baffling issue with a tech product — be it headphones, Google apps like maps or its search features, Apple products, Spotify, other apps, and so on — and when I look for solutions online I sometimes discover this has been an issue for years. Sometimes for many many years.
These tech companies are sometimes ENORMOUS. How is it that these issues persist? Why do some things end up being so inefficient, unintuitive, or clunky? Why do I catch myself saying “oh my dear fucking lord” under my breath so often when I use tech?
Are there no employees who check forums? Does the architecture become so huge and messy that something seemingly simple is actually super hard to fix? Do these companies not have teams that test this stuff?
Why is it so pervasive? And why does some of it seem to be ignored for literal years? Sometimes even a decade!
Is it all due to enshittification? Do they trap us in as users and then stop giving a shit? Or is there more to it than that?
I worked at Google for over a decade. The issue isn't that the engineers are unaware or unable. Time and time and time again there would be some new product or feature released for internal testing, it would be a complete disaster, bugs would be filed with tens of thousands of votes begging not to release it, and Memegen would go nuts. And all the feedback would be ignored and it would ship anyway.
Upper management just doesn't care. Reputational damage isn't something they understand. The company is run by professional management consultants whose main expertise is gaslighting. And the layers and layers of people in the middle who don't actually contribute any value have to constantly generate something to go into the constant cycle of performance reviews and promotion attempts, so they mess with everything, re-org, cancel projects, move teams around, duplicate work, compete with each other, and generally make life hell for everyone under them. It's surprising anything gets done at all, but what does moves at a snail's pace compared to the outside world. Not for lack of effort, the whole system is designed so you have to work 100 times harder than necessary and it feels like an accomplishment when you've spent a year adding a single checkbox to a UI.
I'd think since companies get big enough they can just buy the promising competition before it becomes a problem, I'd say it's a worthwhile cost to them
I ran into a guy from high school and it turns out he worked for Microsoft back in the Windows Mobile days. He said that changing even a single button on a submenu would take six months of meetings, and if it involved other departments they would actively sabotage any progress due to the way MS internally made departments compete, so you could basically forget it. He said they literally backdoored software so they could sidestep other departments to get features in.
A corporate analogy/strategy is to block your competition from the market share.
For example, a company I used to work for would open accounts in non-viable/non-profitable locations so that our competition would not have the chance to get more market share.
Big corps don't give a shit if it works or not, as long as they are the biggest they can squeeze out anyone else, so they will launch whatever is trending (meta/threads) and bullshit thier way into another piece of the pie.
This is what you get when humans try to work beyond our monkeysphere. It's not "capitalism" or "greed" or any other such childish ideas. Groups that large cannot be efficient.
But the trick is having layers of monkey spheres! The ceo monkey has 20 directors below it and each of those has 20 people leading people so it all reports up and gets lost but is "good enough".
The difficulty of keeping something working scales exponentially as its complexity grows. Something of 1x complexity take 1y effort, but 2x complex is 10y effort, 3x complex is 100y, on and on.
Phones/computers/apps are at hilarious levels of complex now, and even 100k people running flat out can barely maintain the illusion that they "just work." Add enshittification heaping its intentionally garbage experience onto the unintentional garbage experience that is modern computing, and it's just gotten stupid.
Seriously. Millions of things have to go right for your consumer electronics or software experience work seemingly flawlessly. Think about the compounding probabilities of it. It’s a monument to human achievement that they work as well as they do.
It doesn't help that every new generation adds a new blackbox abstraction layer with little to no end-user benefit, the possibility of duplicated functionality and poor implementation, security concerns, poor support, and requiring a flashy new CPU with system crashing speed tricks to maintain a responsive environment through 12 levels of interpreters.
People who weren't interested in tech found out they could make a lot of money in the field. The scene went from nerds who were passionate about the field to people who would be just as (un)interested in being doctors and lawyers. The vibrancy is gone.
Source: tech-excited nerd who got into the industry in the late aughts.
I definitely agree about the vibe being different in the mid 90s to the early 00s. Lots of passion and energy about the tech. I don't think it's all gone but it's definitely nowhere near as intense.
Every single new "innovation" is literally locked behind a paywall, sometimes multiple, in tiers. You can't just "buy" anything anymore, you can only lease it, usually at exorbitant prices compared to not that long ago.
Why is it so pervasive? And why does some of it seem to be ignored for literal years?
Considering that you know that these problems have not yet been fixed, you must still be using these products despite these problems not yet being fixed and there's your answer: What would the motivation be to fix problems that aren't severe enough to make you stop using the product?
Speaking as a software engineer, it's usually a combination of things.
The root of all evil is that yes, fixing that thing doesn't just take one hour, as it should, but rather a few days. This is mostly preventable by having sufficient automated tests, high code quality and frequent releases, but it's a lot of work to keep up with. And you really need management to not pressure early feature delivery, because then devs will skip doing necessary work to keep up this high feature-delivery velocity.
Well, and as soon as such a small fix has a chance of taking more than a day or so, then you kind of need to talk to management, whether this should be done.
Which means probably another day or so of just talking about it, and a good chance of them saying we'll do it after we've delivered this extremely important feature, which usually means 'never', because there is always another extremely important feature.
This. Worked at a consulting firm doing e-commerce for a client. The client always pushed making changes on banners or promotional texts rather than fixing bugs.
There was an issue with the address validator in the checkout (why and how is irrelevant) and it was raised by the QAs, but we were told to fix it in the future, they didn’t see it as a priority, they preferred a checkout that worked most of the time an focus on adding a promo banner.
Now I work in a better place, working on product with stakeholders who don’t prioritise new things over fixing stuff, but we still need to fight to have time allocated for technical improvements that the benefits are not directly evident in the final product.
Are there no employees who check forums? Does the architecture become so huge and messy that something seemingly simple is actually super hard to fix?
👆I’m guessing this one is Microsoft. 👆
Apple I cannot explain. They were the gold standard of both brilliant UI and UX, as well as best in class customer support. Now I’m tearing my hair out over seemingly simple things (like their horrendous predictive text in iOS), and I don’t even have any hair.
Apple is a strange beast. I was at their space ship HQ getting interviewed, and the guy kept pointing random facts about it. Like, this particular wood was harvested in the winter so that made it better, or that entire segments can be siloed off, or that the full height glass walls of the cafeteria can be opened on pivots, and there was just so much effort in making sure things worked just right.
Meanwhile [this team] had to test software fixes for their product by provisioning ancient Mac mini's in a closet lab because they wanted to test the "full experience" and so every patch and update they had to do was painful and horribly tested. They all hated each other (which was obvious to me just from my time in their interviews, so it must have gotten really bad during the workday I imagine). Everyone seemed on edge all the time. Even the people in the hallways. But they were all super excited that they could order lattes from the iPads tethered to the break room countertops. And they had an apple orchard I guess. The idea of changing how they do what they do was completely unentertainable.
The whole experience felt surreal, like I had stepped into the world according to The Onion.
Their UX and UI are their bread and butter, but as someone who has done extensive web app development for use on Safari browsers, if I had a nickel for every time their browser just IGNORED a standard, broke one that previously worked, or added new "features" that broke a standard, passing the responsibility of building a workaround down to individual developers... I'd have a few dollars anyway. I don't have much faith their code is all that good compared to average under the hood and the UI, and I think their reputation unjustly leads users to turn a blind eye or give them a pass when their stuff DOESN'T work or works BADLY. "They're Apple... everyone else seems happy. I must be doing something wrong."
Well i for one experience Apple rage multiple times a week, but I’m so entrenched in their ecosystem, i may never escape. Also there is no better alternative that would be quick and easy to setup and maintain.
Apple is a victim of always having to build the new thing, so there's never time or resources to fix the old things. They can sometimes do an end run around this by re-releasing the same thing over again and pretending it's new, but then the cycle just begins anew
Aside from the effort required others have mentioned, there's also an effect of capitalism.
For a lot of their tech, they have a near-monopoly or at least a very large market share. Take windows from Microsoft. What motivation would they have to fix bugs which impact even 5-10% of their userbase? Their only competition is linux with its' around 4(?)% market share and osx which requires expensive hardware. Not fixing the bug just makes people annoyed, but 90% won't leave because they can't. As long as it doesn't impact enterprise contracts it's not worth it to fix it because the time spent doing that is a loss for shareholders, meanwhile new features which can collect data (like copilot for example) that can be sold generate money.
I'm sure even the devs in most places want to make better products and fight management to give them more time to deliver features so they can be better quality - but it's an exhausting sharp uphill battle which never ends, and at the end of the day the person who made broken feature with data collector 9000 built in will probably get the promotion while the person who fixed 800 5+ year old bugs gets a shout-out on a zoom call.
I’m not sure Windows is a good example here since they’re historically well known for backwards compatibility and fixing obscure bugs for specific hardware.
Whereas Linux famously always had driver support issues.
Backwards compatibility - yes I agree, it's quite good at it.
Hardware specific issues for any OSes - disagree. For windows that's 80-90% done by the hardware manufacturer's drivers. It's not through an effort from Microsoft whether issues are fixed or not. For Linux it's usually an effort of maintainers and if anything, Linux is famous for supporting old hardware that windows no longer works with.
But the point I was making is not to say Linux or osx is better than windows or vice versa, it's that windows holds by far the largest market share in desktops and neither of the alternatives are really drop-in replacements. So in the end they have no pressure on them to improve UX since it's infeasible to change OS for the majority of their users at the moment.
Arrogance. They're attitude is basically "we built it, so it's golden. If you can't understand why we did it this way, then put the device down and flip burgers".
I saw this starting around the year 2005. I spoke out about it and told people stop buying /using products that aren't logical and easy to use. If it takes a Google search and a YouTube video to figure out how to use it, then it was built wrong. Return the product and get a better one. No one listened to me. We have what we have.
It sucks and it will only get worse. People will not change. People will keep buying shit products, then bitch that the products suck. Instead of returning the crap, they will keep it. Because they keep it the companies have zero reason to change.
There's the compounding issue that something that seems simple on the surface, say, pairing a pair of bluetooth headphones, is a convoluted mess of super-complicated shit on a technical level.
And to even handle that, the engineer making the app that handles these does not know about how to sync an L and an R headpiece. And the person who knows about that does not know how to establish contact via bluetooth. Etc. It's layers upon layers upon layers of tricky technical stuff. Each of which has the ability to propagate buggy behavior both up and down the layers. And each engineer probably cannot easily fix the other layers (they're not theirs), so they work around the bugs. Over time this adds an insane amount of complexity to the code as hundreds of these tiny adjustments are spread everywhere.
It's a young field and we're still entrenched in the consequences of the sort of mistakes that, in a few hundred years, will become "those silly things people used to do because they didn't know better".
Daily reminder that the web is a mess of corpo bullshit piled on top of 90s tech and most OSes currently in use are culturally from the early 80s.
Is that a thing that goes away? I think a lot of fields still have that silly things being done even closing in a half millennia on the industrial revolution.
You still have tons of screw head sizes and types! Why such diversity!
The screw heads are mainly to prevent people from tampering with stuff they aren't supposed to unscrew. Hard drives, for example, all use the same star-shaped heads that most people don't have screwdrivers for.
I do think that people passionate about information technology – those who love it for the intrinsic awesomeness and not the money it brings – could break away with some of the legacy bullshit that holds back the quality of the software we use, if they were given the opportunity to defy software "tradition" and the profit motive. As of now, there is no systemic path forward, only occasional improvements incited by acute inadequacy of existing conventions for the growth of interested businesses.
Things like planned obsolescence and software blocks on things like farmers fixing tractors without John Deere’s software permission almost makes me think the bad guys won the Cold War.
Between me and a mechanic friend, we can fix my car but we can’t turn off the (wholly unnecessary) “inspection needed” noise without me spending $1000 on software. Apparently, the inspection needed warning isn’t even related to anything. It just comes on every x miles. The car doesn’t have a detected issue or anything. That beep is radicalizing me.
This is somewhat outside the box but as tech becomes easier, a lot of people tend to become weaker at certain tech skills. An example of this is directory management. A lot of folks don't organize their file structures nowadays, relying heavily on the search bar to find everything.
The management culture emphasizes a workflow that's heavy on low skill junior devs and cheap foreign labor of highly variable quality. You caaaaan do that well with infinite planning and QA and project management and test-driven design, but the reason you're trying to do it that way to begin with is you're an under qualified yes-man careerist dipshit trying to come in under budget and time, so you won't. And these are the wages of low wages.
Have you tried Google keyboard (gboard) lately? It made me want to break my phone and just not have one at all. It corrects proper words to other words that make the sentences don't make sense. It corrects words that are already correct and it ignores the misspelled words. It wants to speak for me. They think they're making us type faster with their predictive text, but I was re-reading every thing I put on the internet. I became slower. Thankfully I found a worse keyboard, but it doesn't autocorrects as much and I'm ok with that. Fuck Google.
I didn't know about this keyboard actually. Just installed it and it's great. Only issue with it that it only supports English. I guess I'll use it for English only. Thank you so much.
EDIT: never mind. It does support other languages. All set now.
Something I've noticed in places I've work that aren't small, whoever has talent gets promoted into being half the time in meetings at best, and at worse into managing teams and working by Outlook.
Agile has poisoned software development to the point where it's fine to ship shit products that can be fixed post-release, which of course gives stakeholders and execs the reasons to tie performance and bonuses to shipping, as opposed to routine stable operations.
I don't know if going back to Waterfall is the right fix, but something has to change. Shipping crap is the new normal. If programmers organize to fight for better wages and conditions, we absolutely must fight to hold management responsible for code quality. Get us additional hours for unit and behavioral testing, assessing and tackling technical debt, and so on.
I’ve submitted at least 8 bug reports to them since Oct 2023 (and also many suggestions) through their feedback app. No response to any of them until now. The only closed bugs I closed myself because the problem went away in an update.
I’m pretty sure they don’t have any bug triager whatsoever.
I’ll keep doing it out of spite and because it’s what I do for open-source as well, but I’m really not sure if it has any effect at all.
Monopolization. If you have become the standard, there's no reason to improve.
Technological advancement. If the speed of new processors continue to double every year, why bother optimizing your program?. This pisses me off so much, games don't look much better but are 4x harder to run compared to 8 years ago.
Cost. Having many programmers, and bug testers on payroll to improve your product is expensive. Massive companies are pennywise pound foolish and will hack and slash at their staff line up until catastrophe strikes (which usually only occurs long after the layoffs)
Everyone else has great points about complexity, but there is an additional issue which is the constant desire for change keeping products from being refined and perfected.
Any product will have small changes that improve it, like reinforcing points of failure specific to that design. Let's take a kitchen knife, the kind chefs use. Some manufacturers have the exact same model produced for decades, with ever so slight variations on angles, handles, and so on as they refined design. Now they are high quality if they keep the production going, and that is something that has no moving parts! These knives continue to sell because they are used constantly, can break or be damaged, and new restaurants open all the time requiring a constant supply of knives.
The home knife market does not have the same pressure for reliability because people don't use them all day every day like a chef. Instead, companies are constantly changing designs to sell new versions to the same over saturated market that prizes form over function. They change the handles slightly, make a change to the blade, and sometimes these changes make the knife worse but they can slap a 'new and improved' sticker on the label as long as something changed.
The same thing happens with technology except complex systems have even more refinement needed while the companies are also trying to change things just to change them in the pursuit of the 'new and improved' market. Moving menus around, changing orders of things, making things look flashy are all side effects of tech being afraid of selling the same thing for an extended period of time because people want something new and shiny to replace what they had. Time and effort is spent on changing things, and it is hard to do bug fixes while also creating something new that might make a bunch of old bugs obsolete. Oh, and they will also be spending their time trying to patch critical vulnerabilities, because that might keep someone from buying their next thing.
So all the effort going into changing things, often making them worse if they happened to stumble into a useful design already, and they put all of their focus on that change and vulnerabilities so they don't have time to fix usability issues or do the things that would make their product better because why bother as long as people are buying? Anything someone who is knowledgeable about being fixed is unlikely to be a priority because the regular user probably hasn't even noticed and they are the ones who are going to buy the next version. That is why things like bluetooth continues to suck, because it works well enough to sell more things and doing it right would take more effort. The handy feature that you used to like being removed? They felt it needed to change just to change and whoever provided input or feedback came up with this instead.
Oh, and all of this was just talking about available time spent doing things but on top of that they want to spend as little as possible so they get the cheap parts that are made by companies who also make a product just good enough that they get more customers to buy their parts for as little cost to produce as possible.
TLDR: market pressures favor changing things constantly which introduces more design flaws and capitalist pressures focus on revising designs to sell more and security flaws so as long as it sells it doesn't matter if it has shitty usability and minor flaws are never fixed
There was an article by Google about the security of their code base, and one of their core findings was that old code is good, as it gets refined and more free of bugs over time. And of course conversely, new code is worse.
Sometimes it's a solution in search of a problem. Usually that'll be some startup that really wants Google (or somebody) to either buy them out or shovel millions of venture capital money at them. VC that would be better used for anything that housing homeless people, feeding the hungry, or hell just burning to stay warm.
Most tech sucks because it's closed source. Closed source products are typically made with "the least amount of work done to sell for the most amount of buck". So standards are only sloppily and partially implemented (or sometimes purposefully badly or differently to ensure incompatibility), and bugs after sale won't be fixed because why would they? They already have your money. Middle managers will work hard to ensure more money goes to advertising and marketing than to actual development.
Then there is the embrace, expand, extinguish mentality (hello Microsoft!) to force customers to stay around their shitty products. Microsoft 365 and teams shit are perfect examples. The company I work at currently uses it and it's beyond garbage shit that is expensive as hell. Not an hour goes by without me being confronted by bad design, bugs, bugs, bugs, so many bugs... And it's all designed to ensure you stay in their little walled garden. I can't change this today, but I'm planning to be rid of it in about a year from now, fingers crossed.
In my experience, open source software is fucking awesome because people built it to actually build something awesome. Standards are implemented to the letter, bugs are fixed, and it all works and looks awesome.
Tge problem is money. The incentive if to make as much money as they can, not the company. Company loyalty has completely been blown up by companies, so now not even the ceo gives a fuck, he'll be running another company with a 10% raise this time next year.
This is a topic that could be a novel for how much there is to consider, but in the end it comes down to resources and companies trying to choose what it best for the company overall. For a company to do anything, they are giving up many other things they could be doing instead. Whether it is limited budgets, limited personnel, or company priorities every decision made is always a tradeoff that means you aren't doing something else.
Most companies prioritize releasing new product so they can start getting revenue from it as soon as possible. A new product has the largest potential market, and thus makes shareholders happy to see revenue coming in. The sales from a new product are the easiest ones in most product's lifecycle. Additionally. releasing new products helps keep you ahead of competitors. So ongoing maintenance work is de-prioritized over working on new things.
The goal of testing is to simulate potential use cases of a product and ensure that it will work as expected when the customer has the product in their hands. It is impossible to fully test a product in a finite amount of time, so tests are created that expose flaws within a reasonable search space of the expected uses. If an issue is found then it needs to be evaluated about whether it is worth fixing and when. There are many factors that affect this, for example:
How much would it cost to fix?
How much time would it take to fix?
Does it need to be fixed for launch or can it be a running change?
How many customers are actually going to see the issue? Is it just a small annoyance for them or will it cause returns/RMAs?
Is it within the expected use case of the product?
Can we mitigate it in software/firmware instead of changing hardware?
Is it a compliance/regulatory issue?
Would this bring in new customers for the product?
Was this done a specific way for a reason?
Unfortunately, after considering all this the result is often that it isn't worth the effort to fix something, but it is considered.
I think that manufacturers of tech products test their products only with a few standard configurations - but in reality there are too many possible combinations of different configurations:
Take a bluetooth mouse for example. Generally, it connects to a computer and it works. Now imagine that you have a different configuration - a logicboard in your laptop that has not been tested by the manucacturer of the mouse or an obscure model of the bluetooth reciever, that also hasn't been tested to work with that mouse. Your mouse works well in the beginning, but disconnects at random times. You can't pinpoint the issue, and when you are looking for help online, nobody seems to have the same problems with that mouse.
In this case, said mouse sucks, because it doesn't function reliably. A different person with a different configuration of their computer (different logicboard, different model of the bluetooth unit) might have no problems at all with the same mouse.
enshitification is based on the ease of moving profits from users to creators then from creators to shareholders in a digital service economy all the while degrading the service for the users and then the creators as the profit fulcrum.
so enshitification might be a different thing than the reality around manufacturing items in an international environment which requires design decisions that later require revising because not all materials are available from everyone in the way a design is called for. and finding people that can assemble things while receiving a wage that they can live so that a company can make a profit requires compromises. and that is just two tiny points in not including shipping and workspaces and insurance et cetera
it is hard, yo. in a not a one part is inconceivable hard but in a it gets complicated pretty quickly type of hard.
We tend to forget that all of that is to support people. Tech shouldn’t be an end goal, merely one of the ways to achieve it. And not always the best one at that.
Once they go massive and have shareholders....the CEO is beholden to them exclusively. They do what they want, say what they want to hear.. The minute they don't the board can vote to remove them and replace them with someone who will