Skip Navigation
20 comments
  • 50 million users have an extra 3 seconds of unnecessary lag in a day because you wanted to hit tab rather than write code? That's nearly 5 years of cumulative wasted time.

    As if anyone cared if they had to wait a total of 3 seconds in a workday. If it’s a second per user action, we’re talking, but this is some bare-metal CPU wrangler’s take on how ‘efficient’ code should behave; completely disregarding that most users who touch a computer need 5 seconds to type ‘hi’ into MS Teams.

    Most engineers already write bloated, abstracted, glacial code that burns CPU cycles like a California wildfire. Clean code? Ha! You’re writing for other programmers’ academic circlejerk, not the hardware.

    It’s interesting that everybody else preaches ‘Write for the human first, for the machine second’.

    • As if anyone cared if they had to wait a total of 3 seconds in a workday.

      That depends on when it appears. Some tasks kind of have to feel instantaneous, and there might be a pretty slim margin between okay and frustrating.

      But yeah, that's the kind of savings that mostly matter on the scale of regional or national grid planning.

      Most engineers already write bloated, abstracted, glacial code that burns CPU cycles like a California wildfire. Clean code? Ha! You’re writing for other programmers’ academic circlejerk, not the hardware.

      It’s interesting that everybody else preaches ‘Write for the human first, for the machine second’.

      Yeah, the author seems to lean too hard into the "programming is electronics" model, where the opposing end is "programming is math and formal logic"; most of us take some mixed view. And most of us have higher correctness requirements than what a reasonable effort in memory unsafe languages like C and C++ gives us, so we trade away some machine efficiency. In the authors parlance, most of us aren't interested in the demoscene circlejerk; we need to make tradeoffs between maintainability and everything else. Write-once code isn't good enough.

      There have been attempts at establishing a third pole of "promptgramming is natural language" or whatever ever since COBOL promised programming in plain English, but the ambiguity of natural language when used to encode a business logic machine means that a "sufficiently advanced compiler" will have to be extremely advanced, on the order of including the manager and the entire engineering methodology.

  • these "ai bad" posts are getting so tiring. they just feel like upvote farms these days.

    • Only sunshine and roses allowed? For all the Ai hype in the media and lot of people blindly following, its good to see and remind us the shortcomings. As long as it is done properly and honest, I have nothing against a "Pro" and a "Contra" article.

      • As long as it is done properly and honest, I have nothing against a “Pro” and a “Contra” article.

        Neither do I, personally. Though I am certainly less than inclined enjoy an article where the author is oddly preachy/"holier-than-thou", sayings things such as you're not a "real" programmer unless you sacrifice your health debugging segfaults at 3AM or have done the handmade hero challenge (certainly an interesting series to watch, but one that I have zero interest in replicating). Yet the author accuses copilot of having a superiority complex. I cannot say for sure, however I would assume if the article was in favor of AI rather than against, then there would definitely be comments about exactly this.

        The overarching tone of the article seems like if it were written as a direct comment toward a user instead, it would run afoul of beehaw's (and surely other instances') rules, or at the least come really close to skirting the line - and I don't mean the parts where the author is speaking of/to copilot.

    • I agree. I think it's driven by fear. I get it. I'm slightly afraid I won't have a job in 10 years (or at least a much worse paying one)...

      I'm still a much better programmer than AI today. But I don't cope with the fear by deluding myself into thinking that AI is useless and will stay useless.

      The feels a lot like portrait painters saying that photography will never amount to anything because it's blurry and black and white.

20 comments