Skip Navigation

Posts
38
Comments
884
Joined
2 yr. ago

  • please don’t anthropomorphize venture capitalists like this

    1. running pollution -> profit machine straight from captain planet episode

    that is part of their mission statement, yes

  • hang around more LLM fans, you’ll see bigger assholes

  • holy fuck. since I guess it’s the week when people wander in here and don’t understand fuck about shit:

    we don’t want your stupid shit on our lemmy instance. none of us like the tech you’re in here pushing, even the openwashed version. now fuck off.

  • What is this “no you” that you’re trying to do?

    the penny’s never gonna drop, is it? let me save you the embarrassment

  • time for you to fuck off of “this site”

  • do you?

  • i love spock hes my favorite part of the vulcan api

  • The man probably went insane after psychedelic use, and I have never noticed @BasedBeffJezos to advocate for fixing the system by shooting individual executives. It's a great shot at drawing a plausible-sounding connection; but I think it's not valid criticism.

    wait I’m confused, to be a more effective TESCREAL am I not supposed to be microdosing psychedelics every day? you’re sending mixed signals here, yud (also lol @ the pure Ronald Reagan energy of going “yep obviously drugs just make you murderously insane” based on nothing but vibes and the need to find a scapegoat that isn’t the consequences of your own ideology)

  • your regular reminder that the guy with de facto ownership over the entire Rust ecosystem outside of the standard library and core is very proud about being in Peter Thiel’s pocket (and that post is in reference to this article)

    e: on second thought I’m being unfair — he owns the conferences and the compiler spec process too

  • oh I’m absolutely keeping this article around for the next time some fuckhead tries to call me paranoid for correctly calling out some utterly obvious shit as 1024 bots coordinated by 3 guys on discord with shit to stir

  • Since 2010 we have been working with what I call augmented intelligence. So I am now working with the team behind the scenes to create truly the AI version. Not AI in the sense of making up stories — but imagine if you now take, whether it be news or opinion, and you have a bias meter so that whether the news or the opinion, more likely the opinion or the voices, you have a bias meter, so somebody could understand as a reader that the source of the article has some level of bias.

    they let this fucking guy near transplant patients?

  • also, tech billionaires terrified of a peasant uprising can turn their homes into fucking military bases with a home security system named (of course) Sauron

  • 404media:

    In the aftermath of an LGBT hate incident, the then-CEO of cloud computing giant Digital Ocean told upset staff his mentor was a member of the KKK as an attempt to explain why they must bend their values because "we love the company"

    presented without comment

  • (Obligatory, “oh thank God it’s not the game engine”)

    that was my exact reaction when the thread popped up — it took me a couple seconds to realize the article was authored by Ed and not some asshole in the gaming-to-fascism pipeline still upset because the Godot engine rightfully bans assholes from their collaborative spaces

  • The government worries Ireland may lose its position as Europe’s top data center location. Frankfurt in Germany is currently second but has similar power problems, exacerbated by recent AI demand.

    it’s so weird how this keeps happening, but the corporations behind it claim to be sustainable and people keep believing it

  • This is obviously insane, the correct conclusion is that learning models cannot in fact be trained so hard that they will always get the next token correct. This is provable, and it’s not even hard to prove. It’s intuitively obvious, and a burly argument that backs the intuition is easy to build.

    You do, however, have to approach it through analogies, through toy models. When you insist on thinking about the whole thing at once, you wind up essentially just saying things that feel right, things that are appealing. You can’t actually reason about the damned thing at all.

    this goes a long way towards explaining why computer pseudoscience — like a fundamental ignorance of algorithmic efficiency and the implications of the halting problem — is so common and even celebrated among lesswrongers and other TESCREALs who should theoretically know better

  • my colleagues are kind, caring people & they were attacked (idc if I get attacked so long as it doesn't touch my company/colleagues) we've always seen love for our work, this incident shocked me

    we'll keep shipping 📦💗 can't satisfy all

    Don't take out your frustration from election results on them, LOSERS

    it’s really jarring seeing one of the biggest hosts for generative AI projects simultaneously do “we’re just an uwu smol bean open source passion project why are you attacking us” while boosting and officially supporting chan-coded fash shit from an e/acc account