Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TE
Posts
0
Comments
795
Joined
2 yr. ago

  • DDR5-6000/CL28 should be fine. Make sure to enable the XMP/EXPO profile in your BIOS after installing it.

    You can follow hardware reviewers like GamersNexus, LTT, HardwareUnboxed, etc if you want to stay up to date (which is what I do), or look at their content if you just want a review for a product you're looking at.

  • It's less of an issue now, but there were stability issues in the early days of DDR5. Memory instability can lead to a number of issues including being unable to boot the PC (failing to post), the PC crashing suddenly during use, applications crashing or behaving strangely, etc. Usually it's a sign of memory going bad, but for DDR5 since it's still relatively young it can also be a sign that the memory is just too fast.

    Always check and verify that the RAM manufacturer has validated their RAM against your CPU.

  • Air cooling is sufficient to cool most consumer processors these days. Make sure to get a good cooler though. I remember Thermalright's Peerless Assassin being well reviewed, but there may be even better (reasonably priced) options these days.

    If you don't care about price, Noctua's air coolers are overkill but expensive, or an AIO could be an option too.

    AIOs have the benefit of moving heat directly to your fans via fluid instead of heating up the case interior, but that usually doesn't matter that much, especially outside of intense gaming.

  • Very few things need 64GB memory to compile, but some do. If you think you'll be compiling web browsers or clang or something, then 64GB would be the right call.

    Also, higher speeds of DDR5 can be unstable at higher capacities. If you're going with 64GB or more of DDR5, I'd stick to speeds around 6000 (or less) and not focus too much on overclocking it. If you get a kit of 2x32GB (which you should rather than getting the sticks independently), then you'll be fine. You won't benefit as much from RAM speed anyway as opposed to capacity.

  • The latest round of "stuff I wasn't informed would be installed for me" included enough software to switch me to Linux. I'm still dual booting during the transition, but moving fully over when I can.

    I honestly used to love Windows too. Windows 10 was great, and 11 had problems but was still very usable on the happy path and came with some great improvements over time. These days, it's just so full of bloatware. I just want my damn computer to be mine, and I'd hope an OS license that retails for $200 would be enough to get them to stop advertising to me and shoving shit down my throat but I guess not.

    Word and Powerpoint are good too, but there's some real competition there these days. I haven't needed those on my personal PC in years though, so that's never been a problem for me, and it'll continue to not be a problem as long as that software continues to require a subscription.

  • We have wild ducks around us and it's always fun to see them waddling with the ducklings across a walkway or through our neighborhood. They're super friendly too and will walk right by you, though being wild I haven't tried to approach any of them.

  • ...dangerous online material is perpetuating the growing epidemic of violence against women and girls...

    Serious question because I actually don't know the answer, but is there a proven link between this kind of content and real violence? I would hope a person would know that strangling people is not okay and there's such things as safe words/actions, but I've expected too much of society in the past.

  • AI at Microsoft has not been optional for months.

    This is org-specific and role-specific, but it's becoming more and more pushed onto people (as is evident by this article).

    when they need to find someone's department

    This information is both present in most internal communication tools (org chart), and in the internal directory. Hopefully your friend found it.

    Everything else sounds horrible, and I hope your friend is doing better now.

    (Sauce: "I know a guy")

  • Quoting the analysis in the ruling:

    Authors also complain that the print-to-digital format change was itself an infringement not abridged as a fair use (Opp. 15, 25).

    In other words, part of what is being ruled is whether digitizing the books was fair use. Reinforcing that:

    Recall that Anthropic purchased millions of print books for its central library... [further down past stuff about pirated copies] Anthropic purchased millions of print copies to "build a research library" (Opp. Exh. 22 at 145, 148). It destroyed each print copy while replacing it with a digital copy for use in its library (not for sharing nor sale outside the company). As to these copies, Authors do not complain that Anthropic failed to pay to acquire a library copy. Authors only complain that Anthropic changed each copy's format from print to digital (see Opp. 15, 25 & n.15).

    Bold text is me. Italics are the ruling.

    Further down:

    Was scanning the print copies to create digital replacements transformative? [skipping each party's arguments]

    Here, for reasons narrower than Anthropic offers, the mere format change was fair use.

    The judge ruled that the digitization is fair use.

    Notably, the question about fair use is important because of what the work is being used for. These are being used in a commercial setting to make money, not in a private setting. Additionally, as the works were inputs into the LLM, it is related to the judge's decision on whether using them to train the LLM is fair use.

    Naturally the pirated works are another story, but this article is about the destruction of the physical copies, which only happened for works they purchased. Pirating for LLMs is unacceptable, but that isn't the question here.

    The ruling does go on to indicate that Anthropic might have been able to get away with not destroying the originals, but destroying them meant that the format change was "more clearly transformative" as a result, and questions around fair use are largely up to the judge's opinion on four factors (purpose of use, nature of the work, amount of work used, and effect of use on the market).

    The print original was destroyed. One replaced the other. And, there is no evidence that the new, digital copy was shown, shared, or sold outside the company. [The question about LLM use is earlier in the ruling] This use was even more clearly transformative than those in Texaco, Google, and Sony Betamax (where the number of copies went up by at least one), and, of course, more transformative than those uses rejected in Napster (where the number went up by "millions" of copies shared for free with others).

    ... Anthropic already had purchased permanent library copies (print ones). It did not create new copies to share or sell outside.

    TL;DR: Destroying the original had an effect on the judge's decision and increased the transformativeness of digitizing the books. They might have been fine without doing it, but the judge admitted that it was relevant to the question of fair use.

  • The books were purchased and destroyed to digitize them. There is nothing wrong with digitizing a work. The books were destroyed because duplicating a work without permission is illegal, but destroying the original means that there is only one copy in the end still.

    The LLM training is the problem. This is not.

  • I never really got into commander either, but the bracket system makes it more appealing to me at least. No need to worry that someone's just going to go off turn 2 when going against a sand tribal deck, a madness deck, and a mono green control deck.

    Either way, standard always interested me the most, but they absolutely shafted standard so now if I play it's basically just brawl these days because of the matchmaker.

    Draft is fun as well, just expensive to get into :(

  • Quoting OpenAI:

    Our goal is to make the software pieces as efficient as possible and there were a few areas we wanted to improve:

    • Zero-dependency Install — currently Node v22+ is required, which is frustrating or a blocker for some users
    • Native Security Bindings — surprise! we already ship a Rust for linux sandboxing since the bindings were available
    • Optimized Performance — no runtime garbage collection, resulting in lower memory consumption
    • Extensible Protocol — we've been working on a "wire protocol" for Codex CLI to allow developers to extend the agent in different languages (including Type/JavaScript, Python, etc) and MCPs (already supported in Rust)

    Now to be fair, these dashes scream "LLM generated" for their entire post. Regardless, if these really are their reasons:

    • Zero-dependency Install - there are a ton of languages where this is true. Self-contained installs are possible in Python, C#, Rust, C++, etc. Go is one option here too, but doesn't really provide anything more than the rest of these languages.
    • Native Security Bindings - they supposedly already do this in Rust
    • Optimized Performance - this seems overblown on their part in my opinion, but no runtime GC seems to constrain us to C, C++, Rust, and some others like Zig. Go, C#, Python, etc all do runtime GC. Regardless, in my opinion, runtime GC doesn't actually matter, and all of these options have enough performance (yes even Python) for what they need.
    • Extensible Protocol - this is doable in many languages. It seems to me here that they already have some work in Rust that they want to use.

    As for the difficulty in making a CLI, clap makes CLIs dead simple to build with its derive macro. To be clear, other languages can be just as easy (Python has a ton of libraries for this for example including argparse and Typst).

    Personally, if I were to choose a language for them, it'd be Python, not Go. It would have the most overlap with their users and could get a lot more contributors as a result in my opinion. Go, on the otherhand, may be a language their devs are less familiar with or don't use as much as Rust or other languages.