Skip Navigation
70 comments
  • At least anecdotally, Andreas over at 82MHz.net tried running a AI model locally on his laptop and it took over 10 minutes for just one prompt.

    OK just the 4th sentence clearly shows this person has no clue what they're talking about.

    • Yep, clueless. I stopped reading at that point. For the audience, large language models come in all sizes and you can run some small but useful ones fairly quickly even without a GPU. They keep getting more capable for the size as well. Remember the uproar about Deepseek R1? Well, progress hasn’t stopped.

      • It's not even that. It's like trying to run an AAA game on a 10 year old laptop and complaining the game is garbage because your frame rates are too low.

70 comments