TIL about Computer Science
- Rust and Neovim - A Thorough Guide and Walkthroughrsdlt.github.io Rust and Neovim - A Thorough Guide and Walkthrough
Edit: Some readers mentioned an issue with the example Lua code used to configure the simrat39/rust-tools.nvim plugin; that configuration code has been updated with the example configuration recommended in the plugin page as of the date of this edit. Thanks to Nazar Toakarak for letting me know. Rea...
- Mozilla-Ocho/llamafile: Distribute and run LLMs with a single file.github.com GitHub - Mozilla-Ocho/llamafile: Distribute and run LLMs with a single file.
Distribute and run LLMs with a single file. Contribute to Mozilla-Ocho/llamafile development by creating an account on GitHub.
- Local LLaMA Server Setup Documentationgithub.com GitHub - varunvasudeva1/ollama-server-docs: Documentation on setting up an LLM server on Debian from scratch, using Ollama, Open WebUI, and OpenedAI Speech.
Documentation on setting up an LLM server on Debian from scratch, using Ollama, Open WebUI, and OpenedAI Speech. - varunvasudeva1/ollama-server-docs
- Running Local LLMs, CPU vs. GPU - a Quick Speed Testdev.to Running Local LLMs, CPU vs. GPU - a Quick Speed Test
May 12 Update Putting together a table with all the results from the comments. Putting at the top...
loginwall at Dev.to
1 Active user
Next