LM Studio has released lms, a command-line interface (CLI) tool to load/unload models, start/stop the API server, and inspect raw LLM input. It is developed on GitHub and is MIT Licensed.
A ruby script calculates VRAM requirements for large language models (LLMs) based on model, bits per weight, and context length. It can determine required VRAM, maximum context length, or best bpw given available VRAM.
Pinboard is a command-line utility for managing file references during raw language model development. It aids in streamlining codebase workflows, offering efficient context-aware file updates.
Noema Research introduces Pinboard, a developer tool for improved productivity. Pinboard, a command-line tool, efficiently manages files and terminal references, enhancing development workflows. Key features include flexible pinning, contextual updates, clipboard integration, an interactive shell, and undo functionality.
The article argues that instead of developing numerous tools for LLM, giving it direct access to a terminal is more efficient and future-proof. It references Rich Sutton's "The Bitter Lesson" and discusses how the terminal's existing command-line tools can be utilized by LLM for various tasks, highlighting the importance of general methods over specialized tools.
Mixtral 8x7B:
Use llm-llama-cpp plugin.
Download a GGUF file for Mixtral 8X7B Instruct v0.1.
Run the model using llm -m gguf with the downloaded file.
Use large language models embedded in single-file executables from the command line to perform tasks like renaming images based on their visual content