Tags: localllama* + llm* + self-hosted* + foss*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. llm-tool provides a command-line utility for running large language models locally. It includes scripts for pulling models from the internet, starting them, and managing them using various commands such as 'run', 'ps', 'kill', 'rm', and 'pull'. Additionally, it offers a Python script named 'querylocal.py' for querying these models. The repository also come

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "localllama+llm+self-hosted+foss"

About - Propulsed by SemanticScuttle