0 bookmark(s) - Sort by: Date ↓ / Title /
The "LLM" toolkit offers a versatile command-line utility and Python library that allows users to work efficiently with large language models. Users can execute prompts directly from their terminals, store the outcomes in SQLite databases, generate embeddings, and perform various other tasks. In this extensive tutorial, topics covered include setup, usage, OpenAI models, alternative models, embeddings, plugins, model aliases, Python APIs, prompt templates, logging, related tools, CLI references, contributing, and change logs.
Mixtral 8x7B: Use llm-llama-cpp plugin. Download a GGUF file for Mixtral 8X7B Instruct v0.1. Run the model using llm -m gguf with the downloaded file.
Use large language models embedded in single-file executables from the command line to perform tasks like renaming images based on their visual content
Llamafile lets you distribute and run LLMs with a single file
The api_base key can be used to point the OpenAI client library at a different API endpoint.
The "LLM" toolkit provides a command-line utility and Python library for interacting with large language models. It enables users to run prompts from the terminal, store responses in SQLite databases, generate embeddings, and more. This comprehensive guide includes topics such as setup, usage, OpenAI models, other models, embeddings, plugins, model aliases, Python API, prompt templates, logging, related tools, CLI reference, contributing, and changelog.
openai-to-sqlite
tool now allows enriching data in a SQLite database using OpenAI's GPT3.5 model.sentiment
column in a messages
table using chatgpt()
function.First / Previous / Next / Last
/ Page 3 of 0