llm-tool provides a command-line utility for running large language models locally. It includes scripts for pulling models from the internet, starting them, and managing them using various commands such as 'run', 'ps', 'kill', 'rm', and 'pull'. Additionally, it offers a Python script named 'querylocal.py' for querying these models. The repository also come
- create a custom base image for a Cloud Workstation environment using a Dockerfile
. Uses:
Quantized models from
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). It provides a simple yet robust interface using llama-cpp-python, allowing users to chat with LLM models, execute structured function calls and get structured output.
Chatbot that utilizes Wikipedia data to enhance its factual precision. Key aspects include the use of large language models, retrieval of accurate information from a reliable source, and addressing the challenge of maintaining factual consistency within conversational AI systems.
Ask Questions in natural language and get Answers backed by private sources. Connects to tools like Slack, GitHub, Confluence, etc.