Use Callbacks to send Output Data to Posthog, Sentry, etc. LiteLLM provides input_callbacks, success_callbacks, and failure_callbacks to easily send data based on response status.
LiteLLM is a library to deploy and manage LLM (Large Language Model) APIs using a standardized format. It supports multiple LLM providers, includes proxy server features for load balancing and cost tracking, and offers various integrations for logging and observability.
This is a GitHub repository for a Discord bot named discord-llm-chatbot. This bot allows you to chat with Large Language Models (LLMs) directly in your Discord server. It supports various LLMs, including those from OpenAI API, Mistral API, Anthropic API, and local models like ollama, oobabooga, Jan, LM Studio, etc. The bot offers a reply-based chat system, customizable system prompt, and seamless threading of conversations. It also supports image and text file attachments, and streamed responses.