klotz: functions*

Bookmarks on this page are managed by an admin user.

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. Extract structured data from remote or local LLM models. Predictable output is essential for any serious use of LLMs.

    Extract data into Pydantic objects, dataclasses or simple types.
    Same API for local file models and remote OpenAI, Mistral AI and other models.
    Model management: download models, manage configuration, quickly switch between models.
    Tools for evaluating output across local/remote models, for chat-like interaction and more.
    No matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.
  2. 2024-02-23 Tags: , by klotz
  3. Getting LLMs to analyze and plot data for you, right in your web browser
    2024-01-25 Tags: , , by klotz
  4. The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). It provides a simple yet robust interface using llama-cpp-python, allowing users to chat with LLM models, execute structured function calls and get structured output.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: functions

About - Propulsed by SemanticScuttle