Tags: foss* + github* + llama.cpp*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. llm-tool provides a command-line utility for running large language models locally. It includes scripts for pulling models from the internet, starting them, and managing them using various commands such as 'run', 'ps', 'kill', 'rm', and 'pull'. Additionally, it offers a Python script named 'querylocal.py' for querying these models. The repository also come
  2. - create a custom base image for a Cloud Workstation environment using a Dockerfile
    . Uses:

    Quantized models from
  3. llama-cpp-python offers a web server which aims to act as a drop-in replacement for the OpenAI API. This allows you to use llama.cpp compatible models with any OpenAI compatible client (language libraries, services, etc).
    2023-06-09 Tags: , , , , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "foss+github+llama.cpp"

About - Propulsed by SemanticScuttle