Tags: llm* + localllama*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. This article guides you through the process of building a local RAG (Retrieval-Augmented Generation) system using Llama 3, Ollama for model management, and LlamaIndex as the RAG framework. The tutorial demonstrates how to get a basic local RAG system up and running with just a few lines of code.
    2024-06-21 Tags: , , , , , by klotz
  2. - The Open Interpreter repository provides a natural language interface for computers.
    - It enables users to interact with their computer systems through a chat-like interface in the terminal.
    - Open Interpreter supports various programming languages, including Python, Javascript, Shell, and more.
    - The repository offers installation instructions, usage examples, and an interactive demo.
  3. 2024-02-16 Tags: , , , , by klotz
  4. llm-tool provides a command-line utility for running large language models locally. It includes scripts for pulling models from the internet, starting them, and managing them using various commands such as 'run', 'ps', 'kill', 'rm', and 'pull'. Additionally, it offers a Python script named 'querylocal.py' for querying these models. The repository also come
  5. 2024-01-20 Tags: , , , , , by klotz
  6. 2023-12-31 Tags: , , , by klotz
  7. 2023-11-27 Tags: , , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "llm+localllama"

About - Propulsed by SemanticScuttle