Tags: llm* + localllama* + ollama*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. Ollama now supports HuggingFace GGUF models, making it easier for users to run AI models locally without internet. The GGUF format allows for the use of AI models on modest-sized consumer hardware.

    2024-10-24 Tags: , , , , by klotz
  2. This article guides you through the process of building a local RAG (Retrieval-Augmented Generation) system using Llama 3, Ollama for model management, and LlamaIndex as the RAG framework. The tutorial demonstrates how to get a basic local RAG system up and running with just a few lines of code.

    2024-06-21 Tags: , , , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "llm+localllama+ollama"

About - Propulsed by SemanticScuttle