klotz: llm inference engine* + llm*

0 bookmark(s) - Sort by: Date ↓ / Title / - Bookmarks from other users for this tag

  1. picoLLM is a cross-platform, on-device inference engine optimized for running compressed large language models (LLMs) on various devices. It is compatible with Linux, macOS, Windows, Raspberry Pi OS, Android, iOS, and web browsers.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: Tags: llm inference engine + llm

About - Propulsed by SemanticScuttle