Tags: llm* + cli*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. SuperCoder is a coding agent that runs in your terminal, offering features like code search, project structure exploration, code editing, bug fixing, and integration with OpenAI or local models.

    2025-03-31 Tags: , , , , , , by klotz
  2. Claude Code is an agentic coding tool by Anthropic that operates in your terminal, understanding and modifying your codebase through natural language commands. It streamlines development workflows by executing commands, fixing bugs, and managing Git operations without requiring additional servers or complex setup.

  3. Yoyak is a CLI tool powered by LLMs (Large Language Models) for summarizing and translating web pages. It supports various models and languages and can be installed via Deno or by downloading executables for Linux, macOS, and Windows.

    2025-03-03 Tags: , , , , , , by klotz
  4. Yoyak is a CLI tool that uses LLM to summarize and translate web pages. It supports various models and provides shell completion scripts.

    2025-03-03 Tags: , , , , , , by klotz
  5. Harbor is a containerized LLM toolkit that allows you to run LLMs and additional services with ease, featuring a CLI and a companion App for managing AI services.

    2025-02-14 Tags: , , , , by klotz
  6. LLM 0.17 release enables multi-modal input, allowing users to send images, audio, and video files to Large Language Models like GPT-4o, Llama, and Gemini, with a Python API and cost-effective pricing.

    2024-10-29 Tags: , , , , , , , , by klotz
  7. LM Studio has released lms, a command-line interface (CLI) tool to load/unload models, start/stop the API server, and inspect raw LLM input. It is developed on GitHub and is MIT Licensed.

    2024-10-22 Tags: , , , , , , by klotz
  8. Simon Willison explains how to use the mistral.rs library in Rust to run the Llama Vision model on a Mac M2 laptop. He provides a detailed example and discusses the memory usage and GPU utilization.

  9. code2prompt is a command-line tool (CLI) that converts your codebase into a single LLM prompt with a source tree, prompt templating, and token counting.

    2024-09-28 Tags: , , , by klotz
  10. A ruby script calculates VRAM requirements for large language models (LLMs) based on model, bits per weight, and context length. It can determine required VRAM, maximum context length, or best bpw given available VRAM.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "llm+cli"

About - Propulsed by SemanticScuttle