Tags: python* + llm*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. This document details how to run Qwen models locally using the Text Generation Web UI (oobabooga), covering installation, setup, and launching the web interface.

  2. Model Context Protocol server to run Python code in a sandbox using Pyodide in Deno, isolated from the operating system.

    2025-04-06 Tags: , , , , , , , by klotz
  3. This article details a method for training large language models (LLMs) for code generation using a secure, local WebAssembly-based code interpreter and reinforcement learning with Group Relative Policy Optimization (GRPO). It covers the setup, training process, evaluation, and potential next steps.

  4. A popular and actively maintained open-source web crawling library for LLMs and data extraction, offering advanced features like structured data extraction, browser control, and markdown generation.

  5. This repository organizes public content to train an LLM to answer questions and generate summaries in an author's voice, focusing on the content of 'virtual_adrianco' but designed to be extensible to other authors.

  6. This Splunk Lantern article outlines the steps to monitor Gen AI applications with Splunk Observability Cloud, covering setup with OpenTelemetry, NVIDIA GPU metrics, Python instrumentation, and OpenLIT integration to monitor GenAI applications built with technologies like Python, LLMs (OpenAI's GPT-4o, Anthropic's Claude 3.5 Haiku, Meta’s Llama), NVIDIA GPUs, Langchain, and vector databases (Pinecone, Chroma) using Splunk Observability Cloud. It outlines a six-step process:

    1. Access Splunk Observability Cloud: Sign up for a free trial if needed.
    2. Deploy Splunk Distribution of OpenTelemetry Collector: Use a Helm chart to install the collector in Kubernetes.
    3. Capture NVIDIA GPU Metrics: Utilize the NVIDIA GPU Operator and Prometheus receiver in the OpenTelemetry Collector.
    4. Instrument Python Applications: Use the Splunk Distribution of OpenTelemetry Python agent for automatic instrumentation and enable Always On Profiling.
    5. Enhance with OpenLIT: Install and initialize OpenLIT to capture detailed trace data, including LLM calls and interactions with vector databases (with options to disable PII capture).
    6. Start Using the Data: Leverage the collected metrics and traces, including features like Tag Spotlight, to identify and resolve performance issues (example given: OpenAI rate limits).

    The article emphasizes OpenTelemetry's role in GenAI observability and highlights how Splunk Observability Cloud facilitates monitoring these complex applications, providing insights into performance, cost, and potential bottlenecks. It also points to resources for help and further information on specific aspects of the process.

  7. This document details how to use function calling with Mistral AI models to connect to external tools and build more complex applications, outlining a four-step process: User query & tool specification, Model argument generation, User function execution, and Model final answer generation.

  8. Browser Use is a library that enables AI agents to interact with web browsers, making websites accessible for automated tasks. It includes features for browser automation, agent memory, and various demos showcasing its capabilities.

  9. A terminal-based platform to experiment with the AI Software Engineer. It allows users to specify software in natural language, watch as an AI writes and executes the code, and implement improvements. Supports various models and customization options.

  10. ClickUi is a powerful, open-source, cross-platform AI-assistant application built in Python. It integrates various AI models, speech recognition, and web scraping capabilities, providing both voice and text interaction interfaces. The tool is designed to be a comprehensive AI-computer assistant, supporting features such as voice mode, chat mode, file attachments, property lookups, and web searches. It aims to be user-friendly and adaptable, encouraging community collaboration for future development and improvements.

    2025-03-02 Tags: , , , , by klotz

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "python+llm"

About - Propulsed by SemanticScuttle