Run any GUI app in the terminal `term.everything` is a Linux CLI program to run GUI windows in your terminal. Specifically, `term.everything` is a built-from-scratch Wayland compositor that outputs to a terminal rather than your monitor.
This GitHub repository contains a collection of example files demonstrating various use cases and configurations for the llamafiles tools, including examples:
* **System Administration:** Scripts and configurations for Ubuntu, Raspberry Pi 5, and macOS.
* **LLM Interaction:** Examples of prompts and interactions with LLMs like Mixtral and Dolphin.
* **Text Processing:** Scripts for summarizing text, extracting information, and formatting output.
* **Development Tools:** Examples related to Git, Emacs, and other development tools.
* **Hardware Monitoring:** Scripts for monitoring GPU and NVMe drive status.
A ruby script calculates VRAM requirements for large language models (LLMs) based on model, bits per weight, and context length. It can determine required VRAM, maximum context length, or best bpw given available VRAM.
Noema Research introduces Pinboard, a developer tool for improved productivity. Pinboard, a command-line tool, efficiently manages files and terminal references, enhancing development workflows. Key features include flexible pinning, contextual updates, clipboard integration, an interactive shell, and undo functionality.
A minimalist bash program that prints markdown files in a terminal.
MarCLIdown is a minimalist bash program that prints a markdown file in a formatted text output in a Linux terminal. It can handle various elements, such as headers, emphasis and strong emphasis, strikethrough, linebreaks, unordered lists, ordered lists, hyperlinks, images, urls/mails, checkboxes, checkboxes as items on a list, tables, horizontal lines, blockquotes, nested blockquotes, formatted elements inside Blockquotes/Notes, notes, footnotes, footnote with multiple paragraphs, inline code, code blocks, formatting escape, collapsible section, subscript/superscript, other HTML tags, and HTML collapsible sections. The project is unlicensed and can be installed using the provided install script.
The "LLM" toolkit provides a command-line utility and Python library for interacting with large language models. It enables users to run prompts from the terminal, store responses in SQLite databases, generate embeddings, and more. This comprehensive guide includes topics such as setup, usage, OpenAI models, other models, embeddings, plugins, model aliases, Python API, prompt templates, logging, related tools, CLI reference, contributing, and changelog.
* **New Feature:** `openai-to-sqlite` tool now allows enriching data in a SQLite database using OpenAI's GPT3.5 model.
* **Sentiment Analysis Example:**
+ Update a `sentiment` column in a `messages` table using `chatgpt()` function.
The author has also automated their weeknotes by using an Observable notebook, which generates the "releases this week" and "TILs this week" sections.
The notebook fetches TILs from the author's Datasette, grabs releases from GitHub, and assembles a markdown string for the new post.
* `llm` CLI tool for running prompts against large language models
* Automation of weeknotes using an Observable notebook
* Notebook generates "releases this week" and "TILs this week" sections
* Tool stores prompts and responses in a SQLite database