Llama Stack v0.1.0 introduces a stable API release enabling developers to build RAG applications and agents, integrate with various tools, and use telemetry for monitoring and evaluation. This release provides a comprehensive interface, rich provider ecosystem, and multiple developer interfaces, along with sample applications for Python, iOS, and Android.
Meta has launched Llama-Stack 0.1.0, a development platform designed to simplify the process of building AI applications using Llama models. The platform offers standardized building blocks and flexible deployment options, including remote and local hosting. It features a plugin system for various API providers and supports multiple programming environments with its CLI tools and SDKs. Meta aims to address common challenges faced by AI developers, such as integrating tools and managing data sources.
A software engineer's journey of transitioning from REST APIs to GraphQL, highlighting the benefits, challenges, and providing a guide to build a first GraphQL server.
The author provides a step-by-step guide for building a basic GraphQL server using Node.js and Apollo Server, encouraging a gradual transition approach to integrate GraphQL into existing REST-based projects.
An article detailing the top API documentation tools of 2025, featuring comprehensive reviews with examples, protocol support, pricing, and strengths and weaknesses.
Sparse autoencoders (SAEs) have been trained on Llama 3.3 70B, releasing an interpreted model accessible via API, enabling research and product development through feature space exploration and steering.
A powerful collection of 100+ APIs across categories to help you build apps, tools, and projects.
MCP is an open-source standard that enhances interaction between AI systems and various data sources, improving usability, response quality, and security.
GitHub Models now allows developers to retrieve structured JSON responses from models directly in the UI, improving integration with applications and workflows. Supported models include OpenAI (except for o1-mini and o1-preview) and Mistral models.
Simple, unified interface to multiple Generative AI providers, supporting various providers including OpenAI, Anthropic, Azure, Google, AWS, Groq, Mistral, HuggingFace, and Ollama. It aims to facilitate the use of multiple LLMs with a standardized interface similar to OpenAI’s.
Daniel Mangum describes how to host a website on Bluesky by leveraging the AT Protocol and Personal Data Server (PDS) APIs, detailing the process and underlying mechanics.