This article details how to build a 100% local MCP (Model Context Protocol) client using LlamaIndex, Ollama, and LightningAI. It provides a code walkthrough and explanation of the process, including setting up an SQLite MCP server and a locally served LLM.
This article is a year-end recap from Towards Data Science (TDS) highlighting the most popular articles published in 2025. The year was heavily focused on AI Agents and their development, with significant interest in related frameworks like MCP and contextual engineering. Beyond agents, Python remained a crucial skill for data professionals, and there was a strong emphasis on career development within the field. The recap also touches on the evolution of RAG (Retrieval-Augmented Generation) into more sophisticated context-aware systems and the importance of optimizing LLM (Large Language Model) costs. TDS also celebrated its growth as an independent publication and its Author Payment
"Talk to your data. Instantly analyze, visualize, and transform."
Analyzia is a data analysis tool that allows users to talk to their data, analyze, visualize, and transform CSV files using AI-powered insights without coding. It features natural language queries, Google Gemini integration, professional visualizations, and interactive dashboards, with a conversational interface that remembers previous questions. The tool requires Python 3.11+, a Google API key, and uses Streamlit, LangChain, and various data visualization libraries
This article explores how prompt engineering can be used to improve time-series analysis with Large Language Models (LLMs), covering core strategies, preprocessing, anomaly detection, and feature engineering. It provides practical prompts and examples for various tasks.
Extracting structured information effectively and accurately from long unstructured text with LangExtract and LLMs. This article explores Google’s LangExtract framework and its open-source LLM, Gemma 3, demonstrating how to parse an insurance policy to surface details like exclusions.
This article explores alternatives to NotebookLM, a Google assistant for synthesizing information from documents. It details NousWise, ElevenLabs, NoteGPT, Notion, Evernote, and Obsidian, outlining their key features, limitations, and considerations for choosing the right tool.
Leveraging MCP for automating your daily routine. This article explores the Model Context Protocol (MCP) and demonstrates how to build a toolkit for analysts using it, including creating a local MCP server with useful tools and integrating it with AI tools like Claude Desktop.
Local Large Language Models can convert massive DataFrames to presentable Markdown reports — here's how.
This article details how to accelerate deep learning and LLM inference using Apache Spark, focusing on distributed inference strategies. It covers basic deployment with `predict_batch_udf`, advanced deployment with inference servers like NVIDIA Triton and vLLM, and deployment on cloud platforms like Databricks and Dataproc. It also provides guidance on resource management and configuration for optimal performance.
NVIDIA DGX Spark is a desktop-friendly AI supercomputer powered by the NVIDIA GB10 Grace Blackwell Superchip, delivering 1000 AI TOPS of performance with 128GB of memory. It is designed for prototyping, fine-tuning, and inference of large AI models.