An exploration of transitioning from Docker to Podman within a home lab environment. The author details a weekend project setting up an Ubuntu-based Podman server on Proxmox, highlighting the key architectural differences such as the absence of a central daemon and the benefits of rootless container execution for enhanced security.
Key topics include:
- Comparison between Docker's daemon-based model and Podman's daemonless architecture.
- The advantages of rootless containers for improved host security.
- Utilizing systemd integration through Quadlets for more native Linux service management.
- Practical steps for installation on Ubuntu and Rocky Linux.
- Maintaining compatibility with tools like Portainer via the Podman API socket.
The article explores how Canonical's MicroCloud works well with Raspberry Pi, particularly the Raspberry Pi 5, for home lab setups. It discusses the compatibility, performance, and features of running MicroCloud on ARM-based SBCs, highlighting its ability to run both VMs and containers.
Learn to deploy your own local LLM service using Docker containers for maximum security and control, whether you're running on CPU, NVIDIA GPU or AMD GPU.
Dozzle is a lightweight, self-hosted solution that provides a real-time look into your container logs, offering an intuitive UI, real-time logging, intelligent search, and support for multiple use cases like home labs and local development.
This Docker-based network visualizer deploys in under two minutes and automatically maps all devices on your network with an interactive web dashboard.
The article discusses the increasing complexity of Kubernetes and suggests that Silicon Valley is exploring alternative technologies for container orchestration, citing a benchmark showing a stripped-down stack outperforming Kubernetes.
A curated guide to code sandboxing solutions, covering technologies like MicroVMs, application kernels, language runtimes, and containerization. It provides a feature matrix, in-depth platform profiles (e2b, Daytona, microsandbox, WebContainers, Replit, Cloudflare Workers, Fly.io, Kata Containers), and a decision framework for choosing the right sandboxing solution based on security, performance, workload type, and hosting preferences.
Docker is making it easier for developers to run and test AI Large Language Models (LLMs) on their PCs with the launch of Docker Model Runner, a new beta feature in Docker Desktop 4.40 for Apple silicon-powered Macs. It also integrates the Model Context Protocol (MCP) for streamlined connections between AI agents and data sources.
A Microsoft engineer demonstrates how WebAssembly modules can run alongside containers in Kubernetes environments, offering benefits like reduced size and faster cold start times for certain workloads.
Docker offers various logging drivers that dictate the storage location and format of log messages. These include json-file, syslog, journald, fluentd, awslogs, gelf, logentries, and splunk.