AI
Practical guides for developers working with AI tools. Prompt engineering, custom instructions, and integration patterns for ChatGPT, Claude, and other LLMs.
RAG Document Assistant: Answer Questions from Your Own Docs with Ollama, ChromaDB and Docker
Build a local RAG document assistant that reads .txt files, indexes them with vector embeddings, and answers questions using a local LLM — all without a cloud API. Includes a FastAPI backend, a minimal browser UI, and a full Docker Compose setup.
Last Thursday
Free Local LLM in Docker: Build a Customer Feedback Analyser with Ollama and Pydantic
How to run Ollama in Docker Compose, pull a model on first start, and build a Python CLI that reads customer reviews from CSV, clusters them by theme, and generates a structured report — using Pydantic schemas and system/user message separation. No API keys, no monthly bills.
Last Wednesday