Build and deploy MCP servers
Create Model Context Protocol servers that give AI models access to your tools, data, and services.
Set up local-first AI inference on your own hardware — no cloud, no API keys, full privacy.
Pick the format that matches the level of support you want.
Start immediately and work through the training on your own schedule.
Join a guided cohort or workshop format when live delivery is available.
Guided by an instructor
Practice with an AI-guided trainer experience tailored to the course topic.
Personalized guidance
Local-first edge AI inference lets you run powerful models on your own machine or edge devices. With 633K+ stars across the ecosystem, tools like llama.cpp, Ollama, and vLLM have made this accessible to everyone.
Privacy-sensitive work, offline environments, and cost control all demand local inference skills. This is a must-know for anyone serious about AI.
Create Model Context Protocol servers that give AI models access to your tools, data, and services.
Extract, transform, and structure content from PDFs, DOCX, HTML, and more into clean Markdown for AI consumption.
Extend AI coding assistants with custom wrappers, hooks, and workflows — the fastest-growing segment in dev tools.