This MCP is the fastest way to integrate Taiga UI components into your AI workflow. The Model Context Protocol (MCP) server provides AI assistants with full access to Taiga UI components. Easily obtain Taiga UI component implementations for use in your AI-driven development process.
Key Features
Documentation + code snippets: Full Taiga UI documentation in Markdown format plus ready-made Angular examples — all in one place.
Two MCP tools:
get_list_components to search for components. Lists component / section identifiers (with fuzzy substring filtering) along with basic metadata (category, package, type).
get_component_example to get code examples. Returns full markdown content for each resolved section (entire component documentation)
Flexible configuration and ease: You can change the source URL (stable/next) without installing Angular locally.
Installation
Providing Context with llms.txt
llms.txt is a proposed standard for websites designed to help LLMs better understand and process their content. The Taiga UI team has developed two versions of this file to help LLMs and tools that use LLMs for code generation to create better code:
llms.txt - a table of contents file providing links to key files and resources.
llms-full.txt - a more detailed compiled set of resources describing how to start development with Taiga UI and examples of using components.