Sant Icons
69,000 open-source SVG icons across 18 libraries. One search bar, one URL. A web app, a Model Context Protocol server, and a CLI, all reading from the same manifest. Built in Christchurch, Aotearoa New Zealand. Served free, no signup, no paywall.
One corpus. Three surfaces.
A static web app for browsing and customising. A Model Context Protocol server so coding assistants like Claude Code can pull icons straight into a codebase. A CLI for the terminal. All reading from one manifest, all working offline. The web app deploys to Vercel free tier; the MCP and CLI ship to npm.
Sourcing icons is still fragmented.
Every project sends a developer hopping between Lucide, Phosphor, Tabler, Material, Heroicons. Different sites, different search, different formats, copy-paste each time. AI assistants can't help. They don't have the icons, and they hallucinate names that don't exist.
We built a single addressable corpus. One search bar across 18 libraries. One protocol so an agent can fetch an icon by name. One CLI so you never leave the terminal.
Same data. Different context.
Web
Search 69,460 icons in one box. Live colour, size, and stroke customiser. Per-library SVG chunks lazy-load as you browse. The slim metadata index is ~16 MB; the SVG body only loads when you need it. Static export, hosted on Vercel free tier.
MCP
A Model Context Protocol server bundling the full manifest. Drop it into a Claude Code or Claude Desktop config and your assistant gets three tools: search_icons, get_icon, list_libraries. Ask it for a stroke arrow-right and it writes the SVG straight into your file. Works offline.
CLI
npx sant-icons get lucide/heart. Output as raw SVG, React, Vue, or data URI. Customise size, colour, stroke width inline. Pipe to a file or stdout. Same manifest as the MCP, bundled in the package.
Eighteen libraries. Every license preserved.
Each upstream library keeps its own license. Every detail page on the site links to the icon's specific upstream license. The MCP and CLI both expose license and version fields so consumers can attribute correctly.
Customise once. Export anywhere.
Pick a colour, set a stroke, set a size. The whole grid updates live via CSS variables. Select an icon and the preview snapshots your global values into a panel-local set, so the detail view and your final export stay coherent.
- →Export as SVG, JSX (React), Vue SFC, or PNG
- →Strip Devicon brand colours at small sizes, preserve them on download
- →Keyboard-first: / to search, Esc to clear
- →Deep-link with ?focus=icon-id to open a specific icon
Static. Updated weekly. Free forever.
An ingestion workspace pulls every library from npm or GitHub, normalises into a single record shape, and writes a slim metadata index plus per-library SVG JSON chunks. Roughly five seconds for the full 70k corpus.
A GitHub Action re-runs the ingestion weekly. New icons land, deprecated ones drop, version numbers track upstream. The site rebuilds and redeploys on its own schedule.
No accounts, no API keys, no rate limits, no analytics on the public assets. The whole site is HTML, CSS, and a slim React bundle reading static JSON.
One detail page per icon. Sitemap split across multiple files because Google's 50,000-URL limit doesn't cover the corpus. Open Graph card per library, generated at build.
Coding assistants can finally see icons.
Claude Code, Claude Desktop, Cursor. Anything that speaks the Model Context Protocol gets a search and a fetch tool over the full corpus. Add it to your config once, then just ask.
{
"mcpServers": {
"sant-icons": {
"command": "npx",
"args": ["-y", "@santicons/mcp"]
}
}
}
Then ask: "find me a stroke arrow-right icon from lucide and put it in src/icons/". The assistant searches, fetches the SVG, and writes it where you asked. Manifest is bundled in the package, so it works offline.
A free tool can still be state of the art.
Static beats database
70k records, ~16 MB metadata, lazy-loaded SVG chunks. Sub-second search across the full corpus, no backend, no hosting bill, no DDoS surface.
MCP is the dev DX layer
Sant has shipped two MCP servers in 2026 already. The protocol is the new way to give an AI assistant context that won't fit in a prompt.
Same data, three contexts
Web for browsing, MCP for agents, CLI for terminals. Build the ingestion once, format the output for the surface. One source of truth.
69,460 icons.
One search bar.
Free, no signup, no paywall. Ship something with it.