Overview: How the AI Ecosystem Fits Together
The Ledgerline AI ecosystem is the set of tools, conventions, and automations we use so that both humans and AI assistants can work effectively on the same codebase. The goal isn’t to replace developers—it’s to make the “project brain” explicit and reusable.
The project brain
Every project has an implicit “brain”: how we name things, where we put code, how we run tests, how we deploy, and what we consider “done.” When that brain lives only in people’s heads, AI assistants (and new contributors) keep guessing. Our approach is to make as much of it as possible explicit:
- Cursor rules (
.cursor/rules/*.mdc) encode constraints: ports vs adapters, no magic strings, service contracts, doc contracts, Linear workflow, GitHub token usage, etc. The assistant is instructed to follow these before writing code. - Cursor skills (
.cursor/skills/*.md) encode procedures: “when you need to do X, do steps 1–2–3.” They’re reusable playbooks for documentation, release notes, and validation. - Cursor commands (
.cursor/commands/*.md) give one-shot workflows (e.g. workon, open PR, create issue) so common tasks don’t have to be re-explained. - MCP connects the editor to external systems (e.g. Render) so the assistant can reason about deployments and logs without leaving the IDE.
- Husky + pre-commit run typecheck, tests, contract coverage, E2E, and Semgrep so that neither human nor AI can commit broken or unsafe code without deliberately bypassing hooks.
Together, rules + skills + commands + MCP + quality gates form a single, documented workflow. When we add a new skill or rule, we document it here and (via the generator) keep the Reference in sync so the ecosystem stays discoverable.
Who this is for
- Developers working in this repo: you get one place to understand how Cursor is configured and how to extend it.
- AI enthusiasts: you get a real-world example of “AI-native” setup—not theory, but a working stack you can adapt.
- Future us: when we forget why we chose a rule or a skill, this section and the reference are the source of truth.
How to extend it
When you add a new Cursor skill, rule, command, or MCP server:
- Add the file in the right place (
.cursor/skills/,.cursor/rules/,.cursor/commands/, or MCP config). - Run
pnpm docs:ai-ecosystem:generateto refresh the Reference index. - If the new addition is significant, add a short narrative in the relevant doc (e.g. Cursor setup, MCP) so the “why” is clear.
The generator ensures the inventory is always up to date; the authored pages explain the rationale and the story.