Skip to main content
Jupiter provides structured documentation files purpose-built for LLMs and AI agents, following the llmstxt.org standard. A plain Markdown file containing a structured index of all Jupiter documentation with titles and descriptions for each page:
  • Site title as an H1 heading
  • Sections for each major area (Docs, API References, Guides, Tool Kits, etc)
  • Links with descriptions for every page

llms.txt

Open the llms.txt for Jupiter docs
# Jupiter

## Docs

- [Build with Jupiter](https://dev.jup.ag/index.md)
- [Get Started](https://dev.jup.ag/get-started/index.md): Welcome to Jupiter's Developer Docs...
- [Overview](https://dev.jup.ag/docs/ultra/index.md): Overview of Ultra Swap and its features.
- [API Reference](https://dev.jup.ag/api-reference/index.md): Overview of Jupiter API Reference
This is the perfect entrypoint for LLMs and AI agents to get started. It allows them to efficiently locate relevant content at a high level without processing the entire documentation.

llms-full.txt

llms-full.txt

Open the llms-full.txt for Jupiter docs
While llms.txt provides a concise index, llms-full.txt contains the entire documentation site as context — including every description, code example, and parameter detail. Use llms-full.txt when:
  • Your AI tool needs complete, granular context for deep indexing
  • You’re building custom RAG pipelines over Jupiter docs
  • You want every code example available for reference

Markdown Export

Any documentation page can be accessed as raw markdown, making it easy for AI agents to consume individual pages programmatically. Method 1: Append .md to the URL
curl https://dev.jup.ag/docs/ultra.md
curl https://dev.jup.ag/docs/ultra/get-started.md
Method 2: Use the Accept header
curl -H "Accept: text/markdown" https://dev.jup.ag/docs/ultra
Both methods return the page content as text/markdown.

How It’s Generated

Jupiter’s llms.txt is auto-generated from the documentation structure. Every page includes an llmsDescription field in its frontmatter — a description specifically optimized for LLM consumption, separate from the human-readable description. This ensures AI tools get the most relevant context for each page.