Robots.txt Generation
The
robots.txt file tells search engine crawlers which pages they can access. When enabled, Reverb 2.0 copies a template robots.txt to your output directory.Default robots.txt Content
The default template allows all crawlers to access all content:
User-agent: *
Allow: /
Sitemap: https://docs.example.com/help/sitemap.xmlCustomizing robots.txt
To customize the robots.txt file, create a Format Override:
- In ePublisher Designer, expand your Reverb 2.0 target
- Navigate to Format > Pages > robots.txt
- Right-click and select Create Override
- Edit the file with your custom rules
Last modified date: 01/21/2026