Automate sitemap, robots, and AI-ready llms.txt files with XperienceCommunity.SEO
If you’ve ever set up robots.txt
, sitemap.xml
, or any other SEO configuration file in an Xperience by Kentico project, you know how quickly things get repetitive.
Every project needs them, but each implementation ends up slightly different; hard-coded, manual, and often forgotten until a crawler issue pops up.
To make life easier (and cleaner), our team built XperienceCommunity.SEO: an open-source NuGet package that handles these essentials automatically.
It centralizes SEO infrastructure and introduces something new for 2025: llms.txt
, a file designed for AI and LLM crawlers.
Problem Statement
Developers spend too much time managing boilerplate SEO setup:
- Static
robots.txt
files that don’t reflect environment changes - Sitemaps that don’t update when content is published or localized
- No built-in way to prepare for AI-based discovery (think ChatGPT, Gemini, or Copilot)
As search shifts toward generative engines and large language models, we need a way to make Kentico sites machine-readable and AI-ready, without reinventing the wheel each time.
Solution Overview
The XperienceCommunity.SEO plug-in automates it all.
It dynamically generates robots.txt, sitemap.xml,
and llms.txt
endpoints using Kentico’s content and caching system.
Once installed and configured, your site automatically exposes:
/robots.txt:
defines crawler access rules/sitemap.xml:
lists and updates site pages/llms.txt:
helps AI models understand and index your site’s structure
Everything stays synced with your content, cached efficiently, and fully customizable for your project setup.
Feature Highlights
Feature | What it does |
---|---|
Dynamic Content Discovery | Automatically finds and includes pages, articles, and landing pages from Kentico. |
Kentico Cache Dependency | Keeps generated files fresh without sacrificing performance. |
Custom Sitemap Config | Easily define what to include, how often to refresh, and what URLs to expose. |
AI-Ready llms.txt |
Introduces discoverability for AI agents and large language models. |
Flexible API Options | Works with controllers, minimal APIs, or route attributes; your choice. |
Open Source | Free to use, extend, or adapt for any Xperience by Kentico project. |
These are especially helpful when working with content hub assets or custom structured data. Instead of defensive coding, you can write elegant one-liners that are safe and maintainable.
Technical Implementation
Installation
Install the NuGet package:
dotnet add package XperienceCommunity.SEO
Configuration
Register the SEO services in your Program.cs:
using XperienceCommunity.SEO; // Register the SEO services with configuration builder.Services.AddXperienceCommunitySEO(options => { options.ReusableSchemaName = "PageMetadata"; // Your reusable schema name options.DefaultLanguage = "en"; options.DescriptionFieldName = "MetaDescription"; options.TitleFieldName = "MetaTitle"; options.SitemapShowFieldName = "ShowInSitemap"; // Optional field options.ContentTypeDependencies = new[] { "BlogPost", "Article", "LandingPage" }; });
Usage Examples
1. Basic Controller Example
using Microsoft.AspNetCore.Mvc; using XperienceCommunity.SEO.Services; [ApiController] public class SEOController : ControllerBase { private readonly IWebsiteDiscoveryProvider _websiteDiscoveryProvider; public SEOController(IWebsiteDiscoveryProvider websiteDiscoveryProvider) { _websiteDiscoveryProvider = websiteDiscoveryProvider; } // Generates sitemap.xml at /sitemap.xml [HttpGet("/sitemap.xml")] [ResponseCache(Duration = 3600)] // Cache for 1 hour public async Task<ActionResult> GetSitemap() { return await _websiteDiscoveryProvider.GenerateSitemap(); } // Generates llms.txt at /llms.txt [HttpGet("/llms.txt")] [ResponseCache(Duration = 3600)] // Cache for 1 hour public async Task<ActionResult> GetLlmsTxt() { return await _websiteDiscoveryProvider.GenerateLlmsTxt(); } // Generates robots.txt at /robots.txt [HttpGet("/robots.txt")] [ResponseCache(Duration = 86400)] // Cache for 24 hours public ActionResult GetRobotsTxt() { return _websiteDiscoveryProvider.GenerateRobotsTxt(); } }
2. Using Minimal APIs
app.MapGet("/sitemap.xml", async (IWebsiteDiscoveryProvider provider, HttpContext context) => { var actionResult = await provider.GenerateSitemap(); await actionResult.ExecuteResultAsync(new ActionContext { HttpContext = context }); }); app.MapGet("/llms.txt", async (IWebsiteDiscoveryProvider provider, HttpContext context) => { var actionResult = await provider.GenerateLlmsTxt(); await actionResult.ExecuteResultAsync(new ActionContext { HttpContext = context }); }); app.MapGet("/robots.txt", (IWebsiteDiscoveryProvider provider, HttpContext context) => { var robotsContent = provider.GenerateRobotsTxt(); return Results.Content(robotsContent, "text/plain; charset=utf-8"); });
3. Using Route Attributes
[Route("seo")] public class SEOController : ControllerBase { private readonly IWebsiteDiscoveryProvider _provider; public SEOController(IWebsiteDiscoveryProvider provider) { _provider = provider; } [HttpGet("/sitemap.xml")] // ~/ makes it root-relative public async Task<ActionResult> Sitemap() => await _provider.GenerateSitemap(); [HttpGet("/llms.txt")] // ~/ makes it root-relative public async Task<ActionResult> LlmsTxt() => await _provider.GenerateLlmsTxt(); [HttpGet("/robots.txt")] // ~/ makes it root-relative public ActionResult RobotsTxt() => _provider.GenerateRobotsTxt(); }
Configuration for robots.txt
Add this to your appsettings.json:
{ "XperienceCommunitySEO": { "RobotsContent": "User-agent: Twitterbot\nDisallow:\n\nUser-agent: SiteAuditBot\nAllow: /\n\nUser-agent: *\nDisallow: /" } }
For production environments, you can simplify it as:
{ "XperienceCommunitySEO": { "RobotsContent": "User-agent: *\nAllow: /" } }Customize your
robots.txt
directives as needed to align with your site’s indexing strategy.
Expected Output
robots.txt (Non-production)
User-agent: Twitterbot Disallow: User-agent: SiteAuditBot Allow: / User-agent: * Disallow: /
robots.txt (Production)
User-agent: * Allow: /
sitemap.xml
<?xml version="1.0" encoding="utf-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>https://yoursite.com/about</loc> <lastmod>2025-10-03</lastmod> <changefreq>weekly</changefreq> </url> <url> <loc>https://yoursite.com/blog/article</loc> <lastmod>2025-10-03</lastmod> <changefreq>weekly</changefreq> </url> </urlset>
Llms.txt
# YourWebsiteName ## Pages - [About Us](https://yoursite.com/about): Learn about our company and mission - [Blog Article](https://yoursite.com/blog/article): Comprehensive guide to SEO - [Contact](https://yoursite.com/contact): Get in touch with our team
Advanced Usage - Custom Sitemap Generation
The IWebsiteDiscoveryProvider
service exposes public methods that allow you to retrieve sitemap data and create custom implementations:
Available Methods
-
GetSitemapPages()
- Returns a list ofSitemapNode
objects for generating XML sitemaps -
GetSitemapPagesWithDetails()
- Returns a list ofSitemapPage
objects with additional metadata like titles and descriptions
[ApiController] public class CustomSEOController : ControllerBase { private readonly IWebsiteDiscoveryProvider _provider; public CustomSEOController(IWebsiteDiscoveryProvider provider) { _provider = provider; } [HttpGet("/custom-sitemap.xml")] public async Task<ActionResult> GetCustomSitemap() { // Get the basic sitemap nodes var sitemapNodes = await _provider.GetSitemapPages(); // Customize the nodes (e.g., add custom change frequency, priority, etc.) foreach (var node in sitemapNodes) { if (node.Url.Contains("/blog/")) { node.ChangeFrequency = ChangeFrequency.Daily; node.Priority = 0.8; } else if (node.Url.Contains("/news/")) { node.ChangeFrequency = ChangeFrequency.Hourly; node.Priority = 0.9; } } // Generate custom sitemap XML return new SitemapProvider().CreateSitemap(new SitemapModel(sitemapNodes)); } [HttpGet("/pages-with-metadata.json")] public async Task<ActionResult> GetPagesWithMetadata() { // Get detailed page information including titles and descriptions var pagesWithDetails = await _provider.GetSitemapPagesWithDetails(); // Transform or filter the data as needed var customData = pagesWithDetails.Select(page => new { Url = page.SystemFields.WebPageUrlPath, Title = page.Title, Description = page.Description, LastModified = DateTime.Now }); return Ok(customData); } }
Data Models
SitemapNode contains:
- Url - The page URL path
LastModificationDate
- When the page was last modified- ChangeFrequency - How often the page changes
- Priority - Page priority (0.0 to 1.0)
SitemapPage contains:
- SystemFields - System information about the web page
- Title - The page title from your configured title field
- Description
- IsInSitemap - Whether the page should be included in sitemaps
The XperienceCommunity.SEO package reflects what sets [A] apart: our developers don’t just build within Kentico, they expand it.
This plug-in simplifies one of the most overlooked yet essential aspects of SEO and future-proofs Xperience sites for AI-driven discovery.
At [A], our team of certified Kentico developers and content engineers help clients go beyond implementation, designing scalable, intelligent content systems that grow with your business.
💡 Want to make your site AI-ready?
Let’s talk about how [A] can help you optimize your Kentico or CMS implementation.
Thank you for reading!
Follow me on social:
- GitHub @vhugogarcia
- LinkedIn victorhugogarcia