A Robots.txt middleware for ASP.NET Core. Why is this needed you ask? Because if you need to add dynamic values (such as a configured url from your CMS) you'll need some sort of code to handle that, and this makes it easy.
PM> Install-Package RobotsTxtCore
> dotnet add package RobotsTxtCore
https://www.nuget.org/packages/RobotsTxtCore/
To specify multiple rules with the fluent interface makes it really easy.
app.UseRobotsTxt(builder =>
builder
.AddSection(section =>
section
.AddComment("Allow Googlebot")
.AddUserAgent("Googlebot")
.Allow("/")
)
.AddSection(section =>
section
.AddComment("Disallow the rest")
.AddUserAgent("*")
.AddCrawlDelay(TimeSpan.FromSeconds(10))
.Disallow("/")
)
.AddSitemap("https://example.com/sitemap.xml")
);
Output
# Allow Googlebot
User-agent: Googlebot
Allow: /
# Disallow the rest
User-agent: *
Disallow: /
Crawl-delay: 10
Sitemap: https://example.com/sitemap.xml
Or if you just want to deny everyone.
app.UseRobotsTxt(builder =>
builder
.DenyAll()
);
Output
User-agent: *
Disallow: /