mirror of
https://github.com/Aroy-Art/Rinkusu.git
synced 2025-02-25 15:16:54 +01:00
Feature Request: Add Configurable robots.txt Support #9
Labels
No labels
Kind/Breaking
Kind/Bug
Kind/Documentation
Kind/Enhancement
Kind/Feature
Kind/Security
Kind/Testing
Priority
Critical
Priority
High
Priority
Low
Priority
Medium
Reviewed
Confirmed
Reviewed
Duplicate
Reviewed
Invalid
Reviewed
Won't Fix
Status
Abandoned
Status
Blocked
Status
Need More Info
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: Aroy/Rinkusu#9
Loading…
Add table
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
I would like to request the addition of a robots.txt file with configurable settings that allow users to specify which bots to block directly from Hugo’s main configuration file. This feature will help users control search engine indexing behavior and prevent unwanted bots from crawling their site.
Rationale:
Adding a robots.txt file with configurable rules improves SEO control and security. Users can define which search engines can index their content and block unwanted bots, scrapers, or specific sections of their site. This feature is especially useful for sites that contain private content, staging environments, or areas that should not be indexed.
Use Cases:
Implementation Ideas:
Default robots.txt Template:
layouts/_default/robots.txt
).Allow customization based on site parameters in
config.yaml
orconfig.toml
.Configuration Options in
config.yaml
:Example configuration:
Dynamic robots.txt Generation:
Example template (
layouts/_default/robots.txt
):Environment Variable Support (for Netlify, Vercel, etc.):
This is useful for staging environments where indexing should be disabled.
Documentation:
Additional Context:
Many Hugo users deploy their sites on platforms like Netlify, Vercel, or GitHub Pages, where controlling search engine indexing is crucial. Having a built-in, configurable robots.txt ensures better SEO and security while keeping configuration simple and manageable.
Need to clean up the formatting of this requestFixed markdown formatting for the feature request