Make default robots.txt #2886

Closed
opened 2025-11-02 04:52:38 -06:00 by GiteaMirror · 6 comments
Owner

Originally created by @theel0ja on GitHub (Feb 10, 2019).

My gitea instance is seeing ton of requests from Googlebot to /user/login for example.

Originally created by @theel0ja on GitHub (Feb 10, 2019). My gitea instance is seeing ton of requests from Googlebot to `/user/login` for example.
GiteaMirror added the issue/duplicate label 2025-11-02 04:52:38 -06:00
Author
Owner

@lunny commented on GitHub (Feb 11, 2019):

see https://docs.gitea.io/en-us/customizing-gitea/

@lunny commented on GitHub (Feb 11, 2019): see https://docs.gitea.io/en-us/customizing-gitea/
Author
Owner

@lafriks commented on GitHub (Feb 11, 2019):

@lunny request was to provide one by default with recommended rules

@lafriks commented on GitHub (Feb 11, 2019): @lunny request was to provide one by default with recommended rules
Author
Owner

@stale[bot] commented on GitHub (Apr 12, 2019):

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.

@stale[bot] commented on GitHub (Apr 12, 2019): This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.
Author
Owner

@karthanistyr commented on GitHub (May 29, 2019):

I naively deployed a vanilla gitea docker image on a subdomain and lacked a robots.txt.
Bots hence came and hit a high number of /archive URLs, creating loads of artifacts and consuming a large space on the disk.

I think a default robots.txt forbidding access to this sort or URLs (and other sane defaults) would be really useful for ease of deployment.

@karthanistyr commented on GitHub (May 29, 2019): I naively deployed a vanilla gitea docker image on a subdomain and lacked a robots.txt. Bots hence came and hit a high number of `/archive` URLs, creating loads of artifacts and consuming a large space on the disk. I think a default robots.txt forbidding access to this sort or URLs (and other sane defaults) would be really useful for ease of deployment.
Author
Owner

@Mikaela commented on GitHub (Oct 14, 2019):

Is this a duplicate of https://github.com/go-gitea/gitea/issues/705 or the other way?

@Mikaela commented on GitHub (Oct 14, 2019): Is this a duplicate of https://github.com/go-gitea/gitea/issues/705 or the other way?
Author
Owner

@lunny commented on GitHub (Oct 14, 2019):

@Mikaela you are right. Let's move to #705 to discuss further.

@lunny commented on GitHub (Oct 14, 2019): @Mikaela you are right. Let's move to #705 to discuss further.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#2886