mirror of
https://git.qwik.space/left4code/robots.txt.git
synced 2025-08-14 21:09:31 +05:30
23 lines
796 B
Plaintext
23 lines
796 B
Plaintext
#[WHITELIST-ROBOTS.TXT VERSION 1.0]
|
|
#[MAINTAINED AT: https://git.qwik.space/Left4Code/robots.txt] GET A COPY OR REPORT ISSUES THERE:
|
|
#_________________________________________________________________________
|
|
|
|
#lots of AI companies and scrapers seem to use alternative non-published user-agents, only some get caught and shamed. Consider using some form of AI blocking if possible like Go-away, or Anubis. For static sites hosted with Neocities, Nekoweb, etc. This is another option.
|
|
|
|
#___________________________________________________
|
|
|
|
User-agent: WibyBot
|
|
User-agent: search.marginalia.nu
|
|
Disallow:
|
|
|
|
User-agent: *
|
|
Disallow: /
|
|
Disallow: *
|
|
DisallowAITraining: /
|
|
DisallowAITraining: *
|
|
Content-Usage: ai=n
|
|
|
|
sitemap: !REPLACE WITH SITEMAP LINK!
|
|
|
|
#___________________________________________________
|