#[WHITELIST-ROBOTS.TXT VERSION 1.0] #[MAINTAINED AT: https://git.qwik.space/Left4Code/robots.txt] GET A COPY OR REPORT ISSUES THERE: #_________________________________________________________________________ #lots of AI companies and scrapers seem to use alternative non-published user-agents, only some get caught and shamed. Consider using some form of AI blocking if possible like Go-away, or Anubis. For static sites hosted with Neocities, Nekoweb, etc. This is another option. #___________________________________________________ User-agent: WibyBot User-agent: search.marginalia.nu Disallow: User-agent: * Disallow: / Disallow: * DisallowAITraining: / DisallowAITraining: * Content-Usage: ai=n sitemap: !REPLACE WITH SITEMAP LINK! #___________________________________________________