Google’s John Mueller downplayed the usefulness of LLMs.txt, comparing it to the keywords meta tag, as AI bots aren’t currently checking for the file and it opens potential for cloaking.
The crawl-delay directive is an unofficial directive in robots.txt meant to communicate to crawlers to slow down crawling to not overload the web server. However, support for this directive varies among search engines.