Robots Txt Codes
Allow User Agents
Search engines crawl the web with programs known as robots, crawlers, and/or spiders. These robots must follow a protocol while doing their work. And that is to check each website for instructions or commands made by webmasters.
One important set of instructions are saved in a text file, named as robots.txt, which tells a robot or user-agent whether it is is given permission to crawl, and how far it can go deep into the website's files and folders. Use this tool
A line of code can be added into your robots.txt file to let search engine spiders know that you have a sitemap and where it is located. Use this tool