Robots.txt file is the face of your website for web crawlers and it resides on the root directory of your website. It tells the Web bots or search engines (like google) what to crawl and what not to crawl on your website.
You should have a robots.txt file if you want to block certain folders or files from web crawlers or you want search engines to crawl your sitemap.
What happens if you don’t have a robots.txt file on domain?
Google says that if your subdomain or domain is missing robots.txt file, it means that web bots assume that they can crawl all the content on your website (see below message).