Image - #GetFoundGiveHope: Make sure your /robots.txt file permits search engines to read your website | MissionFound

#GetFoundGiveHope: Give Search Engines Permission to Read Your Website

Jill Van Nostran Digital Strategy

#GetFoundGiveHope Tip: It may seem obvious, but you’d be surprised at how many organizations don’t realize their websites are blocking search engines.

Search engine spiders must be able to crawl your site in order for your content to display in search results and for your site to rank. A /robots.txt file instructs search engine spiders (like Googlebot) about which pages it is allowed to crawl and which should be ignored.

Check your /robots.txt file by going to If you see:
Disallow: /
this tells you that your /robots.txt is blocking search engine spiders. Because of this, not only is your site not ranking, but you are likely missing out on a lot of website traffic.

Call your web host and ask them to help you correct the critical error. Once you’ve created (or checked) your /robots.txt file, submit it to Google via Search Console (but you need a Search Console account first). Just open Search Console and go to Crawl → robots.txt Tester → Submit which will open a dialog box for instructions on submitting an updated file.