Robots.txt file plays a very important rule in the success of your website. It allows you to either allow or deny search engines access to certain files and folders on your website. It is a file use it to block search engines bots from crawling certain pages on your site. Because of the importance, I will show you how to properly optimize your Robots.txt file to improve your SEO. Please Note That this post is for those running the Yoast SEO plugin on their website.
Optimizing your website Robots.txt file involve tweaking it in a way to allow or disallow what search engines can index. Trying to optimize your WordPress Robots.txt file can either improve or harm your website. If its properly done, you will see improvement in the traffic of your website, but if not, you will low traffic which is bad for your website. To optimize your wordPress sites Robots.txt File to improve SEO, follow the steps below.
How To Optimize WordPress Robots.txt File To Improve your Website SEO
One of the easier way to optimize your wordPress sites Robots.txt File, is if your are using Yoast SEO plugin. The plugin comes with a tool section that gives you access to create and edit your Robot.txt file. To do this, simply go to Yoast SEO plugin,>> Tool>>File editor, click on it and a new page will open and you will see the default WordPress robots.txt file there.
To optimize, simply copy and paste the code as it is below to the box that says Robot.Txt. After doing that, change the www.yourwebsite.com URL to yours.
The next step is to submit it to Google using Google Search Console. Now go to Google Search Console, click on Crwal and from the drop down menu, click on Robots.txt Tester and copy and paste the above code on the box. Next is to click on summit to save the code.
I hope this was helpful, Kindly share with friends