Adding Custom robots.txt to the blogger is very easy to add if you want to allow this feature in your blogger account. Robots.txt supports Google Crawlers and indexing, but you must make it work well for your page so follow the steps below to allow custom robots.txt file carefully.
You can enable custom robots.txt and custom robots header-tags
Step 1
After Login to your blogger website you will see the tab "Crawler and indexing" in "Dashboard, under Settings > Search Preferences.Step 2
You can see three options in the Crawler and indexing tab, and you have to enable two options disabled. You have to choose yes and then, enter some code here. Click on the' Edit' button. You will replace your site domain in "sitemap: https:/yourdomain / sitmap.xml" that's it, copy this to the file and paste it into your script.--------------------------------robots.txt file code-------------------------
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://utechi.blogspot.com/sitemap.xml
ScreenShot:
Step 3
You have to activate "Custom Robots header tags" after replacing your domain hit and now click "Edit" and select the same option as in the picture above. Click "save update" and now your Robots.txt files are enabled for your blogger website successfully, set this configuration very carefully and click Save Changes.
Thanks for reading. If you have any query please comment below.
very nice Visit Here for more information...
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteThe descriptive ideas presented in your article. Your points are represented with original writing and your content is well-written. Information like this is good to have for readers. Thank you. visit here social media marketing agency.
ReplyDeleteThis comment has been removed by the author.
ReplyDelete