What is Robots txt and how to create Robots txt file

Today we will learn about what is Robots txt file in wordpress and how to create robots.txt file in our server? if we go anywhere or new place there is always a guide book that tell us where we need to go or where we not to go. robots txt also work like the guide which tells google bot where they can crawl and where they can’t.

Meaning of Robots :

Robots are any type of “bot” that visits websites on the Internet. The most common example is search engine crawlers. These bots “crawl” around the web to help search engines like Google index and rank the billions of pages on the Internet. Robots.txt file is the practical implementation of that standard – it controls how bots interact with your site. We can block bots entirely, restrict their access to certain areas of your site.

What is robots txt file?

Robots.txt file is a very powerful tool or file in our website when you are working on a website’s SEO – but it should be tackled with care. It allows us to deny or allow search engines to access different files and folders, It blocks search engine bots and helps index and crawl important parts of our blog. a wrongly configured Robots.txt file can let our presence completely go away from search engines. So, it’s very important that when you make changes in your robots.txt file, it should be well optimized and should not block access to important parts of our blog.

How to make Robots.txt file :

Robots.txt is a general text file.this file is created in your website’s root folder for ex: webtechsource.com/robots.txt If you want to edit it, you can use any text editor. if you don’t have this file on your website, open any text editor as you like Notepad and create a Robots.txt file made with one or more records. Every record bears important information for the search engine. tips for increase traffic on blog

Syntax for the robots txt file :

User-agent: Name of robots which you want to allow

Disallow: page,paths,folder you want to diallow to robot

Allow: page,path or folders name you want to allow

if you want to allow everything

User-agent: *

Disallow:

if you want to block or disallow everything

User-agent: *

Disallow: /

In robots.txt file you can also add your sitemap manually like:

Sitemap: http://www.webtechsource.com/post-sitemap.xml

Testing Your Robots.txt File :

we can test our robots.txt file in Google Search Console to ensure its setup correctly.
We need to follow some steps:

1. click into your site, and under “Crawl” click on “robots.txt Tester.”
2. We can submit any URL, including your homepage. We see a green Allowed if everything is crawlable. You could also test URLs you have blocked to ensure they are in fact blocked, and or Disallowed.

robots.txt file What is Robots txt and how to create Robots txt file

I hope you like this Post, Please feel free to comment below, your suggestion and problems if you face – we are here to solve your problems.

I, Dayanand Saini,the founder and chief editor of www.webtechsource.com,have been working as a software engineer from last 4+ years. It's my hobby to learn new things and implement on live environment.

Leave a Reply

Your email address will not be published. Required fields are marked *

3 + 5 =

%d bloggers like this: