Page 1 of 1

What is robots.txt used for?

PostPosted: Wed Mar 09, 2016 6:02 am
by danielnash
I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.

Re: What is robots.txt used for?

PostPosted: Tue Jan 10, 2017 6:53 am
by RHCalvin
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

Re: What is robots.txt used for?

PostPosted: Fri Jan 20, 2017 8:29 am
by ankur111
Robots.txt file is a text file that contains some instruction for Search engine robots or crawler. Through robots file, we can disallow or stop crawler to read or index any particular page or directory of website or even we can block whole website from being indexed by search engine robots.

Re: What is robots.txt used for?

PostPosted: Fri Jul 07, 2017 7:34 am
by mah764
Robots.txt is a text file where you can write the instruction for search engine that how it behave on website. Like if you don't want to crawl some pages then you can mention on there then google will never crawl and index that page and it will never to in to live.

Re: What is robots.txt used for?

PostPosted: Tue Jan 29, 2019 7:28 am
by riyajindal
The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. ... The slash after “Disallow” tells the robot to not visit any pages on the site.

Re: What is robots.txt used for?

PostPosted: Tue Jan 29, 2019 10:08 am
by Arpita
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

Re: What is robots.txt used for?

PostPosted: Fri Mar 01, 2019 5:53 am
by andy123
It is a standard file to communicate with web crawlers & other robots. It inform crawlers about which pages to crawl & which to skip for crawling.