What is robots.txt used for?

Discussion about google search engine optimization methods.

Moderators: sendtogeo, hellow0rld, eugene

What is robots.txt used for?

Postby danielnash » Wed Mar 09, 2016 6:02 am

I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.
danielnash
 
Posts: 60
Joined: Tue Sep 02, 2014 11:38 am
Location: USA

Re: What is robots.txt used for?

Postby RHCalvin » Tue Jan 10, 2017 6:53 am

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
RHCalvin
 
Posts: 328
Joined: Fri Jun 03, 2016 6:31 am
Location: Forum

Re: What is robots.txt used for?

Postby ankur111 » Fri Jan 20, 2017 8:29 am

Robots.txt file is a text file that contains some instruction for Search engine robots or crawler. Through robots file, we can disallow or stop crawler to read or index any particular page or directory of website or even we can block whole website from being indexed by search engine robots.
ankur111
 
Posts: 7
Joined: Fri Jan 20, 2017 8:07 am
Location: California, USA

Re: What is robots.txt used for?

Postby mah764 » Fri Jul 07, 2017 7:34 am

Robots.txt is a text file where you can write the instruction for search engine that how it behave on website. Like if you don't want to crawl some pages then you can mention on there then google will never crawl and index that page and it will never to in to live.
Get trained best online courses like Salesforce, DevOps, VMware, Quality Assurance https://www.janbasktraining.com/online-qa-training/ and many more free courses.
mah764
 
Posts: 67
Joined: Mon May 15, 2017 9:20 am

Re: What is robots.txt used for?

Postby riyajindal » Tue Jan 29, 2019 7:28 am

The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. ... The slash after “Disallow” tells the robot to not visit any pages on the site.
riyajindal
 
Posts: 176
Joined: Mon Jan 28, 2019 8:26 am

Re: What is robots.txt used for?

Postby Arpita » Tue Jan 29, 2019 10:08 am

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
Bihar News Portal
https://livebihar.live/
Arpita
 
Posts: 179
Joined: Mon Jan 28, 2019 12:15 pm

Re: What is robots.txt used for?

Postby andy123 » Fri Mar 01, 2019 5:53 am

It is a standard file to communicate with web crawlers & other robots. It inform crawlers about which pages to crawl & which to skip for crawling.
andy123
 
Posts: 354
Joined: Thu Dec 20, 2018 5:57 am
Location: India


Return to Google

Who is online

Users browsing this forum: No registered users and 11 guests