What is robots.txt used for?

Discussion about google search engine optimization methods.

Moderators: sendtogeo, hellow0rld, eugene

What is robots.txt used for?

Postby danielnash » Wed Mar 09, 2016 6:02 am

I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.
https://www.mwll.co.uk/air-freight-from-london.html | synramtechnolab.com | shardltd.com | mwll.co.uk
Posts: 59
Joined: Tue Sep 02, 2014 11:38 am
Location: USA

Re: What is robots.txt used for?

Postby RHCalvin » Tue Jan 10, 2017 6:53 am

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
Posts: 263
Joined: Fri Jun 03, 2016 6:31 am
Location: Forum

Re: What is robots.txt used for?

Postby ankur111 » Fri Jan 20, 2017 8:29 am

Robots.txt file is a text file that contains some instruction for Search engine robots or crawler. Through robots file, we can disallow or stop crawler to read or index any particular page or directory of website or even we can block whole website from being indexed by search engine robots.
Posts: 7
Joined: Fri Jan 20, 2017 8:07 am
Location: California, USA

Re: What is robots.txt used for?

Postby mah764 » Fri Jul 07, 2017 7:34 am

Robots.txt is a text file where you can write the instruction for search engine that how it behave on website. Like if you don't want to crawl some pages then you can mention on there then google will never crawl and index that page and it will never to in to live.
Get trained best online courses like Salesforce, DevOps, VMware, Quality Assurance https://www.janbasktraining.com/online-qa-training/ and many more free courses.
Posts: 67
Joined: Mon May 15, 2017 9:20 am

Return to Google

Who is online

Users browsing this forum: No registered users and 5 guests