What is robots.txt used for?

Discussion about google search engine optimization methods.

Moderators: hellow0rld, eugene, sendtogeo

What is robots.txt used for?

Postby danielnash » Wed Mar 09, 2016 6:02 am

I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.
https://www.mwll.co.uk/air-freight-from-london.html | synramtechnolab.com | shardltd.com | mwll.co.uk
danielnash
 
Posts: 49
Joined: Tue Sep 02, 2014 11:38 am
Location: USA

Re: What is robots.txt used for?

Postby RHCalvin » Tue Jan 10, 2017 6:53 am

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
RHCalvin
 
Posts: 140
Joined: Fri Jun 03, 2016 6:31 am
Location: Forum

Re: What is robots.txt used for?

Postby ankur111 » Fri Jan 20, 2017 8:29 am

Robots.txt file is a text file that contains some instruction for Search engine robots or crawler. Through robots file, we can disallow or stop crawler to read or index any particular page or directory of website or even we can block whole website from being indexed by search engine robots.
ankur111
 
Posts: 4
Joined: Fri Jan 20, 2017 8:07 am
Location: California, USA


Return to Google

Who is online

Users browsing this forum: No registered users and 2 guests