How to wapka robots.txt and use of it

Hi all, Some webmasters do not want their sites to show in search engines while others want. All these can be done with the help of robots.txt What is Robots.txt This is a text file that is written and canbe intepreted by Spiders/Crawlers to abide in rules given to it, and in robots.txt it refers to a particular spider/crawler by its UA (User Agent) in directing it where to crawl in site and where to not. All robots.txt must be saved in the root directory and saved with robots.txt and any spider visiting your site must first look for before accessing the site

What is a Spider/Crawler? 

Spiders or called Crawlers are programs sent by Search engines to index pages and take results gotten from pages to the search engine, Spiders are also used by H*ckers in getting Email address for spamming, and all browser, spiders/crawlers have a unique User Agent used for surfing the net. What is User Agent? User Agents are just like an Identity Card used by browsers and Spiders in surfing the net in order to be recognised