Robot.txt is a file a text file which allows robots, search engines crawl your site better. It is not only for google bot but any search engine bots or even can be for any robot whose aim is different than searching.
You can simply write there
User-agent: *
Allow: /
What it means?" User-agent:*" stands for any robot. Allow: / it is allowing robot search your website.
How to create a robot.txt for blogger
On dashboard, settings, search preferences find custom robot.txt. Usually default settings is disabled for it. But you can change it. Click to edit and Enable custom robots.txt content? choose yes. And underneath text area write what i wrote on top.
And top of it there is a message warn you " Warning! Use with caution. Incorrect use of these features can result in your blog being ignored by search engines.". It is true it can cause search engine ignore your site for example if you write there; "Disallow: /" instead of "Allow:/" .
Lets say you want to write it only for google. It can be. Because some malware and malicious robots can enter site if you allow any robot such as email collectors, etc..
User-agent: Google
Disallow:
User-agent: *
Dissallow: /
But this will result only google bot can crawl your site but not other search engines.
How we can use it on website not in the blog. You can use it on html meta tags between head tags. For example <meta content=robots.....
<meta name="robots" content="all" />
<meta name="robots" content="index,follow" />
and if you want they not follow a link you should put inside href tag this rel="nofollow"
please share detail method of using robot.txt
ReplyDelete