SEO Tips about Robot.txt | Statement meaning of Robot.txt

This is a SEO tips. It is a must requirement for an SEO to prevent indexing of search pages. Google webmaster tools do this automatically. If it is not done then use robot.txt to prevent custom search pages on your blog or website. Its code is like this.
www.yousite.com/robot.txt/searchresult/disallow. This can be tested in google webmaster tools whether the google bot is indexing or not.

To see the websites' robot.txt file write this in the address bar of the browser. E.g. Www.yoursite.com/robot.txt

Read these instruction carefully to understand the robot.txt statements. (Resources google and other branded websites)

What do the robot instructions mean?

Here is an explanation of what the different words mean in a robot.txt file
User-agent:

The "User-agent" part is there to specify directions to a specific robot if needed.

There are two ways to use this in your file.

If you want to tell all robots the same thing you put a " * " after the "User-agent" It would look like this…

User-agent: * (This line is saying "these directions apply to all robots")

If you want to tell a specific robot something (in this example Googlebot) it would look like this…

User-agent: Googlebot (this line is saying "these directions apply to just Googlebot")

Disallow:

The "Disallow" part is there to tell the robots what folders they should not look at.

This means that if, for example you do not want search engines to index the photos on your site then you can place those photos into one folder and exclude it.

Lets say that you have put all these photos into a folder called "photos".
Now you want to tell search engines not to index that folder.

See also  Speed Up Your Windows XP – Use a Registry Cleaner to Make Your Computer Faster

Here is what your robot.txt file should look like:

User-agent: *
Disallow: /photos

The above two lines of text in your robots.txt file would keep robots from visiting your photos folder. The "User-agent *" part is saying "this applies to all robots". The "Disallow: /photos" part is saying "don't visit or index my photos folder".

Googlebot specific instructions

The robot that Google uses to index their search engine is called Googlebot It understands a few more instructions than other robots. The instructions it follows are well defined in the Google help pages (see resources below).

In addition to the "User-name" and "Disallow"

Googlebot also uses the…
Allow:

The "Allow:" instructions lets you tell a robot that it is okay to see a file in a folder that has been "Disallowed" by other instructions.

To illustrate this, let's take the above example of telling the robot not to visit or index your photos. We put all the photos into one folder called "photos" and we made a robot.txt file that looked like this…

User-agent: *
Disallow: /photos

Now let's say there was a photo called mycar.jpg in that folder that you want Googlebot to index. With the Allow: instruction, we can tell Googlebot to do so, it would look like this…

User-agent: *
Disallow: /photos
Allow: /photos/mycar.jpg

This would tell Googlebot that it can visit "mycar.jpg" in the photo folder, even though

Nokia E6
<a href="https://www.essssay.com">Free Essay</a>


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *