Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

The creator of robots.txt is a simple tool with which you can create your robots.txt file for your website in a few seconds.Follow the steps below to protect your site from the robots you want and how you want.

What is the robots.txt file?

This is a text file that describes certain recommendations for crawlers and robots that inspect your website. Recall that the crawlers are a robot of a system or program, such as a search engine, which accesses web pages looking for information. They can also be known as spiders, spiders, bots or even indexers, depending on the case.

If we look at the case of Google, the name given to the robot or crawler it is Googlebot. However, there are also others like Moz, Majestic or Yahoo, known as Slurp.

Create robots txt

What is the purpose of creating a robots.txt file?

Thanks to our robots.txt, we will obtain several benefits according to the instructions that we mark:

  • We can prevent access to certain robots: sometimes we do not want robots of some programs or companies enter our website. This can be useful, for example, not to teach our tactics of linkbuilding to the competition.
  • Prohibit certain areas: even if we have an open public web, we can let certain folders or directories in private.
  • Delete content that is duplicated: it is a very important factor and search engines can penalize them if we have copied content from other websites.
  • Reduce server load: we can control how some robots that can get to make too many requests to our server behave.
  • Flatten sitemaps: we can indicate where our site map to give more clues to different crawlers.

How to create the robots.txt?

With Vegcorn Seo Tools, it will be very easy to create your own robots txt file.

For starters, if you want brand are all allowed or rejected by default robots. Do not worry: later, you can specify the input to certain robots.

Then you have to indicate if you want that there is some delay in the crawl, which is the process by which the robot inspects your entire page. Select the time you want: from no delay up to 120 seconds.

Then you can attach the sitemap of your website so that robots can find it easily. In the example we have left the address that usually have sitemaps in websites.

Our next step will be to allow or restrict entry to specific robots. As you can see, you have the main crawlers on the list.Do not want Google to enter your website? Perhaps you prefer that it is Yahoo that has the prohibited entry? Or do you opt for a link detection company like Dmoz? We leave it to your choice!

Finally, you can restrict certain directories or folders on your website all robots. It simply indicates the path to that directory.

Now you just have to click on "Create a Robots.txt" for our system to get to work and create a personalized and specific robots.txt file for your website according to your needs. You will only have to copy the information that you will receive and paste it into the robots.txt that you have your web. If you have not created one previously, just open a Notepad and do one again with the data we have provided.