Robots.txt Generator

Create a robots.txt file from scratch
or
Choose one of the suggested options
Create a robots.txt file from scratch
  • create from scratch
  • suggestions
Action
Path
Bot

Action

Path

Bot

Delete

Action

Path

Bot

Delete
Add another row
Copy all essential files
Copy all images
General suggestions
Allow everything
User-agent: *
Allow: /
Disallow a website to crawl
User-agent: *
Disallow: /
Allow everything for Google only
User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /
Disallow everything for most commonly blocked bots
User-agent: AhrefsBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: YandexBot
Disallow: /
User-agent: baiduspider
Disallow: /
Disallow for all Google bots, except Google
User-agent: Googlebot
Disallow: /
User-agent: APIs-Google
Disallow: /
User-agent: Mediapartners-Google
Disallow: /
User-agent: AdsBot-Google-Mobile
Disallow: /
User-agent: AdsBot-Google
Disallow: /
User-agent: Googlebot-Image
Disallow: /
User-agent: Googlebot-News
Disallow: /
User-agent: Googlebot-Video
Disallow: /
User-agent: Storebot-Google
Disallow: /
Allow for all Google bots
User-agent: Googlebot
Allow: /
User-agent: APIs-Google
Allow: /
User-agent: Mediapartners-Google
Allow: /
User-agent: AdsBot-Google-Mobile
Allow: /
User-agent: AdsBot-Google
Allow: /
User-agent: Googlebot-Image
Allow: /
User-agent: Googlebot-News
Allow: /
User-agent: Googlebot-Video
Allow: /
User-agent: Storebot-Google
Allow: /
Ready-made robots.txt file for CMS
Robots.txt for WordPress
User-agent: *
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: */trackback
Disallow: */*/trackback
Disallow: */*/feed/*/
Disallow: */feed
Disallow: /*?*
Disallow: /cgi-bin
Disallow: /*.php$
Disallow: /*.inc$
Disallow: /*.gz$
Allow: */uploads
Allow: /*.js
Allow: /*.css
Allow: /*.png
Allow: /*.jpg
Allow: /*.jpeg
Allow: /*.gif
Allow: /*.svg
Allow: /*.webp
Allow: /*.pdf
robots.txt for Joomla
User-agent: *
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/
robots.txt for MODX
User-agent: *
Disallow: /*?id=
Disallow: /assets
Disallow: /assets/cache
Disallow: /assets/components
Disallow: /assets/docs
Disallow: /assets/export
Disallow: /assets/import
Disallow: /assets/modules
Disallow: /assets/plugins
Disallow: /assets/snippets
Disallow: /connectors
Disallow: /core
Disallow: /index.php
Disallow: /install
Disallow: /manager
Disallow: /profile
Disallow: /search
robots.txt for Drupal
User-agent: *
Allow: /core/*.css$
Allow: /core/*.css?
Allow: /core/*.js$
Allow: /core/*.js?
Allow: /core/*.gif
Allow: /core/*.jpg
Allow: /core/*.jpeg
Allow: /core/*.png
Allow: /core/*.svg
Allow: /profiles/*.css$
Allow: /profiles/*.css?
Allow: /profiles/*.js$
Allow: /profiles/*.js?
Allow: /profiles/*.gif
Allow: /profiles/*.jpg
Allow: /profiles/*.jpeg
Allow: /profiles/*.png
Allow: /profiles/*.svg
Disallow: /core/
Disallow: /profiles/
Disallow: /README.txt
Disallow: /web.config
Disallow: /admin/
Disallow: /comment/reply/
Disallow: /filter/tips/
Disallow: /node/add/
Disallow: /search/
Disallow: /user/register/
Disallow: /user/password/
Disallow: /user/login/
Disallow: /user/logout/
Disallow: /index.php/admin/
Disallow: /index.php/comment/reply/
Disallow: /index.php/filter/tips/
Disallow: /index.php/node/add/
Disallow: /index.php/search/
Disallow: /index.php/user/password/
Disallow: /index.php/user/register/
Disallow: /index.php/user/login/
Disallow: /index.php/user/logout/
robots.txt for Magento
User-agent: *
Disallow: /index.php/
Disallow: /*?
Disallow: /checkout/
Disallow: /app/
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=
robots.txt for OpenСart
User-agent: *
Disallow: /*route=account/
Disallow: /*route=affiliate/
Disallow: /*route=checkout/
Disallow: /*route=product/search
Disallow: /index.php?route=product/product*&manufacturer_id=
Disallow: /admin
Disallow: /catalog
Disallow: /system
Disallow: /*?sort=
Disallow: /*&sort=
Disallow: /*?order=
Disallow: /*&order=
Disallow: /*?limit=
Disallow: /*&limit=
Disallow: /*?filter_name=
Disallow: /*&filter_name=
Disallow: /*?filter_sub_category=
Disallow: /*&filter_sub_category=
Disallow: /*?filter_description=
Disallow: /*&filter_description=
Disallow: /*?tracking=
Disallow: /*&tracking=
Disallow: /*compare-products
Disallow: /*search
Disallow: /*cart
Disallow: /*checkout
Disallow: /*login
Disallow: /*logout
Disallow: /*vouchers
Disallow: /*wishlist
Disallow: /*my-account
Disallow: /*order-history
Disallow: /*newsletter
Disallow: /*return-add
Disallow: /*forgot-password
Disallow: /*downloads
Disallow: /*returns
Disallow: /*transactions
Disallow: /*create-account
Disallow: /*recurring
Disallow: /*address-book
Disallow: /*reward-points
Disallow: /*affiliate-forgot-password
Disallow: /*create-affiliate-account
Disallow: /*affiliate-login
Disallow: /*affiliates
Disallow: /*?filter_tag=
Disallow: /*brands
Disallow: /*specials
Disallow: /*simpleregister
Disallow: /*simplecheckout
Disallow: *utm=
Allow: /catalog/view/javascript/
Allow: /catalog/view/theme/*/
robots.txt for WooСommerce
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-json/
Disallow: /*add-to-cart=*
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Your sitemap file
Generate robots.txt
Reset
Your robots.txt file
Click to copy
Download robots.txt file

Check your robots.txt, sitemap.xml and other crawling issues

See detailed and easy-to-follow tips
How to use our Robots.txt Generator?

How to use our Robots.txt Generator?

How to use our Robots.txt Generator?

We developed this free robots.txt generator to help webmasters, SEO experts, and marketers quickly and easily create robots.txt files.

You can generate a robots.txt file from scratch or use ready-made suggestions. In the former case, you customize the file by setting up directives (allow or disallow crawling), the path (specific pages and files), and the bots that should follow the directives. Or you can choose a ready-made robots.txt template containing a set of the most common general and CMS directives. You may also add a sitemap to the file.

As a result, our robots.txt file generator will help you get a ready-made robots.txt which you can edit and then copy or download.

Robots.txt syntax

The robots.txt syntax consists of directives, parameters, and special characters. If you want the file to work properly, you should comply with specific content requirements when creating a robots.txt file:

1. Each directive must begin on a new line. There can only be one parameter per line.

User-agent: * Disallow: /folder1/ Disallow: /folder2/
User-agent: *
Disallow: /folder1/
Disallow: /folder2/

2. Robots.txt is case-sensitive. For example, if a website folder name is capitalized, but it’s lowercase in the robots.txt file, it can disorient crawlers.

User-agent: Disallow: /folder/
Disallow: /Folder/

3. You cannot use quotation marks, spaces at the beginning of lines, or semicolons after them.

Disallow: /folder1/;
Disallow: /“folder2”/
Disallow: /folder1/
Disallow: /folder2/

Show more
How to use the Disallow directive properly?

Once you have filled the User-agent directive in, you should specify the behavior of certain (or all) bots by adding crawl instructions. Here are some essential tips:

1. Don't leave the Disallow directive without a value. In this case, the bot will crawl all of the site's content.

Disallow: - allow to crawl the entire website

2. Do not list every file that you want to block from crawling. Just disallow access to a folder, and all files in it will be blocked from crawling and indexing.

Disallow: /folder/

3. Don't block access to the website with this directive:

Disallow: / - block access to the whole website

Otherwise, the site can be completely removed from the search results.

Besides that, make sure that essential website pages are not blocked from crawling: the home page, landing pages, product cards, etc. With this directive, you should only specify files and pages that should not appear on the SERPs.

Show more
Adding your Sitemap to the robots.txt file

If necessary, you can add your Sitemap to the robots.txt file. This makes it easier for bots to crawl website content. The Sitemap file is located at http://yourwebsite/sitemap.xml. You need to add a directive with the URL of your Sitemap as shown below:

1. Don't leave the Disallow directive without a value. In this case, the bot will crawl all of the site's content.

User-agent: *
Disallow: /folder1/
Allow: /image1/
Sitemap: https://your-site.com/sitemap.xml

Show more
How to submit a robots.txt file to search engines?

You don't need to submit a robots.txt file to search engines. Whenever crawlers come to a site before crawling it, they start looking for a robots.txt file. And if they find one, they will read that file first before scanning your site.

At the same time, if you've made any changes to the robots.txt file and want to notify Google, you can submit your robots.txt file to Google Search Console. Use the Robots.txt Tester to paste the text file and click Submit.

Show more
How to define the User-agent?

When creating robots.txt and configuring crawling rules, you should specify the name of the bot to which you're giving crawl instructions. You can do this with the help of the User-agent directive.

If you want to block or allow all crawlers from accessing some of your content, you can do this by indicating * (asterisk) as the User-agent:

User-agent: *

Or you might want all your pages to appear in a specific search engine, for example, Google. In this case, use Googlebot User-agent like this:

User-agent: Googlebot

Keep in mind that each search engine has its own bots, which may differ in name from the search engine (e.g., Yahoo's Slurp). Moreover, some search engines have many crawlers depending on the crawl targets. For example, in addition to its main crawler Googlebot, Google has other bots:

  • Googlebot News—crawls news;
  • Google Mobile—crawls mobile pages;
  • Googlebot Video—crawls videos;
  • Googlebot Images—crawls images;
  • Google AdSense—crawls websites to determine content and provide relevant ads.

Show more
How to use the Allow directive properly?

The Allow directive is used to counteract the Disallow directive. Using the Allow and Disallow directives together, you can tell search engines that they can access a specific folder, file, or page within an otherwise disallowed directory.

Disallow: /album/ - search engines are not allowed to access the /album/ directory

Allow: /album/picture1.jpg - but they are allowed to access the file picture1 of the /album/ directory

With this directive, you should also specify essential website files: scripts, styles, and images. For example:

Allow: */uploads
Allow: /wp-/*.js
Allow: /wp-/*.css
Allow: /wp-/*.png
Allow: /wp-/*.jpg
Allow: /wp-/*.jpeg
Allow: /wp-/*.gif
Allow: /wp-/*.svg
Allow: /wp-/*.webp
Allow: /wp-/*.pdf

Show more
How to add the generated robots.txt file to your website?

Search engines and other crawling bots look for a robots.txt file whenever they come to a website. But, they'll only look for that file in one specific place—the main directory. So, after generating the robots.txt file, you should add it to the root folder of your website. You can find it at https://your-site.com/robots.txt.

The method of adding a robots.txt file depends on the server and CMS you are using. If you can't access the root directory, contact your web hosting provider.

Show more
Check your robots.txt, sitemap.xml and other crawling issues
Show more
How important is a robots.txt file?

The robots.txt file tells search engines what pages to crawl and which bots have access to crawl the website’s content. We can solve two issues with robots.txt:

  1. Reduce the likelihood of certain pages being crawled, including indexing and appearing in the search results.
  2. Save crawling budget.
Show more
Under what conditions will the generated robots.txt file work properly?

Robots.txt file will properly work under three conditions:

  1. Properly specified User-agent and directives. For example, each group begins with a User-agent line, one directive per line.
  2. The file must be in the .txt format only.
  3. The robots.txt file must be located in the root of the website host to which it applies.
Show more

More features to explore

  • Get immediate notifications when a change is made to a web page
  • Compare changes against two scanning dates
  • Understand the reasons behind rankings fluctuations
  • Analyze the target URL for the target keyword
  • Provides technical, content, link, and many other types of analysis
  • View such analyzed parameters as “passed”, “warnings” and “errors”
  • Use practical tips to solve issues and eliminate warnings
  • Analyze and group search queries that match the same website’s URL
  • Group keywords regardless of the location
  • Check the search volume right on spot
  • Easily group long-tail search queries
  • Analyze each link against 15 SEO parameters
  • Discover your and your competitors’ backlink profiles
  • Export backlinks to the Backlink Monitoring tool
  • Base your backlink profile structure on valuable SEO factors
  • Customizable domain logo, header, footer, color scheme, and much more
  • Set automated or manually generated branded reports
  • Custom access to different SEO facilities
  • Use your own domain via separate personal access
  • Complete checklist with directions and tips
  • Regularly updated material
  • Custom tasks can be added
  • Track your progress
  • Discover competitors by keywords, domains, subdomains, and URLs
  • Analyze the Ads advertising history
  • Get a list of all your competitors in the search results
  • Get ideas for alternative keywords for your ad campaign and organic promotion
  • Analyze all of your pages against key SEO parameters
  • Compare previous crawls
  • Solve issues by following our guidelines
  • Study analyzed parameters categorized as “passed”, “warnings” and “errors”
  • Handpick the perfect keywords to target in your SEO and PPC campaigns
  • Analyze any keyword in detail, identify new ranking opportunities, and assess the competition
  • Analyze core keyword parameters like difficulty score, search volume, CPC and level of Google Ads competition
  • Get a list of similar, related and long-tail keywords to build up or expand your keyword list
  • Add your backlinks to the system to track their parameters on a regular basis
  • Get notified whenever changes are made to your most important backlinks
  • Analyze referring domains and linking pages to see each link’s value for your site
  • Manage your link-building budget and process
1
/