Sign up to get started✨

  • Track your progress and stay organized.
  • Get automatic site scans to tick off the checkboxes for you.
  • Track your keywords positions over time.
  • Step-by-step instructions to SEO tasks.

Create Robots.txt file

The robots.txt file is like a set of instructions for search engine crawlers, telling them which parts of your site they should and shouldn't access. Proper configuration helps preserve crawl budget by preventing crawlers from wasting time on unimportant pages, while ensuring crucial content gets crawled and indexed. However, it's important to use it carefully as incorrect configuration can accidentally block important content from search engines.

Instructions:

Optimize Robots.txt File

  • 1
    Understand the Purpose of Robots.txt: The robots.txt file tells search engine crawlers which pages or sections of your site should or should not be crawled.
  • 2
    Access Your Robots.txt File: Locate or create the robots.txt file in the root directory of your website (e.g.,
    https://yourdomain.com/robots.txt
    ).
  • 3
    Add Basic Rules: Specify which user agents are allowed or disallowed to crawl specific parts of your site.
    1. Example (Allow All):

      User-agent: *
      Disallow:

    2. Example (Block Specific Section):

      User-agent: *
      Disallow: /private/
      Disallow: /temp/

  • 4
    Add Sitemap URL: Include the location of your sitemap for better crawl efficiency:
    1. Example

      Sitemap: https://yourdomain.com/sitemap.xml

  • 5
    Avoid Blocking Important Resources: Ensure that essential resources like CSS, JavaScript, and images are not blocked as they are needed for proper rendering and indexing.
  • 6
    Test Robots.txt File: Use Google Search Console's Robots.txt Tester to check for errors and confirm your rules work as expected.
  • 7
    Monitor and Update: Regularly review your robots.txt file and make adjustments as your site evolves to ensure proper crawling.