Cookies setting

Cookies help us enhance your experience on our site by storing information about your preferences and interactions. You can customize your cookie settings by choosing which cookies to allow. Please note that disabling certain cookies might impact the functionality and features of our services, such as personalized content and suggestions. Cookie Policy

Cookie Policy
Essential cookies

These cookies are strictly necessary for the site to work and may not be disabled.

Information
Always enabled
Advertising cookies

Advertising cookies deliver ads relevant to your interests, limit ad frequency, and measure ad effectiveness.

Information
Analytics cookies

Analytics cookies collect information and report website usage statistics without personally identifying individual visitors to Google.

Information
mageplaza.com

How to Config Magento 2 robots.txt for SEO?

10-01-2024

How to Config Magento 2 robots.txt for SEO? How to Config Magento 2 robots.txt for SEO?

The Most Popular Extension Builder for Magento 2

With a big catalog of 224+ extensions for your online store

If your website is like a house, the Robots.txt file is the rule when entering that house. The first thing when visitors come home is to read the rules of the house and to know whether the host allows it to visit or not.

Robots.txt is a text file that webmasters create to guide robots (search engine spiders) how to crawl and index pages on their site. Therefore, proper configuration of Robots.txt file is very important. If your website has sensitive information, do not want to the public, please set up here. In addition, the reasonable configuration also helps you very well in SEO.

Importance of Robots.txt in SEO

Robots.txt has an important role in SEO. It helps search engines automatically reach the pages you want to search and index the page. However, most web pages have directories or files that do not require the search engine robots to visit. Adding robot files will greatly assist you in SEO.

Magento SEO Services
by Mageplaza

Let experienced professionals optimize your website's ranking

Learn more
Magento SEO service

Configuration of the Robots.txt for SEO

In essence, Robots.txt is a very simple text file placed in the host’s root directory. You can use any text editor to create. For example Notepad. Below is a simple robots.txt structure of WordPress:

robots.txt stop

  • User-agent: * All bots are allowed to access.
  • Allow: / Allows to detect and index entire pages and directories
  • Disallow: /admin/ Block two wp-admin and wp-includes folders
  • Sitemap: Diagram of the website

How to use the Robot.txt

  • Do not allow bots access to any directory you do not want
  • Block one page
  • Block a certain bot
  • Remove one image from Google Images
  • Use both “Allow” and “Disallow” together
  • Lock the whole site not for bots to index

Some mistakes to avoid when using robot.txt file

When you re-use someone’s robots.txt or create your own robots.txt for your website, having some mistakes are inevitable.

  • Differentiate uppercase and lowercase letters.
  • Each statement should be written on one line.
  • Do not write excess or lack of white space.
  • Do not insert any other characters except the command syntax.

Conclusion

If there is no Robots.txt, search engines will have a free run to crawl and index anything that they find on the website. However, creating Robots.txt will help search engines crawl from your site. So your website can be more appreciated.

Magento 2 SEO plugin from Mageplaza includes many outstanding features that are auto-active when you install it without any code modifications. Besides that, it is also friendly and helps your SEO better.

Table of content


    Related Post

    Website Support
    & Maintenance Services

    Make sure your store is not only in good shape but also thriving with a professional team yet at an affordable price.

    Get Started
    mageplaza services