What is the robots.txt file and how to edit it?
In order to be seen on search engines, robots come to your site and index your new and updated content. These search engine robots are the connection between your site and the search engines. The search engine robots are programs that detect new content or updated content, and through an algorithm indexes your page in the search engine. The robot.txt file lets you communicate directly to the robots.
What is the robots.txt file?
When you build a site, emyspot automatically creates the robots.txt file to help you get started with advanced SEO to up your site's visibility. This unique file has one real purpose. The robots.txt file communicates to the search engine robots. This file signals what should appear in search engines, the SEO process called indexing, and what should not be found on search engines. By convention, search engine robots will consult the robots.txt file first, before indexing a website.
Where can I find the robots.txt file?
This file is located by adding an extension to the sites, just like the sitemap. You do not need to do anything for this document to be written ! From creating a free site or building an e-commerce site, this document if created automatically. If you feel the need to see and change this document, you can do so easily from your manager. Go to
Marketing > SEO > Robots.txt
What is in the robots.txt file?
The robots.txt file is three simple lines. In order to better visualize the three lines, go to the homepage of your site and type into the address bar /robots.txt. The page will display the following three things :
- User-agent : This line indicates which search engine robot you want to communicate with. The star character (*) means all robots should be looking at your site and interacting with your content.
- Allow : This line gives "authorization". The slash bar(/) means that the robots should index every page(URL) of your site.
- Sitemap : this is the URL address where your sitemap can be found. The sitemap is the file where are the URL addresses of your site are listed. Each time a page is created it is automatically added to your sitemap.
Why edit the robots.txt file?
If you want all content to be indexed then it is useless to edit robots.txt file. If you leave the file as is, you are telling the search engines that you want every site page indexed. This file is created automatically, you do not need to do anything to the file. If you decide as the webmaster of your site you need to edit this file, the edits you make will severely impact your visibility in search engines.
Do not edit without mastering the knowledge of the file. It is important to note that a spelling error will negatively impact your visibility.
How to edit the robots.txt file?
To edit the robots.txt file go to your manager in the
Marketing > SEO > Robots.txt menu. Uncheck the box titled use automatic robots.txt file so that you can edit the file.
As we explained earlier, the file contains 3 lines, wherein the first 2 lines indicate all the search engines allowed to index your site. This file dedicated to bettering your sites SEO is created for you automatically from the moment you make your site. And will look like the following image.
If you want to tell search engines specifically what to index, or inversely what not to index, you will then use the set protocols to edit this document. For example if you want Google not to index the forum of your site, you will write the following in the robot.txt field :
How to restore your file to the default base?
If you wish to restore the original file, it is easy and fast. All you need to do is go to the menu
Marketing > SEO > Robots tick the box titled Use automatic robots.txt file. Save the change and your file will return to the original settings.
We again want to warn you that this file should be edited with care. You should not need to modify the slightest detail in this file. A written error in the protocol can have a catastrophic effect on you search engine optimization.