PDA

View Full Version : How to Block folder and file through Roots.txt File?



ferriesthai
09-19-2019, 11:14 PM
How to Block folder and file through Roots.txt File?

rscomponentseo
09-20-2019, 05:22 AM
The way this is done is through a file called Robots.txt. Robots.txt is a simple text file that sites in the root directory of your site. It tells “robots” (such as search engine spiders) which pages to crawl on your site, which pages to ignore.

WoodsPainting
09-23-2019, 12:01 AM
A robots text file or the robots.txt file is a must have file for every website. This file is often added at the root directory of the website and adding this file is a very simple process.

Also having this file in the website is considered as a sign of good quality by the search engines. Here are the simple steps to add the robots.txt files to your website:

Identify which directories and files on your web server you want to block from being crawled by the crawler.
Identify weather or not you need to specify additional instructions for a particular search engine bot beyond a generic set of crawling directives.
Use a text editor to create the robots.txt file and directives to block content
Optional: Add a reference to your sitemap file (if you have one)
Check for errors by validating your robots.txt file
Upload the robots.txt file to the root directory of your site.