Preventing search engines from spidering sections of your site
Hello, all,
I just recently learned that DISA (Defense Information Systems Administration) considers a robots.txt file as a vulnerability finding Category II.
https://vaulted.io/library/disa-stigs-srgs/apache_site_22_for_windows/V-2260
So, I will be forced to remove the robots.txt file from our public site. Does anyone know of another way to prevent search engines from spidering certain sections of your website? I just want to keep spiders out of our components folders.
V/r,
^ _ ^
