Make safety a habit to prevent data thefts that can make your company loose millions. There are various methods to protect your company from the voyeur known as web crawler. The first important step to protect your organization from perils of Google hacking is to have a strong security policy in place. Here are some simple steps that will help to safeguard your organization:
A. Keep sensitive data off the web, if possible
Prevention is always better than cure and hence one should remember that web server is the place to store data that could be accessed by public. Therefore as an added safety measure never publish any sensitive information on public server. Sensitive information should only be published on intranet or a special dedicated server which follows all the safety procedures and is under the care of a responsible administrator. Also never split public server for storing different levels of information even if you have stringent access control management in place. This is because it is extremely simple for users to cut and paste files into different folders rendering role based access control totally useless. Restricted access would make hacker search more directories and gain access to more sensitive data. You can also securely share sensitive information using encrypted emails or SSH/SCP.
B. Directory listing should be disabled for all folders on a web server.
As discussed earlier access to directory listing gives the option to the hackers to browse all the sensitive files and subdirectories located in a particular folder. It is essential to disable directory listings for all folders so that hackers do not get access to sensitive files like .htaccess which should not be available for public viewing. The directory listings can be disabled by ensuring the presence of index.htm, index.html or default.asp in each directory. Directory listings can be disabled on apache web server by placing a dash sign in front of word ‘Indexes’ present in httpd.conf.
C. Avoid error messages that contain too much information.
D. Set up a gatekeeper in a form of an instruction page like robots.txt for the search engine’s crawler.
Robots.txt file can easily be called as your security guard as this is the file where the website owners can provided detailed information about files and directories which cannot be accessed by web crawlers. Each line in robots.txt begins either with a
#- which shows it’s a comment
User-agent: to specify which web crawler it is talking about ( in our case its Googlebot)
Disallow: to show whether the web crawler can access the file or not. Here is an example of an entry in robots.txt
#disallowing Googlebot Web crawler from accessing web pages
User-Agent: Googlebot
Disallow: /
All search engines respect the listings of this file and keep away from the specified web pages.
E. Getting rid of snippets.
F. Utilize a web vulnerability scanner that automatically checks for pages identified by Google hacking queries. E.g. Acunetix Web Vulnerability
Scanner and the Google Hack Honey Pot.
G. Password security policies.
1. Use of password protection for sensitive documents.
2. Prevent information leakage from sensitive files in non secure locations.
3. Prevent inappropriate use of password protection.
H. Increased alertness to security risk
1. Leaving files in web accessible folders.
2. Collection of data on web server software versions.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment