Friday, April 30, 2010

Safeguarding the Organization on Web

Make safety a habit to prevent data thefts that can make your company loose millions. There are various methods to protect your company from the voyeur known as web crawler. The first important step to protect your organization from perils of Google hacking is to have a strong security policy in place. Here are some simple steps that will help to safeguard your organization:
A. Keep sensitive data off the web, if possible
Prevention is always better than cure and hence one should remember that web server is the place to store data that could be accessed by public. Therefore as an added safety measure never publish any sensitive information on public server. Sensitive information should only be published on intranet or a special dedicated server which follows all the safety procedures and is under the care of a responsible administrator. Also never split public server for storing different levels of information even if you have stringent access control management in place. This is because it is extremely simple for users to cut and paste files into different folders rendering role based access control totally useless. Restricted access would make hacker search more directories and gain access to more sensitive data. You can also securely share sensitive information using encrypted emails or SSH/SCP.
B. Directory listing should be disabled for all folders on a web server.
As discussed earlier access to directory listing gives the option to the hackers to browse all the sensitive files and subdirectories located in a particular folder. It is essential to disable directory listings for all folders so that hackers do not get access to sensitive files like .htaccess which should not be available for public viewing. The directory listings can be disabled by ensuring the presence of index.htm, index.html or default.asp in each directory. Directory listings can be disabled on apache web server by placing a dash sign in front of word ‘Indexes’ present in httpd.conf.
C. Avoid error messages that contain too much information.

D. Set up a gatekeeper in a form of an instruction page like robots.txt for the search engine’s crawler.
Robots.txt file can easily be called as your security guard as this is the file where the website owners can provided detailed information about files and directories which cannot be accessed by web crawlers. Each line in robots.txt begins either with a
#- which shows it’s a comment
User-agent: to specify which web crawler it is talking about ( in our case its Googlebot)
Disallow: to show whether the web crawler can access the file or not. Here is an example of an entry in robots.txt
#disallowing Googlebot Web crawler from accessing web pages
User-Agent: Googlebot
Disallow: /
All search engines respect the listings of this file and keep away from the specified web pages.
E. Getting rid of snippets.
F. Utilize a web vulnerability scanner that automatically checks for pages identified by Google hacking queries. E.g. Acunetix Web Vulnerability
Scanner and the Google Hack Honey Pot.
G. Password security policies.
1. Use of password protection for sensitive documents.
2. Prevent information leakage from sensitive files in non secure locations.
3. Prevent inappropriate use of password protection.
H. Increased alertness to security risk
1. Leaving files in web accessible folders.
2. Collection of data on web server software versions.

Friday, April 23, 2010

Basics of Google search.

Google search engine boasts of a simplified web interface that can be used even by a novice to search web, images, videos, groups and much more. A user simply has to type what they are looking for in the search field and press enter to view the results.
Google like other search engines makes use of specialized algorithm to produce results which are displayed in order of their relevance. Therefore Google search engine has three parts:
• Web crawler also known as Googlebot that searches and brings web pages
• Indexers whose main function is to sort every page for the keywords and than indexing the relevant keywords in a huge database.
• Lastly a query processor that compares the keyword which is being searched from the index and then displays pages that it considers most relevant.

1. Web crawling and other data gathering techniques.
Google search engine makes use of data gathering techniques like web crawlers and spiders. Google has a big index of all the most searched keywords and also the links where the details of these keywords could be located. Software robots popular known as spiders are used to build an extensive list of words by browsing through millions of webpage so as to speed up the later searches. This methodical process of creating keyword index is known as web crawling. The spiders usually start by going through the list of most widely used severs and extremely popular pages. Google search engine for getting faster results make use of minimum four spiders which indexes all the important keywords in a page located at titles, subtitles and Meta tags. To further speed up its searches Google makes use of its own DNS server.
Google than makes use of its algorithm known as PageRank which displays the result based on the relevancy which is determined based on factors like:
• Frequency of the keywords in the content and also their location. More the frequency higher is the ranking
• How many other sites link to that particular page
• Lastly the history of how long the page has been in existence also affects its search engine ranking.

2. Specialized search functions and operators
You can simplify your search using some specialized operators and functions such as:
• Use (+) operator while searching for a common word and you can use (-) sign if you want to omit a particular term from Google search. When you are using (+) and (-) operator don’t use any space after these signs
• If you are searching for an exact phrase you can surround it using double quotes (“”)
• (.) is considered by Google search engine as single character wildcard.
• Remember (*) is not used for completing any word as is normally the case with search engines.

In order to further refine the Google searches one can make use of advanced Google operators which are to be used using syntax:- operator:search_term. There should be no space left between the advanced operator, colon symbol and search term. If there is a space left then the operator will be treated like any other search term. If your search term is a phrase then there should be no space between colon and first term of the phrase. You can even use quotes with the phrase.

Some of the useful advanced search operators include:
• site: this operator should be used when the search is to be restricted to a particular domain or website
• filetype: operator is used to search for specific file type
• link: operator is used to search for a keyword within hyperlinks
• cache: you have to supply the URL of the website after this operator to know the version of the particular web page which is being crawled by google
• define: operator will return the definition for the searched term
• intitle: operator will make Google search for a particular term only in the titles
• inurl: operator will restrict the Google search within the URL.

Tuesday, April 20, 2010

Change Management

ITIL defines change management as a process to use standardized procedures and methodologies to ensure quick and efficient handling of all change requests, which will result in improved service quality and would also strengthen the daily operations of an organization.
Design Patterns for Optimizing Change Management
1. Process Re-engineering
• Approach:
Fundamental rethinking and redesign of business processes to achieve dramatic improvements in critical, contemporary measures of performance, such as cost, quality, service and speed
• Key Advantages:
– Dramatic Improvements in Business Processes
– Writing To be Processes on a Clean Slate (No Bias from As Is)
• Risks Involved:
– Very High Magnitude of Change
– Requires very strong top management commitment
– Painful Journey
– Poor Track Record

2. Simplification
.Approach:
A set of activities designed to bring gradual, but continual improvement to a process through constant review
• Key Advantages:
– Higher Buy in by the line managers
– Aims at taking small steps at a time
– Gradual Improvement Process
• Risks Involved:
– Requires higher time frame
– Improvements are small in quantum
– Lack of senior management leadership

3. Value Added Analysis ( Lean and 6 sigma)
• Approach:

Eliminate Waste from the process
• Key Advantages:
– Scientific Approach
– Very Successful in Improving Process Efficiency
– Eliminate Non Value Adding Steps from a Process
• Risks Involved:
– Improvement in Effectiveness Parameters is only a by-product
– Effectiveness Parameters may be neglected
– Wrong measures for performance impacts the quality

Wednesday, April 14, 2010

Google Hacking

Google Hacking is the latest buzzword in the e-commerce world where Google applications including the extremely popular search engine is being used by the hackers to get access to personal information of the users by finding loop holes in computer codes and configurations used by the website. Though insecure websites are the easiest targets for Google hacking but hackers are now making use of carefully deduced combinations to reach confidential files stored on network hardware.
Google hacking is also being used to hack network hardware; use cached printing pages and could even snoop security cameras. Thus Google Hacking can be defined as a data mining technique that is used by hackers to discover sensitive data that is mistakenly being posted on public website. Hackers examine all the hidden recesses of an unsecured website that are not revealed during daily Google searches. Google hackers also take advantage of sensitive information that has been accidentally put on website or if the sensitive areas of the websites have not been properly configured.

How Google has become a powerful hacker’s tool in today’s information age
The term “Google Hacking” first emerged in early 2000’s but has become the biggest source of mischief in the recent times when more and more people are finding it next to impossible to find information without Google. Google has become a powerful hacker tool thanks to lot of automated scripts that are available online which can be used to unearth confidential information. These automated programs are making use of flaws present in the website to get access to private details like credit card numbers or public file that contain bank passwords and network details. Now private information is easily available in World Wide Web which further increases the chances of identity thefts using Google Hacking. As while doing Google search you do website mapping, search directory listing, perform CGI scanning and get information about web server which helps them to get lot of secure information. Google Hacking is also being used in this informative age by Virus and Trojan creators to develop programs that can automatically find insecure vulnerable systems
Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be located by the hackers manually or by making use of certain automated scripts.

Monday, April 12, 2010

Test Plan

Having a well designed test plan is the key of achieving best quality for an application. The components of a good test plan are:-
1. Introduction and PurposeThis describes the purpose of the document, including a brief summary of the application to be tested and the types of testing that would be performed on the Depending on the complexity of the application separate test plans may be needed for functionality, performance, security and\or localization testing

2. Goals of Testing
This describes what the testing hopes to achieve including any certification criteria that needs to be performed prior to testing sign off.

3. Test Environment.
The different configurations (Operating Systems, Browsers, Languages, Databases) on which the product\application needs to be tested.

4. Staffing Requirements
The number of Quality Testing professionals needed to complete the testing task.

5. Dependencies , Assumptions and Limitations
Any dependencies like integration tool, use of third party software should be clearly states. Any limitations of the testing results should also be documented in this section

6. Risks
This section describes any risks to testing along with their impact and any responses in place to mitigate the risk.

7. Test Deliverables
This includes the test deliverables more important in external testing. These usually include test cases, results of test execution, test logs, screenshots etc

8. Testing\Defect Tracking Tools
A description of tools that would be used. These include test management tools, version control tools, test automation tools as well as tools used for defect tracking.

Sunday, April 11, 2010

Safeguarding Information Security for Businesses

In today's information driven world, the single most serious threat that businesses are faced with is to their customers' information. The customer base of any business is the most critical asset that the business has, and this is true for almost any business. With risks like identity thefts increasing every day, the customers are more concerned now that ever for the information that they provide to businesses. Whether its credit card information that a buyer passes on to a retailer, or if its medical records that a patient they provides to his/her Doctor, they expect that the business would handle that information with utmost security and prevent any situation which would possible compromise the same.


The threat to safeguarding customers' information has multiple aspects to it ranging from secure transmission over a network to safe storage in a database. Businesses today are spending more money than ever to prevent hackers from getting hold of their customers' private information and misusing it. The reason for this threat being the most serious of all is that it has the most serious implications. A single report of customer data compromise at a company can result is that company losing its established customer base.


The strategies and policies that businesses can adopt to mitigate the risks from the above-mentioned threat may vary based on the type of information that the business deals with but the following set of practices can help any business deal with and control the threat:

Avoiding non electronic forms of collecting and storing customer data as much as possible - Although it's not feasible in all scenarios to completely eliminate paperwork, reducing it to minimum possible level helps prevent getting information into malicious hands.

Employing appropriate security measures for data exchange over Internet - Hacking information that is passed over the Internet is one of the most common ways of data compromise. This applies not only to online based retailers but also to businesses that exchange data over Internet for various purposes. Standards can be put in place for such information exchange that prevents the information from being sued by unauthorized personnel. These include data encryption using public and private keys and digital signatures.

Setting up strict policies on information access levels within the enterprise - In some cases customer data can get compromised by hands on the employees of the business, intentionally or unintentionally. A common example is handling of information by customer service representatives. To prevent the risks of authorized access, companies should adopt strict access control policies that define appropriate data access levels based on “roles”. Information should only be allowed to flow from a lower level security role to a higher level security role, and never in the reverse order.

Adopting appropriate data storage and backup measures - Apart from the threat that comes from hackers trying to access private information, things like disk crashes can also result in loss of data as well. To prevent such risks, companies should adopt appropriate data storage and backup measures. These include taking regular backups of data and having offsite backup location. Creating and maintaining BCP (Business Continuity Plan) is also part of ensuring data security.


The following references and incidents support my belief that the above described threat is the most serious of all, to today's businesses:


"Recent Security breaches - Class Action Lawsuit Alleges Palm Pre/Pixi Users Suffered from Data Loss", www.databreaches.net


“Recommended Practices on Notification of Security Breach Involving Personal Information,” from the California Office of Privacy Protection, www.privacy.ca.gov/recommendations/secbreach.pdf


“T-Mobile confirms biggest phone customer data breach” reported by the UK Guardian, http://www.guardian.co.uk/uk/2009/nov/17/t-mobile-phone-data-privacy


Report on “Customer Data Breach Costs” by Ecommerce Times, http://www.ecommercetimes.com/story/66055.html


“2008 Annual Study: Cost of a Data Breach”, http://www.encryptionreports.com/2008cdb.html


Security Breach Legislation available at http://www.ncsl.org/Default.aspx?TabId=13481

Thursday, April 8, 2010

V Testing Model



The V testing model is one of the most widely used and accepted testing model in the Industry. It is much preferred because this model emphasizes that both testing and development activities go on hand in hand simultaneously.

Each stage in waterfall development model coincides with corresponding stage in testing. While the waterfall development model works from top to bottom V model works the other way round. The coding is integrated with unit testing, the low level design coincides with the system testing. High level design with integration testing and the requirement gathering with acceptance testing.

The advantage of v model is the early detection of bugs in the product as testing commences hand in hand with development. Bugs discovered at the lowest level are easier to fix and do not iterate to be bigger issues in the end product

Tuesday, April 6, 2010

Black Box vs White Box Testing

Black Box Testing
Black box testing refers to testing of the application without taking into account the internal structure of the program. As long as the program returns correct results based on the input parameters, the testing is considered to be valid.

Advantages:-
1. It mimics end user behavior.
2. It can uncover missing specifications.
3. It allows to test portions of the applications that are not currently implemented
4. As tester is independent of developer, it allows for uncompromised testing
5. Tester need not have knowledge about the internal structure of the program
6. For large systems, it simplifies the testing by ensuring all the inputs and outputs are tested

Disadvantages:-
1. All possible scenarios may not be covered
2. It does not allows for structural testing of the system
3. Test cases need to be redesigned with change in input methods or user interface
Common Black Box testing methodologies:-
1. Decision table testing
2. Pair wise testing
3. State transition tables
4. Use case testing
5. Cross-functional testing


White Box testing
White box testing refers to testing of the application taking into account the internal structure of the program. It is also referred to as structural, glass box or transparent box testing. It requires understanding of the coding language and underlying architecture of the program.

Advantages:-
1. It ensures that all possible execution paths are traversed at least once.
2. It offers greater stability and reusability if the basic components of the application does not change
3. It ensures complete testing of all possible input and output parameters

Disadvantages:-
1. It does not mimics end user behavior
2. The tester needs to know the workings of the code and language used in application design.
3. Test cases are complex and difficult to execute than black box testing

Common white box testing methodologies:-
1. Control flow testing
2. Data flow testing
3. Branch testing
4. Path testing

Sunday, April 4, 2010

Risk Control Strategies

The four basic risk strategies for risk control resulting from vulnerabilities are:-

1. Avoidance which includes placing controls in place to prevent or reduce the occurrence of risk. This is the most preferred approach as it deals with avoiding the risk rather than methods to deal with it. This is accomplished using technology safeguards and controls that minimize risk to an acceptable level, use of sound policies to remove vulnerabilities in assets, and educating, training and creating awareness amongst employees on all aspects of information security.
This is adopted to reduce the risks to an acceptable level within the organization and for vulnerabilities that which if exploited threat to impact the business continuity and day to day operations of the organization. It is very important to avoid those vulnerabilities that impact the culture and foundation of the organization like risk of personal and credit card data in online warehouses like Amazon.

2. Mitigation involves measures to reduce the impact of risk. This involves creating policies and procedures for responding to incidents, and plan for restoring operations of the company in case of disasters and the action the company would take should an attack or breach occurs. The three main mitigation plans are:
Incident response plan: This includes procedures for responding to any security incident. Includes reporting structure and escalation procedures for critical incidents
Business continuity plan: This includes plan for restoring business normal modes of operating incurring minimum costs and disruption to business activities following a disaster event.
Disaster recovery plan: This includes plans and procedures for locating lost data and restoring lost services due to attack or disruption
These controls are adapted to when an incident has already taken place. Mitigations involves controls that aim to reduce losses to a minimum level and steps to restore business operations in case of interruption and disaster

3. Transference involves transferring or shifting risk to another entity, process or organization. The most common transference strategies involve outsourcing and purchasing controls. It may also include alternate deployment of controls, using different applications etc
This is involved where the cost of implementing or developing risk control within organization exceeds the cost by which benefits can be procured through outsourcing or insurance. This is used when organization do not have enough resources in house proficient in risk management and is accomplished by hiring firms\individuals as third party contractors proficient in risk management implementation and control and transfer management of complex systems to them

4. Acceptance refers to making no attempts to protect the assets and accept loss if it occurs. It is the absence of any control in place to safeguard the business and the organization from the exploitation of vulnerabilities
This should be resorted to only after a thorough feasibility analysis of risk level, probability of occurrence and potential impact on the assets ensures that the cost and benefit of implementing a control far exceeds the cost of placing any control in place.

Thursday, April 1, 2010

Information Security Controls

The security of information has become the most prevalent problem on Web today. NIST publication “NIST SP 800-26 Security Self-Assessment Guide for Information Technology Systems lists and defines this control

Management Controls deals with security project management and deals with design and implementation of policies, procedures and standards throughout the organization. These include provisions for risk management including assessing and identifying risk, evaluating risk controls, summarizing findings and then selecting a cost effective control and installing and implementing it within the organization. It also includes periodic and systematic review and evaluation of security policy within organization or with independent reviewers and policies for revision and approval of any changes as a result of those reviews. Major Management Controls are
1. Risk Management
2. Review of Security Controls
3. Life Cycle Maintenance
4. Authorization of Processing
5. System Security Plan


Operational Controls covers planning for incident response, disaster recovery and business continuity. These includes policies on reporting and escalating security incidents, preparing proper line of response, incident classification and evidence collection and reporting for knowledge sharing. It also includes procedures to ensure continuity of operation and restoration of company operations in the event of interruption or failure. The recovery plans needs to be constantly evaluated, updated and tested to keep up with the latest in business operations of the company. Provisions for physical security including access cards, gates etc, securing server and office rooms and facilities, security of media information when in transit, equipment protection and maintenance, cable security, disposal of equipments and information , removal of equipment from premises and public access to company’s information and assets are included in operational controls. Other important area it covers is security of company employees of protection of production and input output controls. Operational controls also ensure that all employees be trained and educated on information security and are aware of their responsibility in complying, maintaining and reporting any security breaches or incidents.
Major Operational Controls are
1. Personnel Security
2. Physical Security
3. Production, Input/Output Controls
4. Contingency Planning
5. Hardware and Systems Software
6. Data Integrity
7. Documentation
8. Security Awareness, Training, and Education
9. Incident Response Capability

Technical controls involve researching and selecting technology necessary to develop and implement security controls in an organization. These include technology for physical access (cards or password or combination of both), technology for remote access, policy for third party software, email and internet policies. These also include policy for remote monitoring, audit trails and automated audits for any information security incidents
Major Technical Controls are
1. Identification and Authentication
2. Logical Access Controls
3. Audit Trails