Make safety a habit to prevent data thefts that can make your company loose millions. There are various methods to protect your company from the voyeur known as web crawler. The first important step to protect your organization from perils of Google hacking is to have a strong security policy in place. Here are some simple steps that will help to safeguard your organization:
A. Keep sensitive data off the web, if possible
Prevention is always better than cure and hence one should remember that web server is the place to store data that could be accessed by public. Therefore as an added safety measure never publish any sensitive information on public server. Sensitive information should only be published on intranet or a special dedicated server which follows all the safety procedures and is under the care of a responsible administrator. Also never split public server for storing different levels of information even if you have stringent access control management in place. This is because it is extremely simple for users to cut and paste files into different folders rendering role based access control totally useless. Restricted access would make hacker search more directories and gain access to more sensitive data. You can also securely share sensitive information using encrypted emails or SSH/SCP.
B. Directory listing should be disabled for all folders on a web server.
As discussed earlier access to directory listing gives the option to the hackers to browse all the sensitive files and subdirectories located in a particular folder. It is essential to disable directory listings for all folders so that hackers do not get access to sensitive files like .htaccess which should not be available for public viewing. The directory listings can be disabled by ensuring the presence of index.htm, index.html or default.asp in each directory. Directory listings can be disabled on apache web server by placing a dash sign in front of word ‘Indexes’ present in httpd.conf.
C. Avoid error messages that contain too much information.
D. Set up a gatekeeper in a form of an instruction page like robots.txt for the search engine’s crawler.
Robots.txt file can easily be called as your security guard as this is the file where the website owners can provided detailed information about files and directories which cannot be accessed by web crawlers. Each line in robots.txt begins either with a
#- which shows it’s a comment
User-agent: to specify which web crawler it is talking about ( in our case its Googlebot)
Disallow: to show whether the web crawler can access the file or not. Here is an example of an entry in robots.txt
#disallowing Googlebot Web crawler from accessing web pages
User-Agent: Googlebot
Disallow: /
All search engines respect the listings of this file and keep away from the specified web pages.
E. Getting rid of snippets.
F. Utilize a web vulnerability scanner that automatically checks for pages identified by Google hacking queries. E.g. Acunetix Web Vulnerability
Scanner and the Google Hack Honey Pot.
G. Password security policies.
1. Use of password protection for sensitive documents.
2. Prevent information leakage from sensitive files in non secure locations.
3. Prevent inappropriate use of password protection.
H. Increased alertness to security risk
1. Leaving files in web accessible folders.
2. Collection of data on web server software versions.
Friday, April 30, 2010
Friday, April 23, 2010
Basics of Google search.
Google search engine boasts of a simplified web interface that can be used even by a novice to search web, images, videos, groups and much more. A user simply has to type what they are looking for in the search field and press enter to view the results.
Google like other search engines makes use of specialized algorithm to produce results which are displayed in order of their relevance. Therefore Google search engine has three parts:
• Web crawler also known as Googlebot that searches and brings web pages
• Indexers whose main function is to sort every page for the keywords and than indexing the relevant keywords in a huge database.
• Lastly a query processor that compares the keyword which is being searched from the index and then displays pages that it considers most relevant.
1. Web crawling and other data gathering techniques.
Google search engine makes use of data gathering techniques like web crawlers and spiders. Google has a big index of all the most searched keywords and also the links where the details of these keywords could be located. Software robots popular known as spiders are used to build an extensive list of words by browsing through millions of webpage so as to speed up the later searches. This methodical process of creating keyword index is known as web crawling. The spiders usually start by going through the list of most widely used severs and extremely popular pages. Google search engine for getting faster results make use of minimum four spiders which indexes all the important keywords in a page located at titles, subtitles and Meta tags. To further speed up its searches Google makes use of its own DNS server.
Google than makes use of its algorithm known as PageRank which displays the result based on the relevancy which is determined based on factors like:
• Frequency of the keywords in the content and also their location. More the frequency higher is the ranking
• How many other sites link to that particular page
• Lastly the history of how long the page has been in existence also affects its search engine ranking.
2. Specialized search functions and operators
You can simplify your search using some specialized operators and functions such as:
• Use (+) operator while searching for a common word and you can use (-) sign if you want to omit a particular term from Google search. When you are using (+) and (-) operator don’t use any space after these signs
• If you are searching for an exact phrase you can surround it using double quotes (“”)
• (.) is considered by Google search engine as single character wildcard.
• Remember (*) is not used for completing any word as is normally the case with search engines.
In order to further refine the Google searches one can make use of advanced Google operators which are to be used using syntax:- operator:search_term. There should be no space left between the advanced operator, colon symbol and search term. If there is a space left then the operator will be treated like any other search term. If your search term is a phrase then there should be no space between colon and first term of the phrase. You can even use quotes with the phrase.
Some of the useful advanced search operators include:
• site: this operator should be used when the search is to be restricted to a particular domain or website
• filetype: operator is used to search for specific file type
• link: operator is used to search for a keyword within hyperlinks
• cache: you have to supply the URL of the website after this operator to know the version of the particular web page which is being crawled by google
• define: operator will return the definition for the searched term
• intitle: operator will make Google search for a particular term only in the titles
• inurl: operator will restrict the Google search within the URL.
Google like other search engines makes use of specialized algorithm to produce results which are displayed in order of their relevance. Therefore Google search engine has three parts:
• Web crawler also known as Googlebot that searches and brings web pages
• Indexers whose main function is to sort every page for the keywords and than indexing the relevant keywords in a huge database.
• Lastly a query processor that compares the keyword which is being searched from the index and then displays pages that it considers most relevant.
1. Web crawling and other data gathering techniques.
Google search engine makes use of data gathering techniques like web crawlers and spiders. Google has a big index of all the most searched keywords and also the links where the details of these keywords could be located. Software robots popular known as spiders are used to build an extensive list of words by browsing through millions of webpage so as to speed up the later searches. This methodical process of creating keyword index is known as web crawling. The spiders usually start by going through the list of most widely used severs and extremely popular pages. Google search engine for getting faster results make use of minimum four spiders which indexes all the important keywords in a page located at titles, subtitles and Meta tags. To further speed up its searches Google makes use of its own DNS server.
Google than makes use of its algorithm known as PageRank which displays the result based on the relevancy which is determined based on factors like:
• Frequency of the keywords in the content and also their location. More the frequency higher is the ranking
• How many other sites link to that particular page
• Lastly the history of how long the page has been in existence also affects its search engine ranking.
2. Specialized search functions and operators
You can simplify your search using some specialized operators and functions such as:
• Use (+) operator while searching for a common word and you can use (-) sign if you want to omit a particular term from Google search. When you are using (+) and (-) operator don’t use any space after these signs
• If you are searching for an exact phrase you can surround it using double quotes (“”)
• (.) is considered by Google search engine as single character wildcard.
• Remember (*) is not used for completing any word as is normally the case with search engines.
In order to further refine the Google searches one can make use of advanced Google operators which are to be used using syntax:- operator:search_term. There should be no space left between the advanced operator, colon symbol and search term. If there is a space left then the operator will be treated like any other search term. If your search term is a phrase then there should be no space between colon and first term of the phrase. You can even use quotes with the phrase.
Some of the useful advanced search operators include:
• site: this operator should be used when the search is to be restricted to a particular domain or website
• filetype: operator is used to search for specific file type
• link: operator is used to search for a keyword within hyperlinks
• cache: you have to supply the URL of the website after this operator to know the version of the particular web page which is being crawled by google
• define: operator will return the definition for the searched term
• intitle: operator will make Google search for a particular term only in the titles
• inurl: operator will restrict the Google search within the URL.
Tuesday, April 20, 2010
Change Management
ITIL defines change management as a process to use standardized procedures and methodologies to ensure quick and efficient handling of all change requests, which will result in improved service quality and would also strengthen the daily operations of an organization.
Design Patterns for Optimizing Change Management
1. Process Re-engineering
• Approach:
Fundamental rethinking and redesign of business processes to achieve dramatic improvements in critical, contemporary measures of performance, such as cost, quality, service and speed
• Key Advantages:
– Dramatic Improvements in Business Processes
– Writing To be Processes on a Clean Slate (No Bias from As Is)
• Risks Involved:
– Very High Magnitude of Change
– Requires very strong top management commitment
– Painful Journey
– Poor Track Record
2. Simplification
.Approach:
A set of activities designed to bring gradual, but continual improvement to a process through constant review
• Key Advantages:
– Higher Buy in by the line managers
– Aims at taking small steps at a time
– Gradual Improvement Process
• Risks Involved:
– Requires higher time frame
– Improvements are small in quantum
– Lack of senior management leadership
3. Value Added Analysis ( Lean and 6 sigma)
• Approach:
Eliminate Waste from the process
• Key Advantages:
– Scientific Approach
– Very Successful in Improving Process Efficiency
– Eliminate Non Value Adding Steps from a Process
• Risks Involved:
– Improvement in Effectiveness Parameters is only a by-product
– Effectiveness Parameters may be neglected
– Wrong measures for performance impacts the quality
Design Patterns for Optimizing Change Management
1. Process Re-engineering
• Approach:
Fundamental rethinking and redesign of business processes to achieve dramatic improvements in critical, contemporary measures of performance, such as cost, quality, service and speed
• Key Advantages:
– Dramatic Improvements in Business Processes
– Writing To be Processes on a Clean Slate (No Bias from As Is)
• Risks Involved:
– Very High Magnitude of Change
– Requires very strong top management commitment
– Painful Journey
– Poor Track Record
2. Simplification
.Approach:
A set of activities designed to bring gradual, but continual improvement to a process through constant review
• Key Advantages:
– Higher Buy in by the line managers
– Aims at taking small steps at a time
– Gradual Improvement Process
• Risks Involved:
– Requires higher time frame
– Improvements are small in quantum
– Lack of senior management leadership
3. Value Added Analysis ( Lean and 6 sigma)
• Approach:
Eliminate Waste from the process
• Key Advantages:
– Scientific Approach
– Very Successful in Improving Process Efficiency
– Eliminate Non Value Adding Steps from a Process
• Risks Involved:
– Improvement in Effectiveness Parameters is only a by-product
– Effectiveness Parameters may be neglected
– Wrong measures for performance impacts the quality
Wednesday, April 14, 2010
Google Hacking
Google Hacking is the latest buzzword in the e-commerce world where Google applications including the extremely popular search engine is being used by the hackers to get access to personal information of the users by finding loop holes in computer codes and configurations used by the website. Though insecure websites are the easiest targets for Google hacking but hackers are now making use of carefully deduced combinations to reach confidential files stored on network hardware.
Google hacking is also being used to hack network hardware; use cached printing pages and could even snoop security cameras. Thus Google Hacking can be defined as a data mining technique that is used by hackers to discover sensitive data that is mistakenly being posted on public website. Hackers examine all the hidden recesses of an unsecured website that are not revealed during daily Google searches. Google hackers also take advantage of sensitive information that has been accidentally put on website or if the sensitive areas of the websites have not been properly configured.
How Google has become a powerful hacker’s tool in today’s information age
The term “Google Hacking” first emerged in early 2000’s but has become the biggest source of mischief in the recent times when more and more people are finding it next to impossible to find information without Google. Google has become a powerful hacker tool thanks to lot of automated scripts that are available online which can be used to unearth confidential information. These automated programs are making use of flaws present in the website to get access to private details like credit card numbers or public file that contain bank passwords and network details. Now private information is easily available in World Wide Web which further increases the chances of identity thefts using Google Hacking. As while doing Google search you do website mapping, search directory listing, perform CGI scanning and get information about web server which helps them to get lot of secure information. Google Hacking is also being used in this informative age by Virus and Trojan creators to develop programs that can automatically find insecure vulnerable systems
Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be located by the hackers manually or by making use of certain automated scripts.
Google hacking is also being used to hack network hardware; use cached printing pages and could even snoop security cameras. Thus Google Hacking can be defined as a data mining technique that is used by hackers to discover sensitive data that is mistakenly being posted on public website. Hackers examine all the hidden recesses of an unsecured website that are not revealed during daily Google searches. Google hackers also take advantage of sensitive information that has been accidentally put on website or if the sensitive areas of the websites have not been properly configured.
How Google has become a powerful hacker’s tool in today’s information age
The term “Google Hacking” first emerged in early 2000’s but has become the biggest source of mischief in the recent times when more and more people are finding it next to impossible to find information without Google. Google has become a powerful hacker tool thanks to lot of automated scripts that are available online which can be used to unearth confidential information. These automated programs are making use of flaws present in the website to get access to private details like credit card numbers or public file that contain bank passwords and network details. Now private information is easily available in World Wide Web which further increases the chances of identity thefts using Google Hacking. As while doing Google search you do website mapping, search directory listing, perform CGI scanning and get information about web server which helps them to get lot of secure information. Google Hacking is also being used in this informative age by Virus and Trojan creators to develop programs that can automatically find insecure vulnerable systems
Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be Google search often adds some vulnerable files to its indexes during search procedures. These files can then easily be located by the hackers manually or by making use of certain automated scripts.
Monday, April 12, 2010
Test Plan
Having a well designed test plan is the key of achieving best quality for an application. The components of a good test plan are:-
1. Introduction and PurposeThis describes the purpose of the document, including a brief summary of the application to be tested and the types of testing that would be performed on the Depending on the complexity of the application separate test plans may be needed for functionality, performance, security and\or localization testing
2. Goals of Testing
This describes what the testing hopes to achieve including any certification criteria that needs to be performed prior to testing sign off.
3. Test Environment.
The different configurations (Operating Systems, Browsers, Languages, Databases) on which the product\application needs to be tested.
4. Staffing Requirements
The number of Quality Testing professionals needed to complete the testing task.
5. Dependencies , Assumptions and Limitations
Any dependencies like integration tool, use of third party software should be clearly states. Any limitations of the testing results should also be documented in this section
6. Risks
This section describes any risks to testing along with their impact and any responses in place to mitigate the risk.
7. Test Deliverables
This includes the test deliverables more important in external testing. These usually include test cases, results of test execution, test logs, screenshots etc
8. Testing\Defect Tracking Tools
A description of tools that would be used. These include test management tools, version control tools, test automation tools as well as tools used for defect tracking.
1. Introduction and PurposeThis describes the purpose of the document, including a brief summary of the application to be tested and the types of testing that would be performed on the Depending on the complexity of the application separate test plans may be needed for functionality, performance, security and\or localization testing
2. Goals of Testing
This describes what the testing hopes to achieve including any certification criteria that needs to be performed prior to testing sign off.
3. Test Environment.
The different configurations (Operating Systems, Browsers, Languages, Databases) on which the product\application needs to be tested.
4. Staffing Requirements
The number of Quality Testing professionals needed to complete the testing task.
5. Dependencies , Assumptions and Limitations
Any dependencies like integration tool, use of third party software should be clearly states. Any limitations of the testing results should also be documented in this section
6. Risks
This section describes any risks to testing along with their impact and any responses in place to mitigate the risk.
7. Test Deliverables
This includes the test deliverables more important in external testing. These usually include test cases, results of test execution, test logs, screenshots etc
8. Testing\Defect Tracking Tools
A description of tools that would be used. These include test management tools, version control tools, test automation tools as well as tools used for defect tracking.
Sunday, April 11, 2010
Safeguarding Information Security for Businesses
In today's information driven world, the single most serious threat that businesses are faced with is to their customers' information. The customer base of any business is the most critical asset that the business has, and this is true for almost any business. With risks like identity thefts increasing every day, the customers are more concerned now that ever for the information that they provide to businesses. Whether its credit card information that a buyer passes on to a retailer, or if its medical records that a patient they provides to his/her Doctor, they expect that the business would handle that information with utmost security and prevent any situation which would possible compromise the same.
The threat to safeguarding customers' information has multiple aspects to it ranging from secure transmission over a network to safe storage in a database. Businesses today are spending more money than ever to prevent hackers from getting hold of their customers' private information and misusing it. The reason for this threat being the most serious of all is that it has the most serious implications. A single report of customer data compromise at a company can result is that company losing its established customer base.
The strategies and policies that businesses can adopt to mitigate the risks from the above-mentioned threat may vary based on the type of information that the business deals with but the following set of practices can help any business deal with and control the threat:
Avoiding non electronic forms of collecting and storing customer data as much as possible - Although it's not feasible in all scenarios to completely eliminate paperwork, reducing it to minimum possible level helps prevent getting information into malicious hands.
Employing appropriate security measures for data exchange over Internet - Hacking information that is passed over the Internet is one of the most common ways of data compromise. This applies not only to online based retailers but also to businesses that exchange data over Internet for various purposes. Standards can be put in place for such information exchange that prevents the information from being sued by unauthorized personnel. These include data encryption using public and private keys and digital signatures.
Setting up strict policies on information access levels within the enterprise - In some cases customer data can get compromised by hands on the employees of the business, intentionally or unintentionally. A common example is handling of information by customer service representatives. To prevent the risks of authorized access, companies should adopt strict access control policies that define appropriate data access levels based on “roles”. Information should only be allowed to flow from a lower level security role to a higher level security role, and never in the reverse order.
Adopting appropriate data storage and backup measures - Apart from the threat that comes from hackers trying to access private information, things like disk crashes can also result in loss of data as well. To prevent such risks, companies should adopt appropriate data storage and backup measures. These include taking regular backups of data and having offsite backup location. Creating and maintaining BCP (Business Continuity Plan) is also part of ensuring data security.
The following references and incidents support my belief that the above described threat is the most serious of all, to today's businesses:
"Recent Security breaches - Class Action Lawsuit Alleges Palm Pre/Pixi Users Suffered from Data Loss", www.databreaches.net
“Recommended Practices on Notification of Security Breach Involving Personal Information,” from the California Office of Privacy Protection, www.privacy.ca.gov/recommendations/secbreach.pdf
“T-Mobile confirms biggest phone customer data breach” reported by the UK Guardian, http://www.guardian.co.uk/uk/2009/nov/17/t-mobile-phone-data-privacy
Report on “Customer Data Breach Costs” by Ecommerce Times, http://www.ecommercetimes.com/story/66055.html
“2008 Annual Study: Cost of a Data Breach”, http://www.encryptionreports.com/2008cdb.html
Security Breach Legislation available at http://www.ncsl.org/Default.aspx?TabId=13481
The threat to safeguarding customers' information has multiple aspects to it ranging from secure transmission over a network to safe storage in a database. Businesses today are spending more money than ever to prevent hackers from getting hold of their customers' private information and misusing it. The reason for this threat being the most serious of all is that it has the most serious implications. A single report of customer data compromise at a company can result is that company losing its established customer base.
The strategies and policies that businesses can adopt to mitigate the risks from the above-mentioned threat may vary based on the type of information that the business deals with but the following set of practices can help any business deal with and control the threat:
Avoiding non electronic forms of collecting and storing customer data as much as possible - Although it's not feasible in all scenarios to completely eliminate paperwork, reducing it to minimum possible level helps prevent getting information into malicious hands.
Employing appropriate security measures for data exchange over Internet - Hacking information that is passed over the Internet is one of the most common ways of data compromise. This applies not only to online based retailers but also to businesses that exchange data over Internet for various purposes. Standards can be put in place for such information exchange that prevents the information from being sued by unauthorized personnel. These include data encryption using public and private keys and digital signatures.
Setting up strict policies on information access levels within the enterprise - In some cases customer data can get compromised by hands on the employees of the business, intentionally or unintentionally. A common example is handling of information by customer service representatives. To prevent the risks of authorized access, companies should adopt strict access control policies that define appropriate data access levels based on “roles”. Information should only be allowed to flow from a lower level security role to a higher level security role, and never in the reverse order.
Adopting appropriate data storage and backup measures - Apart from the threat that comes from hackers trying to access private information, things like disk crashes can also result in loss of data as well. To prevent such risks, companies should adopt appropriate data storage and backup measures. These include taking regular backups of data and having offsite backup location. Creating and maintaining BCP (Business Continuity Plan) is also part of ensuring data security.
The following references and incidents support my belief that the above described threat is the most serious of all, to today's businesses:
"Recent Security breaches - Class Action Lawsuit Alleges Palm Pre/Pixi Users Suffered from Data Loss", www.databreaches.net
“Recommended Practices on Notification of Security Breach Involving Personal Information,” from the California Office of Privacy Protection, www.privacy.ca.gov/recommendations/secbreach.pdf
“T-Mobile confirms biggest phone customer data breach” reported by the UK Guardian, http://www.guardian.co.uk/uk/2009/nov/17/t-mobile-phone-data-privacy
Report on “Customer Data Breach Costs” by Ecommerce Times, http://www.ecommercetimes.com/story/66055.html
“2008 Annual Study: Cost of a Data Breach”, http://www.encryptionreports.com/2008cdb.html
Security Breach Legislation available at http://www.ncsl.org/Default.aspx?TabId=13481
Thursday, April 8, 2010
V Testing Model
The V testing model is one of the most widely used and accepted testing model in the Industry. It is much preferred because this model emphasizes that both testing and development activities go on hand in hand simultaneously.
Each stage in waterfall development model coincides with corresponding stage in testing. While the waterfall development model works from top to bottom V model works the other way round. The coding is integrated with unit testing, the low level design coincides with the system testing. High level design with integration testing and the requirement gathering with acceptance testing.
The advantage of v model is the early detection of bugs in the product as testing commences hand in hand with development. Bugs discovered at the lowest level are easier to fix and do not iterate to be bigger issues in the end product
Tuesday, April 6, 2010
Black Box vs White Box Testing
Black Box Testing
Black box testing refers to testing of the application without taking into account the internal structure of the program. As long as the program returns correct results based on the input parameters, the testing is considered to be valid.
Advantages:-
1. It mimics end user behavior.
2. It can uncover missing specifications.
3. It allows to test portions of the applications that are not currently implemented
4. As tester is independent of developer, it allows for uncompromised testing
5. Tester need not have knowledge about the internal structure of the program
6. For large systems, it simplifies the testing by ensuring all the inputs and outputs are tested
Disadvantages:-
1. All possible scenarios may not be covered
2. It does not allows for structural testing of the system
3. Test cases need to be redesigned with change in input methods or user interface
Common Black Box testing methodologies:-
1. Decision table testing
2. Pair wise testing
3. State transition tables
4. Use case testing
5. Cross-functional testing
White Box testing
White box testing refers to testing of the application taking into account the internal structure of the program. It is also referred to as structural, glass box or transparent box testing. It requires understanding of the coding language and underlying architecture of the program.
Advantages:-
1. It ensures that all possible execution paths are traversed at least once.
2. It offers greater stability and reusability if the basic components of the application does not change
3. It ensures complete testing of all possible input and output parameters
Disadvantages:-
1. It does not mimics end user behavior
2. The tester needs to know the workings of the code and language used in application design.
3. Test cases are complex and difficult to execute than black box testing
Common white box testing methodologies:-
1. Control flow testing
2. Data flow testing
3. Branch testing
4. Path testing
Black box testing refers to testing of the application without taking into account the internal structure of the program. As long as the program returns correct results based on the input parameters, the testing is considered to be valid.
Advantages:-
1. It mimics end user behavior.
2. It can uncover missing specifications.
3. It allows to test portions of the applications that are not currently implemented
4. As tester is independent of developer, it allows for uncompromised testing
5. Tester need not have knowledge about the internal structure of the program
6. For large systems, it simplifies the testing by ensuring all the inputs and outputs are tested
Disadvantages:-
1. All possible scenarios may not be covered
2. It does not allows for structural testing of the system
3. Test cases need to be redesigned with change in input methods or user interface
Common Black Box testing methodologies:-
1. Decision table testing
2. Pair wise testing
3. State transition tables
4. Use case testing
5. Cross-functional testing
White Box testing
White box testing refers to testing of the application taking into account the internal structure of the program. It is also referred to as structural, glass box or transparent box testing. It requires understanding of the coding language and underlying architecture of the program.
Advantages:-
1. It ensures that all possible execution paths are traversed at least once.
2. It offers greater stability and reusability if the basic components of the application does not change
3. It ensures complete testing of all possible input and output parameters
Disadvantages:-
1. It does not mimics end user behavior
2. The tester needs to know the workings of the code and language used in application design.
3. Test cases are complex and difficult to execute than black box testing
Common white box testing methodologies:-
1. Control flow testing
2. Data flow testing
3. Branch testing
4. Path testing
Sunday, April 4, 2010
Risk Control Strategies
The four basic risk strategies for risk control resulting from vulnerabilities are:-
1. Avoidance which includes placing controls in place to prevent or reduce the occurrence of risk. This is the most preferred approach as it deals with avoiding the risk rather than methods to deal with it. This is accomplished using technology safeguards and controls that minimize risk to an acceptable level, use of sound policies to remove vulnerabilities in assets, and educating, training and creating awareness amongst employees on all aspects of information security.
This is adopted to reduce the risks to an acceptable level within the organization and for vulnerabilities that which if exploited threat to impact the business continuity and day to day operations of the organization. It is very important to avoid those vulnerabilities that impact the culture and foundation of the organization like risk of personal and credit card data in online warehouses like Amazon.
2. Mitigation involves measures to reduce the impact of risk. This involves creating policies and procedures for responding to incidents, and plan for restoring operations of the company in case of disasters and the action the company would take should an attack or breach occurs. The three main mitigation plans are:
Incident response plan: This includes procedures for responding to any security incident. Includes reporting structure and escalation procedures for critical incidents
Business continuity plan: This includes plan for restoring business normal modes of operating incurring minimum costs and disruption to business activities following a disaster event.
Disaster recovery plan: This includes plans and procedures for locating lost data and restoring lost services due to attack or disruption
These controls are adapted to when an incident has already taken place. Mitigations involves controls that aim to reduce losses to a minimum level and steps to restore business operations in case of interruption and disaster
3. Transference involves transferring or shifting risk to another entity, process or organization. The most common transference strategies involve outsourcing and purchasing controls. It may also include alternate deployment of controls, using different applications etc
This is involved where the cost of implementing or developing risk control within organization exceeds the cost by which benefits can be procured through outsourcing or insurance. This is used when organization do not have enough resources in house proficient in risk management and is accomplished by hiring firms\individuals as third party contractors proficient in risk management implementation and control and transfer management of complex systems to them
4. Acceptance refers to making no attempts to protect the assets and accept loss if it occurs. It is the absence of any control in place to safeguard the business and the organization from the exploitation of vulnerabilities
This should be resorted to only after a thorough feasibility analysis of risk level, probability of occurrence and potential impact on the assets ensures that the cost and benefit of implementing a control far exceeds the cost of placing any control in place.
1. Avoidance which includes placing controls in place to prevent or reduce the occurrence of risk. This is the most preferred approach as it deals with avoiding the risk rather than methods to deal with it. This is accomplished using technology safeguards and controls that minimize risk to an acceptable level, use of sound policies to remove vulnerabilities in assets, and educating, training and creating awareness amongst employees on all aspects of information security.
This is adopted to reduce the risks to an acceptable level within the organization and for vulnerabilities that which if exploited threat to impact the business continuity and day to day operations of the organization. It is very important to avoid those vulnerabilities that impact the culture and foundation of the organization like risk of personal and credit card data in online warehouses like Amazon.
2. Mitigation involves measures to reduce the impact of risk. This involves creating policies and procedures for responding to incidents, and plan for restoring operations of the company in case of disasters and the action the company would take should an attack or breach occurs. The three main mitigation plans are:
Incident response plan: This includes procedures for responding to any security incident. Includes reporting structure and escalation procedures for critical incidents
Business continuity plan: This includes plan for restoring business normal modes of operating incurring minimum costs and disruption to business activities following a disaster event.
Disaster recovery plan: This includes plans and procedures for locating lost data and restoring lost services due to attack or disruption
These controls are adapted to when an incident has already taken place. Mitigations involves controls that aim to reduce losses to a minimum level and steps to restore business operations in case of interruption and disaster
3. Transference involves transferring or shifting risk to another entity, process or organization. The most common transference strategies involve outsourcing and purchasing controls. It may also include alternate deployment of controls, using different applications etc
This is involved where the cost of implementing or developing risk control within organization exceeds the cost by which benefits can be procured through outsourcing or insurance. This is used when organization do not have enough resources in house proficient in risk management and is accomplished by hiring firms\individuals as third party contractors proficient in risk management implementation and control and transfer management of complex systems to them
4. Acceptance refers to making no attempts to protect the assets and accept loss if it occurs. It is the absence of any control in place to safeguard the business and the organization from the exploitation of vulnerabilities
This should be resorted to only after a thorough feasibility analysis of risk level, probability of occurrence and potential impact on the assets ensures that the cost and benefit of implementing a control far exceeds the cost of placing any control in place.
Thursday, April 1, 2010
Information Security Controls
The security of information has become the most prevalent problem on Web today. NIST publication “NIST SP 800-26 Security Self-Assessment Guide for Information Technology Systems lists and defines this control
Management Controls deals with security project management and deals with design and implementation of policies, procedures and standards throughout the organization. These include provisions for risk management including assessing and identifying risk, evaluating risk controls, summarizing findings and then selecting a cost effective control and installing and implementing it within the organization. It also includes periodic and systematic review and evaluation of security policy within organization or with independent reviewers and policies for revision and approval of any changes as a result of those reviews. Major Management Controls are
1. Risk Management
2. Review of Security Controls
3. Life Cycle Maintenance
4. Authorization of Processing
5. System Security Plan
Operational Controls covers planning for incident response, disaster recovery and business continuity. These includes policies on reporting and escalating security incidents, preparing proper line of response, incident classification and evidence collection and reporting for knowledge sharing. It also includes procedures to ensure continuity of operation and restoration of company operations in the event of interruption or failure. The recovery plans needs to be constantly evaluated, updated and tested to keep up with the latest in business operations of the company. Provisions for physical security including access cards, gates etc, securing server and office rooms and facilities, security of media information when in transit, equipment protection and maintenance, cable security, disposal of equipments and information , removal of equipment from premises and public access to company’s information and assets are included in operational controls. Other important area it covers is security of company employees of protection of production and input output controls. Operational controls also ensure that all employees be trained and educated on information security and are aware of their responsibility in complying, maintaining and reporting any security breaches or incidents.
Major Operational Controls are
1. Personnel Security
2. Physical Security
3. Production, Input/Output Controls
4. Contingency Planning
5. Hardware and Systems Software
6. Data Integrity
7. Documentation
8. Security Awareness, Training, and Education
9. Incident Response Capability
Technical controls involve researching and selecting technology necessary to develop and implement security controls in an organization. These include technology for physical access (cards or password or combination of both), technology for remote access, policy for third party software, email and internet policies. These also include policy for remote monitoring, audit trails and automated audits for any information security incidents
Major Technical Controls are
1. Identification and Authentication
2. Logical Access Controls
3. Audit Trails
Management Controls deals with security project management and deals with design and implementation of policies, procedures and standards throughout the organization. These include provisions for risk management including assessing and identifying risk, evaluating risk controls, summarizing findings and then selecting a cost effective control and installing and implementing it within the organization. It also includes periodic and systematic review and evaluation of security policy within organization or with independent reviewers and policies for revision and approval of any changes as a result of those reviews. Major Management Controls are
1. Risk Management
2. Review of Security Controls
3. Life Cycle Maintenance
4. Authorization of Processing
5. System Security Plan
Operational Controls covers planning for incident response, disaster recovery and business continuity. These includes policies on reporting and escalating security incidents, preparing proper line of response, incident classification and evidence collection and reporting for knowledge sharing. It also includes procedures to ensure continuity of operation and restoration of company operations in the event of interruption or failure. The recovery plans needs to be constantly evaluated, updated and tested to keep up with the latest in business operations of the company. Provisions for physical security including access cards, gates etc, securing server and office rooms and facilities, security of media information when in transit, equipment protection and maintenance, cable security, disposal of equipments and information , removal of equipment from premises and public access to company’s information and assets are included in operational controls. Other important area it covers is security of company employees of protection of production and input output controls. Operational controls also ensure that all employees be trained and educated on information security and are aware of their responsibility in complying, maintaining and reporting any security breaches or incidents.
Major Operational Controls are
1. Personnel Security
2. Physical Security
3. Production, Input/Output Controls
4. Contingency Planning
5. Hardware and Systems Software
6. Data Integrity
7. Documentation
8. Security Awareness, Training, and Education
9. Incident Response Capability
Technical controls involve researching and selecting technology necessary to develop and implement security controls in an organization. These include technology for physical access (cards or password or combination of both), technology for remote access, policy for third party software, email and internet policies. These also include policy for remote monitoring, audit trails and automated audits for any information security incidents
Major Technical Controls are
1. Identification and Authentication
2. Logical Access Controls
3. Audit Trails
Saturday, March 27, 2010
Information Security Certification Programs
Three major information security certification programs are:-
1. (ISC)2 International Information Systems Security Certification Consortium, Inc certifications. These include
a. Certified Information Systems Security Professional (CISSP),
b. Systems Security Certified Practitioner (SSCP) and
c. Certification and Accreditation Professional (CAP)
2. Global Information Assurance Certification (GIAC) , a series of technical security certifications offered by SANS. These certificates have three levels, silver, gold and platinum. Platinum are combined certificated with an additional exam
3. Information Systems Audit and Control Association certifications: Certified Information Systems Auditor (CISA) and Certified Information Security Manager (CISM)
Similarities between SSCP, GIAC and CISA
1. All three are for auditing, networking and security professionals dealing with auditing and security planning and implementations.
2. All three are certifications that combine technical knowledge with understanding of vulnerabilities, risks and business best practices.
3. They all are widely acceptable certifications in the IS industry and command respect and are recognized widely within the organizations and businesses.
4. They all require successful completion of an exam to be awarded and adherence to code of ethics and security standards.
5. They all require recertification or Continuing Professional Education (CPE) to maintain the certification
Difference between SSCP, GIAC and CISA
1. Experience Level
a. SSCP Must have at least 1 year of cumulative work experience in one or more of the seven test domains (CBK) in information systems security.
b. GIAC requires no verifiable work experience
c. CISA requires five years of verifiable experience in IS auditing, control or obtained in the 10 years preceding taking of the exam.
2. Recertification period and process
a. SSCP: Recertification required every 3 years by earning 60 CPE and an annual maintenance fee.
b. GIAC: Requires recertification every 2 to 4 years on interval determined by the certification.
c. CISA: No exam required but to maintain certification pay annual maintenance fee and complete 20 CPE annually.
3. Pattern of Examination
a. SSCP: 125 multiple choice questions in 3 hours covering seven test domains described below in common body of knowledge
i. Access Controls
ii. Administration
iii. Audit and Monitoring
iv. Risk, Response and Recovery
v. Cryptography
vi. Data Communications
vii. Malicious Code/Malware
b. GIAC: To obtain GIAC certification candidates must complete a practical, hands-on exam in addition to one or more technical exams.
c. CISA: Exam offered only twice a year and required completion of 200 multiple choice question in 4 hours.
References and More Information:
CISA - Certified Information Systems Auditor. Retrieved December 10, 2009 from the World Wide Web: http://certification.about.com/od/certifications/p/CISA.htm
Systems Security Certified Practitioner (SSCP). Retrieved December 10, 2009 from the World Wide Web http://certification.about.com/od/certifications/p/sscp.htm
GIAC Certifications Retrieved December 10, 2009 from the World Wide Web http://certification.about.com/cs/profiles/p/sansgiac.htm
1. (ISC)2 International Information Systems Security Certification Consortium, Inc certifications. These include
a. Certified Information Systems Security Professional (CISSP),
b. Systems Security Certified Practitioner (SSCP) and
c. Certification and Accreditation Professional (CAP)
2. Global Information Assurance Certification (GIAC) , a series of technical security certifications offered by SANS. These certificates have three levels, silver, gold and platinum. Platinum are combined certificated with an additional exam
3. Information Systems Audit and Control Association certifications: Certified Information Systems Auditor (CISA) and Certified Information Security Manager (CISM)
Similarities between SSCP, GIAC and CISA
1. All three are for auditing, networking and security professionals dealing with auditing and security planning and implementations.
2. All three are certifications that combine technical knowledge with understanding of vulnerabilities, risks and business best practices.
3. They all are widely acceptable certifications in the IS industry and command respect and are recognized widely within the organizations and businesses.
4. They all require successful completion of an exam to be awarded and adherence to code of ethics and security standards.
5. They all require recertification or Continuing Professional Education (CPE) to maintain the certification
Difference between SSCP, GIAC and CISA
1. Experience Level
a. SSCP Must have at least 1 year of cumulative work experience in one or more of the seven test domains (CBK) in information systems security.
b. GIAC requires no verifiable work experience
c. CISA requires five years of verifiable experience in IS auditing, control or obtained in the 10 years preceding taking of the exam.
2. Recertification period and process
a. SSCP: Recertification required every 3 years by earning 60 CPE and an annual maintenance fee.
b. GIAC: Requires recertification every 2 to 4 years on interval determined by the certification.
c. CISA: No exam required but to maintain certification pay annual maintenance fee and complete 20 CPE annually.
3. Pattern of Examination
a. SSCP: 125 multiple choice questions in 3 hours covering seven test domains described below in common body of knowledge
i. Access Controls
ii. Administration
iii. Audit and Monitoring
iv. Risk, Response and Recovery
v. Cryptography
vi. Data Communications
vii. Malicious Code/Malware
b. GIAC: To obtain GIAC certification candidates must complete a practical, hands-on exam in addition to one or more technical exams.
c. CISA: Exam offered only twice a year and required completion of 200 multiple choice question in 4 hours.
References and More Information:
CISA - Certified Information Systems Auditor. Retrieved December 10, 2009 from the World Wide Web: http://certification.about.com/od/certifications/p/CISA.htm
Systems Security Certified Practitioner (SSCP). Retrieved December 10, 2009 from the World Wide Web http://certification.about.com/od/certifications/p/sscp.htm
GIAC Certifications Retrieved December 10, 2009 from the World Wide Web http://certification.about.com/cs/profiles/p/sansgiac.htm
Thursday, March 25, 2010
Component based development:
This development methodology helps improve the development process by reducing risk and increasing the time to market thus reducing prices. However use of predesigned components may lead to a compromise in requirements. Another disadvantage is the reliability on an external party for support. This may create issue if immediate escalation is needed in case of time sensitive business requirements.
Advantages
1. Reduction in development time
2. Increase productivity
3. Reduced risk as pre tested components are used
4. Confirmation with Standards
5. Improved product quality
6. Shorter time to market.
7. Components may become obsolete. If the 3rd party decides to discontinue the product, no support remains in case some problem occurs in development.
8. They can be used when no in house expertise is available on a particular technology
Disadvantages
1. Compromise in requirements
2. Reliability of components and sensitivity to change
3. Problems customizing component to product use
Advantages
1. Reduction in development time
2. Increase productivity
3. Reduced risk as pre tested components are used
4. Confirmation with Standards
5. Improved product quality
6. Shorter time to market.
7. Components may become obsolete. If the 3rd party decides to discontinue the product, no support remains in case some problem occurs in development.
8. They can be used when no in house expertise is available on a particular technology
Disadvantages
1. Compromise in requirements
2. Reliability of components and sensitivity to change
3. Problems customizing component to product use
Wednesday, March 24, 2010
Skill sets for an IT Executive
While leadership styles may vary, following is a time honored skill set for an IT Executive
1. S\He should be knowledgeable about all aspects of business; logistics and operations, finance cash flow and budgeting that would enable him to take time sensitive decisions that would impact all sides of the business.
2. S\He should be a quick decision maker but at the same time should be flexible enough to take decisions with changing requirements and circumstances
3. S\He should be firm with deadlines but at the same time compassionate enough to account for any personal emergencies
4. S\He should be a good negotiator and should show strength and decisiveness in his dealings
5. S\He should have excellent oral and written communication skills
6. Should have long range vision and goal settings requiring techniques of forecasting, anticipating and strategic decision making.
7. S\He should be a good leader and should have the potential of creating faith in himself and the organization.
1. S\He should be knowledgeable about all aspects of business; logistics and operations, finance cash flow and budgeting that would enable him to take time sensitive decisions that would impact all sides of the business.
2. S\He should be a quick decision maker but at the same time should be flexible enough to take decisions with changing requirements and circumstances
3. S\He should be firm with deadlines but at the same time compassionate enough to account for any personal emergencies
4. S\He should be a good negotiator and should show strength and decisiveness in his dealings
5. S\He should have excellent oral and written communication skills
6. Should have long range vision and goal settings requiring techniques of forecasting, anticipating and strategic decision making.
7. S\He should be a good leader and should have the potential of creating faith in himself and the organization.
Monday, March 22, 2010
UML Class Diagrams
UML class design techniques make it an effective methodology when developing new release features. UML Class diagrams are static diagrams that are representative of the entire structure of the new feature by showing the different classes used for developing the new feature, various class attributes and also the relationship that exist between various classes. Thus you get a clear picture of the functionality of the new feature by representing is using class diagrams.
UML class diagrams prove to be an extremely effective methodology while developing new features as it provides following information:
1.You get detailed information about all the class members, their visibility whether they are public private or protected and also details about their attributes and methods
2. Help in finding logical relationship between objects and classes which makes coding easier for the developer. Following relationships can be reflected by class diagrams:
a. Instance level relationships like external links, aggregation, association and composition
b. Class level relationships like generalization and realization
c. General relationships of dependency and multiplicity
3. During the technical analysis phase of new feature development these class diagrams can be used for creating conceptualized model of the expected system
Hence class diagrams prove useful for both software developers as they get a clear view of the system that needs to be developed and also to the business analyst that can use class diagrams to create system models from business view point.
UML class diagrams prove to be an extremely effective methodology while developing new features as it provides following information:
1.You get detailed information about all the class members, their visibility whether they are public private or protected and also details about their attributes and methods
2. Help in finding logical relationship between objects and classes which makes coding easier for the developer. Following relationships can be reflected by class diagrams:
a. Instance level relationships like external links, aggregation, association and composition
b. Class level relationships like generalization and realization
c. General relationships of dependency and multiplicity
3. During the technical analysis phase of new feature development these class diagrams can be used for creating conceptualized model of the expected system
Hence class diagrams prove useful for both software developers as they get a clear view of the system that needs to be developed and also to the business analyst that can use class diagrams to create system models from business view point.
Sunday, March 21, 2010
Important Considerations for Designing Robust Software
Following approach should be taken to come up with a robust computer software system:-
1. Selection of appropriate software development lifecycle process should be the start point for the group responsible for developing the software which would help them in mitigating the risk.
2. Requirement gathering should be the next important step which would help the software developers get a clear idea about the system usage , memory allocation and selection of proper shareware based on the number of users which would have logged into the system simultaneously. This would also help them in selecting a proper database and come up with a robust backup and restore mechanism with minimum downtime. Requirement gathering should take the inputs from the stakeholders and actual users who would also help in coming up with the exact hardware needed to create a robust system.
3. Selecting a proper software architecture and correct software design is another important step that goes a long way in preventing single point of failure. Implementing an appropriate network algorithm will help in saving lots of time, effort and cost.
4. Single point of failures can be avoided by coming up with software system that is:-
a. Less complex and is easier to understand and rectify the problem in case of failure
b. Makes all critical components redundant with a robust backup and recovery mechanism to transfer control to a proper functioning unit in case of any failure
c. Diversification that is similar to redundancy and helps in doubling the functionality by designing it in two different manner so that in case of failure of one other is available
d. Transparency in code with proper comments and informative user documentation which can help in speedy rectification of any problem
1. Selection of appropriate software development lifecycle process should be the start point for the group responsible for developing the software which would help them in mitigating the risk.
2. Requirement gathering should be the next important step which would help the software developers get a clear idea about the system usage , memory allocation and selection of proper shareware based on the number of users which would have logged into the system simultaneously. This would also help them in selecting a proper database and come up with a robust backup and restore mechanism with minimum downtime. Requirement gathering should take the inputs from the stakeholders and actual users who would also help in coming up with the exact hardware needed to create a robust system.
3. Selecting a proper software architecture and correct software design is another important step that goes a long way in preventing single point of failure. Implementing an appropriate network algorithm will help in saving lots of time, effort and cost.
4. Single point of failures can be avoided by coming up with software system that is:-
a. Less complex and is easier to understand and rectify the problem in case of failure
b. Makes all critical components redundant with a robust backup and recovery mechanism to transfer control to a proper functioning unit in case of any failure
c. Diversification that is similar to redundancy and helps in doubling the functionality by designing it in two different manner so that in case of failure of one other is available
d. Transparency in code with proper comments and informative user documentation which can help in speedy rectification of any problem
Friday, March 19, 2010
Scalability 101: A Primer on Scalability
Scalability refers to the ease with which the existing site can be modified or enhanced to accommodate changed design, business requirements and user needs.
Why is Scalability Important
Business requirements change and evolve over time. User feedback gives rise to new ways in the business can be done. New products and services emerge while certain other becomes obsolete. Scalability enables the web page to incorporate these change requests with minimal time and effort thereby reducing cost.
A scalable website ensures minimal maintenance cost to keep the site current A good web design should take in consideration the expansion of business and services in the years to come and should be able to accommodate those requests with minimal redesign.
Features of Scalability
1.Flexible and scalable site architecture
2.Can accommodate change in requirements with minimal site redesign.
3.Easy maintenance for normal everyday changes.
4.Initial website design should not appear to be incomplete
Why is Scalability Important
Business requirements change and evolve over time. User feedback gives rise to new ways in the business can be done. New products and services emerge while certain other becomes obsolete. Scalability enables the web page to incorporate these change requests with minimal time and effort thereby reducing cost.
A scalable website ensures minimal maintenance cost to keep the site current A good web design should take in consideration the expansion of business and services in the years to come and should be able to accommodate those requests with minimal redesign.
Features of Scalability
1.Flexible and scalable site architecture
2.Can accommodate change in requirements with minimal site redesign.
3.Easy maintenance for normal everyday changes.
4.Initial website design should not appear to be incomplete
Thursday, March 18, 2010
Google Trends
With world experiencing the worst economic slowdown world wide web has become the hottest destination to promote your product and services. Google trends help you to find out the most popular keywords searched by the users that can be used for making our websites search engine optimized.
Advantages of using Google Trends are:-
The search results obtained are based on regions that helps in finding the searches carried out by people based on the geography. This is beneficial as it helps you in targeting the customers situated at a particular location. It also helps you in finding the resources which the people are looking for in a particular city which is really of a big help
The second important information that you get from Google trends is the "Also visited" section that provides you information about the other sites visited by the user and helps you in getting the information about your competitors
Also being the world's top search engine Google trends also helps you in finding the exact keywords or queries that are used by the people for making searches. These results helps in making your websites search engine optimized with receiving higher search engine ranking thereby increasing your website traffic resulting in more business
Google trends provides a user with more data then other engines like ALexa which helps in user making better analysis of the current trends being followed in the world of online marketing
Disadvantage of Google trends is :-
1. It is also being used by lots of cyber criminals for spreading malwares
2. Google trends does not contain updated information as reported by lots of bloggers unlike similar products in markets like hot trends
3. Most of the keywords trends shown by Google are seasonal like if a search is being performed using thanks giving keyword searches shows anomaly before and after Christmas every year.
4. Last but not the least like al tools google trends gives us a pattern of the past searches but cannot help us in predicting the most searched keyword of the future!!!
Advantages of using Google Trends are:-
The search results obtained are based on regions that helps in finding the searches carried out by people based on the geography. This is beneficial as it helps you in targeting the customers situated at a particular location. It also helps you in finding the resources which the people are looking for in a particular city which is really of a big help
The second important information that you get from Google trends is the "Also visited" section that provides you information about the other sites visited by the user and helps you in getting the information about your competitors
Also being the world's top search engine Google trends also helps you in finding the exact keywords or queries that are used by the people for making searches. These results helps in making your websites search engine optimized with receiving higher search engine ranking thereby increasing your website traffic resulting in more business
Google trends provides a user with more data then other engines like ALexa which helps in user making better analysis of the current trends being followed in the world of online marketing
Disadvantage of Google trends is :-
1. It is also being used by lots of cyber criminals for spreading malwares
2. Google trends does not contain updated information as reported by lots of bloggers unlike similar products in markets like hot trends
3. Most of the keywords trends shown by Google are seasonal like if a search is being performed using thanks giving keyword searches shows anomaly before and after Christmas every year.
4. Last but not the least like al tools google trends gives us a pattern of the past searches but cannot help us in predicting the most searched keyword of the future!!!
Wednesday, March 17, 2010
Defect Severity Vs Defect Priority
Severity refers to the impact of the bug on the system where as priority refers to the urgency with which the defect needs to be fixed. Severity is one of the several considerations in deciding the priority of bug.
High Severity Low Priority Defect
If a web or stand alone application crashes after n number of negative steps where n>7 is a good example of High Severity Low Priority defect. The defect is High Severity because it results in system crash and loss of data. However it’s a low priority defect because the probability of users doing such a large number of negative steps is quite low
Low Severity High Priority Defect
A spelling error on index page of web application is a low severity but a high priority defect. The defect is low severity since its visual and does not impacts the functionality of the system. On the other hand, the spelling error on launch page would be highly critical to the image of the company and hence should be fixed at utmost priority
High Severity Low Priority Defect
If a web or stand alone application crashes after n number of negative steps where n>7 is a good example of High Severity Low Priority defect. The defect is High Severity because it results in system crash and loss of data. However it’s a low priority defect because the probability of users doing such a large number of negative steps is quite low
Low Severity High Priority Defect
A spelling error on index page of web application is a low severity but a high priority defect. The defect is low severity since its visual and does not impacts the functionality of the system. On the other hand, the spelling error on launch page would be highly critical to the image of the company and hence should be fixed at utmost priority
Monday, March 15, 2010
Components of a Defect Report
Excelon prides itself on providing one of the best documented bug reports in the industry. Our talented team of Quality testers has experience working with a wide array of reporting tools including Quality center, PVCS, PeopleSoft and Bugzilla.
A good bug report consists of the following elements:-
1. Bug Title: Concise but should be clear enough to briefly described the encountered problem
2. Steps to reproduce: Provide clear and concise steps to reproduce the problem. Augment with as much information as possible to help the development team reproduce the problem
3. Actual and Expected Result: What was the actual and expected outcome of the test case
4. Severity: This describes the degree to which the product or service is affected by the encountered scenario.
5.Priority: This describes how quickly the reported problem needs to be fixed.
6.Environment: This is critical if the product is being tested on multiple servers, databases, browsers or operating systems. Clearly specify the environment being used for testing in this column.
7.Files \ Attachments \ Screenshots: Augment the bug report by adding all relevant files used in testing or obtained as output, Server or console logs as attachments and screenshots of any encountered error
There can be several more fields added to bug reports depending on the requirements of the clients. This includes the sub component of the project, version, external or internal reported defects etc. Excelon will work with you on an individualized basis to customize the reporting solution best suited to your needs
A good bug report consists of the following elements:-
1. Bug Title: Concise but should be clear enough to briefly described the encountered problem
2. Steps to reproduce: Provide clear and concise steps to reproduce the problem. Augment with as much information as possible to help the development team reproduce the problem
3. Actual and Expected Result: What was the actual and expected outcome of the test case
4. Severity: This describes the degree to which the product or service is affected by the encountered scenario.
5.Priority: This describes how quickly the reported problem needs to be fixed.
6.Environment: This is critical if the product is being tested on multiple servers, databases, browsers or operating systems. Clearly specify the environment being used for testing in this column.
7.Files \ Attachments \ Screenshots: Augment the bug report by adding all relevant files used in testing or obtained as output, Server or console logs as attachments and screenshots of any encountered error
There can be several more fields added to bug reports depending on the requirements of the clients. This includes the sub component of the project, version, external or internal reported defects etc. Excelon will work with you on an individualized basis to customize the reporting solution best suited to your needs
Saturday, March 13, 2010
Usability 101: A Primer on Usability
Usability is the key primer to our Web Development projects. It refers to the ease with which the user can browse the web page to look for the content and services it offers.
Why is Usability Important
A website that is not usable will not be able to attract people for a prolonged period of time. If the website is unappealing and difficult to navigate, people will leave. If the company website fails to explain its purpose and demonstrate the projects, people leave. If the website takes too long to load, people leave. If the website behaves different between different browsers and platform people leave.
For a company website, it is essential that all features that set the company apart and the services that it offers be easily available from the home page. For a product company, usability Is important because longer the customer stays on the web, more is the chance of him making a purchase. A usable website will encourage return visits and add to the profit margin of the company
Features of Usability
Following are the key features that define usability
1. The site should be easy to use for a novice user
2. All relevant information should be available on the home page with all pages being interlinked to allow for quick and easy navigation
3. The site should not have ambiguous information
4. Short loading time.
Why is Usability Important
A website that is not usable will not be able to attract people for a prolonged period of time. If the website is unappealing and difficult to navigate, people will leave. If the company website fails to explain its purpose and demonstrate the projects, people leave. If the website takes too long to load, people leave. If the website behaves different between different browsers and platform people leave.
For a company website, it is essential that all features that set the company apart and the services that it offers be easily available from the home page. For a product company, usability Is important because longer the customer stays on the web, more is the chance of him making a purchase. A usable website will encourage return visits and add to the profit margin of the company
Features of Usability
Following are the key features that define usability
1. The site should be easy to use for a novice user
2. All relevant information should be available on the home page with all pages being interlinked to allow for quick and easy navigation
3. The site should not have ambiguous information
4. Short loading time.
Thursday, March 11, 2010
Defect Tracking and Bug Reporting
Our bug reports are detailed with stepwise procedure to recreate the issue. They are assigned priority and severity according to a set protocol mutually agreed on by the development team. The bug reports are augmented with screenshots, server logs and database validations whenever required.
Defect Life Cycle
The life cycle of a defect (Bug as it is normally referred to) starts with its creation. The following statuses are generalization and they might differ slightly from organization to organization but the crux remains same
1. New \Open
The Quality Engineer encounters a deviation from the requirements and opens a defect. The defect is then in the open status
2. Assigned
The project manager\dev lead\QA lead assigns the defect to a developer for fixing and the defect is in Assigned status. In this status it is being worked on by the developer. Once the developer has fixed the bug and verified the fix on the development environment, he marks the bug as Fixed\Ready for QA and assigns back to the Quality Engineer
3. Fixed\Ready For QA
In this state, the bug is owned by the Quality Analyst who verifies that the fix made by the developer has indeed fixed the issue. The QA also performs a regression of any other feature that he\she thinks may have been impacted by the changed
4. QA Accepted\Closed
If the Quality Analyst is satisfied with the fix made to the change request, he closes the bug and mark it as fixed\QA Accepted. If the bug is not fixed, It is rejected and goes back to the developer in Assigned status.
Defect Life Cycle
The life cycle of a defect (Bug as it is normally referred to) starts with its creation. The following statuses are generalization and they might differ slightly from organization to organization but the crux remains same
1. New \Open
The Quality Engineer encounters a deviation from the requirements and opens a defect. The defect is then in the open status
2. Assigned
The project manager\dev lead\QA lead assigns the defect to a developer for fixing and the defect is in Assigned status. In this status it is being worked on by the developer. Once the developer has fixed the bug and verified the fix on the development environment, he marks the bug as Fixed\Ready for QA and assigns back to the Quality Engineer
3. Fixed\Ready For QA
In this state, the bug is owned by the Quality Analyst who verifies that the fix made by the developer has indeed fixed the issue. The QA also performs a regression of any other feature that he\she thinks may have been impacted by the changed
4. QA Accepted\Closed
If the Quality Analyst is satisfied with the fix made to the change request, he closes the bug and mark it as fixed\QA Accepted. If the bug is not fixed, It is rejected and goes back to the developer in Assigned status.
Our Software Testing Philosophy
Software Testing is an Integral part of the project life cycle. Excelon will work with you from the requirement gathering phase to ensure that the quality is being added to the product as it is built rather than at the end of it
Test Case Creation
We will analyze requirements and create use cases to develop test execution scenarios. We work with the development team closely to ensure that all testing scenarios are well covered. Our test cases are easy to understand, are concise and clearly written and are part of the final deliverable.
Test Case Execution
The test cases are executed on different environment depending on the need and requirements of the client. Our team is capable of installing and executing test cases on a variety of platforms and browsers. Our team is well versed in backend testing and performing validations on Database using SQL query language. The test execution reports with the number of pass\fail scenarios is created and is part of the final deliverable
Defect Tracking and Bug Reporting
Our bug reports are detailed with stepwise procedure to recreate the issue. They are assigned priority and severity according to a set protocol mutually agreed on by the development team. The bug reports are augmented with screenshots, server logs and database validations whenever required.
Test Case Creation
We will analyze requirements and create use cases to develop test execution scenarios. We work with the development team closely to ensure that all testing scenarios are well covered. Our test cases are easy to understand, are concise and clearly written and are part of the final deliverable.
Test Case Execution
The test cases are executed on different environment depending on the need and requirements of the client. Our team is capable of installing and executing test cases on a variety of platforms and browsers. Our team is well versed in backend testing and performing validations on Database using SQL query language. The test execution reports with the number of pass\fail scenarios is created and is part of the final deliverable
Defect Tracking and Bug Reporting
Our bug reports are detailed with stepwise procedure to recreate the issue. They are assigned priority and severity according to a set protocol mutually agreed on by the development team. The bug reports are augmented with screenshots, server logs and database validations whenever required.
Wednesday, March 10, 2010
Our Web Development Philosophy
Your web page is your name on the web and is a critical element of the Branding process. We use the latest in technologies to develop an original and scalable solution to create a usable and glowing presence for your company on the web.
Technology
Excelon continuously invests in latest in web development technologies to create a solution that is compatible with the W3C (World Wide Consortium) standards.
Browser Compatibility
Our websites are tested to be compatible on all major browsers on Window, Macintosh and Linux platforms.
Usability
Usability is the key component and focus of our development efforts. We ensure that our sites are quick to load and easy to navigate creating a unique experience for the end user.
Originality
Our dedicated team of engineers will work with you on a one to one basis to create a unique solution customized and tailored to suit your individual needs.
Scalability
The only thing constant is change. We ensure that the site we create is easy to customize to meet your ever changing and growing needs.
Technology
Excelon continuously invests in latest in web development technologies to create a solution that is compatible with the W3C (World Wide Consortium) standards.
Browser Compatibility
Our websites are tested to be compatible on all major browsers on Window, Macintosh and Linux platforms.
Usability
Usability is the key component and focus of our development efforts. We ensure that our sites are quick to load and easy to navigate creating a unique experience for the end user.
Originality
Our dedicated team of engineers will work with you on a one to one basis to create a unique solution customized and tailored to suit your individual needs.
Scalability
The only thing constant is change. We ensure that the site we create is easy to customize to meet your ever changing and growing needs.
About Us
Exelon Consulting provides Website Development, E Commerce, Software testing, User Documentation and Content Writing Services. We provide design, development and testing services using the latest in technologies to help our customers create there name and brand on the web. The service is run by a group of talented individuals with experience working in Fortune 500 companies in India and the US
This blog describes our company, the development and testing methodologies we adhere to, the latest in technology and many more. If you would like to hear more on a particular topic drop us an email or request a quote for our products and services on our website.
This blog describes our company, the development and testing methodologies we adhere to, the latest in technology and many more. If you would like to hear more on a particular topic drop us an email or request a quote for our products and services on our website.
Subscribe to:
Posts (Atom)