Difference between revisions of "How to write the report of the testing"

From OWASP
Jump to: navigation, search
(29 intermediate revisions by 6 users not shown)
Line 1: Line 1:
{{Template:OWASP Testing Guide v2}}
+
{{Template:OWASP Testing Guide v4}}
  
** THIS SECTION IN PROGRESS AS OF 11/29/2006 - Tom Brennan
+
Performing the technical side of the assessment is only half of the overall assessment process; the final product is the production of a well-written, and informative, report.
 +
A report should be easy to understand and highlight all the risks found during the assessment phase and appeal to both executive management and technical staff.
  
Writing the report is final phase but we like to think backwards… hence reverse engineers right.  What I mean is that the report must be a standard methodology that is a repeatable.  Is this a baseline report or the 12th report of a single application of all applications in the current year?  Either way the report needs to be consistent in the delivery to customers, management etc.
+
The report needs to have three major sections and be created in a manner that allows each section to be split off and printed and given to the appropriate teams, such as the developers or system managers.
  
Every application security assessment report should have as a deliverable the following sections
+
The sections generally recommended are:
 +
 +
'''1.0 Executive Summary'''
  
'''I. Executive Summary''' - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results.  I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested.
+
The executive summary sums up the overall findings of the assessment and gives managers, or system owners, an idea of the overall risk faced.  
  
'''II. Systems Summary''' – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose.  Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illistruate per system the status
+
The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. It is recommended that a summary be included, which details when the testing commenced and when it was completed.
  
'''III. System Detail''' – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a indivudal system level and then detailed information and recommended corrective actions to the found issues.
+
Another section, which is often overlooked, is a paragraph on implications and actions. This allows the system owners to understand what is required to be done in order to ensure the system remains secure.  
  
'''APPENDIX A'''
 
  
{| border=1
+
1.1 Project Objective:
  || '''Category''' || '''Ref Number''' || '''Name ''' || '''Finding ''' ||'''Affected Item'''|| '''Comment/Solution ''' || '''Risk Value '''
+
In this section you will need to outline the project objectives and what is expected as an outcome of the assessment.
|-
+
|| Information Gathering ||  || Application Discovery ||  ||  ||  ||
+
|-
+
||  ||  || Spidering and googling ||  ||  ||  ||
+
|-
+
||  ||  || Analisys of error code ||  ||  ||  ||
+
|-
+
||  ||  || SSL/TLS Testing ||  ||  ||  ||
+
|-
+
||  ||  || DB Listener Testing ||  ||  ||  ||
+
|-
+
||  ||  || File extensions handling ||  ||  ||  ||
+
|-
+
||  ||  || Old, backup and unreferenced files ||  ||  ||  ||
+
|-
+
||Business logic testing  ||  ||  ||  ||  ||  ||
+
|-
+
|| Authentication Testing ||  || Default or guessable account ||  ||  ||  ||
+
|-
+
||  ||  || Brute Force ||  ||  ||  ||
+
|-
+
||  ||  || Bypassing authentication schema ||  ||  ||  ||
+
|-
+
||  ||  || Directory traversal/file include ||  ||  ||  ||
+
|-
+
||  ||  || Vulnerable remember password and pwd reset ||  ||  ||  ||
+
|-
+
||  ||  || Logout and Browser Cache Management Testing ||  ||  ||  ||
+
|-
+
|| Session Management Testing ||  || Session Management Schema  ||  ||  || ||
+
|-
+
||  ||  || Session Token Manipulation ||  ||  ||  ||
+
|-
+
||  ||  || Exposed Session Variables ||  ||  ||  ||
+
|-
+
||  ||  || Session Riding ||  ||  ||  ||
+
|-
+
||  ||  || HTTP Exploit ||  ||  ||  ||
+
|-
+
|| Data Validation Testing ||  || Cross site scripting ||  ||  ||  ||
+
|-
+
||  ||  || HTTP Methods and XST ||  ||  ||  ||
+
|-
+
||  ||  || SQL Injection ||  ||  ||  ||
+
|-
+
||  ||  || Stored procedure injection ||  ||  ||  ||
+
|-
+
||  ||  || ORM Injection ||  ||  ||  ||
+
|-
+
||  ||  || LDAP Injection ||  ||  ||  ||
+
|-
+
||  ||  || XML Injection ||  ||  ||  ||
+
|-
+
||  ||  || SSI Injection ||  ||  ||  ||
+
|-
+
||  ||  || XPath Injection ||  ||  ||  ||
+
|-
+
||  ||  || IMAP/SMTP Injection ||  ||  ||  ||
+
|-
+
||  ||  || Code Injection ||  ||  ||  ||
+
|-
+
||  ||  || OS Commanding ||  ||  ||  ||
+
|-
+
||  ||  || Buffer overflow ||  ||  ||  ||
+
|-
+
||  ||  || Incubated vulnerability ||  ||  ||  ||
+
|-
+
|| Denial of Service Testing ||  || Locking Customer Accounts ||  ||  ||  ||
+
|-
+
||  ||  || User Specified Object Allocation ||  ||  ||  ||
+
|-
+
||  ||  || User Input as a Loop Counter ||  ||  ||  ||
+
|-
+
||  ||  || Writing User Provided Data to Disk ||  ||  ||  ||
+
|-
+
||  ||  || Failure to Release Resources ||  ||  ||  ||
+
|-
+
||  ||  || Storing too Much Data in Session ||  ||  ||  ||
+
|-
+
|| Web Services Testing  ||  || XML Structural Testing ||  ||  ||  ||
+
|-
+
||  ||  || XML content-level Testing ||  ||  ||  ||
+
|-
+
||  ||  || HTTP GET parameters/REST Testing ||  ||  ||  ||
+
|-
+
||  ||  || Naughty SOAP attachments ||  ||  ||  ||
+
|-
+
||  ||  || Replay Testing  ||  ||  ||  ||
+
|-
+
|| AJAX Testing ||  || AJAX Vulnerabilities  ||  ||  ||  ||
+
|-
+
|}
+
  
 +
1.2 Project Scope:
 +
In this section you will need to outline the agreed scope  - in some cases the limitation of scope.
  
 +
1.3 Targets:
 +
In this section you will need to list the number of applications and/or targeted systems.
  
  
 +
'''2.0 Technical Management Overview'''
  
{{Category:OWASP Testing Project AoC}}
+
The technical management overview section often appeals to technical managers who require more technical detail than found in the executive summary. This section should include details about the scope of the assessment, the targets included and any caveats, such as system availability etc.
 +
This section also needs to include an introduction on the risk rating used throughout the report and then finally a technical summary of the findings.
 +
 
 +
'''3.0 Assessment Findings'''
 +
 
 +
The last section of the report is the section, which includes detailed technical detail about the vulnerabilities found, and the approaches needed to ensure they are resolved. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and be able to solve it.
 +
 
 +
The findings section should include:
 +
 
 +
* A reference number for easy reference with screenshots
 +
* The affected item
 +
* A technical description of the issue
 +
* A section on resolving the issue
 +
* The risk rating and impact value
 +
 
 +
Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand.
 +
 
 +
Here is the report (see https://www.owasp.org/index.php/Testing_Checklist for the complete list of tests):
 +
 
 +
<center>[[Image:tablerep.PNG]]</center>
 +
<center>[[Image:tablerep2.PNG]]</center>
 +
<center>[[Image:tablerep3.PNG]]</center>
 +
 
 +
'''Appendix''' 
 +
 
 +
This section is often used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts/code are utilized during the assessment, it should be disclosed in this section or noted as attachment.
 +
It is often appreciated by the customer when the methodology used by the consultants is included. It gives them an idea of the thoroughness of the assessment and also an idea what areas were included.

Revision as of 12:59, 21 February 2013

This article is part of the new OWASP Testing Guide v4. 
At the moment the project is in the REVIEW phase.

Back to the OWASP Testing Guide v4 ToC: https://www.owasp.org/index.php/OWASP_Testing_Guide_v4_Table_of_Contents Back to the OWASP Testing Guide Project: http://www.owasp.org/index.php/OWASP_Testing_Project


Performing the technical side of the assessment is only half of the overall assessment process; the final product is the production of a well-written, and informative, report. A report should be easy to understand and highlight all the risks found during the assessment phase and appeal to both executive management and technical staff.

The report needs to have three major sections and be created in a manner that allows each section to be split off and printed and given to the appropriate teams, such as the developers or system managers.

The sections generally recommended are:

1.0 Executive Summary

The executive summary sums up the overall findings of the assessment and gives managers, or system owners, an idea of the overall risk faced.

The language used should be more suited to people who are not technically aware and should include graphs or other charts which show the risk level. It is recommended that a summary be included, which details when the testing commenced and when it was completed.

Another section, which is often overlooked, is a paragraph on implications and actions. This allows the system owners to understand what is required to be done in order to ensure the system remains secure.


1.1 Project Objective: In this section you will need to outline the project objectives and what is expected as an outcome of the assessment.

1.2 Project Scope: In this section you will need to outline the agreed scope - in some cases the limitation of scope.

1.3 Targets: In this section you will need to list the number of applications and/or targeted systems.


2.0 Technical Management Overview

The technical management overview section often appeals to technical managers who require more technical detail than found in the executive summary. This section should include details about the scope of the assessment, the targets included and any caveats, such as system availability etc. This section also needs to include an introduction on the risk rating used throughout the report and then finally a technical summary of the findings.

3.0 Assessment Findings

The last section of the report is the section, which includes detailed technical detail about the vulnerabilities found, and the approaches needed to ensure they are resolved. This section is aimed at a technical level and should include all the necessary information for the technical teams to understand the issue and be able to solve it.

The findings section should include:

  • A reference number for easy reference with screenshots
  • The affected item
  • A technical description of the issue
  • A section on resolving the issue
  • The risk rating and impact value

Each finding should be clear and concise and give the reader of the report a full understanding of the issue at hand.

Here is the report (see https://www.owasp.org/index.php/Testing_Checklist for the complete list of tests):

Tablerep.PNG
Tablerep2.PNG
Tablerep3.PNG

Appendix

This section is often used to describe the commercial and open-source tools that were used in conducting the assessment. When custom scripts/code are utilized during the assessment, it should be disclosed in this section or noted as attachment. It is often appreciated by the customer when the methodology used by the consultants is included. It gives them an idea of the thoroughness of the assessment and also an idea what areas were included.