Difference between revisions of "How to write the report of the testing"

From OWASP
Jump to: navigation, search
Line 1: Line 1:
 
{{Template:OWASP Testing Guide v2}}
 
{{Template:OWASP Testing Guide v2}}
  
** THIS SECTION IN PROGRESS AS OF 11/29/2006 - Tom Brennan
+
THIS SECTION IN PROGRESS AS OF 11/29/2006 - Tom Brennan  
 
+
Writing the report is final phase but we like to think backwards… hence reverse engineers right. What I mean is that the report must be a standard methodology that is a repeatable. Is this a baseline report or the 12th report of a single application of all applications in the current year? Either way the report needs to be consistent in the delivery to customers, management etc.  
Writing the report is final phase but we like to think backwards… hence reverse engineers right. What I mean is that the report must be a standard methodology that is a repeatable. Is this a baseline report or the 12th report of a single application of all applications in the current year? Either way the report needs to be consistent in the delivery to customers, management etc.
+
Every application security assessment report should have as a deliverable the following sections  
 
+
I. Executive Summary - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results. I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested.  
Every application security assessment report should have as a deliverable the following sections
+
II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illustrate per system the status  
 
+
III. System Detail – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a individual system level and then detailed information and recommended corrective actions to the found issues.  
'''I. Executive Summary''' - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results. I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested.
+
 
+
'''II. Systems Summary''' – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illistruate per system the status  
+
 
+
'''III. System Detail''' – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a indivudal system level and then detailed information and recommended corrective actions to the found issues.
+
  
 
'''APPENDIX A'''
 
'''APPENDIX A'''

Revision as of 20:53, 29 November 2006

OWASP Testing Guide v2 Table of Contents


THIS SECTION IN PROGRESS AS OF 11/29/2006 - Tom Brennan Writing the report is final phase but we like to think backwards… hence reverse engineers right. What I mean is that the report must be a standard methodology that is a repeatable. Is this a baseline report or the 12th report of a single application of all applications in the current year? Either way the report needs to be consistent in the delivery to customers, management etc. Every application security assessment report should have as a deliverable the following sections I. Executive Summary - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results. I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested. II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary. Appendix A, below would be ideal to illustrate per system the status III. System Detail – This is a details on each system test and details of the individual hosts or applications again Appendix A would serve this purpose as a summary on a individual system level and then detailed information and recommended corrective actions to the found issues.

APPENDIX A

Category Ref Number Name Finding Affected Item Comment/Solution Risk Value
Information Gathering Application Discovery
Spidering and googling
Analisys of error code
SSL/TLS Testing
DB Listener Testing
File extensions handling
Old, backup and unreferenced files
Business logic testing
Authentication Testing Default or guessable account
Brute Force
Bypassing authentication schema
Directory traversal/file include
Vulnerable remember password and pwd reset
Logout and Browser Cache Management Testing
Session Management Testing Session Management Schema
Session Token Manipulation
Exposed Session Variables
Session Riding
HTTP Exploit
Data Validation Testing Cross site scripting
HTTP Methods and XST
SQL Injection
Stored procedure injection
ORM Injection
LDAP Injection
XML Injection
SSI Injection
XPath Injection
IMAP/SMTP Injection
Code Injection
OS Commanding
Buffer overflow
Incubated vulnerability
Denial of Service Testing Locking Customer Accounts
User Specified Object Allocation
User Input as a Loop Counter
Writing User Provided Data to Disk
Failure to Release Resources
Storing too Much Data in Session
Web Services Testing XML Structural Testing
XML content-level Testing
HTTP GET parameters/REST Testing
Naughty SOAP attachments
Replay Testing
AJAX Testing AJAX Vulnerabilities




OWASP Testing Guide v2

Here is the OWASP Testing Guide v2 Table of Contents