Writing the report is final phase but we like to think backwards… hence reverse engineers right. What I mean is that the report must be a standard methodology that is a repeatable. Is this a baseline report or the 12 report of the application in the current year? Either way the report needs to be consistent in the delivery to customers, management etc.
Every application security assessment report should have as a deliverable the following sections
I. Executive Summary - It this report is for many systems then there should be a summary paragraph that outlines the total devices tested, the dates of the testing and a summary of the results.
I think we all know that management likes visual graphics to illustrate problems so why not provide some to do that for all of the systems tested.
II. Systems Summary – If more than 1 application or system is tested, we recommend that the systems summary defined the systems and the systems boundaries and a description of the systems purpose. Each “System” may contain many sub-systems etc., it is important to provide both Executive Summary about systems as well as within the systems summary.
III. System Detail – This is a details on each system test and details of the individual hosts or applications.
|Category||Ref Number||Name||Finding||Affected Item||Comment/Solution||Risk Value|
|Information Gathering||Application Discovery|
|Spidering and googling|
|Analisys of error code|
|DB Listener Testing|
|File extensions handling|
|Old, backup and unreferenced files|
|Business logic testing|
|Authentication Testing||Default or guessable account|
|Bypassing authentication schema|
|Directory traversal/file include|
|Vulnerable remember password and pwd reset|
|Logout and Browser Cache Management Testing|
|Session Management Testing||Session Management Schema|
|Session Token Manipulation|
|Exposed Session Variables|
|Data Validation Testing||Cross site scripting|
|HTTP Methods and XST|
|Stored procedure injection|
|Denial of Service Testing||Locking Customer Accounts|
|User Specified Object Allocation|
|User Input as a Loop Counter|
|Writing User Provided Data to Disk|
|Failure to Release Resources|
|Storing too Much Data in Session|
|Web Services Testing||XML Structural Testing|
|XML content-level Testing|
|HTTP GET parameters/REST Testing|
|Naughty SOAP attachments|
|AJAX Testing||AJAX Vulnerabilities|
OWASP Testing Guide v2
Here is the OWASP Testing Guide v2 Table of Contents