Difference between revisions of "Category:Tools Rating Criteria"

From OWASP
Jump to: navigation, search
 
(2 intermediate revisions by one user not shown)
Line 6: Line 6:
 
The goal of this project is to provide the following information (at a minimum) in each tool category:  
 
The goal of this project is to provide the following information (at a minimum) in each tool category:  
  
*Description of this tool category
 
 
*General strengths and weaknesses of tools in this category  
 
*General strengths and weaknesses of tools in this category  
 
*Important selection criteria for comparing tools within the category (e.g. ease of use, performance, cost, likelihood of false positives)  
 
*Important selection criteria for comparing tools within the category (e.g. ease of use, performance, cost, likelihood of false positives)  
*List of all OWASP Tools in this category
 
*List of other well known open source tools in this category
 
*List of commercial tools in this category provided by OWASP Members
 
*List of other well known commercial tools in this category
 
  
  
 +
We are still working on defining the rating criteria and the different possible ways to rate these tools:
 +
*We may rate the tools ourselves.
 +
*We may seek other users' feedback based on their experiences
 +
*We may also seek other independent reviews.
  
== General Application Security Tool Resources  ==
+
 
 +
 
 +
== Resources  ==
  
 
You can read Larry's 1st paper [http://ha.ckers.org/blog/20071014/web-application-scanning-depth-statistics/ Version 1.0] and version 2.0 project is now underway with revised criteria and the assistance of OWASP & [http://www.webappsec.org WASC] members and that of the [http://www.reversebenchmarking.org ORB Project]  
 
You can read Larry's 1st paper [http://ha.ckers.org/blog/20071014/web-application-scanning-depth-statistics/ Version 1.0] and version 2.0 project is now underway with revised criteria and the assistance of OWASP & [http://www.webappsec.org WASC] members and that of the [http://www.reversebenchmarking.org ORB Project]  

Latest revision as of 16:22, 29 September 2009

This article is a stub. You can help OWASP by expanding it or discussing it on its Talk page.

Overall Approach

The goal of this project is to provide the following information (at a minimum) in each tool category:

  • General strengths and weaknesses of tools in this category
  • Important selection criteria for comparing tools within the category (e.g. ease of use, performance, cost, likelihood of false positives)


We are still working on defining the rating criteria and the different possible ways to rate these tools:

  • We may rate the tools ourselves.
  • We may seek other users' feedback based on their experiences
  • We may also seek other independent reviews.


Resources

You can read Larry's 1st paper Version 1.0 and version 2.0 project is now underway with revised criteria and the assistance of OWASP & WASC members and that of the ORB Project

Product testing will be performed based on defined criteria that will measure the usability performance and positive validation on “red herring” web applications OWASP Site Generator this model common systems and unique systems in the wild.

SAMATE
The NIST SAMATE - Software Assurance Metrics And Tool Evaluation project has an ambitious goal of creating a large set of application security test data and then assessing the capabilities of a wide variety of application security tools.

This category currently contains no pages or media.