Top 10 2013/ProjectMethodology

=About= The purpose of this page is to provide greater clarity to the methodology of the OWASP Top 10 project. This page will provide information on the data and individuals involved in the top 10, the current processes, suggestions to improve involvement and participation, and also an FAQ to cover common questions & concerns.

This is a wiki and editable by anyone with an owasp account. Please constructively contribute to the conversation. Additional discussions should also take place within the OWASP top 10 mailing list.

=Current Methodology=
 * 1) Data sources accepted from a variety of companies (see sources)
 * 2) Data & professional opinion used to create initial Top 10 rankings and items
 * 3) * List involved individuals here
 * 4) Public comment period of RC1 from February through end of March
 * 5) All comments evaluated and top 10 updated appropriately by:
 * 6) * List involved individuals here
 * 7) All comments and responses posted publicly
 * 8) RC2 issued?
 * 9) Final version published
 * 1) RC2 issued?
 * 2) Final version published

=Current Data Sources=
 * Aspect Security
 * HP (Results for both Fortify and WebInspect)
 * Minded Security
 * Softtek
 * TrustWave
 * Veracode – Statistics
 * WhiteHat Security Inc. – Statistics

=Suggested Enhancements=
 * Use a public wiki or google issues to capture feedback - mailing lists are tough and things get lost
 * Establish a Top 10 panel to evaluate and make final decisions on inclusion & ranking
 * Not feasible for everyone to vote on every item
 * A diverse panel representing various verticals (vendor, enterprise, offense/defense, etc)
 * Additional data sources could be considered (please add links)
 * WASC Web Hacking Incident Database
 * Akamai State of the Internet Reports
 * Firehosts Web Application Attack Reports
 * Imperva's Web Application Attack Reports
 * Additional reports could be considered:
 * Annual Symantec Internet Threat Reports
 * Datalossdb
 * IBM XForce threat reports

=FAQ=