Category:OWASP Security Spending Benchmarks
Security Spending Benchmarks Project Report March 2009
The first report of the OWASP Security Spending Benchmarks Report is now available. It can be found at the following link:
There are a number of key findings in the study:
- Organizations that have suffered a public data breach spend more on security in the development process than those that have not.
- Web application security spending is expected to either stay flat or increase in nearly two thirds of companies.
- Half of respondents consider security experience important when hiring developers, and a majority provide their developers with security training. 38% have a third party firm conduct a security review of outsourced code.
- At least 61% of respondents perform an independent third party security review before deploying a Web application while 17% do not (the remainder do not know or do so when requested by customers).
- Just under half of the surveyed organizations have Web application firewalls deployed for at least some of their Web applications.
Transparency is a key principle of the OWASP SSB Project. For this reason all raw survey results are made available to the community. We welcome additional commentary and interpretations on the survey data. The raw survey data can be found here.
Please contact the project leader Boaz Gelbord (bgelbord at wgen dot net) if you have questions about the project or you would like to inquire about contributing to the project.
About the Security Spending Benchmarks Project
The Security Spending Benchmarks Project seeks to produce guidance and an industry accepted benchmark for justifying overall Web application security spending. We want to quantify how many dollars and human resources should be allocated towards the software development life-cycle, security training, security software/tools, independent third-party reviews, Web application firewalls, etc. This project is motivated by the fact that:
- There are few, if any, industry standard benchmarks for executive management to consider when deciding what is a reasonable amount of resources to spend on Web application security in or out of the software development processes.
- Spending on security helps mitigate risks whose potential costs are often difficult to quantify, thereby making justifying and obtaining security budgets difficult.
- Many business initiatives require organizations to take “reasonable measures” and “adhere to best practices” for developing, delivering, and/or hosting secure Web application, but there is no industry consensus or data repositories on how this translates into monetary terms.
- Smaller organizations outside of highly regulated industries purchase and deploy Web applications with no realistic ability to evaluate their security program.
- Producing a less secure Web application may be less expensive than producing a more secure version of the same software. Organization that have invested development resources into software security may not be able to charge a premium for this investment because there is no reference point for the investment.
The survey was formulated with the help of our project partners to address the following questions and many others:
- What percentage of a Web application development groups headcount is dedicated towards security?
- How much budget is allocated towards Web application security as a percentage of software development and overall operational IT security costs?
- Where do Web application security budget come from?
- How much budget is allocated towards security education?
Data Collection & Distribution
We utilize the SurveyMonkey system to host surveys conducted for the OWASP SSB Project. We do not collect any publicly identifiable information including names, addresses, employer, email addresses, etc. from the respondents. While we expect a limited number of respondents trying to intentionally skew the results, we take precautions to limit the potential while not creating unnecessary overhead. We control survey access via username/password, as well as through a trusted network of contacts. All information collected is made available through Survey Monkey.
Planned Q2 Timeline:
1. April 1-15: Discuss thematic priorities with partners. Expand partner network.
2. April 15-30: Formulate survey questions based on identified thematic priorities
3. May 1st-May 15th: Collect survey responses through partner network.
4. May 15th-May 31st: Analyze results and produce draft report.
5. June 1st - June 15th: Get partner feedback on draft and make edits.
6. June 15th: Publish final report
1. Completing the project description text and finalizing the proposed survey questions. (DONE)
2. January 12th - Open up survey to respondents (DONE)
3. February 6 (extended from Jan 26) - Close survey (DONE)
4. February 6th - Survey Analysis Begins (DONE)
5. February 6th-20th - Boaz Gelbord and Jeremiah Grossman to edit draft report (DONE)
6. February 20th- Decision point whether to include late submissions (DONE)
7. February 20th - Circulate draft report to partners with raw data, request to keep data confidential prior to publication. (DONE)
8. March 19th (was March 15th) - Publish report after integrating partner feedback. Generate community interest and discussion around results.(CURRENT)
9. After March 19th - Coordinate formal acceptance of deliverable by OWASP and plan further steps for the project.
The Security Spending Benchmarks Project Leader is Boaz Gelbord (Executive Director of Information Security, Wireless Generation). Boaz can be reached directly at bgelbord AT wgen.net with any questions or feedback. Jeremiah Grossman (Founder & CTO, WhiteHat Security) is also closely assisting in the effort.
This category currently contains no pages or media.