Difference between revisions of "The Strengths of Combining Code Review with Application Penetration Testing"

From OWASP
Jump to: navigation, search
(Created page with '== The presentation == rightThe strengths of manual code review in findings vulns (using the Top 10 as the categories) * The strengths of manua…')
 
(added link header)
Line 1: Line 1:
 +
[[Image:468x60-banner-2010.gif|link=http://www.owasp.org/index.php?title=OWASP_AppSec_DC_2010]]
 +
 +
[https://guest.cvent.com/EVENTS/Register/IdentityConfirmation.aspx?e=d52c6f5f-d568-4e16-b8e0-b5e2bf87ab3a Registration] | [https://resweb.passkey.com/Resweb.do?mode=welcome_gi_new&groupID=2766908 Hotel] | [http://www.dcconvention.com/ Walter E. Washington Convention Center]
 +
<br>
 
== The presentation  ==
 
== The presentation  ==
  

Revision as of 00:04, 21 September 2010

468x60-banner-2010.gif

Registration | Hotel | Walter E. Washington Convention Center

The presentation

Owasp logo normal.jpg
The strengths of manual code review in findings vulns (using the Top 10 as the categories)
  • The strengths of manual pen testing in findings vulns (against Top 10)
  • How each technique can leverage the other.
  • How proving vulns can be important, but not really in a mature org
  • The massive benefit of finding where the vulns are in the CODE, not just finding the flaws in the application
  • How tracking down a penetration testing finding to where the flaw is in the actual code can be EXTREMELY hard
  • Potentially some discussion on the role of automated analysis tools (both code and external scanning) and their strengths
  • An how automated analysis tools can support a more efficient application security assessment process, when combined with manual analysis

The speaker

Dave Wichers Speaker bio will be posted shortly.