Difference between revisions of "ESAPI Assurance"

From OWASP
Jump to: navigation, search
(Building an Assurance Case for ESAPI)
(Coding Practices)
Line 34: Line 34:
  
 
* practices for code check-in and independent review - how is introduction of Trojans avoided?
 
* practices for code check-in and independent review - how is introduction of Trojans avoided?
 +
 +
* what threat level is being accounted for (e.g. will this only work against script kiddies)?  was threat modeling used?

Revision as of 12:50, 11 December 2008

Building an Assurance Case for ESAPI

  • summary: make Claims, provide supporting Evidence, and make Arguments for how the evidence supports the claims
  • Highest level claim is "The system is Acceptably Secure" but how to break this down into sub-claims that map to the provided evidence? e.g. absence of specific vulns (as investigated by manual testing or tool scans)
  • "Software Facts Label"
 http://swaconsortium.org/projects/softwareFacts/softwareFacts.html
  • each language (Java, ASP, etc.) may need separate claims
  • list the third-party software
  • discuss coding practices that were followed, skill levels of developers, amount of independent review
  • publish scanning tool results
  • links to DHS web sites and documents
  • "Arguing Security - Creating Security Assurance Cases"
https://buildsecurityin.us-cert.gov/daisy/bsi/articles/knowledge/assurance/643-BSI.html

Coding Practices

  • was OWASP Top Ten followed?
  • how was performance and security balanced?
  • what is the level of training of the developers? amount of experience in web development?
  • were tools part of the whole process or run at the end?
  • how was code repository prevented from unauthorized alterations?
  • practices for code check-in and independent review - how is introduction of Trojans avoided?
  • what threat level is being accounted for (e.g. will this only work against script kiddies)? was threat modeling used?