User talk:Adrian Goodhead

Welcome to OWASP! We hope you will contribute much and well. You will probably want to read the help pages. Again, welcome and have fun! KateHartmann (talk) 08:47, 10 September 2013 (CDT)

== Security testing from the trenches

Part 1 The road to secure applications ==

As I reach my 20th year in Information Technology, I am feeling the need to share those lessons that cannot be picked up from books or training. It is our responsibility to guide newcomers into the terrible and exciting world of IT Security, like Morpheus freeing Neo’s mind from the grips of the machine, or Alice’s most helpful Rabbit. Unfortunately for those companies looking to mass-manufacture IT Security experts for fun and profit (mostly profit) IT Security is a state of mind, and is hard to quantify with pie charts and KPI’s.

I was tasked to perform a review of a Web Service used widely across the world in educational institutions, a new version of the application had been developed using a well-known Java framework and Apache Tomcat, and they were marketing the move to a Cloud based SaaS offering. According to the Vendor this application had been fully tested by a security company and no issues had been identified. The client had also employed a leading application scanning tool to check for vulnerabilities and according to the Vendor no risks had been found. When I requested access to the scan report, they refused. Happily the good men at Portswigger had just released a new build of Burp, which is a staple in my toolkit, new versions of this package bring with it discoveries of input validation flaws for many high profile sites.

During my site enumeration and spidering, my Anti-Virus started going a bit bonkers complaining about infected content which was embedded in the web pages being downloaded from the SaaS site in question. And so the Rabbit-Hole extended. Burp detected a number of simple XSS vulnerabilities that were reported to the Vendor, who then refused to admit that these flaws existed. This was the first time that I had come across this much Malware during a test. At this point the sceptic in me started to emerge and when I requested a copy of the review of the audit report for the Software I was quickly informed that this would not be possible. I was told that the security company who had performed the testing was reputable.

So I requested that the security company provide a letter stating that they had tested the application and were able to confirm in writing that the application had been reviewed and was found to be secure. At first the Vendor refused, but after lengthy commercial debates with the Project Manager they agreed to provide this proof. Enter a full room of people and hurl a generic insult not targeted at anyone in particular, those who react the most are likely to be the insecure ones, this is human nature 101, Software Vendors are no different. We spend large parts of our lives working out the technical dynamics of the Interweb but often fail to use our instincts to guide us, unless we specialize in Social Engineering! So the Vendor gave me the details of their testing company and after some brief research on Google it was determined that they were an “unknown” Security Company in Israel with no industry footprint, which could explain how they may have missed fairly Vanilla XSS vectors.

During a conference call with the Development team they then attempted to discredit the results of the testing process, stating that their scanner had not found these issues, and that this “Open-Source” tool was not trustworthy. This type of thinking is one reason why software is still so insecure. After a month of emails and pressure they accepted that these were real issues and agreed to fix them in a patch to their input validation routines. When I performed a retest these issues had been fixed, and had been replaced by the same number of XSS weaknesses of a similar variant in each affected form field. The vendor had used a Blacklist to prevent this attack however a slightly modified payload would provide a similar result. After another month the Developers agreed to implement a Whitelist for all input validation. On the third retest the application was clear of input validation flaws and the World was saved.

Lessons learnt

Always use Good Anti-Virus protection when doing Site discovery in Windows and preferably Sandboxed Virtualization, this is one very good reason to use a minimized Linux install for App testing. You can be owned by spidering a Malware infested site.

Always perform a Due Diligence exercise on any Security Testing Company that you intend to engage with and ensure that they are an established company, known in the Industry and have qualified testing staff.

When using automated scanners, make sure that you are scanning all forms and interfaces. Scanning a bunch of login pages and saying you are secured is just plain silly, but not unusual.

Developers may be personally attached to code, because it is their deliverable, if you are stating that it is bad be prepared to deal with unhappy campers. Use more than one tool as each tool may find slightly different risks, which means better coverage, this should include a POC if allowed by the client, as a colleague of mine would say, you have been Stall0wn3d!

Always be sensitive to how a Vendor reacts when questioned about the Security of their application, resistance to providing information is a sure-fire sign that they have something to hide.

When a Vendor becomes evasive when dealing with Security related questions then there is a strong possibility that their application has serious security issues.

Security testing from the trences Part 2 The blame game
Often companies engage a security tester to have someone to blame when they get hacked through disastrous security, PCI DSS is a classic example, and here is another. We performed a test for a large gambling company, with a huge sprawling site using all the latest Web technologies. This site had been thoroughly tested before, and we did not identify any serious flaws, which is quite a nice novelty once in a while.

A few months later we were contacted by the client to inform us that their site had been fully compromised by a critical vulnerability in the upload feature which had allowed malicious files to be uploaded to gain unauthorized access. So looking through our logs we were unable to find any reference to the upload feature URL’s that had been provided. I arranged a teleconference to try and manage our client’s expectations and to ensure that this was not an error by either the tools or our engineer. So I asked them when the feature had been added, because our test did not reflect this part of the application, and therefore we had not tested it. They would not provide a clear answer to this question, which immediately put me on guard. So I asked them if any major changes had been made to the application after our tests, and the reply was that no changes had been made except they had moved to a new Operating System, Web Server, Application Server and Database, and a new hosting company, but absolutely no application changes had been undertaken… At this stage I actually put my hand over my mouth to stop my laughter.

The client would still not admit to having updated the application with new functionality and it was becoming apparent that this was a “Witch-Hunt” to try and protect the guy’s jobs that were responsible for the security of this site by blaming the consultants. Sadly this strategy plays itself out regularly; just look at the Target breach for a great example and be ready to defend yourself against this type of incident. Often in badly managed organizations the most mature business process is the blame process.

Lessons learnt

Always keep detailed logs of all scans and site structures for at least 12 months, include dates. You never know when you may be facing an angry client looking for your scalp.

Always perform full retests of the environment when making major changes to the Infrastructure or Applications.

As mentioned always use more than one tool to validate results, one tool may be wrong, a second tool and manual checks can save a lot of pain.

Always remember when testing a critical site that the client is relying on you to find serious flaws, this is a huge responsibility and can never be taken lightly. Some clients are just not worth working with, it is important to remember this if you are scoping assignments or are a freelance consultant, always meet the client face to face, as email, teleconferencing and Skype can lead to a world of hurt.