AppSec Blog

AppSec Blog

Exchanging and sharing of assessment results

[Cross posted from SANS ISC]

Penetration tests and vulnerability assessments are becoming more common across the whole industry as organizations found that it is necessary to prove a certain level of security for infrastructure/application. The need to exchange test result information is also increasing substantially. External parties ranging from business partners, clients to regulators may ask for prove of tests being done and also results of the test (aka. Clean bill of health).

The sharing of pentest information can create a huge debate, just how much do you want to share? There are at least a couple ways to get this done. The most seemingly easy way to do this is to share the whole report including the summary and also the detailed findings. While this seems easy, the party sharing out the report may be exposing too much information. Pentest reports can be like treasure map to attack an infrastructure and/or application. The detailed report usually include ways to reproduce the attack and effectively documenting a potential attack path in a step by step manner. It is true that vulnerabilities should be fixed as soon as possible after the pentest is done. Consider this scenario, the day after pentest is done, the regulators shows up and ask for the most recent test result. If you are not above the law, you should be yielding the latest report that is full of unfixed flaws.

Another way to share pentest result is to only share the executive summary portion. This portion of the test report usually gives a good overall view to what was done in the test and what sort of overall security posture the test subject is in. While this protects the party sharing out the test result, this may not grant the reviewer the right kind of information. Some executive summary does not contain sufficient information especially those ones done by less competent testers. Aside from that, one of the trend I am noticing is the less experience the receiver of test result, the more him/her want to see the whole report, they just don't know how to determine the security posture based on the executive summary alone.

There is no current industry standard for this kind of communication, it seems that all the exchange and sharing currently done are on ad-hoc basis. Some like it one way and others like it another way. I consider the current baseline for this kind of communication to be a well written executive summary containing actual summary information of the test with the methodologies used and also the high level view of the vulnerabilities that was found to be sufficient for giving a decent view into overall security posture. This obviously can escalate into a full report sharing if the quality of the executive summary just isn't there.

If you have any opinions or tips on how to communicate this kind of information, let us know.

2 Comments

Posted November 19, 2010 at 3:25 PM | Permalink | Reply

Ryan Barnett

The WASC Web Application Security Statistics Project had to deal with similar issues - http://projects.webappsec.org/w/page/13246989/Web-Application-Security-Statistics

There were 8 different assessment companies (both SAST and DAST) that had to figure out how to share assessment data and make the results public.

The key for public release is to not include any identifying data about specific organizations or attack vectors. They only focused on the attack/vulnerability categories.

Posted November 19, 2010 at 7:50 PM | Permalink | Reply

Andre Gironda

Anurag Argawal, Matteo Meucci, and Eoin Keary are working on an OWASP common vulnerability list. There was a recent discussion on owasp-leaders from Daniel Cuthbert on a "Common web application vulnerability naming standard" where Eoin subtlety announced this project.

You are correct that a lot of application security consulting and penetration-testing consulting companies have a lot of different standards when it comes to documentation, but many follow the Big 4 consulting or government subcontractor styles, as many of the original founders of say, Foundstone, came from E&Y, and many of the @Stake founders came from BBN.

There have been attempts at creating standardized report templates. vulnerability finding templates, and report creation guidance from a few places, notable OWASP -- however I have not seen this utilized as clients often demand their own styles/standards and consulting teams often have their own unique requirements depending on how they share information using their own internal processes and perhaps penetration-test workflow tools (e.g. The Dradis Framework). I'm only mentioning one here because it appears to be under active development and heavily utilized compared to any others I've seen or heard about.

A few companies have a very well-known methodology and branding around this process. Gotham Digital Science, in particular, makes casual mention to their Application Security Directives (ASD) on their website. Aspect Security leverages their appsec control library of categories based on the type of control. They are different, but all tend to play off the MITRE CWE/CAPEC work, or potentially the OWASP T10, WASC TC, Microsoft STRIDE, or other popular threat-modeling, appsec Enterprise architecture, or similar open standard language documentation. ISC2, which has historically been successful with their CBK, hasn't built a strategically significant CBK for appsec using their CSSLP program yet, and neither has SANS/GIAC (they are using the MITRE CWE in a lighter, more consumable format).

Speaking only to penetration-testing, I think that MITRE CAPEC is worth a strong look as a standard, but the language around other aspects are certainly better understood from the ISECOM OSSTMMv3 perspective, which is on the ISO track. If reports could be done in OSSTMMv3 STAR/RAV format, this would certainly lead to report and results consistency, however OSSTMMv3 is primarily focused on LAN, WAN, Internet/DMZ, Wireless communication, and physical penetration-testing activity -- but is largely ineffectual when dealing with appsec, cloudsec, virtualization security, or data security activities. Dreamlab has come closest to integrating the work of ISECOM and OWASP in their Certified Secure Web project, but I haven't seen any public documentation and it's been a few years since it was announced.

If you are looking for a very successful, albeit higher level, initiative, check out the Financial Institution Shared Assessments Program (FISAP). Um, here: www.sharedassessments.org/media/AUP%20v5%20Assessment%20Report%20Template%202010.doc

Post a Comment






Captcha

* Indicates a required field.