The Cybersecurity Report: A Case for a Dynamic Deliverable Death to the Document By Anna Lee, Since 2019 PlexTrac has held a unique vantage point in the penetration testing and vulnerability assessment subsectors of the information security industry. Our clients use our platform to deliver the results of their testing and assessment activities to their end users — and they generally want to do that in a very bespoke manner. We make it easy for them to do this, and thus we’ve seen A LOT of cybersecurity report templates. One of our services is the automated production of a .docx report in a near pixel-perfect representation of the clients’ existing report template. In fact, the PlexTrac team has torn apart and recreated over 600 different report templates from many of the top information security consultancies and Fortune 500 internal testing teams. We spent on average over 25 hours with each template, and today we have a team of eight who continue this work. This has by no means been a labor of love. We do this work not because we believe it is the best method of report delivery, but because we continue to serve an industry that mistakenly believes that a document is the best method to deliver security results. And since a document is the recognized standard for the final deliverable, consultancies try to differentiate themselves more in the sizzle of formatting than in the content of their reports. But is the static report template really the thing differentiating one service provider from another? Are the formatting, branding, or even the different content included in the report really what separates a skilled, elite testing team from a lesser one? Are we delivering the value in the static document-based report that we think we are? Learn how PlexTrac can not only transform the deliverable for consumers but also improve your effectiveness and efficiency and cut reporting time in half. Perhaps the Cybersecurity Report Sauce Isn’t so Secret As we collected and converted hundreds of report templates so our clients could streamline their reporting building processes and still output their perfectly honed document, we began to notice that the differences between the majority of reports were primarily cosmetic. Each client has meticulous and granular formatting and branding requirements that are considered key components of their “secret sauce.” There are the obvious and easy attributes such as fonts, sizes, styles, margins, colors, and spacing. But there are also fanciful tables, branding logos, custom numbering schema, and non-standard finding attributes. To determine if the actual content of the different templates varied much, we conducted some research. By selecting 50 technical testing reports from industry-leading consultancies and mature internal testing teams and then coding each report (statistically speaking) for the data types presented, we hoped to determine how unique each client’s “secret sauce” really was. Turns out, not very secret. What Is the Recipe for a Successful Cybersecurity Report? Our analysis clearly indicated that there are core components of technical testing reports with a high degree of adoption across the sample set. If one were to create a content framework from those data types that have > 50 percent adoption, it would look like this: Executive Summary Contents Introduction Narrative Scope Narrative Summary of Findings Narrative Summary of Findings Table Findings Count by Severity Table Methodology Narrative Severity Explanations Table Detailed Findings Content Title Description Recommendations (Verbose) Affected Assets Evidence / Technical Details We assert that most testers would recognize this list and agree that it comprises the core expectations of the consumers of the reports. However, is the inclusion of finding data such as CVSS scores, references, impact and likelihood scores, and CVE mapping truly just padding? Note the absence of an entire section of the report delivered by many penetration testing and red teams — the Attack Path. There are strong arguments that these lesser-adopted data types provide additional information and context that can assist consumers with prioritization and remediation efforts. So if we accept that there is value in lesser-adopted data types, why do they not have greater adoption? Maybe some testing organizations just have a greater degree of rigor around their testing and reporting. One may also point to differences in the consumer profile that these organizations are serving. However, we think the most likely root cause is that testing organizations are making trade-offs to address the fundamental limitation in document-based report delivery: signal-to-noise ratio. Maximizing Signal and Minimizing Noise in Your Cybersecurity Report Document-based reporting suffers from the law of diminishing returns. The more information that is included, the more difficult it becomes to find the information needed for the task at hand or to prioritize analysis of that data. Information that may be useful, “signal,” to one consumer may not be germane to another and thus creating “noise.” Leaders of consulting and internal teams are forced to make subjective assessments as to whether any given data field provides enough value for enough consumers to offset the additional distraction that will inevitably be induced for some consumers. More often than not, testers will make the conservative decision to include additional data for fear of angering the minority of consumers who do believe the data is valuable. And this is why we have a communal archetype of the 300-page PDF report. For as long as a static document is the primary deliverable for an engagement, we can never optimize delivery of the right information to each and every consumer. But Wait, There’s More … Problems with the Current Cybersecurity Report Paradigm While the signal-to-noise problem may be the cardinal sin of document-based reporting, it is by no means the only one. In our interactions with report writers, we routinely hear a plethora of additional gripes with traditional document-based reporting: Inefficient editing workflows Version control issues Inactionable, static data One-size-doesn’t-fit-all templates Despite these problems the vast majority of pentesters practically worship their document. And why wouldn’t they considering the value it has historically represented: It is the concrete deliverable that gets them paid. It represents weeks of work and an opportunity to stand out from the competition. It is time consuming to create. It meets the expectations we’ve trained into customers. But, for the most part, these value propositions only exist because they represent the status quo. They exist because we have collectively accepted them, not because they are the best way. The primary purpose of testing is to provide information to an organization that enables it to actually improve security posture. If the static document deliverable isn’t the best way to communicate results that are actionable any more, it’s time to move on. A Cybersecurity Report Deliverable for the 21st Century We propose a new offensive testing primary deliverable: a dynamic report delivered electronically via web-based platform. An electronic report, not bound by a static document format, offers some obvious advantages. The information delivered can be easily tailored to the needs of every client, customers can access or interact with data throughout the testing process not just at the end, and, above all, the results are much more actionable. While it is unlikely that static documentation would be completely eliminated as it is good proof of work completed and a helpful point-in-time marker (especially for auditors!), it doesn’t need to be the primary deliverable that teams are relying on to drive remediation forward. We think of it as a banquet versus a buffet. At a banquet, you eat what you are served. It’s nice, but it may or may not be what you were really wanting. You may leave hungry or stuffed. At a buffet, each person chooses what and how much they want and nobody is waiting to be served. Likewise with dynamic reporting through a client-accessible platform, customers can have all the testing data they want without being overwhelmed with the data they don’t. They can easily parse the information into actionable tasks and begin to see real improvement to processes and posture. A dynamic, electronic deliverable ensures the relevancy of the pentester as the industry moves towards shorter, more frequent testing cycles and automated pentesting solutions. That web-based, dynamic reporting solution is PlexTrac, the premier pentest reporting and collaboration platform. Request your demo of PlexTrac today! Anna Lee
Vulnerability Assessment vs Penetration Testing: Understanding the Key Differences Vulnerability Assessment vs Penetration Testing READ ARTICLE
Unlocking Continuous Threat Exposure Management: New Features for Prioritizing Remediation Based on Business Impact The evolution of product security READ ARTICLE