We have recently completed SBIR research funded by DHS relating to Cyber-security and software assurance. In that research the necessity for a knowledge base for software assurance results becomes crystal clear. In this research we looked at relating software assurances methodologies – static analysis (evaluating the system design and source code) as well as dynamic analysis (evaluating tools that attempt to penetrate a system at runtime). We also looked at the integration and correlation of static and dynamic analysis. The really exciting results come from the correlated results. A video demonstrating these results can be seen here:
There are many tools that analyze systems for vulnerabilities and even some that can integrate the results of multiple tools and provide pretty pictures or reports. However, this is a naïve approach and one that cannot provide the full hybrid analysis capabilities needed to overcome the sophisticated threats to our critical systems. These more naïve approaches tend to provide a “black box” into which some information goes and then answers pop out. What this fails to provide is the ecosystem foundation whereby there can be any number of inputs from a wide variety of static and dynamic sources, any number of correlation and analysis algorithms (which must independently evolve as does attackers sophistication) and any number of reporting and visualization capabilities. A closed system (or a set of non-interoperable closed systems) can only, ever, be simple tools – not a technical foundation or the basis of an ecosystem.
The problem with any single point or closed solution is that the job of software assurance is, by necessity, never compete. There will always be new patterns of attack and patterns of vulnerabilities. In addition, any single tool invariably has many false positives (things marked as weaknesses that are not) and false negatives (vulnerabilities that have been missed). The false positives can run into the thousands, making true validation impractical, and of course, and SINGLE missed vulnerability can be disastrous. Only when the capabilities of multiple techniques, algorithms, visualizations and tools is combined can a realistic picture of a system’s risk be understood and vulnerabilities mitigated. We need an environment where these different tools, algorithms and visualization capabilities “plug in” to a common repository of facts about a system. Our tests have indicated that a federated approach to assurance reduces the false positives by an order of magnitude (or more) and improves coverage, providing fewer false negatives – i.e. a more trusted system.
The open knowledge base approach is fundamental to providing an agile, open and essentially unlimited capability to understand, analyze, secure and visualize our systems. With one query a new correlation can be tested, a new path explored or a new report produced. It is the essential move of cyber security from a tool orientation to a knowledge and information orientation.
With a knowledge base as a foundation we have the makings of an ecosystem, not just a product. Anyone can add an adapter to gather new system facts, anyone can add new correlation or analysis algorithms – based on the information from any of the inputs, from anyone else, anyone can then add visualizations or reports to the information produced from any of the inputs or any of the correlation or analysis algorithms. It should be noted that while the ecosystem is open due to the knowledge base, any component may be internally open or closed – based on the commercial or security interests of the authors. This openness and flexibility are crucial for a successful software assurance ecosystem.
Since it is our desire and expectation to have this degree of openness and flexibility, the capabilities of and interfaces to the knowledge base are crucial. It has to be tough enough, scalable enough and capable enough to manage such varied requirements. The approach to the knowledge base we evaluated is to integrate the experience of dozens of experts in the field which have contributed to an open standard – ISO/IEC 19506 (OMG-KDM). Home-grown or ad-hoc approaches to the problem are unlikely to encompass the deep experience encompassed in the standard and reproducing that capability in a non-standard way would be expensive and counter-productive.
Of course any standard must be “field proven” and this project has been a test of that standard to encompass the new capabilities of hybrid analysis. The good news is that there were no changes required to the standard foundation. All that was required is the addition of definitions, in terms of KDM primitives, of the new capabilities – execution traces and penetration results. This addition of new definitions in terms of primitives is the intended way for the knowledge base to grow in scope and flexibility. The KDM based repository performed admirably in this project and exceeded expectations.
It is therefore our conclusion in this research that ISO/IEC 19506 (OMG-KDM) and the knowledge based approach have been validated for hybrid analysis and Cyber Security. There is no reason for proprietary, black-box or closed solutions. Further, it is apparent that it is irresponsible to deploy any system in today’s environment without rigorous software assurance.