Another major breach of cybersecurity will soon be in the news. The only question is how dramatic and costly that breach will be, and whether the full extent of the damage will ever be made public. Worse still, should hackers gain access to the financial records of a major national bank or credit card issuer or an important defense contractor, the recent thefts of consumer information, such as those that occurred at retailers Target and Home Depot, may seem comparatively insignificant.
What accounts for the increase in cybercrime? Three broad new security challenges have emerged. First, there has been a previously unimaginable explosion in the amount of data, connections, transactions and communications that has overloaded traditional data systems. Think of the issues that arise when you have a cyberconversation with 1,000 people at the same time all of whom can respond at warp speed to what youre saying and are themselves connected with thousands of others. How do you detect potential risks among an ever-expanding constellation of connections and data?
Second, institutions have lost the ability to identify problems. Faster innovation cycles and a dizzying array of new products mean that most businesses find themselves unable to quickly recognize security breaches. Social networking systems, big data, cloud computing, mobile Internet and Internet of Things technologies are generating personal data streams that have made authorization and message filtration extraordinarily difficult.
Third, there is a lack of formal control mechanisms. In an environment where cybersecurity disruptions are becoming more pervasive and sophisticated, there are still no recognized standards for detection, response, remediation and enterprisewide communication. The management of these critical functions is often left to the IT department, which is usually directed to pursue outdated hardened-shell strategies designed only to discourage penetration (see also Jack Lew Calls for Action on Cybersecurity).
The old approach to cybersecurity required users, processes and code to conform to its specific requirements. But if we want to exploit and protect our IT capabilities, our systems must be open to the outside world and, crucially, be able to learn from it. We need an information security model that continually assesses the validity, reliability and value of the information it gathers.
The human immune system offers a helpful analogy. When a germ breaches the bodys natural barriers, the immune system mounts a three-step defense: Sound the alarm, solve the problem and recover and remember. The first defenders on the scene are the white blood cells, which constantly circulate throughout the body, much like police on patrol. Next, specialized white blood cells called lymphocytes engage in a two-pronged attack, one directed at infected cells and the other at hostile microbes roaming through the blood. Finally, once the invaders and the compromised cells have been destroyed, the immune systems soldiers return to their bases, leaving a smaller number of seasoned veterans to attack should the invader return.
The effectiveness of a cybersecurity defense, like that of the immune system, depends largely on each component efficiently fulfilling its role. Corporations clearly need to manage cybersecurity at the enterprise level and must improve the ability of each element line management, operations, internal audit, risk and compliance to satisfy its individual and organizational functions (see also How Exchanges Should Tackle Cybersecurity).
Constant surveillance, early warning indicators, multiple layers of defense and learning from events are all critical pieces of cybersecurity. When things do go wrong because sooner or later they will the ability to quickly identify the problem will lead to a more effective recovery. Security cannot be guaranteed, but timely reaction can.
Cybersecurity is not a problem to be solved it is an ongoing risk to be managed. Just like subjecting yourself to an annual medical examination, so too senior management must institute independent cybersecurity review processes. Threats should be viewed in the context of tolerance levels high and low and treated accordingly. A comprehensive management framework for cybersecurity should encompass governance, the setting of objectives, prompt identification of rapid events, risk assessment, and response and control activities.
Finally, it is important to leverage emerging open source intelligence services and use the data gathered to guide ongoing cybersecurity investment. Over time that data will be pooled, allowing new tools to be developed to analyze, prevent and mitigate the cyberthreat of the day. This new intelligence will be particularly valuable when proactively quantifying risks and evaluating the investment levels required to protect specific assets.
David X. Martin is a senior adviser to consulting firm Oliver Wyman and author of The Nature of Risk.