Cybersecurity Paralysis as a Dunning-Kruger Problem

By Chris Hamlin

Legend has it that 19th century laws in the midwest stipulated that if two trains arrived simultaneously at a crossing on intersecting tracks, neither should proceed until the other had gone ahead.

Some aspects of the current predicament in cybersecurity resemble this paradoxical situation, in that the actors in the drama hold conflicting and contradictory interests and positions which lock the situation in a kind of stasis.

Further aggravating matters is the reliance of most of the digital world today on archaic technological foundations which are inadequate, both algorithmically and physically, to the demands of future security applications which are bearing down on us with accelerating urgency. 

Finding a way forward which satisfies the requirements of the various players in cybersecurity is no easy task.

To facilitate such an exploration, it may be helpful to think of it in terms of the famous Dunning-Kruger experiments, which we'll summarize. But first it's important to look at just who the actors are, and how their interests diverge.

First (and probably foremost) we have the users: the individuals and institutions whose digital assets are exposed to risk because their uses involve computers and networks of which adversaries seek to penetrate. The economic incentive to compromise systems and data stems largely from the value of these assets, which runs to many trillions of dollars, and also from strategic considerations.

Second, we have venders of computing and networking hardware and software. These organizations attempt to make money by offering opposed propositions: speedily and reliably moving, processing, and storing vast quantities of data, but inexpensively protecting the data from usurpation by malicious interlopers. Cost and competitive pressures mean that venders face continuous demands for more value at lower prices; security is usually sacrificed in the process.

Third come the government bodies charged with overseeing the immense infrastructure of the modern digital communications ecosystem. Here again we find conflicting motivations and incentives: on the one hand to defend society's network nervous system against attack, and on the other to inspect it and its content at will. 

Opening up the capacity to enter the traffic flow at will means potentially opening that capacity up to the fourth group of actors, opponents with malign or nefarious intent who seek to profit from unauthorized access to the digital data and metadata coursing through the channels of the networked world, the strategic and economic lifeblood of every country on earth.

Added to this collection of institutions and groups is the fact that the technologies  the networked world has grown up with and depend upon have  their theoretical and conceptual foundations in work whose origins stretch back a century or more. The semiconductor, software, algorithmic, protocol, and von Neumann computing architectures upon which we rely  are nothing short of archaic when compared with the task which society has asked them to undertake: safely carry all of the economic, personal, and strategic communications of the world.

The cross-cutting and conflicting interests and needs which these four sets of actors bring to the picture have had the effect of forestalling effective change, in the manner of the 19th century trains cited earlier.

One way of understanding this paralytic immobility, even as mounting millions of illicit appropriations of digital data and other assets accumulate by the day, is to see it as a consequence of behaviors first demonstrated experimentally by the researchers Dunning and Kruger in 1999.

What they found (and demonstrated through an ingenious series of tests) was 1) that unskilled people often overrate their competence, because they lack the very competence necessary to evaluate it objectively, and 2) that very skilled people often underrate their own competence.

If many of the players in the cybersecurity game are thought of as participants in a Dunning-Kruger experiment, the possibility (indeed the likelihood) is that necessary improvements will not be made because enough of the players don't understand the problem well enough to mobilize around correct solutions.

This shortcoming is exacerbated by the ambivalence of governing bodies and agencies with conflicted objectives, since they benefit from the very weaknesses which cybersecurity venders are supposed to know how to repair.

Thus the standoff: vendors are happy providing uncomprehending customers with solutions which even they (the venders) are unaware fail to address the security risks and challenges faced by their customers. Add to this the inadequacy of the architectural underpinnings of the technology, and you have a situation inviting attack and exploitation by ill-intentioned adversaries.

One antidote to this logjam is new technology with the ability to answer the needs of all the players while honoring the tensions which are inherent in the differing roles and motivations present, and defeating the designs of adversaries.

BlackRidge TAC (Transport Access Control) is an example of just such new technology: while offering potent defenses against the most determined attacks, it may be deployed over a very broad range of risk scenarios and mitigation strategies so that customers and venders are assured of providing sufficient but cost-effective solutions to the fullest scope of threats to networks and systems.

Moreover the unique, innovative principles upon which BlackRidge TAC is based allow for continuous, adaptive responses to evolving  threat types, so that customers' risk management strategies are  always ahead of the adversary instead of playing “catch-up”; but  without excessive, unnecessary expenditures for capacity which is largely unneeded.

In this sense BlackRidge TAC offers a remedy to the unhealthy paralysis which is fostering rising cybercrime and compromise to the privacy and security of individuals, agencies and corporations.