Everyone agrees on the need for strong cybersecurity policy. Each month, we see headlines telling of high-profile hacks and expansive bugs that threaten our nation’s commerce, privacy and even our safety. But there is much disagreement on how best to proceed.
Some commentators suggest that a top-down, government-directed solution is the only path forward. Yet history and experience suggests that a bottom-up, decentralized solution to our cybersecurity problems may prove more robust in the long run.
Consider the question of vulnerable devices. Observers like computer security expert Bruce Schneier have argued that there is simply no market incentive for device manufacturers to ensure that their products are reasonably secure before shipping them off to consumers who aren’t exactly security-savvy. Businesses, the argument goes, would rather make a short-term buck and risk calamity than take the extra time to probe devices for vulnerabilities.
Indeed, if the marketplace consisted of nothing more than security-ignorant buyers and sellers who care only for short-term profits, we’d have a big problem. Devices would be chronically insecure without some kind of top-down regulation.
But first remember that computing devices are hardly “unregulated.” Federal agencies like the Federal Trade Commission (FTC) already have the authority to oversee and investigate when product sellers are suspected of foul play.
For many, this existing regulation is not enough. They propose the creation of some new federal body — specific ideas have included a “Department of Robotics” or “Federal Software Commission” — that would be granted new and expansive authority over vast domains of our economy. The policy details of such proposals are generally short: Create an agency, and expect it to solve our security woes.
There are many problems with the top-down approach. First, it fails to recognize the many real market incentives that contribute to good security. Bad actors already risk being blacklisted by responsible companies, losing customers due to bad reputation, or lawsuits in the courts for negligent behavior. This process may not be immediate, and it may not be perfect, but it does an adequate job of weeding out problems.
Next, the top-down approach imposes real costs on manufacturers and consumers. Oftentimes, regulators are just as much, if not more, in the dark about market risks and opportunities as neophyte producers. Their policies may therefore incorrectly stymie certain areas of production without any real benefit, which ultimately raises the costs to consumers.
What’s worse, this misdirection could miss true security threats, ultimately leaving us no safer than before. Consider the federal government’s own bad track record of forcing us to comply with checklist-style measures while missing simple blunders like poor employee password management.
But perhaps most importantly, the top-down approach distracts us from the kinds of bottom-up solutions that could truly provide a robust security environment. As my former colleague Eli Dourado explains in his paper, “Internet Security Without Law,” Internet Service Providers (ISPs) have developed a system of voluntary notice and blacklisting that promotes proactive security outcomes. This kind of arrangement should be a model for how best to proceed.
Meanwhile, my Mercatus Center colleague Anne Hobson uses the metaphor of disaster recovery to explain the ideal path for cybersecurity policy. The idea that planners can accurately predict and direct producers to prepare for all potential security risks is grounded in folly that will always lead to failure. We will never be able to be 100 percent protected from all risks. We can, however, promote an environment that allows us to adapt to the inevitable hiccups that do arise.
What does this mean in terms of concrete policy?
Thankfully, cybersecurity experts already recognize the importance of resiliency for security. Like the Obama administration, the Trump administration designated resilience-building as a key cybersecurity policy goal. This means that public and private bodies alike are working together to develop and strengthen information-sharing bodies, education and certification programs, and even a cyber-insurance industry.
Policymakers should heed and supplement these existing efforts before attempting to erect an awkward command-and-control style regulatory body to direct security policy.
In general, we should all maintain a posture of humility and open-mindedness when seeking security solutions. The threats are often vast, and the answers to these thorny issues will almost never be as straightforward as simply creating a federal agency to “deal with things.” Only decentralized resilience can provide the recovery responses needed to weather the coming digital storms.
• Andrea O’Sullivan is a program manager with the Mercatus Center at George Mason University’s Technology Policy Program.
Copyright © 2019 The Washington Times, LLC. Click here for reprint permission.