I think this incident also demonstrates why this whole problem is still so hard to evaluate, and why we really need greater information and assessment before we’ll know if we are over- or under-reacting.
Indeed, private computer security experts are reportedly miffed that the U.S. government isn’t providing them with everything it may know about the Stuxnet problem. So it’s hard for us laypersons to judge just how broad or serious such a threat might be, or how easy it would be for others to do something like this to us.
The reports I’ve seen also suggest that the worm was almost certainly the product of a sophisticated programming team, and most analysts seem to think that a wealthy and/or advanced country had to be behind it. If so, then one might be justified in concluding that cyber-war in the future will be a lot like conventional war in the past: the richest and most advanced countries will be better at it, simply because they can devote more resources to the problem. Even if Stuxnet suggests that cyber-war has more potential than people like me had previously believed, it doesn’t herald some sort of revolutionary shift in the global balance of power, in which a handful of clever computer-wielding Davids suddenly strike down various lumbering, computer-dependent Goliaths.
In any case, the one thing I haven’t changed is my desire to see this problem analyzed in a more systematic and public fashion, and by a panel of experts with no particular professional or economic stake in the outcome. Ironically, in the aftermath of the Stuxnet attack, I’d like to see that even more.
David Ignatius penned an op-ed column warning that Pentagon planning for cyber war had â€œa Cold War chillâ€ and that â€œa new (and expensive) obsession with cybersecurity is not what this traumatized country needs.â€
The column was widely reprinted: It played into themes running through the boomer pundit complex, a view of the military and intelligence services as hawks circling a checkbook, of public debates dominated by vulpine armchair warriors and gullible pols desperate to hang tough with voters; it even played to the civil libertarian angle that has some on the right making common cause with the ACLU over â€œBig Sisterâ€ assaults on privacy.
They cringed because Krebs knew the tech and Ignatius clearly didnâ€™t.
Framing via the use of analogies, metaphors, etc. is crucial to understanding and responding to problems most effectively. This is not in doubt. Focusing on “the technology” is not a substitute.
Work by historians, sociologists, and anthropologists of science and technology have shown that language, metaphor, analogies, etc. help scientists and engineers to frame the work that they do, to understand “the technology” (or “the science”) that they do, but also shapes decisions about which technology to develop and which science to do in the first place.
Stuxnet penetrated the media consciousness because it was an unconventional weapon with a conventional target â€“ Iranâ€™s Bushehr nuclear reactor â€“ and therefore had conventional geopolitical significance.
Stuxnet represented an evolutionary leap in malware; it was a code map for the future of hacking. So what did that mean in practical and organizational terms? What needs to be done?
Those are indeed key questions. But they cannot be answered by technical experts alone. So Stuxnet captured media attention because it “had conventional geopolitical significance.” But unless geopolitics is going away as a concern, then we will need people with expertise in geopolitics to help us answer these questions and make the right decisions.
Given that key analysts, such as Forresterâ€™s John Kindervag, were now talking about the end of trust â€“ that one should assume â€œzero trustâ€ in any networked environment â€“ and that NeuStarâ€™s Joffe was arguing that we needed to open a second front on cybersecurity by rewriting the protocols governing the Internet, the answers in Geekville signaled tectonic shifts, radical responses.
But because there are so few mainstream journalists covering this beat armed with sufficient technological insight, the response to Stuxnet has largely been one of alarm and speculation.
In some (tech free) quarters, skepticism is still the default position
There are definitely those with tech credentials who are skeptical of the Stuxnet hype–e.g. Jeffrey Carr–and those who are skeptical of cyberwar more generally–e.g. Bruce Schneier, Marcus Ranum, George Smith, Rob Rosenberger, etc.
some attacks are tangible â€“ they are, like Stuxnet, easily explained to the public â€“ while others are intangible, and are very difficult to convey to a non-technical audience.â€œCyberwar has been going on for years, it just hasnâ€™t been noticed or it has been kept on the DL. There are a lot of things we try not to make public and a lot more going on than people realize.â€
And whose fault is that? In a society dependent upon complex infrastructure networks, the argument that “it’s technical” and therefore beyond the Unwashed Masses, that the public should just trust the technocrats, is unacceptable. Part of the problem here is clearly a communication problem on the part of the technical experts. True, its difficult to explain these issues to a “non-technical audience.” But it is the technical experts’ duty to figure it out, and to do it in a way that is ethical. Next, if there is a “lot more going on than people realize,” whose fault is that? The people who are making the decision not to inform the public. There are trade-offs to be made between secrecy and openness and the advantages and disadvantages of each. There is a decision being made here in favor of secrecy. That’s fine. But then don’t complain when people don’t have all the information and are therefore skeptical.
All the analysts Iâ€™ve spoken to say the media needs to raise the level of its game: reporters need to be tech savvy in order to take the kind of reporting done by Priest and Arkin to the next level rather than let it flounder in conventional Washington wisdom â€“ a horse race story of turf battles and lobbying and waste. The only way to know if government is screwing up is to know enough tech to know whether the bureaucrats know enough tech.
1) Surprise! Technical experts say that journalists, politicians, and the public all need to know more about tech.
2) Unless “turf battles and lobbying and waste” are a thing of the past–and there’s no evidence that they are–then continuing to report on those issues is not “floundering in conventional Washington wisdom,” it’s essential.
3) Knowing tech is necessary but not “the only way to know if government is screwing up.” Again, unless law, politics, lobbying, organizational cultures, power relationships, personal, organizational, and institutional interests, and much more are suddenly irrelevant, then technical knowledge alone is not sufficient to allow to “know if government is screwing up.”
4) No one type of knowledge or way of knowing will provide the silver bullet. No one person or group of people will have all the knowledge necessary to “know if government is screwing up.” Rather, multiple people with multiple skill sets and areas of expertise, all looking at the same issues and problems from their various perspectives, will give us an idea about the wisdom of government decision making on cybersecurity (or any policy, for that matter). Policy decisions, even ones about highly technical matters, cannot be left to the technicians alone.
says Kindervag, â€œwith all of the benefits of the Internet, blogging, and social networking, it seems that too many people get a voice before theyâ€™ve earned the right to have a voice. Brian [Krebs] earned his voice.â€