The following post was written by Andy Bochman, Grid Strategist-Infrastructure Defender for Idaho National Laboratory’s National & Homeland Security Directorate. Andy will serve as a provocateur at the March 19 Moving From Cyber Security to Cyber Resilience Summit.
Whatever primary hat you wear (engineer, attorney, cyber guru, standards writer, regulator, or end user), imagine for a moment the challenge of cybersecurity from the perspective of each of those other folks. You are familiar, I take it, with the parable of the elephant in the dark room, and how one’s impression of what animal it is depends on which part they encounter first as they feel their way around.
Speaking of elephants, as framed by the engineering standard of care, if one were designing a bridge capable of safely and reliably supporting the passage of up to 100 elephants at a time, the normal best practice thing to do is design and build it with a safety factor — let’s say a structural design and materials selected to support 140 standard elephants. Of course, we need to define whether we are talking African or Asian elephants, as there’s a not insignificant weight difference, with Africans often reaching seven tons and the Asian species topping out at a bit over five.
In other words, details matter. We must pay attention to how initial assumptions about users can be proved wrong by future shifts: in technology, regulation, user behavior, or weather patterns for that matter.
Turning the lights back on and putting pachyderms aside (though I am not sure they are moved so easily), allow me to put the cyber hat on for a moment and summon recent words from a brilliant cyber and engineering colleague in Germany. Sarah Fluchs is one of our most articulate champions of security by design, otherwise known as cyber-informed engineering, or in culinary parlance, baking security in. Here’s Sarah:
Security-by-design is highly dependent on what the design workflow looks like. I know that sounds totally obvious but let me elaborate a bit. Security-by-design means that we consider security during the design of a given system. This means that if you’re thinking about how to practically do security-by-design, it is very relevant how that system is designed. And when you start looking at how systems are designed, you quickly find out that while very many roads lead to Rome, people tend to assume how they design systems is how everyone designs systems.
While Sarah’s words resonate with me, a cyber guy who has spent a lot of time around engineers and scientists, does it relate useful meaning to other disciplines? How does this paragraph sound, do you think, to lawyers, regulators, end users, etc.? A quarter of a century after the mainstream arrival of the internet and the near-full embrace of digital technology by all manner of companies, including those in the engineering and design business, and with daily reports of disruptive ransomware attacks, how can it be that we still design and build things with nary a thought to security? How do end users pay good money for and accept wholly insecure systems? How do judges not see failure to adhere to secure-by-design principles as negligent? How do federal and state regulators, fully aware of the downsides of this situation, look the other way, pretend it away?
And not to put too fine a point on it, but how can engineers like you stand this? To tolerate this much uncertainty? I can only imagine you must think there is no choice. Well, there is a choice, and we are going to talk about it soon.
Join us at Engineering Change Lab – USA’s Moving from Cyber Security to Cyber Resilience summit on March 19 to continue the discussion. Learn more and register at this link.