I read with interest an Op-Ed piece in the New York Times the other day by Marc Maiffret (founder and CTO of BeyondTrust) entitled “Closing the Door on Hackers.” [By the way, as I’ve mentioned before, it’s interesting to see cybersecurity in the mainstream news, which seems to be happening more and more these days.] The thesis of his piece is that we should also be pressuring software makers to make significant investments in their products’ security.
Mr. Maiffret makes several good points:
- Large software companies do not generally seem motivated to make software secure.
- Microsoft, on the other hand, has fundamentally changed its software development process to make security a core part of the program.
- And while not all developers of popular / widely-used programs are following Microsoft’s lead (e.g., Oracle’s Java), some are getting on the bandwagon (e.g., Adobe Reader and Flash).
- A new mind-set – which no longer sees security as an add-on feature – is needed, not an act of Congress.
I agree with his points, and would like to expand on them. Specifically, let’s look earlier in the lifecycle and ask: what role does the education of today’s developers (and management) have in creating more secure products? As Mr. Maiffret points out, securing software is not a trivial task. But as with so many things in life, proper prior planning prevents piss poor performance (my former boss’ variant on the “5 Ps of Success”).
Now, I know that there are companies out there that help with secure software development lifecycle (SDLC), and plenty of companies which will test your code for security. And there are plenty of resources out there for developers interested in learning about designing security into their process – and organizations wanting security as a core business process, a business enabler. These are good and useful resources – both as an added quality step and as a way to change organizational culture (and ones’ individual skills, like these folks). But are these steps akin to “bolting security on” after the product is done? How are we educating future developers to “bake security in” so that it pervades their work?
I want to be clear – I’m asking this out of ignorance, not as an accusation. I know that universities around the US have security courses in their Computer Science or Engineering programs. But is it part of every course? Here are some analogies that come to mind …
- High School Shop – when I took shop (do they even do that anymore?), we not only got a full lecture or three on safety, but every time we stepped out into the shop itself, it was reinforced – Protective glasses? Check. No loose clothing / ties? Check. Everything clamped down? Check. Using hand guards / push blocks? Check. So, it was not just some initial training, some set-and-forget thing – it was embedded into every single thing we did … so much so that I still remember those lessons even though I’ve not stepped in a wood / machine shop in years.
- Global Mindset – at my alma mater Thunderbird, the notion of “international” was also embedded in every class we took. It was not just one or three “international” classes that we could take to augment our educational CV. In addition to working on proficiency in the 2nd (or, for many of my friends, 3rd or 4th) language and a full complement of global studies courses, the international mindset permeated every business class too; so, for instance, accounting courses are not just focused on US GAAP, but also deal with F/X rates, international tax implications, and more.
So, I guess what I’m getting at is that, IMHO, there’s a link between customers pressuring software makers to make significant investments in their products’ security, to software makers creating a culture of security by looking for / hiring developers (and management) with a security mindset, to our schools ensuring that secure coding practices are “baked in” their graduates. What do you think?