Universities aren’t training software developers in security. Does that matter?

Taylor Armerding
6 min readAug 24, 2020
Photo by Nathan Dumlao on Unsplash

If you want a good job but you aren’t born into the family business, it usually takes at least two things: opportunity and training. And it’s clear that the opportunity part is there in software development and security.

Software is everywhere. It was almost a decade ago, after all, that Marc Andreessen, Netscape cofounder and now venture capitalist, famously said “software is eating the world.”

Today, software has eaten most of the world — if anything, its appetite has increased. It is behind just about everything we use and do, from our vehicles to our retail purchases, appliances, home security, communication, education, entertainment, healthcare — the list is almost endless.

That’s one of the reasons unemployment in the software industry is low to virtually nonexistent. For developers, unemployment is an estimated 1.6%, and the median salary of those jobs is more than $100,000.

For software security, the prospects are even better. The unemployment rate has been below zero for several years, and estimates are that by next year there could be 3.5 million unfilled cybersecurity positions. Many of these jobs, by the way, pay better than $200,000 a year.

So you would think computer science programs at colleges and universities would be focused on giving students the qualifications necessary to grab one of those waiting, lucrative opportunities.

But you would be only partially correct. There are plenty of programs that focus on software development skills. But when it comes to software security, not so much. Forrester research reported last year that of 40 university computer science programs it surveyed across the U.S., not one required students to take courses in secure coding or secure application design.

Is that a problem? The views from experts are mixed.

Skills gap? No wonder

Jonathan Knudsen, senior security strategist at Synopsys, says it is at least a disconnect. “Think about how people learn to build airplanes,” he said. “Safety is part of every aspect of aviation — aerospace engineers don’t just learn how to make something fly, but how to make something that flies safely.”

“Software education is almost exactly opposite: Students learn how to make things work, how to make them work faster, how to make them work more efficiently, but security is often neglected or ignored.”

Which is likely a major reason for the ongoing, widening “skills gap” in cybersecurity. It also could be a major reason why the move to DevSecOps — merging development, security and operations teams in order to crank out better and more secure software products much faster — is still a bumpy road for many organizations.

As has been preached from podiums and in panel discussions at just about every security conference for the past decade, if teams that face different pressures and incentives are going to work together effectively, they need to understand one another.

But the reality persists that security people don’t understand development and developers don’t understand security. So, given that colleges and universities aren’t delivering software developers to the job market with security skills, or even much awareness, it’s up to the industry to do security training for developers in-house.

In some ways, that may not be such a bad thing, says Sammy Migues, principal scientist at Synopsys and coauthor since 2008 of the BSIMM [Building Security In Maturity Model], an annual report that has tracked software security initiatives in hundreds of organizations.

He isn’t arguing that security training at the college level isn’t important. Knowing something about common cyberattacks and how to deliver code that isn’t littered with vulnerabilities is obviously a good thing, he said.

Contextual training

But things change so fast in cybersecurity that “when you hit the field, it’s not college,” he said. “Once you’re in an organization, you’re working with their particular (coding) languages, you’re doing it in their frameworks and internal libraries, following their coding standards, and doing it all in their development toolchains.”

Training is also different from education, he adds. “Education, life lessons, and teachable moments can and do happen to us all the time. But training has to be contextual; it has to happen near to when the person is actually going to do the thing and it should replicate as closely as possible how they’re going to do the thing,” he said.

“Showing me a slide or sending me a PDF report with some Java code and a big red arrow labeled ‘SQL Injection’ is like showing me a picture of a tire with a nail in it. I’ve learned nothing about avoiding flats or changing a tire.”

Migues also said a secure development structure can help developers write more secure code without special training. He said a major factor in building applications that are more secure is to require developers to use “frameworks” that are secure by design.

An example, he said, is to build secure libraries or APIs [application programming interfaces] to be used for things like input validation (making sure that user input is not malicious) and then requiring developers to use them — and only them. “Of course, ‘requiring’ may not be the right approach in all cases. Sometimes you just have to make the right way the easiest way,” he said.

It’s a bit like a stove in a home that provides a controlled, “secure-by-design” way to use what could be a threat — a deadly, flammable substance like natural gas or propane. “You don’t let the kids build a campfire in every room of the house if they want a hot dog,” he said. “They have to use the stove. Governance, including our insurance person, says it’s the only way to have cooking fires in the home.”

Requirements to use secure frameworks in software development, he said, could “eliminate the vast majority of the OWASP [Open Web Application Security Project] Top 10 (vulnerabilities) in a web application.”

But then, in a DevSecOps world, today’s software developers have an expanded role in spotting and fixing vulnerabilities in code. And better tools and training can help them do that.

As daily headlines tell us, web applications are a prime target for cyber criminals because malicious hackers know the software that powers those apps remains, in many cases, riddled with vulnerabilities.

The industry ramps it up

And while higher education isn’t doing much on the security front, security training within the industry for developers has been ramping up for some time. There is plenty of material out there telling organizations how to do it.

A webinar last year with Utsav Sanghani, product manager at Synopsys, and Amy DeMartine, vice president and research director at Forrester, focused on “tools and techniques that can transform your developers into AppSec rock stars.”

Forrester also released a whitepaper more than a year ago titled “Show, don’t tell, your developers how to write secure code.”

That whitepaper has a number of recommendations Among them:

  • Use application security testing tools that will both assist and train developers on the job. Good tools that perform automated static, dynamic and interactive testing, can flag problems with code as it’s being written, when they are much easier and less expensive to fix than at the end of the software development life cycle (SDLC).
  • Enforce hard quality gates that stop the SDLC until a vulnerability is fixed.
  • Create developer “security champions.” Developers can’t be expected to be security experts — if they were, there would be no need for a security team. But organizations can find developers with a demonstrated interest in security to be security advocates in each product team.

Another passionate advocate for better understanding, and therefore more effective cooperation, between developer and security teams, is Tanya Janca, CEO and founder of WeHackPurple and an OWASP chapter leader in Canada.

She agrees that developers shouldn’t be expected to become security experts, but said they do “need to learn the security guidelines, standards and policies of their workplace.”

That, of course, implies some responsibility of the employer. “I also believe each workplace should have a policy mandating the security activities that are part of their SDLC — sometimes called an application security program,” she said.

Janca even spent the last three years writing a book, “Alice and Bob Learn Application Security,” which she hopes will make its way into universities as a textbook.

“I had AppSec experts weigh in, and I wrote it in easy-to-understand language,” she said. “It’s my dream to help make a generation of more-secure coders.”

Jonathan Knudsen agrees. He said his hope for the future is that “software development and security will become inseparable, that no one will ever discuss creating software without also considering security implications at every phase of development.”

--

--

Taylor Armerding

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.