How to Cyber Security: Software testing is cool

Software security testing involves knocking down walls, scaling fences, and otherwise breaking applications to generate helpful feedback for development.

Software security testing is cool

That is a title I never expected to write.

My father was a developer, and I was sure I would follow in his footsteps. For 34 years, he wrote C and C++ code for Bell Labs. At home we had first a TRS-80, then a Commodore 64 upon which I learned the fundamentals of programming.

I love the open-ended creativity of programming, the idea that you start with an empty editor window and breathe life into an application, line by line, feature by feature.

That said, I had the dimmest possible view of software testing. I knew it was important, in the same way that dental hygiene is important, or eating your vegetables, or getting the oil changed in your car.

Eating vegetables is important but not fun.

Testing seemed boring. Testing seemed like something that other people should have to do. Many developers also believe this, that they are Batman and the rest of the product team is Robin and Alfred. In truth, it is much more of a Justice League situation.

Enter security testing

In 2011 I joined a small Finnish company, Codenomicon, and had my mind thoroughly blown. I learned about fuzz testing, delivering intentionally malformed inputs to software to see if something bad happens. Fuzzing is a great way to locate unknown vulnerabilities in an application. If you find them and fix them before bad people find them and exploit them, you substantially reduce your risk.

Once I understood the value of fuzz testing, I was sure that I was onto something big. “Everyone’s going to do fuzzing!” I thought to myself. “We’re going to be rich!”

While it’s true that all application teams should be doing fuzzing, I was naïve about how fast fuzzing, and security testing in general, would permeate application development. It takes time to change people’s attitudes and evolve the processes of software development. The current movement toward DevSecOps reflects the dawning realization that security must be an integral part of the application development process.

When I started explaining fuzzing to customers and potential customers, I understood software testing much better. On the one hand, organizations that create products need testing to make sure the products work correctly and don’t break easily. On the other hand, organizations that purchase products need testing to make sure that the products work correctly in their environment. The bottom line is that both types of organizations are trying to manage and minimize their business risk.

Fuzzing is only one type of security testing. Other types include source analysis (static analysis), software composition analysis, interactive web application testing, and more. Each type has its strengths. Making use of all security testing that fits with your application type and technology stack gives the strongest defense against vulnerabilities and the most effective reduction in risk.

Making use of all security testing that fits with your application type and technology stack gives the strongest defense against vulnerabilities and the most effective reduction in risk.

It’s not a checkmark

No matter what kind of testing is being performed, a common misconception is that testing is a checkmark that must be achieved, and a pass verdict is “good” and a fail verdict is “bad.” If the development team has been working away for weeks or months, and the test team holds up a product release, it’s easy to see the holdup as a nuisance.

The test team does not exist to give lip service to testing. The test team is there to make sure the application doesn’t fall on its face later. Testers need to be good at breaking an application. You want them to find those nasty corner cases, those pesky out-of-memory errors, those crazy control paths that you think nobody will ever attempt.

It’s so much more than functionality

One of the reasons I’m so excited about DevSecOps is that it integrates security testing directly into the development cycle, alongside functional testing. Really, it should have been this way from the very beginning.

But many organizations initially implemented a security group separate from their product teams. This centralized security group was responsible for all security testing, late in the product cycles. Consequently, security testing was both slow (because the one security team was a bottleneck) and inconvenient (because security testing happened so late in the product cycle).

DevSecOps pushes against the separation of different teams.

Nowadays, organizations are moving to a DevSecOps model, where security is fully integrated into every phase of the application development cycle. This means that “normal” functional testing and security testing both happen in-band as part of development and build pipelines.

Find the whole iceberg

Traditionally, software testing has focused on functionality. Does the application work as it should? In functional testing, valid inputs are supplied to the application, and the tests check to see if the corresponding outputs occur. Usually, hundreds or thousands of functional tests are automated so they can be run consistently and reliably.

Functional testing is only the tip of the iceberg. If the only type of testing you do is functional testing, then your software might work correctly under ideal circumstances, with networks that always work, with users that do things that make sense, in a fantasy unicorn world devoid of chaos and malice. The real world is messy and complicated; adding security testing as part of your application development cycle helps ensure that your application can perform well in unpredictable and hostile circumstances.

Relying solely on functional testing assumes we live in a fantasy unicorn world devoid of chaos and malice.

To minimize risk, find and fix as many vulnerabilities as possible before releasing the application. After release, continue monitoring to respond quickly to software supply chain vulnerabilities. For optimal performance, make sure that all testing is automated and integrated with the development cycle. This means that any vulnerabilities located will go to the issue tracker everyone is already using, and developers have minimal friction in responding to testing results.

It’s a game

The relationship between developers and testers works best as a game. The developers do their best to provide functionality without introducing vulnerabilities; the testers do their best to find vulnerabilities anyhow. Developers use creativity to solve difficult problems; testers use creativity to break the application.

Ideally, the two teams strengthen each other over time. If the test team keeps finding a particular type of vulnerability, the developers will eventually learn not to write that type of vulnerability. As developers get better at writing more solid code, the test team will expand their horizons by learning new techniques and doing more types of security testing.

The thrill of the chase

It is great fun to break stuff. If you’ve ever stacked blocks for a toddler, you’ve heard that gleeful laugh when they are knocked down. We find catharsis in wiping the slate clean and starting over, like the cycle of the seasons.

If development builds a wall, software security testers try to knock it down.

If you are a software tester, it is your job to break stuff. You’re not a rubber stamp and you’re not supposed to say that everything is fine. If the development team builds a wall, it is your job to knock it down. If the development team builds a fence, it is your job to climb over it or go around it. Just like an attacker, you need not be constrained by convention.

Every time you break the application, your feedback to the development team helps make the application stronger. Keep on breaking things!

Learn more about software testing solutions

*** This is a Security Bloggers Network syndicated blog from Software Integrity Blog authored by Jonathan Knudsen. Read the original post at:

Secure Guardrails