My pentester, my enemy? Developers reveal what they really think about pentesting and static analysis results
A developer in their natural habitat is often spotted in a state of deep concentration, coding awesome features to tight deadlines. Feature-building is often our favorite part of the job, and really, it's the fundamental outcome of the software development life cycle (SDLC).
However, as we have discussed before, many of us are still prioritizing features over security best practices. After all, in most organizations, it is set up to be someone else's job and adequate security training for us is limited. Penetration testing and static analysis scanning tools (better known as SAST) are just part of the overall process to mitigate security risks, operating rather independently from what we do... until the code bounces back to us for hotfixes, of course.
And it's at that moment that many developers think: "Do the pentesters hate me?".
These interactions often define a team, a culture. The concerning thing is, the lack of communication, understanding and overall collaboration has created tension, at least on the side of the developer. Think about it: Imagine you've spent a few hundred hours sculpting a marvelous statue, and then someone comes along with a hammer and starts smashing bits off it after they've told you its foundations are not up to scratch. That's the perceived dynamic between a tester and developer - the latter has their software darlings slaughtered by an outsider that hasn't labored through the process with them; instead, they've extended the workload and delayed the satisfaction of shipping code.
Having moved into the security space long ago, I can see both sides of the story. And no, pentesters don't hate developers. The pentester is in all likelihood, overworked and under a lot of pressure. As such, a constant stream of common security bugs that could be quite easily fixed at the code level take up time, resources and headspace away from the really serious issues.
I always saw pentesters as kind of like parents. They want you to do well, and when you don't... they're not mad, just disappointed.
Now that I've put that (perhaps slightly unfair) image in your mind, let's explore this a little deeper. What has caused this world view among developers?
"Of course I'm getting defensive; they're telling me how to do my job!"
Nobody likes feeling as though they've done a bad job, or that someone doesn't like their work. Sadly for developers, when static analysis and pentest results come back to them, it can feel like a report card. They've been given low grades, but at the end of the day, their bosses assess them on the features they've built and the time they've delivered them, not whether there were vulnerable elements in the software or not.
For the poor pentester, this is a case of "don't shoot the messenger". It's nothing personal - they are tasked with finding bugs, and they found them. Granted, at a person-to-person level, maybe some pentesters are grumpier than others, but they're not (or shouldn't be) out to crucify development teams. It would be far easier for both teams if they were on the same page with what constitutes security best practice. And developers are not expected to be perfect; realistically, the testing team is there to protect them from shipping vulnerable code.
"They've told me to fix all these minor issues, don't they know there are higher priorities? And why don't they help me fix them if they care so much?"
It's true - a developer's highest priority will always be the building of features, and in this crazy world of rapid digitization, it will have to be done at speed. While some coders have a personal interest in security and secure coding, the general sentiment is that security is "someone else's problem", which inevitably includes pentesters.
Most common vulnerabilities are indeed minor issues to remediate - once known, the fixes are simple to execute for things like cross-site scripting (XSS) and SQL injection... the problem is, many developers don't realize they're introducing them in the first place, and these seemingly minor issues are the small window of opportunity an attacker needs to cause devastating problems for a company. According to Akamai, between November 2017 and March 2019, SQL injection vulnerabilities accounted for 65% of all web-based attack vectors. For a vulnerability that has had a known fix for more than twenty years, that is a sobering statistic.
Some pentest teams do assist in the remediation of security bugs, but others will provide a report of the bad news and expect developers to work through hotfixes, even if they have moved onto a different project by the time this happens. And in some cases, the development team may be faced with a report that includes bugs they can't (or shouldn't be expected to) fix - it still has to be part of the findings, and again, not taken personally.
The "happy medium" for this would be pentesters, security personnel and development managers acting in more of a mentor role to ensure the team has what they need in terms of effective training and tools, giving individual coders the best chance to succeed and code securely from the very beginning of the SDLC. Both teams really should be meeting half-way to ensure security is considered from the start, as part of a healthy DevSecOps practice.
"I've got far better security knowledge than I get credit for; these reports are mostly false positives, or not important".
Static analysis is an element of the security process in the SDLC, and static analysis scanning tools (SAST) play a fundamental role. And yes, false positives are an issue with these and other types (DAST/IAST/RAST) of scanners. It's an annoyance in what is already a slow process, requiring manual code review and putting pressure on both developers and pentesters alike. Pentesting personnel have taken time to meticulously set up custom rules to avoid inaccurate readings and provided company-specific guidance, yet some false readings slip through and end up in front of a head-scratching developer.
This process isn't perfect, but the other problem is that many developers lack enough knowledge to mitigate a lot of common vulnerabilities on a consistent basis. With security training rare in tertiary education, and on-the-job training varying in its effectiveness, it stands to reason that there may be some overconfidence at play as well (and it's not their fault - we as an industry need to get better at equipping them with what they need).
"I didn't know this application was going to be tested, but now I'm stuck with the remediation tasks".
Sometimes, there is an assumption by overworked engineers that pentesters are just hanging in the wings, waiting for the moment to strike by testing an application and raining on the development team's parade. They are overtesting, they're nitpicking, they're creating extra work.
The only problem with that is they too are overworked (more so, in fact - the cybersecurity skills shortage is at dire levels and getting worse) and simply don't have the time to test without reason. They are not the sole decision-makers in prioritizing testing; it could have been requested by senior leadership, a customer, as part of a security audit or even determined as a result of a bug bounty program.
For a developer, being pulled off current feature-building sprints to work on security fixes is annoying - especially if it's not their work. Perhaps a previous team did the last update, or another vendor. However, security is everyone's problem. That doesn't mean every developer has to take ownership of security bugs as though they have made them all themselves, but they do need to come to the party in terms of security being a shared responsibility.
Where to from here?
Sometimes, a mindset shift can be all it takes to make significant headway in solving a problem. We've talked about the rather frosty reaction a developer has to less than favorable pentest results, but what if they could turn it into a challenge? Perhaps they could think of the pentester as a friendly competitor; someone they can beat at their own game. After all, a security-aware developer that can eliminate common bugs as they write code is going to make their job much more difficult. By contrast, a developer with no focus on security is going to be comprehensively bested by their pentester counterparts when they can easily break their software.
Pentesters and developers may not be joined in harmony 100% of the time, but their relationship can be vastly improved when an organization addresses security as a key priority, and empowers teams with the right knowledge and tools to succeed - especially developers. It comes down to whether a company-wide, positive security culture is a priority, and if we are to fight the (currently) losing battle against common vulnerabilities, it absolutely should be.
Penetration testing and static analysis scanning tools (better known as SAST) are just part of the overall process to mitigate security risks, operating rather independently from what we do - until the code bounces back to us for hotfixes, of course!
Matias Madou, Ph.D. is a security expert, researcher, and CTO and co-founder of Secure Code Warrior. Matias obtained his Ph.D. in Application Security from Ghent University, focusing on static analysis solutions. He later joined Fortify in the US, where he realized that it was insufficient to solely detect code problems without aiding developers in writing secure code. This inspired him to develop products that assist developers, alleviate the burden of security, and exceed customers' expectations. When he is not at his desk as part of Team Awesome, he enjoys being on stage presenting at conferences including RSA Conference, BlackHat and DefCon.
Secure Code Warrior is here for your organization to help you secure code across the entire software development lifecycle and create a culture in which cybersecurity is top of mind. Whether you’re an AppSec Manager, Developer, CISO, or anyone involved in security, we can help your organization reduce risks associated with insecure code.
Book a demoMatias Madou, Ph.D. is a security expert, researcher, and CTO and co-founder of Secure Code Warrior. Matias obtained his Ph.D. in Application Security from Ghent University, focusing on static analysis solutions. He later joined Fortify in the US, where he realized that it was insufficient to solely detect code problems without aiding developers in writing secure code. This inspired him to develop products that assist developers, alleviate the burden of security, and exceed customers' expectations. When he is not at his desk as part of Team Awesome, he enjoys being on stage presenting at conferences including RSA Conference, BlackHat and DefCon.
Matias is a researcher and developer with more than 15 years of hands-on software security experience. He has developed solutions for companies such as Fortify Software and his own company Sensei Security. Over his career, Matias has led multiple application security research projects which have led to commercial products and boasts over 10 patents under his belt. When he is away from his desk, Matias has served as an instructor for advanced application security training courses and regularly speaks at global conferences including RSA Conference, Black Hat, DefCon, BSIMM, OWASP AppSec and BruCon.
Matias holds a Ph.D. in Computer Engineering from Ghent University, where he studied application security through program obfuscation to hide the inner workings of an application.
A developer in their natural habitat is often spotted in a state of deep concentration, coding awesome features to tight deadlines. Feature-building is often our favorite part of the job, and really, it's the fundamental outcome of the software development life cycle (SDLC).
However, as we have discussed before, many of us are still prioritizing features over security best practices. After all, in most organizations, it is set up to be someone else's job and adequate security training for us is limited. Penetration testing and static analysis scanning tools (better known as SAST) are just part of the overall process to mitigate security risks, operating rather independently from what we do... until the code bounces back to us for hotfixes, of course.
And it's at that moment that many developers think: "Do the pentesters hate me?".
These interactions often define a team, a culture. The concerning thing is, the lack of communication, understanding and overall collaboration has created tension, at least on the side of the developer. Think about it: Imagine you've spent a few hundred hours sculpting a marvelous statue, and then someone comes along with a hammer and starts smashing bits off it after they've told you its foundations are not up to scratch. That's the perceived dynamic between a tester and developer - the latter has their software darlings slaughtered by an outsider that hasn't labored through the process with them; instead, they've extended the workload and delayed the satisfaction of shipping code.
Having moved into the security space long ago, I can see both sides of the story. And no, pentesters don't hate developers. The pentester is in all likelihood, overworked and under a lot of pressure. As such, a constant stream of common security bugs that could be quite easily fixed at the code level take up time, resources and headspace away from the really serious issues.
I always saw pentesters as kind of like parents. They want you to do well, and when you don't... they're not mad, just disappointed.
Now that I've put that (perhaps slightly unfair) image in your mind, let's explore this a little deeper. What has caused this world view among developers?
"Of course I'm getting defensive; they're telling me how to do my job!"
Nobody likes feeling as though they've done a bad job, or that someone doesn't like their work. Sadly for developers, when static analysis and pentest results come back to them, it can feel like a report card. They've been given low grades, but at the end of the day, their bosses assess them on the features they've built and the time they've delivered them, not whether there were vulnerable elements in the software or not.
For the poor pentester, this is a case of "don't shoot the messenger". It's nothing personal - they are tasked with finding bugs, and they found them. Granted, at a person-to-person level, maybe some pentesters are grumpier than others, but they're not (or shouldn't be) out to crucify development teams. It would be far easier for both teams if they were on the same page with what constitutes security best practice. And developers are not expected to be perfect; realistically, the testing team is there to protect them from shipping vulnerable code.
"They've told me to fix all these minor issues, don't they know there are higher priorities? And why don't they help me fix them if they care so much?"
It's true - a developer's highest priority will always be the building of features, and in this crazy world of rapid digitization, it will have to be done at speed. While some coders have a personal interest in security and secure coding, the general sentiment is that security is "someone else's problem", which inevitably includes pentesters.
Most common vulnerabilities are indeed minor issues to remediate - once known, the fixes are simple to execute for things like cross-site scripting (XSS) and SQL injection... the problem is, many developers don't realize they're introducing them in the first place, and these seemingly minor issues are the small window of opportunity an attacker needs to cause devastating problems for a company. According to Akamai, between November 2017 and March 2019, SQL injection vulnerabilities accounted for 65% of all web-based attack vectors. For a vulnerability that has had a known fix for more than twenty years, that is a sobering statistic.
Some pentest teams do assist in the remediation of security bugs, but others will provide a report of the bad news and expect developers to work through hotfixes, even if they have moved onto a different project by the time this happens. And in some cases, the development team may be faced with a report that includes bugs they can't (or shouldn't be expected to) fix - it still has to be part of the findings, and again, not taken personally.
The "happy medium" for this would be pentesters, security personnel and development managers acting in more of a mentor role to ensure the team has what they need in terms of effective training and tools, giving individual coders the best chance to succeed and code securely from the very beginning of the SDLC. Both teams really should be meeting half-way to ensure security is considered from the start, as part of a healthy DevSecOps practice.
"I've got far better security knowledge than I get credit for; these reports are mostly false positives, or not important".
Static analysis is an element of the security process in the SDLC, and static analysis scanning tools (SAST) play a fundamental role. And yes, false positives are an issue with these and other types (DAST/IAST/RAST) of scanners. It's an annoyance in what is already a slow process, requiring manual code review and putting pressure on both developers and pentesters alike. Pentesting personnel have taken time to meticulously set up custom rules to avoid inaccurate readings and provided company-specific guidance, yet some false readings slip through and end up in front of a head-scratching developer.
This process isn't perfect, but the other problem is that many developers lack enough knowledge to mitigate a lot of common vulnerabilities on a consistent basis. With security training rare in tertiary education, and on-the-job training varying in its effectiveness, it stands to reason that there may be some overconfidence at play as well (and it's not their fault - we as an industry need to get better at equipping them with what they need).
"I didn't know this application was going to be tested, but now I'm stuck with the remediation tasks".
Sometimes, there is an assumption by overworked engineers that pentesters are just hanging in the wings, waiting for the moment to strike by testing an application and raining on the development team's parade. They are overtesting, they're nitpicking, they're creating extra work.
The only problem with that is they too are overworked (more so, in fact - the cybersecurity skills shortage is at dire levels and getting worse) and simply don't have the time to test without reason. They are not the sole decision-makers in prioritizing testing; it could have been requested by senior leadership, a customer, as part of a security audit or even determined as a result of a bug bounty program.
For a developer, being pulled off current feature-building sprints to work on security fixes is annoying - especially if it's not their work. Perhaps a previous team did the last update, or another vendor. However, security is everyone's problem. That doesn't mean every developer has to take ownership of security bugs as though they have made them all themselves, but they do need to come to the party in terms of security being a shared responsibility.
Where to from here?
Sometimes, a mindset shift can be all it takes to make significant headway in solving a problem. We've talked about the rather frosty reaction a developer has to less than favorable pentest results, but what if they could turn it into a challenge? Perhaps they could think of the pentester as a friendly competitor; someone they can beat at their own game. After all, a security-aware developer that can eliminate common bugs as they write code is going to make their job much more difficult. By contrast, a developer with no focus on security is going to be comprehensively bested by their pentester counterparts when they can easily break their software.
Pentesters and developers may not be joined in harmony 100% of the time, but their relationship can be vastly improved when an organization addresses security as a key priority, and empowers teams with the right knowledge and tools to succeed - especially developers. It comes down to whether a company-wide, positive security culture is a priority, and if we are to fight the (currently) losing battle against common vulnerabilities, it absolutely should be.
A developer in their natural habitat is often spotted in a state of deep concentration, coding awesome features to tight deadlines. Feature-building is often our favorite part of the job, and really, it's the fundamental outcome of the software development life cycle (SDLC).
However, as we have discussed before, many of us are still prioritizing features over security best practices. After all, in most organizations, it is set up to be someone else's job and adequate security training for us is limited. Penetration testing and static analysis scanning tools (better known as SAST) are just part of the overall process to mitigate security risks, operating rather independently from what we do... until the code bounces back to us for hotfixes, of course.
And it's at that moment that many developers think: "Do the pentesters hate me?".
These interactions often define a team, a culture. The concerning thing is, the lack of communication, understanding and overall collaboration has created tension, at least on the side of the developer. Think about it: Imagine you've spent a few hundred hours sculpting a marvelous statue, and then someone comes along with a hammer and starts smashing bits off it after they've told you its foundations are not up to scratch. That's the perceived dynamic between a tester and developer - the latter has their software darlings slaughtered by an outsider that hasn't labored through the process with them; instead, they've extended the workload and delayed the satisfaction of shipping code.
Having moved into the security space long ago, I can see both sides of the story. And no, pentesters don't hate developers. The pentester is in all likelihood, overworked and under a lot of pressure. As such, a constant stream of common security bugs that could be quite easily fixed at the code level take up time, resources and headspace away from the really serious issues.
I always saw pentesters as kind of like parents. They want you to do well, and when you don't... they're not mad, just disappointed.
Now that I've put that (perhaps slightly unfair) image in your mind, let's explore this a little deeper. What has caused this world view among developers?
"Of course I'm getting defensive; they're telling me how to do my job!"
Nobody likes feeling as though they've done a bad job, or that someone doesn't like their work. Sadly for developers, when static analysis and pentest results come back to them, it can feel like a report card. They've been given low grades, but at the end of the day, their bosses assess them on the features they've built and the time they've delivered them, not whether there were vulnerable elements in the software or not.
For the poor pentester, this is a case of "don't shoot the messenger". It's nothing personal - they are tasked with finding bugs, and they found them. Granted, at a person-to-person level, maybe some pentesters are grumpier than others, but they're not (or shouldn't be) out to crucify development teams. It would be far easier for both teams if they were on the same page with what constitutes security best practice. And developers are not expected to be perfect; realistically, the testing team is there to protect them from shipping vulnerable code.
"They've told me to fix all these minor issues, don't they know there are higher priorities? And why don't they help me fix them if they care so much?"
It's true - a developer's highest priority will always be the building of features, and in this crazy world of rapid digitization, it will have to be done at speed. While some coders have a personal interest in security and secure coding, the general sentiment is that security is "someone else's problem", which inevitably includes pentesters.
Most common vulnerabilities are indeed minor issues to remediate - once known, the fixes are simple to execute for things like cross-site scripting (XSS) and SQL injection... the problem is, many developers don't realize they're introducing them in the first place, and these seemingly minor issues are the small window of opportunity an attacker needs to cause devastating problems for a company. According to Akamai, between November 2017 and March 2019, SQL injection vulnerabilities accounted for 65% of all web-based attack vectors. For a vulnerability that has had a known fix for more than twenty years, that is a sobering statistic.
Some pentest teams do assist in the remediation of security bugs, but others will provide a report of the bad news and expect developers to work through hotfixes, even if they have moved onto a different project by the time this happens. And in some cases, the development team may be faced with a report that includes bugs they can't (or shouldn't be expected to) fix - it still has to be part of the findings, and again, not taken personally.
The "happy medium" for this would be pentesters, security personnel and development managers acting in more of a mentor role to ensure the team has what they need in terms of effective training and tools, giving individual coders the best chance to succeed and code securely from the very beginning of the SDLC. Both teams really should be meeting half-way to ensure security is considered from the start, as part of a healthy DevSecOps practice.
"I've got far better security knowledge than I get credit for; these reports are mostly false positives, or not important".
Static analysis is an element of the security process in the SDLC, and static analysis scanning tools (SAST) play a fundamental role. And yes, false positives are an issue with these and other types (DAST/IAST/RAST) of scanners. It's an annoyance in what is already a slow process, requiring manual code review and putting pressure on both developers and pentesters alike. Pentesting personnel have taken time to meticulously set up custom rules to avoid inaccurate readings and provided company-specific guidance, yet some false readings slip through and end up in front of a head-scratching developer.
This process isn't perfect, but the other problem is that many developers lack enough knowledge to mitigate a lot of common vulnerabilities on a consistent basis. With security training rare in tertiary education, and on-the-job training varying in its effectiveness, it stands to reason that there may be some overconfidence at play as well (and it's not their fault - we as an industry need to get better at equipping them with what they need).
"I didn't know this application was going to be tested, but now I'm stuck with the remediation tasks".
Sometimes, there is an assumption by overworked engineers that pentesters are just hanging in the wings, waiting for the moment to strike by testing an application and raining on the development team's parade. They are overtesting, they're nitpicking, they're creating extra work.
The only problem with that is they too are overworked (more so, in fact - the cybersecurity skills shortage is at dire levels and getting worse) and simply don't have the time to test without reason. They are not the sole decision-makers in prioritizing testing; it could have been requested by senior leadership, a customer, as part of a security audit or even determined as a result of a bug bounty program.
For a developer, being pulled off current feature-building sprints to work on security fixes is annoying - especially if it's not their work. Perhaps a previous team did the last update, or another vendor. However, security is everyone's problem. That doesn't mean every developer has to take ownership of security bugs as though they have made them all themselves, but they do need to come to the party in terms of security being a shared responsibility.
Where to from here?
Sometimes, a mindset shift can be all it takes to make significant headway in solving a problem. We've talked about the rather frosty reaction a developer has to less than favorable pentest results, but what if they could turn it into a challenge? Perhaps they could think of the pentester as a friendly competitor; someone they can beat at their own game. After all, a security-aware developer that can eliminate common bugs as they write code is going to make their job much more difficult. By contrast, a developer with no focus on security is going to be comprehensively bested by their pentester counterparts when they can easily break their software.
Pentesters and developers may not be joined in harmony 100% of the time, but their relationship can be vastly improved when an organization addresses security as a key priority, and empowers teams with the right knowledge and tools to succeed - especially developers. It comes down to whether a company-wide, positive security culture is a priority, and if we are to fight the (currently) losing battle against common vulnerabilities, it absolutely should be.
Click on the link below and download the PDF of this resource.
Secure Code Warrior is here for your organization to help you secure code across the entire software development lifecycle and create a culture in which cybersecurity is top of mind. Whether you’re an AppSec Manager, Developer, CISO, or anyone involved in security, we can help your organization reduce risks associated with insecure code.
View reportBook a demoMatias Madou, Ph.D. is a security expert, researcher, and CTO and co-founder of Secure Code Warrior. Matias obtained his Ph.D. in Application Security from Ghent University, focusing on static analysis solutions. He later joined Fortify in the US, where he realized that it was insufficient to solely detect code problems without aiding developers in writing secure code. This inspired him to develop products that assist developers, alleviate the burden of security, and exceed customers' expectations. When he is not at his desk as part of Team Awesome, he enjoys being on stage presenting at conferences including RSA Conference, BlackHat and DefCon.
Matias is a researcher and developer with more than 15 years of hands-on software security experience. He has developed solutions for companies such as Fortify Software and his own company Sensei Security. Over his career, Matias has led multiple application security research projects which have led to commercial products and boasts over 10 patents under his belt. When he is away from his desk, Matias has served as an instructor for advanced application security training courses and regularly speaks at global conferences including RSA Conference, Black Hat, DefCon, BSIMM, OWASP AppSec and BruCon.
Matias holds a Ph.D. in Computer Engineering from Ghent University, where he studied application security through program obfuscation to hide the inner workings of an application.
A developer in their natural habitat is often spotted in a state of deep concentration, coding awesome features to tight deadlines. Feature-building is often our favorite part of the job, and really, it's the fundamental outcome of the software development life cycle (SDLC).
However, as we have discussed before, many of us are still prioritizing features over security best practices. After all, in most organizations, it is set up to be someone else's job and adequate security training for us is limited. Penetration testing and static analysis scanning tools (better known as SAST) are just part of the overall process to mitigate security risks, operating rather independently from what we do... until the code bounces back to us for hotfixes, of course.
And it's at that moment that many developers think: "Do the pentesters hate me?".
These interactions often define a team, a culture. The concerning thing is, the lack of communication, understanding and overall collaboration has created tension, at least on the side of the developer. Think about it: Imagine you've spent a few hundred hours sculpting a marvelous statue, and then someone comes along with a hammer and starts smashing bits off it after they've told you its foundations are not up to scratch. That's the perceived dynamic between a tester and developer - the latter has their software darlings slaughtered by an outsider that hasn't labored through the process with them; instead, they've extended the workload and delayed the satisfaction of shipping code.
Having moved into the security space long ago, I can see both sides of the story. And no, pentesters don't hate developers. The pentester is in all likelihood, overworked and under a lot of pressure. As such, a constant stream of common security bugs that could be quite easily fixed at the code level take up time, resources and headspace away from the really serious issues.
I always saw pentesters as kind of like parents. They want you to do well, and when you don't... they're not mad, just disappointed.
Now that I've put that (perhaps slightly unfair) image in your mind, let's explore this a little deeper. What has caused this world view among developers?
"Of course I'm getting defensive; they're telling me how to do my job!"
Nobody likes feeling as though they've done a bad job, or that someone doesn't like their work. Sadly for developers, when static analysis and pentest results come back to them, it can feel like a report card. They've been given low grades, but at the end of the day, their bosses assess them on the features they've built and the time they've delivered them, not whether there were vulnerable elements in the software or not.
For the poor pentester, this is a case of "don't shoot the messenger". It's nothing personal - they are tasked with finding bugs, and they found them. Granted, at a person-to-person level, maybe some pentesters are grumpier than others, but they're not (or shouldn't be) out to crucify development teams. It would be far easier for both teams if they were on the same page with what constitutes security best practice. And developers are not expected to be perfect; realistically, the testing team is there to protect them from shipping vulnerable code.
"They've told me to fix all these minor issues, don't they know there are higher priorities? And why don't they help me fix them if they care so much?"
It's true - a developer's highest priority will always be the building of features, and in this crazy world of rapid digitization, it will have to be done at speed. While some coders have a personal interest in security and secure coding, the general sentiment is that security is "someone else's problem", which inevitably includes pentesters.
Most common vulnerabilities are indeed minor issues to remediate - once known, the fixes are simple to execute for things like cross-site scripting (XSS) and SQL injection... the problem is, many developers don't realize they're introducing them in the first place, and these seemingly minor issues are the small window of opportunity an attacker needs to cause devastating problems for a company. According to Akamai, between November 2017 and March 2019, SQL injection vulnerabilities accounted for 65% of all web-based attack vectors. For a vulnerability that has had a known fix for more than twenty years, that is a sobering statistic.
Some pentest teams do assist in the remediation of security bugs, but others will provide a report of the bad news and expect developers to work through hotfixes, even if they have moved onto a different project by the time this happens. And in some cases, the development team may be faced with a report that includes bugs they can't (or shouldn't be expected to) fix - it still has to be part of the findings, and again, not taken personally.
The "happy medium" for this would be pentesters, security personnel and development managers acting in more of a mentor role to ensure the team has what they need in terms of effective training and tools, giving individual coders the best chance to succeed and code securely from the very beginning of the SDLC. Both teams really should be meeting half-way to ensure security is considered from the start, as part of a healthy DevSecOps practice.
"I've got far better security knowledge than I get credit for; these reports are mostly false positives, or not important".
Static analysis is an element of the security process in the SDLC, and static analysis scanning tools (SAST) play a fundamental role. And yes, false positives are an issue with these and other types (DAST/IAST/RAST) of scanners. It's an annoyance in what is already a slow process, requiring manual code review and putting pressure on both developers and pentesters alike. Pentesting personnel have taken time to meticulously set up custom rules to avoid inaccurate readings and provided company-specific guidance, yet some false readings slip through and end up in front of a head-scratching developer.
This process isn't perfect, but the other problem is that many developers lack enough knowledge to mitigate a lot of common vulnerabilities on a consistent basis. With security training rare in tertiary education, and on-the-job training varying in its effectiveness, it stands to reason that there may be some overconfidence at play as well (and it's not their fault - we as an industry need to get better at equipping them with what they need).
"I didn't know this application was going to be tested, but now I'm stuck with the remediation tasks".
Sometimes, there is an assumption by overworked engineers that pentesters are just hanging in the wings, waiting for the moment to strike by testing an application and raining on the development team's parade. They are overtesting, they're nitpicking, they're creating extra work.
The only problem with that is they too are overworked (more so, in fact - the cybersecurity skills shortage is at dire levels and getting worse) and simply don't have the time to test without reason. They are not the sole decision-makers in prioritizing testing; it could have been requested by senior leadership, a customer, as part of a security audit or even determined as a result of a bug bounty program.
For a developer, being pulled off current feature-building sprints to work on security fixes is annoying - especially if it's not their work. Perhaps a previous team did the last update, or another vendor. However, security is everyone's problem. That doesn't mean every developer has to take ownership of security bugs as though they have made them all themselves, but they do need to come to the party in terms of security being a shared responsibility.
Where to from here?
Sometimes, a mindset shift can be all it takes to make significant headway in solving a problem. We've talked about the rather frosty reaction a developer has to less than favorable pentest results, but what if they could turn it into a challenge? Perhaps they could think of the pentester as a friendly competitor; someone they can beat at their own game. After all, a security-aware developer that can eliminate common bugs as they write code is going to make their job much more difficult. By contrast, a developer with no focus on security is going to be comprehensively bested by their pentester counterparts when they can easily break their software.
Pentesters and developers may not be joined in harmony 100% of the time, but their relationship can be vastly improved when an organization addresses security as a key priority, and empowers teams with the right knowledge and tools to succeed - especially developers. It comes down to whether a company-wide, positive security culture is a priority, and if we are to fight the (currently) losing battle against common vulnerabilities, it absolutely should be.
Table of contents
Matias Madou, Ph.D. is a security expert, researcher, and CTO and co-founder of Secure Code Warrior. Matias obtained his Ph.D. in Application Security from Ghent University, focusing on static analysis solutions. He later joined Fortify in the US, where he realized that it was insufficient to solely detect code problems without aiding developers in writing secure code. This inspired him to develop products that assist developers, alleviate the burden of security, and exceed customers' expectations. When he is not at his desk as part of Team Awesome, he enjoys being on stage presenting at conferences including RSA Conference, BlackHat and DefCon.
Secure Code Warrior is here for your organization to help you secure code across the entire software development lifecycle and create a culture in which cybersecurity is top of mind. Whether you’re an AppSec Manager, Developer, CISO, or anyone involved in security, we can help your organization reduce risks associated with insecure code.
Book a demoDownloadResources to get you started
Benchmarking Security Skills: Streamlining Secure-by-Design in the Enterprise
The Secure-by-Design movement is the future of secure software development. Learn about the key elements companies need to keep in mind when they think about a Secure-by-Design initiative.
DigitalOcean Decreases Security Debt with Secure Code Warrior
DigitalOcean's use of Secure Code Warrior training has significantly reduced security debt, allowing teams to focus more on innovation and productivity. The improved security has strengthened their product quality and competitive edge. Looking ahead, the SCW Trust Score will help them further enhance security practices and continue driving innovation.
Resources to get you started
Trust Score Reveals the Value of Secure-by-Design Upskilling Initiatives
Our research has shown that secure code training works. Trust Score, using an algorithm drawing on more than 20 million learning data points from work by more than 250,000 learners at over 600 organizations, reveals its effectiveness in driving down vulnerabilities and how to make the initiative even more effective.
Reactive Versus Preventive Security: Prevention Is a Better Cure
The idea of bringing preventive security to legacy code and systems at the same time as newer applications can seem daunting, but a Secure-by-Design approach, enforced by upskilling developers, can apply security best practices to those systems. It’s the best chance many organizations have of improving their security postures.
The Benefits of Benchmarking Security Skills for Developers
The growing focus on secure code and Secure-by-Design principles requires developers to be trained in cybersecurity from the start of the SDLC, with tools like Secure Code Warrior’s Trust Score helping measure and improve their progress.
Driving Meaningful Success for Enterprise Secure-by-Design Initiatives
Our latest research paper, Benchmarking Security Skills: Streamlining Secure-by-Design in the Enterprise is the result of deep analysis of real Secure-by-Design initiatives at the enterprise level, and deriving best practice approaches based on data-driven findings.