Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

There are many static analysis tools that can be used to check an application for quality and security issues. Code Dx currently integrates with 24 of them. There’s a mix of both commercial and freely available tools. Many of the freely available tools are bundled directly within Code Dx and automatically run based on the source code supplied.

We know commercial tools have invested a lot of research and effort into providing the best static analysis capabilities, but we were curious how a given commercial tool would compare to a suite of open source tools. For this experiment, we took a leading commercial static analysis tool and ran it against the OWASP WebGoat Java and WebGoat.NET projects. These projects have known flaws in them, such as SQL Injection and Cross-site scripting (XSS), and make for a great way to learn about application security. For our purposes, we used them as ground truth to compare tool results against.

Java Coverage

Using the OWASP WebGoat Java project, we took a look at a leading commercial static analysis tool as well as the freely available FindBugs, FindBugs Find Security Bugs plugin, PMD, OWASP Dependency-Check, and Retire.js .

Below is a table that gives a general idea of the kinds of weaknesses each tool supports in the OWASP Top 10. The value shown in each cell indicates the number of rules the given tool supports, which will detect that particular OWASP Top 10 CWE or a descendant of that CWE. You can see some other CWE analyses we did here.

CWE

Description

Commercial Tool

FindBugs

Find Security
Bugs Plugin

PMD

Dependency-Check
/ Retire.js

929

A1 – Injection

5

3

4

0

N/A

930

A2
– Broken Authentication and Session Management

13

3

4

0

N/A

931

A3 – Cross-Site Scripting (XSS)

5

3

3

0

N/A

932

A4
– Insecure Direct Object References

0

2

3

0

N/A

933

A5 – Security Misconfiguration

8

0

1

0

N/A

934

A6
– Sensitive Data Exposure

5

0

10

0

N/A

935

A7 – Missing Function Level Access Control

9

3

1

0

N/A

936

A8
– Cross-Site Request Forgery (CSRF)

1

0

1

0

N/A

937

A9 – Using Components with Known Vulnerabilities

0

0

0

0

MANY

938

A10
– Unvalidated Redirects and Forwards

1

0

1

0

N/A

 

Total

47

14

28

0

MANY

 

It is evident here that not all static analysis tools are security focused. PMD, for example, covers none of the OWASP Top 10. FindBugs, on the other hand covers 9 out of 10 of these issues if the Find Security Bugs plugin is used. And if we add Dependency-Check and Retire.js to check for the use of known vulnerable components, we’re up to 10 out of 10.

The commercial tool we used covered 8 out of 10 of these issues. So we can see an example here of the benefit of multiple tools, and that freely available tools cover the same or more than commercial tools. Of course, just because they cover a certain category doesn’t mean they do it well – they may suffer from missing real weaknesses (i.e., False Negative) or flagging weaknesses that are not a problem at all (i.e., False Positive). However, the fact that one can get good coverage using just open source tools is useful. For those that can’t afford commercial tools, freely available alternatives can help them achieve some level of assurance, and for those using a single commercial tool, adding freely available tools will help them achieve better coverage and help reduce their exposure to security vulnerabilities.

OWASP WebGoat Java

For the OWASP WebGoat Java project we came up with a list of 64 known weaknesses, which are intentionally part of the application. We did not focus on any weaknesses related to the use of known vulnerable components – we know the commercial tool we selected does not check for this.

We then compared the freely available tools with the commercial tool. The commercial tool detected 26 out of 64 of the weaknesses, which is about 41%, and FindBugs with the Find Security Bugs plugin detected 18 out of 64 weaknesses, which is about 28%. Looking at the results found by both the commercial tool and the freely available tools, we found 23% overlap.

PMD did not identify any findings. It is also interesting to note that 3 findings discovered by FindBugs were not found by the commercial tool. This indicates that there are situations where Java issues are identified by open source tools, but not by commercial tools. It is not surprising that PMD found zero findings since it does not focus on security-related issues. Additionally, it turned out that many of the weaknesses were not discovered by any tools – 35 out of 64 weaknesses. This is roughly 55% of the findings.

Although the commercial tool discovered the most findings, it still missed more than 50% of them. Combining the commercial tool with freely available tools yields the greatest coverage, but still results in more than half of the findings not being found.

codedx_img2

.NET Coverage

For OWASP WebGoat.NET we took a look at a leading commercial static analysis tool as well as the freely available FxCop including the ASP.NET Security Rules, CAT.NET, Gendarme, OWASP Dependency-Check, and Retire.js.

Below is a table that gives a general idea of the kinds of weaknesses each tool supports for the OWASP Top 10. For each tool, the value shown in each cell indicates the number of rules that detect that specific OWASP Top 10 CWE or a descendant of that CWE.

CWE

Description

Commercial Tool

FxCop

Gendarme

CAT.NET

Dependency-Check /  Retire.js

929

A1 – Injection

5

2

0

4

N/A

930

A2 – Broken
Authentication and Session Management

13

8

0

0

N/A

931

A3 – Cross-Site Scripting (XSS)

5

2

0

1

N/A

932

A4 – Insecure
Direct Object References

0

0

0

0

N/A

933

A5 – Security Misconfiguration

8

10

0

1

N/A

934

A6 – Sensitive
Data Exposure

5

5

0

0

N/A

935

A7 – Missing Function Level Access Control

9

1

0

0

N/A

936

A8 – Cross-Site
Request Forgery (CSRF)

1

1

0

0

N/A

937

A9 – Using Components with Known Vulnerabilities

0

0

0

0

MANY

938

A10 –
Unvalidated Redirects and Forwards

1

1

0

1

N/A

 

Total

47

30

0

7

MANY

 

Similar to Java, not all .NET tools are security focused. Gendarme covers none of the OWASP Top 10, while FxCop covers 8 of them. CAT.NET covers 4 of the categories, which is interesting because CAT.NET is designed to be security focused, but its scope seems to be limited to certain types of software security vulnerabilities. Combining the freely available tools covers 9 of the OWASP Top 10 categories and the commercial tool covers 8. In this case it isn’t clear whether open source tools will cover anything that the commercial tool does not, except for A9, but the data does seem to indicate that combining multiple open source tools will result in greater coverage than using just one.

OWASP WebGoat.NET

For OWASP WebGoat.NET we selected 14 known weaknesses in the codebase and determined which of the tools were able to successfully find the weaknesses. Again we did not focus on any weaknesses related to the use of known vulnerable components – we know the commercial tool does not check for this.

The commercial tool did find the majority of weaknesses in the application. It found 11 out of 14 of the weaknesses which is a roughly 78% weakness coverage. CAT.NET does fairly well on its own, finding 6 out of 14 of the weaknesses, yielding 42% coverage. Looking at the results found by both the commercial tool and the freely available tools, we found 36% overlap.

Although FxCop does not succeed in finding most of the weaknesses, the only two that it does find are ones that were not found by any other tool – including the commercial tool. Furthermore, if you combine the total coverage of the freely available tools, 8 out of 14 weaknesses are detected. This is a total coverage of 57%.

These results are telling us two positive things about freely available tools. The first is that open source tools can find issues that commercial tools miss. The second is that, when combined, open source tools are capable of finding a sizable number of weaknesses. The best coverage possible is achieved by combining the commercial and open source tools.

codedx_img4

Conclusion

In conclusion, the results we found were positive for the use of freely available tools. On their own, they will identify many weaknesses that would otherwise be missed. Our analysis demonstrates, when complimented with a commercial tool, the coverage will improve since there are weaknesses that were only found by freely available tools. Doing an analysis on the WebGoat application is admittedly limited and small scale. What really matters is how the tools perform on your own custom applications. So give them a try. Also give Code Dx a try since it will help you consolidate and compare the results using multiple tools. Download a trial here.