Meet the Metrics: The Code Dx Dashboard

If you haven’t heard, the recently released Code Dx 3.5 includes our most significant UI update: the new Code Dx Dashboard.

This central hub is the result of a long and intense research process, for which we partnered with cybersecurity visualization experts. Together, we studied real-life AppSec professionals to figure out what metrics they actually need to see, and the most effective way to display them. After carefully designing the visualizations, we put them all in one place, so that all of your teams can quickly see the most critical information about your application’s security.

This dashboard was designed to help improve communication and transparency between teams, so they can coordinate ongoing remediation efforts—and hopefully uncover trends that they can curb before they snowball into big problems.

It’s also worth mentioning that these visualizations are interactive, not static images; you can manipulate these visualizations in a variety of ways. We’ve linked relevant areas of our User Guide here for more information on each metric.
 

So what metrics do we display? Check them out below:

Code Dx Risk Score

The Code Dx Risk Score provides a letter grade to indicate the overall “quality” of the project. The letter grade is based on a percentage score, based on the number of vulnerability findings in custom code and third-party components. Those scores are displayed as both percentages and by a fill bar below them, beside the letter grade. Note that only critical, high, and medium severity findings are counted against the Code Dx Risk Score.

Next to the letter grade, the specific percentage score is displayed alongside a spark-line that shows the general trend of the project’s Code Dx Risk Score over the past week.

You can learn more about how we calculate the Code Dx Risk Score here.

Open Findings

The Open Findings section shows the overall “triage status” of the project.

waffle chart is used as a severity-age breakdown of the untriaged findings in the project. Different colors indicate different severities, as indicated by the legend. The number of dots of each color indicate the rounded percentage of findings in the project of that specific severity. Transparency is used to indicate the relative age of the findings—a more transparent dot indicates relatively new findings of that severity, and a darker dot indicates relatively old findings of that severity.

You can click on the severity labels in the waffle chart’s legend to focus on that severity, fading the other severities from view. Clicking again on the same label will reset that focus, returning the visualization to its normal state. Hovering will temporarily focus on that severity as well.

Below the waffle chart is a fill-bar indicating the percentage of triaged findings (i.e. set to Fixed, False Positive, etc.), out of the total number of findings in the project, excluding findings that are marked “Gone.”

You can learn more about using this waffle chart here.

Findings Count Trend

The Findings Count Trend shows a breakdown of findings by “detection method” over time.

The Findings Count Trend visualization uses a stacked area chart, with “date” as the X axis, and total finding count as the Y axis. By default, an area for each detection method is shown, so that the stacked areas’ total height indicates the total number of findings at a given date. You can click and hover on the area chart to focus on different dates or detection methods.

You can learn more about the Findings Count Trend chart here.

Average Days to Resolution

The Average Days to Resolution shows the average number of days it takes for a new finding in the project to be resolved—either being marked “gone” or “resolved.”

For each severity, the average number of days it takes to resolve a finding of that severity is displayed in a badge. Initially, each badge will display “N/A” since no findings have been resolved. A colored bar below the badges acts as a legend, and hovering the mouse cursor over a badge highlights that severity’s respective color.

As a rule of thumb, teams may wish to prioritize addressing higher-severity findings, so team leads will want to see a lower number of days-to-resolution for higher-severity findings.

You can read more about Average Days to Resolution here.

Code Metrics

The Code Metrics section displays a set of metrics for the project’s codebase, broken down by language.

On the left of the section, a legend shows:

  • An “Overall” group, which represents the entire codebase. This is the sum of the metrics for each language.
  • The top 5 languages (by ratio of lines of code in the respective language to all lines of code).
  • An “Other” group, which contains the summation of any other languages after the top 5.

The colors assigned to each language are purely aesthetic, and are chosen using the same color scheme that Github uses.

By default, the “Overall” group is selected, so the metric areas to the right will show stats for the whole codebase. Clicking one of the languages, or the “Other” group in the legend will cause the metric areas to display language-specific stats. Clicking on the “Overall” group will return the display to its default state.

When focused on a particular language, each metric will show an “X / Y” value instead of the usual “Y.” The “Y” indicates the metric’s value for the entire codebase, and the “X” indicates the metric’s value for the subset of the codebase which is written in the focused language.

Each metric area will also show a sparkline indicating that metric’s trend over the past week. The sparklines will be colored blue for “good” changes, and red for “bad” changes.

You can read about the specific metrics this section displays here.

Analysis Frequency

The Analysis Frequency section offers a summary of the project’s most recent analyses.

At the top of the section, a text blurb describes when the latest analysis occurred, and how long it took. The rest of the section is broken down into three tabbed sections:

Analyses shows how many analyses were run on the project over the past week, 4 weeks, and 3 months.

Tools shows how many unique tools were run in analyses on the project, over the same time periods.

Coverage indicates the percentage of your codebase’s “custom code” that has been covered by instrumentation (only available on projects that have hybrid analysis enabled in their Analysis Configuration).

You can learn more about the Analysis Frequency metric here.

Activity Monitor

The Activity Monitor shows a calendar heatmap that represents analysis activity on the project over the past year, from oldest (left side) to most recent (right side).

Each bubble represents a day of the week, with Sunday at the top, and Saturday at the bottom. Hovering the mouse cursor over any of the bubbles in the chart will display a tooltip of the bubble’s respective date, and the number of analyses that were run that day.

The analysis activity is broken down by different types of analyses, such as Static and Dynamic. The legend items represent these different analysis types. Clicking the legend items will focus the visualization on that category.

The visualization uses brightness to indicate more or less analysis activity for each given day, as indicated by the legend above the visualization. A darker shade indicates more analyses, and a lighter shade indicates fewer analyses.

You can read more about the Activity Monitor here.

Created vs. Resolved

The Created vs. Resolved section shows the dueling trends of new findings that are added to the project, findings that are resolved by the team, and the difference between the two.

This section is broken into two pieces: the graph and the table. Both represent the same data.

The graph is broken into two pieces: the duel and the trend.

The duel section shows the number of created findings (in red) versus the number of resolved findings (in green). The icon in the upper-right corner of the Created vs. Resolved section opens a menu which allows you to toggle between “accumulated” and “daily” counts in the duel section. Daily counts show the exact number of created and resolved findings for any given day. The colored area between the lines in the duel section of the graph indicates which line is higher. A green fill means more findings were resolved as of that day (if using “accumulated” counts) or resolved on that day (if using “daily” counts).

The trend section of the graph shows the difference between the red and green lines of the duel (in blue). The duel and the trend graphs have their own separate Y axes, representing cumulative finding counts and count difference, respectively. The two graphs share the same X axis, which represents the date.

You can learn more about the Created vs. Resolved graph and table here.

Top Finding Types

The Top Finding Types section shows the top ten types of findings in the project by number of open findings.

The visualization uses a Stream Graph to represent the relative volume of the top finding types (Y axis) over time (X axis). Each stacked area of a given color represents a specific type of finding, such as “SQL Injection.” The height of each area represents the number of findings of that type on a given day.

The table to the left of the visualization acts as a legend, where each of the finding types is labelled, and has a colored fill-bar indicating the respective finding type’s percentage share of the project.

Hovering the mouse cursor over an item in the table to the left of the visualization will highlight the corresponding area in the visualization. Similarly, hovering the mouse cursor over an area in the visualization will highlight the corresponding item in the table. Clicking an item will focus on that item.

By default, the graph uses stream layout. Switch to the “stack” layout to rearrange the items into a stack, such that the bottom of the stack aligns with the “0” on the Y axis. Note that with the stream layout, the Y axis’s meaning differs from date to date, so no axis numbers will be displayed.

You can learn more about the Top Finding Types graph here.

© 2018 Code Dx, Inc. (631) 759-3993 | Privacy Policy | Contact Us