Security Defect Testing

Hi, this is Darius Wiles.

Software vendors aim to release defect-free products. Earlier posts have discussed Oracle Software Security Assurance (OSSA) program and its processes that aim to get us as close to this goal as possible. Automated testing is an important part of OSSA as it helps catch problems missed in earlier stages of the development lifecycle and gives us the opportunity of improving our processes to prevent similar problems in future.

When I joined Oracle a number of years ago, we relied almost exclusively on internally-developed testing tools and common freely available penetration testing tools. We had tried many of the commercial tools that were available at that time but concluded they were not useful to us due to their inability to cope with the volume and complexity of code we needed to test. In addition, these commercial tools did not always work correctly with our products (e.g. web application tools could not test web sites protected with Oracle Single Sign-On), and other tools were not able to statically analyze all the programming languages we use. However, the leading commercial scanners have improved significantly over the past few years, and we are now increasingly building them into our standard development and testing processes.

The increasing use of automated tools by Oracle is having an impact on the proportion of security defects that are discovered internally versus those reported by external sources. For reporting and tracking purposes, we categorize security defects into groups based on who found them, namely internal, customer and external. Andy Webber's recent blog post talked about an unexpected internal source of defect reports, but generally, internally-discovered security defects are found as a result of security testing performed by the development, quality assurance, and ethical hacking teams.

The increasing use of automated tools by Oracle is allowing us to find more security defects internally. Of all defects found in 2009 (so far), 87% have been found internally, 10% have been reported by customers, and 3% were found externally. Note that the external group consists of defects reported to us by security researchers, Oracle-specific defects posted directly to the Internet, and problems in third party products/code that we include with our products. One should be wary in drawing comparisons based on this data, especially to compare the effectiveness of these different groups in finding security bugs. The raw statistics provide a slightly unbalanced comparison as many of the internal defects are unlikely to be exploitable in practice: static source code analysis tools tend to err on the side of caution when reporting potential problems and we prefer to add validation checks to likewise be cautious unless we are absolutely sure a potential problem is a false positive. Conversely, most of the customer and external defects have been analyzed in enough detail to weed out the false positives. Additionally, an increasing number of potential defects are being found and fixed during product development, reducing the opportunity for defects to find their way into released products and for people outside Oracle to find them. However, we are using these figures as a guide in tracking our progress in finding an increasing percentage of problems internally.

SecurityDefects2009.png

In a blog posting earlier this year, Reshma Banerjee discussed the inclusion of BEA into OSSA. A lot of good ideas are exchanged between security teams in Oracle and newly-acquired companies. Although many companies have solved the same security challenges in surprisingly similar ways, the variations are always worth exploring as more effective techniques can be shared across the whole company. This includes the use of security testing techniques and commercial testing tools. I expect the continuous improvement in OSSA processes to continue to increase the percentage of internally found defects, relative to the other two categories, and get us closer to the goal of defect-free products.

For more information, a number of Oracle OpenWorld sessions will be dedicated to Oracle Software Security Assurance:


  • On Sunday October 11 at 3:30PM: Oracle Software Security Assurance Town Hall with IOUG (panel discussion with Bruce Lowenthal, John Heimann, Mark Fallon, Robert Armstrong, and Hiran Patel)

  • On Tuesday October 13 at 11:30AM: Governance and Security Assurance with John Heimann

  • On Thursday October 15 at 9:00AM: Software Vulnerabilities: Preventing & Protection with Bruce Lowenthal

  • Also on Thursday, at 3:00PM: Critical Patch Update: A Year in Review with Bruce Lowenthal & Eric Maurice

Comments:

Post a Comment:
Comments are closed for this entry.
About

This blog provides insight about key aspects of Oracle Software Security Assurance programs.

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
11
12
13
14
16
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today