Software Security and Computer Forensics

Project Description

The current focus of our work in this project is to identify methods that help developers write more secure software. Along these lines, we are addressing multiple research topics including:
  • The use of peer code review to identify vulnerable code.
  • Conducting security research in a scientifically defensible manner.
Previous work in this project was to build and evaluate a visualization that will enable computer forensics practitioners to more effectively gather and document electronic evidence, such as may be found on a computer hard drive.

Recent Activities

We are working with a large, international team of reserachers to analyze current security literature to analyze the scientific rigor with which the research is reported. To perform this analysis, we have developed a series of rubrics, based on published literature, to enumerate the type of information that should appear in various types of research reports (e.g. case studies, controlled experiments, etc...). Using these rubrics, at least two members of our team analyze each paper to determine the degree to which the paper reports that information.

We are also working with the North Carolina State University Science of Security Lablet to build a community of security researchers who are using scientifically defensible methodologies. As part of this work, we are beginning to build IRN-SoS (International Research Network for Science of Security). We have held workshops at the annual Hot-SoS conference and plan to propose workshops at other security-related conferences.

Key Results

From our work on identifying which security vulnerabilities can be identified via peer code review, we have the following high-level findings:
  • Code review can identify common types of vulnerabilities
  • While more experienced contributors authored the majority of the Vulnerable Code Changes, the less experienced contributors' changes were 1.8 to 24 times more likely to be vulnerable
  • The likelihood of a vulnerability increases with the number of lines changed
  • Modified files are more likely to contain vulnerabilities than new files

Funding Support

  • Carver, J. (PI). "Growing the Science of Security Through Analytics." National Security Agency (sub to North Carolina State University). $174,847. 2/6/14-3/27/16.
  • Carver, J. (PI). "Empirically Evaluating & Quantifying the Effects of Inspections & Testing on Security Vulnerabilities." Army Research Office (sub to North Carolina State University). $82,951. 8/16/13-5/15/14.
  • Carver, J. (PI). "Empirically Evaluating and Quantifying the Effects of Inspections on Security Vulnerabilities (Iteration 1)." Army Research Office (sub to North Carolina State University), 1/1/13-5/15/14. $56,798.
  • Jankun-Kelly, T. (PI), Carver, J. (Co-PI), Swan, J. (Co-PI), and Dampier, D. (Co-PI). "CT-ISG: Empirically-based Visualization for Computer Security and Forensics." National Science Foundation Award 0627407, 10/06-9/09. $300,000.

Related Publications

Refereed Journals

Refereed Conferences

  • Carver, J., Burcham, M., Kocak, S., Bener, A., Felderer, M., Gander, M., King, J., Markkula, J., Oivo, M., Sauerwein, C., and Williams, L. "Establishing a Baseline for Measuring Advancement in the Science of Security - an Analysis of the 2015 IEEE Security & Privacy Proceedings." Proceedings of the 2016 Symposium and Bootcamp on the Science of Security (HotSoS), Pittsburgh, PA. April 19-21, 2016.
  • Bosu, A., Carver, J., Hafiz, M., Hilley, P. and Janni, D. "Identifying the Characteristics of Vulnerable Code Changes: An Empirical Study." Proceedings of the 22nd ACM SIGSOFT Foundations of Software Engineering. Nov. 16-21, 2014. Hong Kong. p. 257-268.

Workshops and Other Publications

Projects Frontpage    |     Homepage    |     List of Publications

Last Updated on February 26, 2015 by Jeffrey Carver