A Framework for Intrusion Detection Systems Evaluation Bandara L.R.J. This dissertation was submitted to the Department of Computer Science and Engineering of the University of Moratuwa in partial fulfillment of the requirements for the Degree of MSc in Computer Science Department of Computer Science and Engineering University of Moratuwa October / 2007 A FRAMEWORK FOR INTRUSION DETECTION SYSTEMS EVALUATION BANDARA L.R.J. This dissertation was submitted to the Department of Computer Science and Engineering of the University of Moratuwa in partial fulfillment of the requirements for the Degree of MSc in Computer Science Department of Computer Science and Engineering University of Moratuwa October / 2007 ii Declaration I, Bandara L.R.J, confirm that this work submitted for assessment is my own and expressed in my own words and the work included in the dissertation in part or whole, has not been submitted for any other academic qualification at any institution. Signature …………………………………………………………..... Date ……………………………………………………………. Certified by (Supervisor): Name …………………………………………………………..... Signature …………………………………………………………..... Date ……………………………………………………………. iii Abstract Information security plays a major role in today’s IT enabled organizations. In this security stance, Intrusion Detection Systems (IDSes) is a very important element if not the most. Therefore it is very important to select the most suitable product to deploy in any organization concerned. In order to select the suitable IDS it is necessary to evaluate at least short listed number of products or it is necessary to rely on some third party organizations who evaluate these products. But only very few organizations are involving in evaluating IDSes and therefore the cost of hiring such an organization is very high and hence only a very few organizations can bear it where as small organizations have to depend of there own methods. Therefore it is essential for the research community to help in evaluating these products. But the research community can not rely on the methods used by the organizations that do the evaluations since those methods are proprietary and not publicly available. This paper describes a method of using the existing freely available tools of generating a data set or a criterion/check list and a framework that can be used to evaluate intrusion detection systems for a specific facility using the proposed method of generating data set. Finally we discuss the lessons learned using this kind of a framework to evaluate intrusion detection systems and the opportunities for further improvement of this framework and in this area. The tool uses a check list or attack script list and a parser that passes parameters to an open source/free vulnerability scan engine according to the check list to attack the targets and then search the intrusion detection systems logs/database for any detection of those attacks. This will evaluate the quality of the signatures of the specific intrusion detection system. Then we use Snort IDS as the base line to benchmark other candidate IDSes (and possibly will try to benchmark at least one more IDS, as a proof-of-concept, due to the time limitation). iv Acknowledgments First of all I would like to thank my project supervisor Mr. Shantha Fernando. He is very supportive and is always there to help me. He guided me at every time when I was in trouble. And also I would like to thank the academic staff specially Dr. Sanath Jayasena who review our progress regularly throughout the year and who encourage us to meet the deadlines and guide us on the correct times for starting documentation etc., Dr. Gihan Dias and Mrs. Vishaka Nanayakkara who taught us how to do a literature review. And also I would like to thank the non-academic staff of the department who helps us in many ways during this study. I would also like to thank Mr. Aruna B. Herath from SLT iDC for the help and guidance given from the starting until the completion of this research. Finally I would like to thank my wife Saja for the patience, support, care and motivation from the starting of this postgraduate study. She also helped me converting Snort rules to Shoki manually and it needed huge amount of time and care. v Table of Contents 1.  Introduction ............................................................................................................ 1  1.1.  Background ................................................................................................................ 1  1.1.1.  Types of Intrusion Detection Systems ............................................................... 2  1.1.2.  Major components of an IDS ............................................................................. 3  1.1.3.  Introduction to Filters and Signatures ................................................................ 5  1.1.4.  Intrusion detection methods/algorithms ............................................................. 5  1.1.5.  An Introduction to vulnerability databases ........................................................ 5  1.2.  Motivation and Problem Identified ............................................................................ 7  1.3.  Objective .................................................................................................................... 8  2.  Related work........................................................................................................... 9  3.  Methodology of Study .......................................................................................... 11  3.1.  Design ...................................................................................................................... 11  3.1.1.  Attack Script Database ..................................................................................... 11  3.1.2.  Attacker/Evaluator ........................................................................................... 19  3.1.3.  Analyzer ........................................................................................................... 19  3.1.4.  Reporter ............................................................................................................ 19  3.2.  Real Lab Setup of the Framework ........................................................................... 19  3.2.1.  Hardware ......................................................................................................... 20  3.2.2.  Software ........................................................................................................... 22  3.3.  Procedure carried out ............................................................................................... 22  3.3.1.  Registering IDSes in the system: ...................................................................... 23  3.3.2.  Launching Attacks: ........................................................................................... 23  3.4.  Limitations and Problems Encountered ................................................................... 25  4.  Experimental Results and Analysis ...................................................................... 27  4.1.  Categorizing the results ............................................................................................ 27  4.1.1.  True Positives (TPs) ......................................................................................... 28  4.1.2.  False Positives (FPs) ........................................................................................ 30  4.1.3.  False Negatives (FNs) ...................................................................................... 31  5.  Discussion of Results and Conclusions ................................................................ 34  6.  Future work .......................................................................................................... 36  References .................................................................................................................... 37  Definitions ................................................................................................................... 40  vi List of Figures Figure 1. General Layout of an IDS ............................................................................. 4  Figure 2. High-level design of the Framework .......................................................... 11  Figure 3. Database for the Framework ...................................................................... 12  Figure 4: FRIDSE Check List ..................................................................................... 18  Figure 5. OSEC evaluation lab setup ......................................................................... 20  Figure 6. Real Lab setup ............................................................................................ 21  Figure 7. FRIDSE Interface for Registering IDSes. .................................................. 23  Figure 8. FRIDSE Interface for launching attacks ..................................................... 24  Figure 9. Background process seen while attacks are going on ................................ 25  Figure 10. FRIDSE interface for generating reports .................................................. 27  Figure 11. True Positives from Snort ......................................................................... 28  Figure 12. True Positives from Shoki ........................................................................ 29  Figure 13. False Positives from Snort ........................................................................ 30  Figure 14. False Positives from Shoki ....................................................................... 31  Figure 15. False Negatives from Snort ...................................................................... 32  Figure 16. False Negatives from Shoki...................................................................... 33  vii List of Tables Table 1. Osec NIDS v1 Vulnerability List ................................................................. 14  Table 2. List of Backdoor attacks in the check list .................................................... 15  Table 3. List of CGI Abuse attacks in the check list ................................................. 16  Table 4. List of database attacks in the check list ...................................................... 16  Table 5. List of default unix account attacks in the check list ................................... 16  Table 6. List of dos attacks in the check list .............................................................. 16  Table 7. List of ftp attacks in the check list ............................................................... 16  Table 8. List of attacks for gaining root access remotely in the check list ................ 17  Table 9. List of P2P file sharing attacks in the check list .......................................... 17  Table 10. List of useless services attacks in the check list ........................................ 17  Table 11. List of Windows attacks in the check list .................................................. 17  Table 12. List of Windows: Microsoft Bulletin attacks in the check list .................. 17  Table 13. Available hardware list .............................................................................. 21