# Cyberprints_ Identifying Cyber Attackers by Feature Analysis.pdf

• 144
• 100% (1) 1 out of 1 people found this document helpful

This preview shows page 1 out of 144 pages.

Unformatted text preview: Graduate Theses and Dissertations Iowa State University Capstones, Theses and Dissertations 2012 Cyberprints: Identifying Cyber Attackers by Feature Analysis Benjamin A. Blakely Iowa State University Follow this and additional works at: Part of the Computer Engineering Commons, and the Databases and Information Systems Commons Recommended Citation Blakely, Benjamin A., "Cyberprints: Identifying Cyber Attackers by Feature Analysis" (2012). Graduate Theses and Dissertations. 12280. This Dissertation is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected] Cyberprints: Identifying cyber attackers by feature analysis submitted to Iowa State University by Benjamin A. Blakely A dissertation submitted to the graduate faculty in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Major: Computer Engineering Program of Study Committee: Doug W. Jacobson, Major Professor Thomas E. Daniels Mani Mina James M. McCormick Steffen W. Schmidt Iowa State University Ames, Iowa 2012 c Benjamin A. Blakely, 2012. All rights reserved. Copyright ii ACKNOWLEDGMENTS Reaching this point in my education is hardly something for which I can claim the credit. For the past twenty-six years I have been encouraged, cared for, counseled, and educated by many truly amazing people. To list them all would require much more space than can be allotted here. In that way, this is both the easiest and hardest part of this document to write. However, I wish to highlight the following people who have stood out as enduring influences and to whom I credit any past or future successes otherwise attributed to me. First and foremost, I owe an eternal debt of gratitude to my parents, Bill and Denise, and all the rest of my family and extended family, who have been role models for everything I hope to become. Their generous support, whether it be emotional, financial, or just being willing to encourage me in all the many hobbies and activities I went through as a child, have instilled in me the eagerness to learn and willingness to do what it takes to reach for success. I have now been fortunate to add to this the wonderful extended family of my wife. My grandmother, Velma; grandfather, Paul; and uncle, Fred — all educators — are unfortunately not here to share this milestone with me. However, their example of the sanctity of education lingers with me. Additionally, my grandmother, Rose is not here to see the completion of this work, but her encouragement and interest in everything I’ve done all the way to the end will always stay with me. Many educators have inspired me, and yes, put up with me, over the twenty-one years I’ve spent in school. From my kindergarten teacher, Janet Leonard, who gave me and so many others the foundation for a successful career, to teachers like Diane Silvers, Marian Houseman, Bruce Bennett, and Brenda Yoakum, who allowed me the latitude to be myself, tolerated my non-conformity in the classroom, and pushed me to succeed in whatever ways I chose. Had I iii not had such caring and passionate people as a part of my educational career, I believe I would not have the insatiable curiosity that drives me today. At Iowa State, I have encountered a number of instructors and friends who have kept me on the path that has led me to finish this work. In particular, I thank Doug Jacobson, for all of the opportunities he has allowed me to participate in and the many years of support; and Mani Mina, for his contagious passion for learning and all the moral support and guidance he has offered me through my most difficult semesters. Without them, I don’t know that I would have finished even a Bachelor’s in Computer Engineering, not to mention anything further. Additionally, I thank the other members of my program of study committee — Tom Daniels (whom I have shared many hours with in cyber defense competitions), James McCormick, and Steffen Schmidt — for all of their input and guidance during the completion of this work. Many others have served as mentors and friends along the way. My friend and supervisor at the Krell Institute, Nazanin Imani, has always been willing to listen to me and lend her wise advice. My friend and colleague, Nate Evans, took me under his wing as a freshman and we’ve formed not only what I believe to be an unbeatable pair in the workplace, but a close and enduring friendship. And of course, without the unending love, patience, and encouragement of my wife, Afton, I could not have even dreamed of dedicating the many hours to my schoolwork that would have otherwise been spent with her. People sometimes ask why I sign my name with a lowercase “B” in my first name. I have signed it this way since I realized in high school that all of those things which I undertake and “leave my mark” on are only in small part due to any ability I might have. By signing my name ‘benjamin A Blakely’, I emphasize the significance of my family and all of those who have been like family to me in making me who I am. Alor ergo sum. iv TABLE OF CONTENTS ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii CHAPTER 1. INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . 1 CHAPTER 2. CYBER DETERRENCE . . . . . . . . . . . . . . . . . . . . . 3 2.1 History of Cyber Warfare . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Difficulty of Defense . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Why Deterrence? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.4 Classical Deterrence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.5 Cyber Deterrence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2.6 Criticality of Attribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 CHAPTER 3. ATTRIBUTION . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.1 Network-based Attribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.2 Traffic-based Attribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 3.3 Host-based Attribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.4 Non-technical Attribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 3.5 Need for New Attribution Strategies . . . . . . . . . . . . . . . . . . . . . . . . 51 CHAPTER 4. TOWARDS A CYBERPRINT . . . . . . . . . . . . . . . . . . 54 4.1 Stylometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 4.2 Software Forensic Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 4.3 Stability and Sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.4 Feature Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 v CHAPTER 5. METHOD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5.1 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5.2 Analysis Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 CHAPTER 6. RESULTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 6.1 Multivariate Analysis of Variance (MANOVA) . . . . . . . . . . . . . . . . . . 86 6.2 Principal Component Analysis and Clustering . . . . . . . . . . . . . . . . . . . 88 6.3 Kolmogorov-Smirnov Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 CHAPTER 7. CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . . . . . 98 7.1 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 7.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 APPENDIX A. FEATURE DISTRIBUTION HISTOGRAMS . . . . . . . . 102 APPENDIX B. KOLMOGOROV-SMIRNOV HEATMAPS . . . . . . . . . . 108 GLOSSARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 vi LIST OF FIGURES Figure 2.1 Regional Internet Registries . . . . . . . . . . . . . . . . . . . . . . . . 7 Figure 4.1 Theoretical Explanatory Variable Contributions . . . . . . . . . . . . . 65 Figure 4.2 Sources of Cyberprint Features . . . . . . . . . . . . . . . . . . . . . . 69 Figure 6.1 Significance of Discriminating Features (MANOVA) . . . . . . . . . . . 88 Figure 6.2 Output of Principal Component Analysis, by application . . . . . . . . 89 Figure 6.3 Output of Principal Component Analysis, by operating system 89 Figure 6.4 Analysis of 13-dimensional feature space of significant features, by ap- . . . . plication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Figure 6.5 Analysis of 13-dimensional feature space of significant features, by operating system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Figure 6.6 90 91 K-Means analysis of 13-dimensional feature space of significant features (k=6, 10 iterations) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 Figure 6.7 Feature Distribution Histogram: Debian Linux . . . . . . . . . . . . . . 94 Figure 6.8 Kolmogorov-Smirnov Comparisons: Overall Average (Best Features) . 96 Figure A.1 Feature Distribution Histogram: Debian Linux . . . . . . . . . . . . . . 102 Figure A.2 Feature Distribution Histogram: FreeBSD 5 . . . . . . . . . . . . . . . 103 Figure A.3 Feature Distribution Histogram: FreeBSD 9 . . . . . . . . . . . . . . . 103 Figure A.4 Feature Distribution Histogram: Red Flag Linux . . . . . . . . . . . . 104 Figure A.5 Feature Distribution Histogram: Windows 7 . . . . . . . . . . . . . . . 104 Figure A.6 Feature Distribution Histogram: Windows XP . . . . . . . . . . . . . . 105 Figure A.7 Feature Distribution Histogram: Hydra HTTP Brute-Force . . . . . . 105 vii Figure A.8 Feature Distribution Histogram: Hydra IMAP Brute-Force . . . . . . . 106 Figure A.9 Feature Distribution Histogram: Nmap 4 Scan . . . . . . . . . . . . . . 106 Figure A.10 Feature Distribution Histogram: Nmap 5 Scan . . . . . . . . . . . . . . 107 Figure B.1 Kolmogorov-Smirnov Comparisons: Overall Average, before filtering . 108 Figure B.2 Kolmogorov-Smirnov Heatmap: Packet Length . . . . . . . . . . . . . 109 Figure B.3 Kolmogorov-Smirnov Heatmap: Minimum IPv4 Identifier . . . . . . . . 110 Figure B.4 Kolmogorov-Smirnov Heatmap: Average IPv4 Identifier . . . . . . . . 111 Figure B.5 Kolmogorov-Smirnov Heatmap: Maximum IPv4 Identifier . . . . . . . 112 Figure B.6 Kolmogorov-Smirnov Heatmap: Minimum IPv4 Time-to-Live . . . . . 113 Figure B.7 Kolmogorov-Smirnov Heatmap: Maximum IPv4 Time-to-Live . . . . . 114 Figure B.8 Kolmogorov-Smirnov Heatmap: IPv4 Don’t Fragment Flags . . . . . . 115 Figure B.9 Kolmogorov-Smirnov Heatmap: TCP Source Port . . . . . . . . . . . . 116 Figure B.10 Kolmogorov-Smirnov Heatmap: TCP Acknowledgment Flags . . . . . 117 Figure B.11 Kolmogorov-Smirnov Heatmap: TCP FIN Flags . . . . . . . . . . . . . 118 Figure B.12 Kolmogorov-Smirnov Heatmap: TCP Push Flags . . . . . . . . . . . . 119 Figure B.13 Kolmogorov-Smirnov Heatmap: TCP Window Size . . . . . . . . . . . 120 viii LIST OF TABLES Table 3.1 Attribution Assumptions, Old and New . . . . . . . . . . . . . . . . . . 53 Table 4.1 Open Systems Interconnection (OSI) Model . . . . . . . . . . . . . . . 62 Table 4.2 IPv4 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Table 4.3 IPv6 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Table 4.4 ICMP Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Table 4.5 TCP Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Table 4.6 UDP Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Table 6.1 Initial Set of Features for Consideration and their Relevance . . . . . . 87 1 CHAPTER 1. INTRODUCTION Deterrence, whether it be to counter cyber attacks, nuclear strikes, or playground bullies relies upon a solid and demonstrable understanding of where an attack originated. If an attacker knows his identity is unlikely to be determined, any threats made against him will fall on deaf ears. The current global state of cyber security is one that routinely makes the news as acts of cyber espionage, cyber crime, and cyber warfare close the gap in commonality with their conventional counterparts. In any of these domains, attribution is without a doubt a critical factor. This dissertation will lay the groundwork for this topic by analyzing the history of cyber warfare, and demonstrating why deterrence is the superior, and perhaps only, way to counter it. Of course, great deal effort has been placed into discovering new ways to determine the origin of attacks in cyberspace. Particularly within the decade from 1995 to 2005, there was something of a genesis of this topic in the literature. While many of these methods proved useful, many failed to find applications beyond the laboratory. The same set of assumptions is used for many of these efforts — that a forensic analyst will be able to obtain, store, and analyze whatever information is necessary for attribution (omniscience), that she will also be able to place sensors in any place necessary to conduct attribution — perhaps even by covert means (omnipresence), and that the choice of location for such systems will be correct for attributing any attack (a priori positioning). A survey of the literature in this area is performed to show how these assumptions are built into existing methods and why a new strategy is needed. The problem of determining authorship is not unique to cyberspace. It is also not a new problem. It has been known for at least a century that forensic analysis of documents can reveal behavioral and linguistic signatures of the author, that the author expressed without 2 realizing it. Whether it be handwriting, vocabulary, or even the choice of paper, many features can be used to say whether two paper documents share authorship. Such methods have been used extensively in literary analysis under the name Stylometry. Similar methods have been applied to electronic mail, forum postings, software source code (including viruses), and other forms of electronic communication (Morse Code operators were even know to be recognizable by their cadence) with much success. The central question of this dissertation is whether similar methods can be applied to network-level feature analysis. To this end, a number of network-level features are derived from the IPv4 and TCP headers of a dataset generated for this purpose. First, using a Multivariate Analysis of Variance (MANOVA), then a Principal Component Analysis (PCA) and visual inspection of feature distribution histograms, and finally a Kolmogorov-Smirnov comparison of the individual feature distributions, these features are evaluated for discriminatory power. The major contribution of this work is a foundation for a new branch of network forensics - Cyberprints. Just as a fingerprint can identify a burglar, the materials in a bomb identify a bomb maker, and features of a fire can identify an arsonist, network-level features in traffic streams show promise for attributing actions in cyberspace. This is accomplished without making the assumptions of existing attribution methods, using relatively simple computations, and considering a small final list of useful features. It is hoped, however, this work will be extended to develop new methods to generate and analyze Cyberprints, and find new features to use in such analyses. For the unfamiliar reader, a glossary of technical terms used in this dissertation is included before the bibliography. 3 CHAPTER 2. 2.1 CYBER DETERRENCE History of Cyber Warfare Exploitation of vulnerabilities in cyberspace is hardly a twenty-first century concept. As long as nodes on the Internet have been able to communicate with each other, there have been those willing to take advantage of them. What was perhaps the oldest known widespread cyber attack was not even intended as an attack. In 1988, Robert Morris, a student at Cornell University, wrote a program to count the number of nodes on the Internet. To do so, it used known system vulnerabilities to replicate itself from machine-to-machine. In this regard, it was largely successful, having reached an estimated 10% of the Internet - 60,000 nodes. However, an unintended side effect of its exploitation method caused it to also crash the systems it penetrated, leaving a trail of destruction in its path. Morris resulted in the first conviction in the U.S. under the 1986 Computer Fraud and Abuse Act, despite its benign intentions (Moore, 2008). The concept of self replicating software has since developed into the malicious codes now referred to as worms. In 1994, one of the first international cyber incidents to gain notoriety occurred at the Air Force Rome Lab. At least 150 intrusions were detected there by system administrators, eventually traced to an Israeli. Unlike Morris, no damage occurred and there were no Israeli laws at the time that made the action a crime, so the perpetrator escaped punishment (Beidleman, 2009). This highlighted the need for international coordination of criminal law and agreements for cooperation to apprehend and prosecute offenders, a topic that will have a more thorough treatment later in this dissertation. To keep from being caught completely by surprise, good system administrators must view their systems from an adversarial perspective. It was just such an exercise by the U.S. National 4 Security Agency (NSA) Red Team in 1997 that led to an embarrassingly effective lesson in overconfidence. During operation Eligible Receiver, simulated attackers were able to take control of computers in the command center of the United States Pacific Command (PACOM), and in power grids and 911 systems of nine major U.S. cities (Beidleman, 2009). In a 1998 event, later named “Solar Sunrise”, over 500 U.S. Department of Defense (DOD) computers were found to have been compromised by an unknown attacker (Beidleman, 2009). A year later, an event named “Moonlight Maze” resulted in the breach of hundreds of computers at the U.S. National Aeronautics and Space Administration (NASA), the Pentagon, the Department of Energy (DOE), and other universities and laboratories. Information stolen included technical research, contracts, encryption techniques, and information on war-planning systems (Adams, 2001; Moore, 2008). In 2003, what were believed to be Chinese attackers began infiltrating classified U.S. networks at an alarming rate in a operation named “Titan Rain” that is still under a veil of secrecy (Moore, 2008; Thornburgh, 2005). In 2007, a penetration so great it has been called “our electronic Pearl Harbor” compromised computers in the DOD, DOE, Department of State (DOS), and Department of Commerce (DOC). The amount of information exfiltrated was sufficient to fill the Library of Congress (Habiger, 2010). That same year, a trove of classified data was found on peer-to-peer (P2P) networks such as Limewire. This included a diagram of the U.S. Secret Internet Protocol Router Network (SIPRNet), password change scripts for the Pentagon’s SECRET network, encryption certificates allowing access to contractor systems, the Pentagon’s information technology (IT) threat response plan, and threat assessments for multiple U.S. cities (among other data) (Habiger, 2010). In 2008, an attack suspected to originate from Russia crossed into a classified network, creating enough alarm that the President of the United States was personally briefed (Habiger, 2010). The worst-case scenario in cy...
View Full Document