The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reports and Papers Archive


Browse All Papers »       Submit A Paper »

Essays on information security from an economic perspective

CERIAS TR 2009-24
Ta-Wei Wang
Download: PDF
Added 2009-09-14

ACHIEVING HIGH SURVIVABILITY IN DISTRIBUTED SYSTEMS THROUGH AUTOMATED RESPONSE

CERIAS TR 2009-22
Yu-Sung Wu
Download: PDF

We propose a new model for automated response in distributed systems. We formalize the process of providing automated responses and the criterion for asserting global optimality of the selection of responses. We show that reaching the globally optimal solution is an NP-hard problem. Therefore we design a genetic algorithm framework for searching for good selections of responses in the runtime. Our system constantly adapts itself to the changing environment based on short-term history and also tracks the patterns of attacks in a long-term history.  Unknown security attacks, or zero-day attacks, exploit unknown or undisclosed vulnerabilities and can cause devastating damage. The escalation pattern, commonly represented as an attack graph, is not known a priori for a zero-day attack. Hence, a typical response system provides ineffective or drastic responses. Our system �conceptualizes� nodes in an attack graph, whereby they are generalized based on the object-oriented hierarchy for components and alerts. This is done based on our insight that high level manifestations of unknown attacks may bear similarity with those of previously seen attacks. This allows the use of history such as effectiveness of each response from past attacks to assist responses to the unknown attack.  This thesis lays down three distinct claims and validates them empirically. The claims are: (i) For automated response, consider a baseline mechanism that has a static mapping from the local detector symptom to a local response. This corresponds to the state-of-the-art in deployed response systems. Now consider our proposed model which takes into account global optimality from choosing a set of responses and also does a dynamic computation of the response combination from the set of detectors and other system parameters (inferences about the accuracy of the attack diagnosis, response effectiveness, etc.). The survivability of the application system with our proposed model is an upper bound of the survivability achievable through the baseline model. (ii) In some practical situations, the proposed model gives higher survivability than the baseline model. (iii) The survivability with our proposed model is improved when the system takes into account history from prior similar attacks. This kind of history is particularly important when the system deals with zero-day attacks.

Added 2009-09-11

The period of the Bell numbers modulo a prime

CERIAS TR 2010-01
Peter Montgomery, Sangil Nahm, Samuel Wagstaff Jr
Download: PDF

We discuss the number in the title, especially whether the minimum period of the Bell numbers modulo p can be a proper divisor of N_p = (p^p-1)/(p-1).  The investigation leads to interesting new theorems about possible prime factors of N_p. For example, we show that if p -s odd and q = 4m^2p+1 is prime and m is a positive integer, then q divides p^{m^2p} - 1. Then we explain how this fact influences the probability that q divides N_p

Added 2009-09-07

Effects of Anonymity, pre-employment integrity and antisocial behvaior on self reported cyber crime engagement: An exploratory study

CERIAS TR 2009-31
Ibrahim Moussa Baggili
Download: PDF

A key issue facing today’s society is the increase in cyber crimes. Cyber crimes pose threats to nations, organizations and individuals across the globe. Much of the research in cyber crime has risen from computer science-centric programs and little experimental research has been performed on the psychology of cyber crime. This has caused a knowledge gap in the study of cyber crime. To this end, this dissertation focuses on understanding psychological concepts related to cyber crime. Through an experimental design, participants were randomly assigned to three groups with varying degrees of anonymity. After each treatment, participants were asked to self-report their cyber crime engagement, antisocial behavior and pre-employment integrity. Results indicated that the anonymity manipulation had a main effect on self-reported cyber crime engagement. The results also showed that there is a statistically significant positive relationship between self-reported antisocial behaviors and cyber crime engagement, and a statistically significant negative relationship between self-reported cyber crime engagement and preemployment integrity. Suggestions for future research are also discussed.

Added 2009-09-03

Federal Plan for Advanced Networking Research and Development

National Science and Technology Council
Added 2009-08-27

The Networking and Information Technology Research and Development Program

Executive Office of the President
Added 2009-08-27

Integration of COBIT, Balanced Scorecard and SSE-CMM as a strategic Information Security Management (ISM) framework

CERIAS TR 2009-21
Suchit Ahuja
Download: PDF

The purpose of this study is to explore the integrated use of Control Objectives for Information Technology (COBIT) and Balanced Scorecard (BSC) frameworks for strategic information security management. The goal is to investigate the strengths, weaknesses, implementation techniques, and potential benefits of such an integrated framework. This integration is achieved by “bridging” the gaps or mitigating the weaknesses that are recognized within one framework, using the methodology prescribed by the second framework. Thus, integration of COBIT and BSC can provide a more comprehensive mechanism for strategic information security management – one that is fully aligned with business, IT and information security strategies. The use of Systems Security Engineering Capability Maturity Model (SSE-CMM) as a tool for performance measurement and evaluation can ensure the adoption of a continuous improvement approach for successful sustainability of this comprehensive framework. There are some instances of similar studies conducted previously: • metrics based security assessment (Goldman & Christie, 2004) using ISO 27001 and SSE-CMM • mapping of processes for effective integration of COBIT and SEI-CMM (IT Governance Institute, 2007a) • mapping of COBIT with ITIL and ISO 27002 (IT Governance Institute, 2008) for effective management and alignment of IT with business The factor that differentiates this research study from the previous ones is that none of the previous studies integrated BSC, COBIT and SSE-CMM, to formulate a comprehensive framework for strategic information security management (ISM) that is aligned with business, IT and information security strategies. Therefore, a valid opportunity to conduct this research study exists.

Added 2009-08-18

Integrating Model Checking and Test Generation for Reliable and Secure Concurrent Programs

CERIAS TR 2009-25
Tang, Mathur

A method for testing concurrent programs is introduced. The proposed method combines the power of dynamic model checking with test generation via program mutation. Dynamic model checking is not reliable without an adequate test set, while naive test generation of concurrent programs is insufficient, due to the possibility of many interleavings. A combination of the two processes could reduce the weaknesses in each process.

Added 2009-07-21

Access Control for Healthcare using Policy Machine

CERIAS TR 2009-20
Zahid Pervaiz, Arjmand Samuel, David Ferraiolo, Serban Gavrila, Arif Ghafoor
Download: PDF

Access control policies in healthcare domain define permissions for users to access different medical records. Role Based Access Control (RBAC) helps to restrict medical records to users in a certain role but sensitive information in medical records can still be compromised by authorized insiders. The threat is from users who are not treating the patient but have access to medical records .We propose selective combination of policies where sensitive records are only available to primary doctor under Discretionary Access Control (DAC). This helps not only better compliance of principle of least privilege but also helps to mitigate the threat of authorized insiders disclosing sensitive patient information. We use Policy Machine (PM) proposed by NIST to combine policies and develop a flexible healthcare access control policy which has benefits of context awareness and discretionary access. Temporal constrains have been added to RBAC in PM and after combination of Generalized Temporal RBAC and DAC an example healthcare scenario has been setup.

Added 2009-07-04

Physically Restricted Authentication with Trusted Hardware

CERIAS TR 2009-18
Michael Kirkpatrick, Elisa Bertino
Download: PDF

Modern computer systems permit mobile users to access protected information from remote locations. In certain secure environments, it would be desirable to restrict this access to a particular computer or set of computers. Existing solutions of machine-level authentication are undesirable for two reasons. First, they do not allow fine-grained application layer access decisions. Second, they are vulnerable to insider attacks in which a trusted administrator acts maliciously. In this work, we describe a novel approach using secure hardware that solves these problems. In our design, multiple administrators are required for installation of a system. After installation, the authentication privileges are physically linked to that machine, and no administrator can bypass these controls. We define an administrative model and detail the requirements for an authentication protocol to be compatible with our methodology. Our design presents some challenges for large-scale systems, in addition to the benefit of reduced maintenance.

Added 2009-07-03

Privacy-Preserving Filtering and Covering in Content-Based Publish Subscribe Systems

CERIAS TR 2009-15
Mohamed Nabeel, Ning Shang, Elisa Bertino
Download: PDF

Content-Based Publish-Subscribe (CBPS) is an asynchronous messaging paradigm that supports a highly dynamic and many-to-many communication pattern based on the content of the messages themselves. In general, a CBPS system has three distinct parties - \textit{Content Publishers} , \textit{Content Brokers},  and \textit{Subscribers} - working in a highly decoupled fashion. The ability to seamlessly scale on demand has made CBPS systems the choice of distributing \textit{messages/documents} produced by \textit{Content Publishers} to many \textit{Subscribers} through \textit{Content Brokers}. Most of the current systems assume that \textit{Content Brokers} are trusted for the confidentiality of the data published by \textit{Content Publishers} and the privacy of the subscriptions, which specify their interests, made by \textit{Subscribers}.  However, with the increased use of technologies, such as service oriented architectures and cloud computing, essentially outsourcing the broker functionality to third-party providers, one can no longer assume the trust relationship to hold. The problem of providing privacy/confidentiality in CBPS systems is challenging, since the solution to the problem should allow \textit{Content Brokers} to make routing decisions based on the content without revealing the content to them. The problem may appear unsolvable since it involves conflicting goals, but in this paper, we propose a novel approach to preserve the privacy of the subscriptions made by \textit{Subscribers} and confidentiality of the data published by \textit{Content Publishers} using cryptographic techniques when third-party \textit{Content Brokers} are utilized to make routing decisions based on the content. We analyze the security of our approach to show that it is indeed sound and provide experimental results to show that it is practical.

Added 2009-06-18

On the Tradeoff Between Privacy and Utility in Data Publishing

CERIAS TR 2009-17
Tiancheng Li; Ninghui Li
Download: PDF

In data publishing, anonymization techniques such as generalization and bucketization have been designed to provide privacy protection. In the meanwhile, they reduce the utility of the data. It is important to consider the tradeoff between privacy and utility. In a paper that appeared in KDD 2008, Brickell and Shmatikov proposed an evaluation methodology by comparing privacy gain with utility gain resulted from anonymizing the data, and concluded that “even modest privacy gains require almost complete destruction of the data-mining utility”. This conclusion seems to undermine existing work on data anonymization. In this paper, we analyze the fundamental characteristics of privacy and utility, and show that it is inappropriate to directly compare privacy with utility. We then observe that the privacy-utility tradeoff in data publishing is similar to the risk-return tradeoff in financial investment, and propose an integrated framework for considering privacy-utility tradeoff, borrowing concepts from the Modern Portfolio Theory for financial investment. Finally, we evaluate our methodology on the Adult dataset from the UCI machine learning repository. Our results clarify several common misconceptions about data utility and provide data publishers useful guidelines on choosing the right tradeoff between privacy and utility.

Added 2009-06-12

Collaring the Cybercrook: An Investigator's View

David J. Icove
Added 2009-06-12

Modeling and Integrating Background Knowledge in Data Anonymization

CERIAS TR 2009-16
Tiancheng Li; Ninghui Li; Jian Zhang
Download: PDF

Recent work has shown the importance of considering the adversary’s background knowledge when reasoning about privacy in data publishing. However, it is very difficult for the data publisher to know exactly the adversary’s background knowledge. Existing work cannot satisfactorily model background knowledge and reason about privacy in the presence of such knowledge. This paper presents a general framework for modeling the adversary’s background knowledge using kernel estimation methods. This framework subsumes different types of knowledge (e.g., negative association rules) that can be mined from the data. Under this framework, we reason about privacy using Bayesian inference techniques and propose the skyline (B, t)-privacy model, which allows the data publisher to enforce privacy requirements to protect the data against adversaries with different levels of background knowledge. Through an extensive set of experiments, we show the effects of probabilistic background knowledge in data anonymization and the effectiveness of our approach in both privacy protection and utility preservation.

Added 2009-06-12

Efficient and Selective Publishing of Hierarchically Structured Data

CERIAS TR 2009-13
Mohamed Nabeel, Elisa Bertino
Download: PDF

Hierarchical data models (e.g. XML, Oslo) are an ideal data exchange format to facilitate ever increasing data sharing needs among enterprises, organizations as well as general users. However, building efficient and scalable Event Driven Systems (EDS) for selectively disseminating such data remains largely an unsolved problem to date. In general, an EDS has three distinct parties -  Content Publishers ({pubs}), Content Brokers ({bs}), Subscribers ({subs}) - working in a highly decoupled Publish-Subscribe (PS) model. With a large Subscriber base having different interests and many documents ({docs}), the deficiency in existing such systems lies in the techniques used to distribute (match/filter and forward) content from pubs to subs through {bs}. Thus, we propose an efficient and scalable approach to selectively distribute different subtrees of possibly large documents,  which have access control restrictions, to different $U_i$‘s $in$ subs by exploiting the hierarchical structure of those documents. A novelty of our approach is that we map subscription routing tables in bs to efficient tree data structures in order to perform matching and other commonly used operations efficiently. bs form a DAG consisting of multiple trees from pubs to {subs}. Along with our simple but adequate subscription language,  our proposed approach combines policy-driven covering and merging based routing to dramatically reduce the load towards the root of the distribution trees leading to a scalable system. The experimental results clearly reinforce our claims.

Added 2009-05-31