We will be posting more materials from the symposium as it becomes available.
Presentation Slides
Posters
Congratulations to the winners of the 6th Annual Information Security Symposium Poster Competition! We are proud to be associated with the following students whose posters best reflect the intellect and talent CERIAS fosters within its research and education programs.
First Place:
Abhi Bhargav Spantzel and Anna C. Squicciarini: Integrating Identity Management and Trust Negotiation
http://www.cerias.purdue.edu/news_and_events/events/symposium/2005/materials/#F949EF
Second Place
Michael Yang: ReAssure: Virtual Imaging Instrument for Logically Destructive Experiments
http://www.cerias.purdue.edu/news_and_events/events/symposium/2005/materials/#00B46C
Third Place
Mohamed Shehab: SERAT : SEcure Role mApping Technique for Decentralized Secure Interoperability
http://www.cerias.purdue.edu/news_and_events/events/symposium/2005/materials/#12FE0B
Identification, Authentication and Privacy
Incident Detection, Response, and Investigation
Assurable Software and Architectures
Enclave and Network Security
Security Awareness, Education and Training
Cryptology and Rights Management
Risk Management, Policies and Laws
Trusted Social and Human Interactions
Adaptive Intrusion Response using Attack graphs
in an E-commerce environment
Yu-Sung Wu;
Area: Assurable Software and Architectures
Distributed systems with multiple interacting services,
such as distributed e-commerce systems, are suitable targets for malicious
attacks because of the potential financial impact. Intrusion detection
in such systems has been an active area of research, while the problem
of automated response has received relatively less attention. The thought
often is that a system administrator will be included in the loop for
troubleshooting once the alert about a possible intrusion has been raised.
In this paper, we present the design of automated response mechanisms
in an intrusion tolerant system called ADEPTS. The particular type of
response we focus on enforces containment in the system, through which
it localizes the effect of the intrusion thus allowing the system to provide
service, albeit degraded. Containment can be very important in a large
class of distributed systems in which a single compromised service can
affect other services through the mutual interactions. ADEPTS uses a graph
of intrusion goals, called I-GRAPH as the underlying representation in
the system. In response to alerts from an intrusion detection framework,
ADEPTS executes an algorithm to determine the possible path of spread
of the intrusion and the appropriate response to deploy. A feedback mechanism
evaluates the success of a deployed response and uses that in guiding
future choices. ADEPTS is demonstrated on a distributed e-commerce system
and evaluated using the survivability metric whose value depends on the
operational services in the face of an intrusion.
Convenience and Security in the Safely Contained
Experimentation Facility, Reassure.
Pascal Meunier (PI); Michael Yang;
Area: Assurable Software and Architectures
ReAssure is a work in-progress to create a virtual
machine-based reconfigurable facility for efficient, reproducible, controlled
and safely contained experimentation. Convenience and security in complex
systems are traditionally seen as being at odds; what follows is a description
of the challenges and solutions we adopted. The ReAssure human interface
is a web-based application, with a separate utility for configuring networks,
as well as access to experimental PCs. This poster describes how and why
the data provided by the convenient but untrusted graphical network configuration
tool can be trusted.
ReAssure: Virtual Imaging Instrument for Logically
Destructive Experiments
Pascal Meunier (PI); Michael Wang
Area: Assurable Software and Architectures
Role-Based Access Control for Group Communication
Systems
Jacques Thomas;
Area: Assurable Software and Architectures
In addition to basic security services such as confidentiality,
integrity and data source authentication, a secure group communication
system should also provide authentication of participants and access control
to group resources. While considerable research has been conducted on
providing confidentiality and integrity for group communication, less
work focused on group access control services. In the context of group
communication, specifying and enforcing access control becomes more challenging
because of the dynamic and distributed nature of groups and the fault
tolerance issues (i.e. withstanding process faults and network partitions).
In this work we analyze the requirements access
control mechanisms must fulfill in the context of group communication
and define a framework for supporting fine-grained access control in client-server
group communication systems. Our framework combines role-based access
control mechanisms with environment parameters (time, IP address, etc.)
to provide policy support for a wide range of applications with very different
requirements. While policy is defined by the application, its efficient
enforcement is provided by the group communication system.
Security Based Testing of Access Control Systems
Ammar Masood;
Area: Assurable Software and Architectures
Software testing is essential to verify implementation
conformance to the desired functional objectives. The high cost and extremely
time consuming nature of software testing is a major hurdle in increasing
its effectiveness, therefore it is very important to establish a structured
process for generating tests. One important aspect of software testing
is verifying that the software meets its specified security objectives,
which is classified as security testing. The discovery of dangerous flaws
in many security products signifies the importance of security testing.
Using software specifications for test generation has several advantages
as compared to code/structure based test generation. We present a model
based testing strategy for security testing of access control systems.
Our model based approach is formally guided by the system constraints
in generating structural and behavior models of the system under test.
The generated models are then used to derive test suites for performing
the security testing.
The Poly^2 Project
David Ehrmann; Gregory Ose; Michael Armbrust; Jay Gengelbach;
Keith Watson (PI);
Area: Assurable Software and Architectures
A significant challenge faced by many organizations
today is the enforcement and evaluation of trust in complex networks built
using a variety of commercial off-the-shelf and freeware components, and
running on one or more general purpose operating systems. We will address
this problem by simplifying, modularizing, and separating functionality
to the extent that Poly^2 components have the minimum functionality needed
for a particular network task, and interactions between components are
confined to known, defendable contexts. The Poly^2 research project advances
the understanding of building secure and reliable system architectures
to support critical services in hostile network environments. A secure
and reliable system architecture must only provide the required services
to authorized users in time to be effective. The Poly^2 architecture we
propose will be based on sound, widely acknowledged security design principles.
It will form the basis for providing present and future network services
while, at the same time, being highly robust and resistant to attack.
A prototype of the new architecture will be developed that will provide
traditional network services (e.g. web, FTP, email, DNS, etc.) using commodity
hardware and an open source operating system. All along, security analyses
will be conducted to see if there are flaws in the design or implementation
and whether those flaws are the result of conflicting requirements or
design objectives. The resulting implementation will serve as a secure
and highly available platform from which organizations can deploy their
own critical network services. Poly^2 research will provide the community
with a better understanding of the application of sound security design
principles, the decomposition of COTS software components to increase
trust, separation of data based on design and policy, and the isolation
of services to decrease commonality and contain failures and attacks.
Further, it can provide a springboard for additional research in areas
such as intrusion detection, security evaluation, and performance metrics.
Uncheatable Master-Worker Grid Computing and Its
Application in Drug Discovery
Mummoorthy Murugesan;
Area: Assurable Software and Architectures
Drug discovery process has long been acknowledged
as an expensive and computationally intensive procedure. This process
starts with identifying a protein that causes a disease. Subsequently,
millions of molecules are tested for finding lead molecules that will
bind with the protein to modulate the disease. Instead of doing laborious
experiments in labs, which normally takes years of analyzing all the lead
molecules, molecular modeling is used to do the simulations of the same
computations in few hours. Researchers have investigated methods to accelerate
this matching process (also known as docking), and the most recent involves
the PC Grid computing.
However, in grid computing, since the supervisor
does not have any control over the participants' machines and cannot prevent
them from manipulating the programs provided, participants may not have
performed the necessary computations but claim to have done so. When participants
are paid for their contributions, they have strong incentives to cheat
to maximize their gains. This cheating behavior, if undetected, may render
the results useless. Drug discovery process is a highly expensive procedure,
and cheating may cause severe consequences to the whole process.
This presentation discusses how to detect
cheating in grid computing for drug discovery projects. We use CBS (Commitment
Based Sampling) scheme to detect whether a participant is cheating. We
show that our scheme is efficient in terms of communication cost and extra
computational effort required. We also present an experimental study on
the prototype implementation of our scheme on two molecular docking programs:
AutoDock and FTDock. We show that CBS scheme adds less than 7%computational
overhead to the actual task.
Updating XML Documents in Distributed and Cooperative
Systems
Yunhua Koglin; Yunhua Koglin;
Area: Assurable Software and Architectures
Securing data is becoming a crucial need for most
internet-based applications. Whereas the problem of data confidentiality
has been widely investigated, the problem of how to ensure that data,
when moving among different parties, are modified only according to the
stated policies has been so far not deeply investigated. In this paper,
we propose an approach supporting parallel and distributed secure updates
to XML documents. The approach, based on the use of a security region-object
parallel flow (S-RPF) graph protocol, is particularly suited for all environments
requiring cooperative updates to XML documents. It allows different users
to simultaneously update different portions of the same document, according
to the specified access control policies. Additionally, it supports a
decentralized management of update operations in that a subject can exercise
its privileges and verify the correctness of the operations performed
so far on the document without interacting, in most of the cases, with
the document server.
Ensuring Correctness over Untrusted Private Databases
Sarvjeet Singh (PI);
Area: Cryptology and Rights Management
Protecting the privacy of data is becoming increasingly
important as is evidenced by recent interest in this area, especially
for databases. We address the problem of ensuring the correctness of query
results returned by an untrusted private database. The database owns the
data and may modify it at any time. The querier is allowed to execute
queries over this database; however it may not learn anything more than
the result of these legal queries about the contents of the database.
The querier does not necessarily trust the database and would like the
owner to furnish proof that the data has not been modified in response
to recent events such as the submission of the query. There are many practical
scenarios that lead to this problem including the need for private databases
to participate in collaboration with semi-trusted partners or to comply
with legal requirements. This problem has not been addressed earlier.
Special cases of this problem that trust the data owner,and do not protect
the privacy of the data have been addressed earlier. Our work is directly
applicable to these special cases too.
We have developed two metrics that capture
the correctness of query answers and proposed a number of solutions to
this problem that provide a trade-off between the degree of exposure of
private data in order to prove correctness, and the cost of generating
these proofs and executing the verification. Our final solution is able
to provide minimal exposure of private data while ensuring a low overhead.
Our proposed solutions are tested through implementation using PostgreSQL and real data. The results show that it is easy to implement our techniques
and the overheads are acceptable.
Evaluation Methodologies for Internet Security
Technology (EMIST)
Roman Chertov; Sonia Fahmy (PI);
Area: Enclave and Network Security
Evaluation Methodologies for Internet Security Technology
(EMIST) is a
multi-institution project to develop rigorous
testing methodologies,
tools, and benchmarks for Internet security
attacks and
defenses. EMIST is a companion project to
Defense Technology
Experimental Research Network (DETER): a remotely
accessible
experimental testbed that allows researchers
to evaluate Internet
cyber-security technologies in a realistic,
but quarantined,
environment. The team from Purdue has developed
innovative models to
characterize the impact of attacks on congestion
control (TCP) and
routing (BGP/OSPF) protocols on Internet infrastructure
and hosts,
both spatially and temporally. We have also
identified a number of key
limitations of analytical, simulation, and
emulation models.
Global Purification of Internet Streams
Bhagyalaxmi Bethala;
Area: Enclave and Network Security
Scalable Infrastructure Protection and Performance
Evaluation in Power-law Networks
Hyojeong Kim;
Area: Enclave and Network Security
WormHole: A Gigabit Worm Filter Using Network
Processor Technology
Ikkyun Kim;
Area: Enclave and Network Security
We have made initial steps to address the fast worm
filtering problem, designing a fast filtering architecture and implementing
a prototype system in the IXP1200 network processor platform that handles
30 recent worms including Blaster, CodeRedI & II, Li0n, Nimda, Ramen,
Sadmind, Sasser, Slammer, Slapper, and Welchia. The worm filter is able
to filter these worms and their mutations-known and artificially generated-at
near gigabit line speed (>>965 Mbps). We give a summary of what we have
been able to accomplish and what challenges remain to achieve 10 Gbps
and faster worm filtering -gigabit line speed filtering because of its
worm-specificity. By this we mean: because worms implementing buffer overflow
exploits do not generate the vast space of polymorphic variations that
viruses can-i.e., ones that cannot be dealt with by length invariance-we
are able to utilize design techniques that perform filtering in O(1) time.
General purpose packet classifiers cannot make full use of this specialization,
which limits their worm filtering performance. Our worm filter architecture
realizes three principal features: (f1) workload sensitivity that puts
frequent traffic first, (f2) Bloom filter hashing that rapidly identifies
non-worm traffic for immediate pass-through, and (f3) multi-level caching
that appropriates differentiated processing speed while allowing run-time
programmability. Features (f1) and (f2) embody Amdahl's law and Bloom
filtering provides an effective means to achieve identification of "important" traffic
that dominate overall filtering performance. A beneficial consequence
is that normal traffic is passed through with minimal footprint.
A GTRBAC Based System for Workflow Composition
and Management
Basit Shafiq;
Area: Identification, Authentication
and Privacy
We present an architecture for adaptive real-time
workflow-based collaborative system. Such a system is needed to support
communication and sharing of information among predefined or ad hoc team
of users collaborating with each other for the execution of their respective
tasks in the workflow. A key requirement for real-time workflow system
is to provide the right data to the right person at the right time. In
addition, the workflow needs to be reconfigured if a subtask of a workflow
cannot be executed within the due time. We use the generalized temporal
role-based access control (GTRBAC) model to capture the real-time dependencies
of such workflow applications. In addition, support for triggers in GTRBAC
allows dynamic adaptation of workflow based on the occurrence of certain
events. Such adaptations may include rescheduling of workflow tasks, reassignment
of users to scheduled tasks based on their availability and skill level,
and abortion of incomplete tasks.
An Analysis of Privacy and Security Information
Provided and Elicited by Different Types of Web Site
M Athar Ali; Robert Proctor (PI);
Area: Identification, Authentication
and Privacy
Issues involving consumer privacy and the privacy
policies of organizations are of concern for Web transactions that require
users to provide personal information. Although several studies have examined
users' reported concerns and preferences with respect to privacy issues,
no systematic investigation has been conducted regarding the personal
information requested by Web sites and the ways in which the sites incorporate
privacy and security features to alleviate users' concerns. We report
the results of an investigation of 42 Web sites, six each from seven different
categories. The results show large differences in the amount and types
of personal information requested, both between and within the different
site categories. This variability may cause users to hesitate to provide
information to a site that deviates from their prior experience with similar
sites. Furthermore, although privacy certification logos are viewed favorably
by users, many sites do not display them.
Biometric Feasibility Study: Hand Geometry at
the Recreational Sports Center
Eric Kukula;
Area: Identification, Authentication
and Privacy
Technology is impacting most components of our lives,
and will continue to do such with the continuous development and passage
of information security and privacy protection laws. An emerging field
known as biometrics combines privacy, security, and audit control functionalities
together to an individual's physical and behavioral characteristics for
use in applications such as: time and attendance, access control, e-commerce,
and identification systems.
This study explores the feasibility and performance
related to implementing a hand geometry system for access and audit control
purposes at the Purdue University Recreational Sports Center. This system
will replace the existing system of magnetic stripe identification cards
to enter the facility. The current system requires a user to swipe a student
identification card through a reader to access the recreational center.
Since the access decision is based solely on a token, which can be stolen,
lost, handed to others, or copied; accurate auditing is not possible,
causing a two fold problem: (1) an insurance risk as accurate records
of who is in the facility is not available, and (2) a loss of revenue
for the recreational center as multiple people could possibly use the
same identification card. Hand geometry removes the token and replaces
it with something you have, your hand, removing the possibility for lost
or stolen tokens to be used for accessing the recreational center.
This evaluation discusses implementation issues
of the hand geometry system into the current infrastructure, user interactions,
including the habituation curve, user perceptions of the device, and the
performance results, including the failure to enroll rate, failure to
acquire rate, false match rate, false non-match rate, as well as analyzing
the interclass relationships using collected demographic information including:
gender, ethnicity, and handedness.
Digital Identity Management domain in Ontological
Semantics
Evguenia Malaia (PI);
Area: Identification, Authentication
and Privacy
This paper focuses on ontological efforts to support
information security applications - more specifically, applied natural
language processing technology - in the domain of Digital Identity Management
(DIM).
Because digital identities have such varied
uses and meanings, and because the social implications of digital identity
management policies are far-reaching into domains of free speech, privacy
and accountability online, it is necessary to develop a universal vocabulary
for digital identity framework development and policy language. We propose
the framework of ontological semantic processing and text meaning representation
through the ontology as a possible solution for this problem. The existing
ontology (about 49,000 concepts, including 400 pertaining to information
security) would need to be supplemented by ~500-700 more concepts and
~1,500 lexical items to allow adequate support for the domain of digital
identity management. The ontological semantic approach to natural language
processing as such consists of using both language-independent static
knowledge sources (the ontology, fact database) and static language-dependent
sources (lexicons, onomasticons, morphological processor, syntactic processor,
etc.), as well as dynamic processing algorithms. In summary, the dynamic
algorithms (which include the tokenizer, syntactic, ecological and morphological
analyzers) process the text and match the lexical items in the text with
appropriate ontological concepts through the lexicon-ontology connections,
the output being text meaning representation, or TMR. As a language-independent
semantic "concept web", it can then be used (with appropriate follow-up
processing) for machine translation, data mining, information extraction,
question answering, text summarization, etc. The present paper deals with
the following methodological questions in domain acquisition for two of
the static knowledge sources, the ontology and the lexicon:
-Delimitation
of the expanding DIM textual corpus with volatile vocabulary;
-Extraction of lexical items pertaining to
the domain;
-Building ontological support for lexical
items; introduction of necessary attributes and relations;
-Semantic representation of lexical items
in the domain.
Integrating Identity Management and Trust Negotiation
Abhilasha Bhargav-Spantzel (PI);
Area: Identification, Authentication
and Privacy
Ontological Semantics Support for Handling Privacy
Policies
Olya Krachina;
Area: Identification, Authentication
and Privacy
CyberTrust Project purposes to provide a more robust
system for handling privacy policies as compared to the currently available
solutions.
One of the tasks is to convert user-defined
natural language into machine-readable language necessary for further
implementation. NLP presents itself as a powerful processing tool, given
the fact that privacy policies are formulated in natural language initially.
Furthermore, Ontological Semantics successfully accommodates the afore-mentioned
need. This poster demonstrates fundamental features of the framework through
topic-specific examples.
Privacy-Preserving Distributed k-Anonymity
Wei Jiang (PI);
Area: Identification, Authentication
and Privacy
k-anonymity provides a measure of privacy protection
by preventing re-identification of data to fewer than group of k data
items. While algorithms exist for producing k-anonymous data, the model
has been that of a single source wanting to publish data. This paper presents
a k-anonymity protocol when the data is vertically partitioned between
sites. A key contribution is a proof that the protocol preserves k-anonymity
between the sites: While one site may have individually identifiable data,
it learns nothing that violates k-anonymity with respect to the data at
the other site. This is a fundamentally different distributed privacy
definition than that of Secure Multiparty Computation, and it provides
a better match with both ethical and legal views of privacy.
Protecting Consumer Privacy in Reusable Digital
Devices
Brad Moseng (PI);
Area: Identification, Authentication
and Privacy
The increasing use of disposable digital devices
is leading to expanding vulnerabilities in personal privacy. In an effort
to bring digital devices to the mass populace, manufacturers have started
to market single-use, recyclable digital devices. One such device, the
Dakota single-use digital camera, is designed to compete in the disposable
photography market. However, with reusable digital devices, there is often
sensitive residual data left behind from previous users. The purpose of
this research is to determine if Pure Digital, the makers of the Dakota
camera, are providing enough data security in their device recycling process.
Purpose Based Access Control for Privacy Protection
Ji-Won Byun;
Area: Identification, Authentication
and Privacy
In this project, we investigate a comprehensive approach
for privacy preserving access control based on the notion of purpose.
Purpose information associated with a given data element specifies the
intended use of the data element, and our model allows multiple purposes
to be associated with each data element. A key feature of our model is
that it also supports explicit prohibitions, thus allowing privacy officers
to specify that some data should not be used for certain purposes. Another
important issue is the granularity of data labeling, that is, the units
of data with which purposes can be associated. We address this issue in
the context of relational databases and propose four different labeling
schemes, each providing a different granularity. We also explore an approach
to representing purpose information, which results in very low storage
overhead, and we exploit query modification techniques to support data
access control based on purpose information.
Querying Private Data in Moving-Object Environments
Reynold Cheng;
Area: Identification, Authentication
and Privacy
Location-based services, such as finding the nearest
gas station, require users to supply their location information. However,
a user's location can be tracked without her consent or knowledge. Lowering
the spatial and temporal resolution of location data sent to the server
has been proposed as a solution. Although this technique is
effective in protecting privacy, it may be
overkill and the quality of desired services can be severely affected.
In this paper, we investigate the relationship between uncertainty, privacy,
and quality of services. We propose using imprecise queries to hide the
location of the query issuer and evaluate uncertain information. We also
suggest a framework where uncertainty can be controlled to provide high
quality and privacy-preserving services. We study how the idea can be
applied to a moving range query over moving objects. We further investigate
how the linkability of
the proposed solution can be protected against
trajectory-tracing.
Securing the Manufacturing Environment using Biometrics
Stephen Elliott (PI); Shimon Modi;
Area: Identification, Authentication
and Privacy
Computer integrated manufacturing systems have changed
the interaction of Computer integrated manufacturing systems have changed
the interaction of industrial manufacturing equipment with different systems
within and outside the manufacturing environment. The increase in the
sophistication of the manufacturing equipment, along with increased connectivity
with internal and external systems has changed the way that manufacturing
security is designed. As manufacturers move towards a more connected collaborative
environment in order to compete in global businesses and geographically
disparate facilities, concerns that their proprietary manufacturing processes
and intellectual property could be exposed to damaging compromise on a
worldwide scale are increasing. The US government has also passed several
regulations so that companies take into account general concerns like
physical and logical security. The Sarbanes-Oxley Act of 2002 and FDA's
21 CFR Part 11 are two such regulations which require that companies have
specific controls to ensure authenticity, integrity and auditability of
electronic records. As part of the compliance guidelines, biometrics is
indicated as a preferred means of security. The general problem that the
manufacturing environment is facing is that operation of most industrial
manufacturing equipment does not require any strong form of authentication
or identification when some transaction related to product manufacturing
takes place. Most manufacturing systems require a password to log onto
the system, after which the system is open to anyone on the manufacturing
floor to operate. The manufacturing systems are sophisticated enough to
provide remote operation capability, but the only form of authentication
is a password. There are no means to ascertain who was operating the machine
and if they were authorized to do so. In an event of a malfunction or
accident, the audit trail does not provide adequate information. Biometrics
can solidify the authority checks and operator entry checks since the
authentication is no longer based only on passwords or security cards/tokens.
The main aim of this project is to demonstrate the integration of several
biometric technologies into the manufacturing environment. This project
proposes a unique application of biometrics and computer integrated technology
as part of providing an applied solution for the problems of security
and auditability in the manufacturing environment.
Security and Privacy in Healthcare Environments
Bharat Bhargava; Leszek Lilien; Yuhui Zhong
Area: Identification, Authentication
and Privacy
The objectives of our work on security and
privacy in healthcare environments include assuring security, privacy,
and safety for patients and staff, as well as processes and facilities
in healthcare institutions. Our work goes in three major directions. First,
we are concerned with vulnerabilities due to malicious behavior, hostile
settings, terrorist attacks, natural disasters, and tampering with data.
Second, we study how reliability, security, and privacy issues affect
timeliness and precision of patient information. Third, we investigate
methods and techniques to assure secure, private, trustworthy, reliable,
consistent, correct and pseudonymous collaboration over networks among
healthcare professionals.
SERAT : SEcure Role mApping Technique for decentralized
secure interoperability
Mohamed Shehab (PI);
Area: Identification, Authentication
and Privacy
Multi-domain application environments where distributed
domains interoperate with each other are becoming a reality in internet-based
and web-services based enterprise applica-
tions. The secure interoperation in a multidomain
environment is a challenging problem. In this paper, we propose a distributed
secure interoperability protocol that ensures secure interoperation of
the multiple collaborating domains without compromising the security of
collaborating domains. We introduce the idea
of access paths and access paths constraints. Furthermore, we device a
path discover algorithm that is capable of querying interoperating domains
for the set of secure access paths between different domains.
Translation-based Steganography
Christian Grothoff; Krista Bennett;
Area: Identification, Authentication
and Privacy
We are investigating the possibilities of steganographically
embedding information in the "noise" created by automatic translation
of natural language documents. An automated natural language translation
system is ideal for steganographic applications, since natural language
translation leaves plenty of room for variation. Also, because there are
frequent errors in legitimate automatic text translations, additional
errors inserted by an information hiding mechanism are plausibly undetectable
and would appear to be part of the normal noise associated
with translation. Significantly, it should
be extremely difficult for an adversary to determine if inaccuracies in
the translation are caused by the use of steganography or by perceptions
and deficiencies of the translation
software.
VNCs, Web Portals, Biometrics and the Verification
of Distance Education Students
Nathan Sickler (PI);
Area: Identification, Authentication
and Privacy
Technologies exist which allow Higher Education Institutions
to extend their classrooms beyond the traditional lecture hall to virtual
learning environments. Technologies such as Virtual Network Connections
(VNCs), and Web Portals (WebCT Vista & CE) are useful tools in the transfer
of knowledge to distance students. However, these technologies offer weak
methods of student verification - passwords. To improve the confidence
of Universities, Instructors, and other Students that the proper distance
students are answering exams, quizzes, assignments, and lab practicals,
biometric technologies (i.e. face, fingerprint, iris, keystroke dynamics)
can be employed.
Wireless Encryption Practices: Social Capital
Factors and Diffusion of Innovation
Sorin Matei; John Hooker (PI);
Area: Identification, Authentication
and Privacy
The poster is based on a study that explores how
social capital effects the diffusion of the social practice of residential
wireless computer network encryption over time. It is important to note
that we are examining the influences on and practice of using encryption,
not the technology itself. As a practice, choosing to use encryption is
similar to choosing whether to use other computer technologies like the
internet and e-mail. Thus, our results show that formal social capital
(membership in community organizations) influences people to employ encryption
in their home wireless computer networks. Informal social capital (level
of neighborhood belonging) was found to not significantly affect encryption
practices.
Causality-based Intrusion Analysis
Sundararaman Jeyaraman;
Area: Incident Detection, Response,
and Investigation
Intrusion Analysis has traditionally been a very
time-consuming and manual task. We take the view that the current failings
of intrusion analysis tools can be attributed to the lack of data that
identify causal relationships between system events. Causal relationships,
once identified, are not only useful for intrusion-analysis, but also
in a variety of other areas like Intrusion Detection, Forensics, Information-flow
tracking and Intrusion-alert correlation. In our work, we design and study
a host of techniques for capturing causal relationships.
Discussion of Defense against Phishing
mercan topkara (PI);
Area: Incident Detection, Response,
and Investigation
Foundations of Digital Forensic Investigations
Brian Carrier;
Area: Incident Detection, Response,
and Investigation
Digital investigations frequently occur, but there
is little theory behind them. The lack of theory becomes apparent when
one tries to define tool requirements for development and testing. There
is currently an umbrella term of "computer forensic tools," but there
are no formally defined categories of tools. This is similar to including
IDS, firewalls, and anti-virus in the category of "computer security tools." In
this work, we formally examine how data and evidence are created and define
a framework accordingly. This framework can be used to more rigorously
define procedures and tools.
iPod Forensics
Christopher Marsico (PI);
Area: Incident Detection, Response,
and Investigation
The iPod is the most popular digital music device.
The newest versions of the iPod have become more PDA like then ever before.
With this new functionality the iPod has recently found its way into the
criminal world. With the continued growth of the digital music device
market, the iPod's use in criminal activity will only continue to increase.
This research discusses some of the features of the iPod and how a criminal
could use them. Literature review found little or no documentation or
discussion on the forensic analysis of the iPod or similar devices. Therefore,
this research outlines what should be considered when an iPod is found
at the crime scene, and offers a critical analysis of some common forensic
tools and their ability to collect and analyze data from an iPod.
Psychological Profiling and Computer Forensics:
Locard's Principle in the Digital World
Marc Rogers (PI);
Area: Incident Detection, Response,
and Investigation
The presentation discusses the need to extend psychological
crime scene analysis from its current supportive role in physical crime
scene analysis, to an identical role in digital and cyber crime scenes.
The fundamentals of crime scene analysis are discussed and a focus on
the ability of psychological cyber crime scene analysis to answer the
FBI's critical elements is presented. A model illustrating the analogous
physical and cyber crime scene elements is provided. The importance of
cyber victimology in profiling and target hardening is also briefly examined,
as is the importance of not being fearful of the seeming uniqueness of
computer crime scenes. Finally, suggestions for future study are offered
The Trojan Horse Defense In Cybercrime Cases
Brian Carrier;
Area: Incident Detection, Response,
and Investigation
In a United Kingdom prosecution in 2003, Aaron Caffrey,
an admitted hacker, was acquitted of hacking into the computer system
of the Port of Houston and shutting it down. Caffrey was acquitted despite
the fact that no one disputed it was his laptop which committed the crime.
His defense was that his computer was taken over, courtesy of a Trojan
horse, and used to commit the crime without his knowledge. The jury acquitted
even though no evidence of a Trojan horse was found on Caffrey's computer.
Caffrey's is only one of several cases in which defendants who invoked
the "Trojan horse defense" have been acquitted of charges ranging from
hacking to tax fraud. This article examines the defense from both legal
and technical perspectives; while its primary focus is on how the prosecution
can respond to the invocation of a Trojan horse defense, it also explains
that such a defense can, in fact, be well-grounded. It would be quite
possible for someone to use a Trojan horse program to seize control of
another's computer and frame them.
Using process labels to obtain forensic and traceback
information
Florian Buchholz;
Area: Incident Detection, Response,
and Investigation
Many questions that are of interest in digital forensics,
intrusion
detection, network traceback, and access control
can not be
sufficiently answered by today's computing
systems. In particular
the question of who or what caused certain
actions on a system or
from where they originated can not be answered
by information
currently available. In this paper we present
a model that makes it
possible to propagate arbitrary meta-information
bound to subjects
-- active principals and passive objects --
based on the exchange of
information among them. The goal is to bind
a meaningful label, for
instance user or location information, to
an active principal on a
system at a certain event and then be able
to propagate and track
that label as information is being exchanged.
Virtual Playgrounds For Worm Behavior Investigation
Xuxian Jiang (PI);
Area: Incident Detection, Response,
and Investigation
To better understand worms' dynamic and possibly
camouflaged behavior, researchers have long hoped to have a safe and convenient
environment to unleash and run real-world worms. There are, however, major
challenges in realizing such "worm playgrounds", including the
playgrounds' fidelity, confinement, scalability,
as well
as convenience in infrastructure setup and
worm experiments. In particular, worm playgrounds that use physical hosts
as playground nodes may not be effective in addressing these challenges.
In this work, we present a virtualization-based
approach to creating virtual worm playgrounds, called vGrounds, on top
of a general-purpose infrastructure. A vGround is an all-software virtual
environment dynamically created on top of a physical infrastructure. The
infrastructure can be a single physical machine, a local cluster, or a
multi-domain overlay infrastructure such as
PlanetLab. A vGround contains realistic end-hosts and network
entities, all realized as virtual machines
(VMs) and confined in a virtual network (VN). The salient features of
vGround include: (1) high fidelity supporting
real worm codes exploiting real vulnerable
services,
(2) strict confinement making the real Internet
totally invisible and unreachable from inside a vGround, (3) high resource
efficiency providing worm experiments with a scale
magnitudes larger than the number of physical
machines in the infrastructure, and (4) flexible and efficient worm experiment
control enabling fast (tens of seconds) and automatic generation, re-installation,
and final tear-down of vGrounds. Our experiments with real-world worms
have successfully reproduced their probing and propagation patterns, exploitation
steps, and malicious payloads,
demonstrating both research and education
value of vGrounds.
Open Source Vs Proprietary Software: Vulnerabilities
and Patch Response
Sanjay Sridhar (PI);
Area: Risk Management, Policies and
Laws
Software selection is an important consideration
in risk management for information security. Additionally, the underlying
robustness and security of a technology under consideration has become
increasingly important in total cost of ownership and other calculations
of business value. Open source software is often touted as being robust
to many of the problems that seem to plague proprietary software. This
study seeks to empirically investigate, from an information security perspective
specific security characteristics of open source software compared to
those of proprietary software. Software vulnerability data spanning several
years are collected and analyzed to determine if significant differences
exist in terms of inter-arrival times of published vulnerabilities, median
time to release 'fixes' (commonly referred to as patches), type of vulnerability
reported and the respective severity of the vulnerabilities. It appears
that both open source and proprietary software are each likely to report
similar vulnerabilities and that open source software is only marginally
quicker in releasing patches for problems identified in their software.
The arguments favoring the inherent security of open source software do
not appear to hold up to scrutiny. These findings provide evidence to
security managers to focus more on holistic software security management,
irrespective of the proprietary-nature of the underlying software.
Securing the National Plant Diagnostic Network
Keith Watson (PI);
Area: Risk Management, Policies and
Laws
The National Plant Diagnostic Network (NPDN) is a
USDA-funded plant biosecurity program. The mission of the Network is to
enhance national agricultural security by quickly detecting introduced
pests and pathogens. This is achieved through a nationwide network of
public agricultural institutions with a cohesive, distributed system to
quickly detect deliberately introduced, high consequence, biological pests
and pathogens into agricultural and natural ecosystems by providing means
for quick identifications and establishing protocols for immediate reporting
to appropriate responders and decision makers. The Network will allow
land grant university diagnosticians and faculty, state regulatory personnel,
and first detectors to efficiently communicate information, images, and
methods of detection throughout the system in a timely manner.
CERIAS is assisting the NPDN in developing
an effective information security program. Security assessments were conducted
at each regional center and the national database facility to identify
security issues. Security policy review is underway. Technical training
is being developed for the NPDN regional IT personnel, as well as security
awareness materials for all personnel associated with the NPDN.
An Object-Oriented Multimedia System for Delivering
K-12 Information Security Curriculum
Nathan Bingham (PI);
Area: Security Awareness, Education
and Training
Educational research has shown that the use of multiple
sensory stimuli facilitates and extends the information processing
ability of learners with a variety of learning styles; in addition,
dual-encoding theory posits that multiple representations are more effective
for purposes of long-term retention (Smith and Ragan, 1999). Specifically,
the use of video is effective for addressing attitudinal change
and allows humanization of seemingly abstract and impersonal topics
such as information security. These factors, along with practical
concerns such as sustainability and efficiency, highlight the utility
of interactive multimedia as an ideal tool for end-user training
and awareness initiatives. For the K-12 audience who may have little
or no experience with the topic, engaging and exciting the prospective
learner is the key to changing behavior.
Unfortunately, despite the great benefit to
the user, the burden of creating interactive multimedia materials
can be both time consuming and costly. Using an object-oriented design
approach to both the system and the educational materials, these costs
can be reduced over time. Object-oriented systems allow for reusability
of multimedia components and coding such as exercises, screen layouts,
and graphics. Ideally, once a learning object is created it can
be reused or repurposed to a particular audience. The system created
for CERIAS's K-12 initiative takes advantage of benefits of both interactive
multimedia and object-oriented design. Using Macromedia Flash and XML
in an object-oriented design, the K-12 Information Security Modules
deliver multimedia lessons to the user using predefined XML instructions
created by an instructor. The system, graphics, and exercises used
in the module can be reused or repurposed allowing the instructor
to extend an existing lesson plan when new topics arise or even tailor
the system to a particular audience easily and quickly.
CERIAS K-12 Outreach
Matt Rose (PI);
Area: Security Awareness, Education
and Training
The K-12 Outreach program has developed a portfolio
of educational materials and resources targeted at the needs of K-12 teachers,
technology coordinators, administrators, parents, and students. The K-12
Outreach program provides support, staff development programs, student
workshops, professional collaborations, and this online reference. To
date, more than 5,200 educators have participated in K-12 Outreach program
activities. This poster highlights educational and research initiatives
in K-12 outreach.
Creating Awareness of Computer Security Concepts
Using Multimedia Teaching Aids
Jeff Hennis (PI);
Area: Security Awareness, Education
and Training
As the amount of cyber crimes and identity theft
steadily increases, there has been a growing need for awareness and training
in the areas of computer securities. The purpose of this project was to
develop multimedia elements that will fulfill this need to parents and
guardians with minors 6 - 17 years of age, in the areas of computer securities.
The target audience was tested not only on their awareness and knowledge
before viewing the developed product, but also tested how well it created
awareness, the effectiveness, and the motivation to look further into
the areas of computer securities. The testing method of the product was
available in a PDF file, on the internet. The results received, showed
a possible increase in awareness and motivation based on the effectiveness
of the product. With knowledge of this outcome, Group 12 will show how
their form of reintroduced technologies can be used as an effective as
a teaching aid.
Internship "Living Lab" Projects
Ed Finkler;
Area: Security Awareness, Education
and Training
To provide real world experience in Networking and
Security to students by supporting the Computer and Information Technology
Department Students, Classes and Faculty.
National Capacity Building
Melissa Dark (PI);
Area: Security Awareness, Education
and Training
This poster reports the impact of the Center for
Faculty Development in Information Assurance Education which is housed
at CERIAS.
Network Security Certificate Program
Connie Justice (PI); Ed Finkler;
Area: Security Awareness, Education
and Training
To provide Information Assurance education and training
to students and professionals.
Open Seminar
Annie Anton;
Area: Security Awareness, Education
and Training
Refactoring Secure Programming Classes
Pascal Meunier (PI);
Area: Security Awareness, Education
and Training
Undergraduate curricula most often focuses on achieving
functional objectives, and assignments are graded accordingly. Security
objectives are typically not emphasized, so most software vulnerabilities
are repeated, well-understood mistakes. Secure programming classes have
been taught at Purdue since the Fall 2002 in order to address this problem.
However, the scope of the classes was limited to that of an associated "sister" class
(e.g., operating systems), and biased towards UNIX. We expanded the scope
of the previously developed material to include Windows issues and created
exercises and labs that are independent of any "sister" class. We also
added coverage of trust, threat, risk issues as well as software engineering
considerations. In the process we uncovered Windows threats that have
been neglected in the litterature. The material is freely available at
http://www.cerias.purdue.edu/secprog/ as sets of slides and is divided
into three modules, which can be taught to different audiences.
Social Engineering Defense Architecture
Michael Hoeschele (PI); Michael Armbrust (PI);
Area: Trusted Social and Human Interactions
This paper proposes a theoretical solution to the
problem of Social Engineering (SE) attacks perpetrated over the phone
lines. As a byproduct real time attack signatures are generated, which
can be used in a cyber forensic analysis of such attacks. Current methods
of SE attack detection and prevention rely on policy and personnel training,
which fails because the root of the problem, people, are still involved.
The proposed solution relies on computer systems to analyze phone conversations
in real time and determine if the caller is deceiving the receiver. This
Social Engineering Defense Architecture (SEDA) is completely theoretical
as the technologies employed are only in the proof of concept phase, but
they are all proven to be tractable problems.