The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reports and Papers Archive


Browse All Papers »       Submit A Paper »

Software Generation of Practically Strong Random Number

Peter Gutmann

Although much thought usually goes into the design of encryption algorithms and protocols, less consideration is often given to equally important issues such as the selection of cryptographically strong random numbers, so that an attacker may find it easier to break the random number generator than the security system it is used with. This paper provides a comprehensive guide to designing and implementing a practically strong random data accumulator and generator which requires no specialised hardware or access to privileged system services. The performance of the generator on a variety of systems is analysed, and measures which can make recovery of the accumulator/generator state information more difficult for an attacker are presented. The result is an easy to use random number generator which should be suitable even for demanding cryptographic applications.

Added 2002-07-26

Compiler-Integrated Support for Structural and Fault-Based Software Testing

E. W. Krauser

Despite the application of rigorous development methodologies, program errors are generally regarded as a fundamental phenomenon in sufficiently complex software systems. Yet, computer software is employed to control larger, more complex, more potentially dangerous systems than ever before. Ensuring that these software systems function correctly is a vital concern. A large class of well-known software testing techniques require information attained by executing the program under test. We are interested in those techniques that monitor and/or modify the internal behavior of the executing program. This work proposes new methods to efficiently and effectively support these techniques while executing programs in their target run-time environment. Consequently, we are concerned with machine executable programs and our approach is compiler-based. This approach, as opposed to more traditional interpreter-based approaches, will afford a significant increase execution speed during software testing, increased confidence in the accuracy of a software test, and a new set of capabilities with which to test software. It is hoped that, as a result, current state-of the-art structural and fault-based testing techniques will become significantly more cost-effective and attractive as a means to test real, production software systems.

Added 2002-07-26

A New Design for Distributed Systems: The Remote Memory Model

Douglas Comer,James Griffioen

This paper describes a new model for constructing distributed systems called the Remote Memory Model. The remote memory model consists of several client machines, one or more dedicated machines called remote memory servers, and a communication channel interconnecting them. In the remote memory model, client machines share the memory resources located on the remote memory server. Client machines that exhaust their local memory move portions of their address space to the remote memory server and retrieve pieces as needed. Because the remote memory server uses a machine-independent protocol to communicate with client machines, the remote memory server can support multiple heterogenous client machines simultaneously. This paper describes the remote memory model and discusses the advantages and issues of system that use this model.It examines the design of a highly efficient, reliable, machine-independent protocol used by the remote memory server to communicate with the client machines. It also outlines the algorithms and data structures employed by the remote memory server to efficiently locate the data stored on the server. Finally, it presents measurements of a prototype implementation that clearly demonstrate the viablility and competitive performance of the remote memory model.

Added 2002-07-26

Computer Security

Eugene H. Spafford
Added 2002-07-26

An Overview Of The Virtual Memory Xinu Project

Douglas Comer,James Griffioen

The Virtual Memory Xinu Project investigates a new model of virtual memory in which dedicated, large-memory machines serve as a backing store (page servers) for virtual memory systems operating on a set of (heterogeneous) clients. The dedicated page server allows sharing of the large physical memory resource and provides fast access to data. This paper gives a brief overview of the Virtual Memory Xinu research project. It outlines the new virtual memory model used, the project’s goals, and prototype design and implementation.

Added 2002-07-26

Parallel Algorithms for Longest Increasing Chains in the Plane and Related Problems

COAST TR 97-22
Mikhail J. Atallah,Danny Z. Chen,Kevin S. Klenk
Download: PDF

Given a set S of n points in the plane such that each point in S is associated with a non negative weight, we consider the problem of computing the single source longest increasing chains among the points in S. This problem is a generalization of the planar maximal layers problem. In this paper, we present a parallel algorithm that computes the single source longest increasing chains in the plane in O(log^2*n) time using O(n^2/log^3 n) processors in hte CREW PRAM computational model. We also have solved a related problem of computing the all-pairs longest paths in an n-node weighted planar st-graph, in O(log^2 n) time using O(n^2 / log n) CREW PRAM processors. Both of our parallel algorithms are an improvement over the previously best known results.

Added 2002-07-26

Sample Statements of Work for Federal Computer Security Services: For use In-House or Contracting Out

Dennis Gilbert, Nickilyn Lynch, ed.

Each federal organization is fully responsible for its copmuter security program whether the security program is performed by in-house staff or contracted out.  Time constraints, budget constraints, availability or expertise of staff, and the potential knowledge to be gained by the organization from an experienced contractor are among the reasons a federal organization may wish to get external assistance for some of these complex, labor intensive activities.

Added 2002-07-26

An Improved Hypercube Bound for Multisearching and its Applications

COAST TR 97-23
Mikhail Atallah
Download: PDF

We give a result that implies an improvement by a factor of log log n in the hypercube bounds for the geometric problems of batched planar point location, trapezoidal decomposition, and polygon triangulation. The improvements are achieved through a better solution to the multisearch problem on a hypercube, a parallel search problem where the elements in the data structure S to be searched are totally ordered, but where it is not possible to compare in constant time any two given queries q and q’. Whereas the previous best solution to this problem took O(log n(log log n)^3) time on an n-processor hypercube, the solution given here takes O(log n (log log n)^2) time on an n-processor hypercube. The hypercube model for which we claim our bounds is the standard one, SIMD, with O(1) memory registers per processor, and with one-port communication. Each registar can store O(log n) bits, so that a processor knows its ID.

Added 2002-07-26

Firewalls Fend Off Invasions From the Net

COAST TR 98-04
Christopher Schuba
Download: PDF
Added 2002-07-26

Pattern Matching Image Compression

COAST TR 97-21
Mikhail Atallah,Yann Genin,Wojciech Szpankowski
Download: PDF

We propose a non-transform image compression technique based on approximate pattern matching, that we name Pattern Matching Image Compression (PMIC). The main idea behind it is a lossy extension of the Lempel-Ziv data compression scheme in which one searches for the longest prefix of an uncompressed image that approximately (e.g., D of mismatches are allowed) occurs in the already processed image. This main algorithm is enhanced with several new features such as searching for reverse approximate matching, recognizing substrings in images that are additively shifted versions of each other, introducing a variable and adaptive maximum distortion level D, and so forth. These enhancements are crucial to the overall quality of our scheme. In this paper we present algorithmic as well as experimental results of the Pattern Matching Image Compression. Our scheme turns out to be competitive with JPEG and wavelet compression for graphical and photographical images. A unique feature of the purposed algorithm is that an asymptotic performance of the scheme can be theroretically established. More precisely, under stationary mixing probablilistic model of an image and fixed maximum distortion level D, the compression ratio is asymptotically equal to the so called generalized Renyientropy r0(D). This entropy is in general smaller than the optimal rate distortion function R(D), but there is numerical evidence that these two quantities do not differ too much for small values of D.

Added 2002-07-26

Monte Carlo Simulation on Software Mutation Test-Case Adequacy

Mehmet Sahinoglu,Ibrahim Baltaci,E. H. Spafford

This paper compares the conventional stopping criterion of ad-hoc mutation score with the proposed Barnard’s score notation technique based on the Bayesian SPRT to test the software mutation test-case adequacy. A Monte Carlo simulation is used to mimic the mutation analysis as in a software laboratory on the well known Trityp test program for two calculated experimental designs. The results indicate that the Bayesian SPRT through Barnard’s score notation proves at least equally accurate and more cost effective for the goal of a given test-case adequacy.

Added 2002-07-26



Electronic Commerce in the NII

XIWT Members
Added 2002-07-26

Nomadicity in the NII

XIWT Members
Added 2002-07-26