This article presents a legal and policy analysis of issue preclusion in product liability litigation. The analysis shows that application of offensive collateral estoppel to preclude liability constitutes an abridgement of a fundamental right, and should therefore be subject to strict scrutiny. The major public policy interests of collateral estoppel to be weighed in a strict scrutiny calculus are decisional consistenct and judicial economy. In fact, offensive collateral estoppel has an ambiguouscausal connection with decisional consistency and may actually undermine it. Furthermore, analysis of constitutional jurisprudence shows that the fundamental right at issue may not be rationed or compromised to promote a purelyeconomic interest. Based on these considerations, the offensive use of collateral estoppel to preclude liability does not pass the strict scrutiny test of constitutionality. Policy implications of this analysis include limitations on full faith and credit recognition and enforcement of product liability judgements across state lines.
Federal Rule of Civil Procedure42(b) allows the separation of a civil trial into two or more components when such a separation would promote convenience, fairness and economy. The separate components are then tried successively. When a case is separated into two components, such as the separate adjudication of liability and damages in a tort action, the procedure is referred to as \“bifurcation.\” This article presents an analysis of the legal and policy implications of trial bifurcation in product liability litigation. an analysis of the information integration and group compromise inherent in the jury decision, within the framework of applicable procedural rules, predicts that bifurcation will give an \“a priori\” lower likelihood of a liability verdict and higher expected damages conditional on a liability verdict, compared to a unitary trial of the same case. Furthermore, the (unconditional) expected damage award is greater for bifurcated trials than for unitary trials when evidence of liability is strong, while the converse is true for marginal cases. We show that these predictions are consistent with empirical evidence based on reported case decisions, as well as experimental studies with simulated juries. A final section discusses public welfare implications of policies that would ecourage increased use of trial bifurcation.
The accelerating trends of interconnectedness, complexity, and extensibility are aggravating the already-serious threat posed by malicious code. To combat malicious code, these authors argue for creating sound policy about software behavior and enforcing that policy through technological means.
This paper describes an implementation of UNIX on top of an object-oriented operating system. UNIX is implemented without modifying the underlying mechanisms provided by the base system. The resulting system runs dynamically-linked UNIX binaries and utilizes the services provided by the object-oriented system.
Because of the urgent security requirments in many existing general-purpose operating systems, the large investment committed to such systems, and the large number of protection errors embedded in them, the problem of finding such errors is one of major importance. This report presents an approach to this task, based on the premise that the effectiveness of error searches can be greatly increased by techniques that utilize \“patterns,\” i.e., formalized descriptions of error types. It gives a conceptual overview of the pattern-directed evaulation process and reports the authors\’ initial experience in formulating patterns from the analysis of protection errors previously detected in various systems, as well as in applying the pattern-directed technique. This study is part of a larger effort to provide securable operating systems in DoD environments.
This report describes a class of operation system protection errors known as \“insufficient validation of critical conditions,\” or simply \“validaion errors,\” and outlines a scheme for finding them. This class of errors is recognized as a very broad one, lying outside the scope of the basic protection mechanisms of existing systems; the extent of the problem is illustrated by a set of validation errors taken from current systems. Considerations for validity conditions and their attachment to variables and to various types of control points in procedures are explored, and categories of validation methods noted. The notion of critiality itself is analyzed, and criteria suggested for determining which variables and control points are most critical in the protection sense. Because a search for validation errors can involve substantial information processing, the report references existing or developing tools and techniques applicable to this task.
We show how mutation testing can be used to detect simple and complex errors that are often found in production software. We present a classification of the errors of TEX reported by Knuth. Using this classification we show that indeed the simple errors that mutation models do form a significant percentage of the errors found in production software. We introduce the notion of an \‘error revealing\’ mutant and show how such mutants, created by simple alterations of the program under test, can expose complex errors. We use the data provided by Knuth to obtain the types of complex errors used in our examples.
A number of people have published studies of faults found in software systems. This report summarizes the results of many of those studies.
A common security problem is the residual—data or access capability left after the completion of a process and not intended for use outside the context of that process. If the residual becomes accessible to another process, a security error may result. A major source of such residuals is improper or incomplete allocation/deallocation processing. The various types of allocation/deallocation residuals are discussed in terms of their characteristics and the manner in which they occur, and a semiautomatable search strategy for detecting sources of these residuals is presented.
This document describes a class of protection errors found in current computer operating systems. It is intended (1) for persons responsible for improving the security aspects of existing operating system software and (2) for designers and students of operating systems. The purpose is to help protection evaluators find such errors in current systems and to help designers and implementers avoid them in future systems, by analysis and methodical approach.
This report deals with a class of errors, initially identified empirically, that formeditself around a group of protection errors (within a larger collection) having the common characteristic of involving operations or accesses ocurring in the wrong order or at the wrong times; hence the name \“serialization\”. In its broadest sense, it includes a large proportion of all programming errors which may have improper order or scheduling, and, in a narrower sense includes only those errors resulting from improper ordering of accesses to objects accessible by potentially concurrent operations.
This study is neither a full analysis of the subject of the ordering of operations nor only a discussion of process synchronization, but rather an attempt to give perspective to several closely-related subclasses of problems in this area.
In an open network computing environment, a workstation cannot be trusted to identify its users correctly to network services. Kerberos provide an alternative approach whereby a trusted third-party authentication service is used to verify users\’ identities. This paper gives an overview of the Kerberos authentication model as implemented for MIT\‘s Project Athena. It describes the protocols used by clients, servers, and Kerberos to achieve authentication. It also describes the management and replication of the database required. The views of Kerberos as seen by the user authentication. We describe the addition of Kerberos authentication to the Sun Network File System as a case study for integrating Kerberos with an existing application.
The successful penetration testing of a major time-sharing operating system is desribed. The educational value of such a project is stressed, and principle of methodologyand team organization are discussed as well as the technical conclusions from the study.