Securing the Internet presents great challenges and research opportunities. Potential applications such as Internet voting, universally available medical records, and ubiquitous e-commerce are all being hindered because of serious security and privacy concerns. The epidemic of hacker attacks on personal computers and web sites only highlights the inherent vulnerability of the current computer and network infrastructure.
Adequately addressing security and privacy concerns requires a combination of technical, social, and legal approaches. Topics currently under active investigation in the department include mathematical modeling of security properties, implementation and application of cryptographic protocols, secure and privacy-preserving distributed algorithms, trust management, verification of security properties, and proof-carrying code. There is also interest in the legal aspects of security, privacy, and intellectual property, both within the department and in the world-famous Yale Law school, with which we cooperate. Some of these topics are described in greater detail below.
James Aspnes is interested in problems involved with securing large distributed algorithms against disruption by untrustworthy participants. Using cryptographic techniques, it may be possible to allow intermediate results in a distributed algorithm to be certified independently of who provides them, reducing the problem of choosing which machines to trust. These issues become especially important in systems, such as peer-to-peer networks, where association with the system is voluntary and cannot be limited only to machines under the control of the algorithm designer
Joan Feigenbaum is interested in the foundations of electronic commerce and in fundamental problems in complexity theory that are motivated by cryptology. One such problem is the power of “instance-hiding” computations. Can the owner of a private database use the superior processing power of one or more other machines (perhaps for a fee) without having to reveal the database to those machines? In a set of influential papers with Mart‚n Abadi, Don Beaver, Lance Fortnow, and Joe Kilian, Professor Feigenbaum showed: 1) that instance-hiding computations are limited in power if the private-database owner can only consult a single other machine; 2) that they are extremely powerful if the owner can consult multiple other machines, and 3) that instance hiding is closely related to some of the central themes of complexity theory, e.g., interactive provability, average vs. worst-case complexity, and the inherent communication costs of multiparty protocols.
In another direction, Professor Feigenbaum founded the research area of “trust management” in collaboration with Matt Blaze and Jack Lacy. Emerging Internet services that use encryption on a mass-market scale require sophisticated mechanisms for managing trust. E-businesses will receive cryptographically signed requests for action and will have to decide whether or not to grant these requests. In centralized (and small-scale distributed) computing communities, an authorizer can make such a decision based on the identity of the person who signed the request. Global, internet-scale e-businesses, however, cannot rely on identities. Most merchants will have had no contact with a typical prospective customer prior to the first time they receive a request from him. Making authorization decisions in this type of environment requires formal techniques for specifying security policies and security credentials, rigorously determining whether a particular set of credentials proves that a request complies with a policy, and deferring trust to third-party credential issuers. The “PolicyMaker” and “KeyNote” trust-management systems, which she co-invented with Blaze, Lacy, John Ioannidis, and Angelos Keromytis, have had wide-ranging impact on large-scale distributed-authorization mechanisms.
Michael Fischer is interested in security problems connected with Internet voting, and more generally in trust and security in multiparty computations. He has been developing an artificial society in which trust has a precise algorithmic meaning. In this setting, trust can be learned and used for decision making. Better decisions lead to greater social success. This framework allows for the development and analysis of some very simple algorithms for learning and utilizing trust that are easily implementable in a variety of settings and are arguably similar to what people commonly use in everyday life.
Zhong Shao leads the FLINT group at Yale, which is developing a system for secure mobile code based on authentication logics, proof-carrying code, and type-based certifying compilers. Authentication logics are formal logics that allow one to reason about the properties of systems and protocols that verify the identity of users and decide whether or not to permit various operations. Modeling such systems provides the usual benefits of formal analysis: hidden assumptions are made explicit, redundant features are exposed, and flaws in the system may be found. Proof-carrying code (PCC) allows a code producer to provide a (compiled) program to a host, along with a formal proof of safety. The host can specify a safety policy and a set of axioms for reasoning about safety; the producer’s proof must be in terms of those axioms. Type-based certifying compilers are compilers that use static type information to help generate provably safe target code. These technologies fit together naturally and form the foundation for modern secure mobile-code system.