Having a paper accepted to a high-caliber conference is a great accomplishment. Having one of those papers withstand the test of time is even better.
Five papers co-authored by Carnegie Mellon researchers presented at past IEEE Security & Privacy (S&P) symposia more than fifteen years ago were awarded IEEE’s Test-of-Time Award at this week’s annual conference. The IEEE S&P symposium initiated the Test-of-Time Award last year to recognize papers that have made a lasting impact on the fields of security and privacy. Nine papers received the award this year.
Listed below are the five papers that received the IEEE Test-of-Time Award this week.
Authors: Bryan Parno (CyLab), Adrian Perrig (CyLab), Virgil Gligor (University of Maryland)
- This paper observed that in some networks, an attacker can capture a legitimate network node, extract its secrets, and then introduce many clones of that node back into the network. This gives the attacker significant influence over the network, allowing it to, for example, suppress legitimate alarms or subvert additional nodes. Detecting such a threat is quite difficult, since the clones have all of the access and authentication tokens the original trusted node did.
- To counter this threat, the study introduces a pair of protocols designed to detect replication via "emergent algorithms,” which produce a network-level property via the independent actions of many nodes.
- This adversary model and problem formulation captured the community's interest, and led to some remarkable follow on work, both in sensor networks, and in other contexts, e.g., detecting fake accounts in social networks.
Random Key Predistribution Schemes for Sensor Networks (IEEE S&P 2003)
Authors: Haowen Chan (CyLab), Adrian Perrig (CyLab), Dawn Song (CyLab)
- This study helped advance security of communications between devices in the Internet of Things (IoT).
- Building off a paper from 2002, this paper extends the Eschenauer-Gligor model – by which sensors communicate using cryptographic keys – and identified three different ways to extend the resilience of the communication scheme.
- Community interest arose from the study as researchers started to embark and propose a series of ever-improved systems. This research also helped to popularize the threat model of multiple-node compromise in sensor networks and IoT.
Authors: Adrian Perrig (CMU), Ran Canetti (Boston University), J.D. Tygar (UC Berkeley),
- The core contribution of this paper is TESLA, a system for authentication of broadcast messages, also referred to as multicast streams.
- In order for secure broadcast authentication to be authenticated, only the sender can create the authentication information, and all the receivers can verify – but not create – the authentication. This “asymmetry” is difficult to achieve in an efficient manner. TESLA used time to achieve asymmetry, where a message is authenticated with a secret key that will be made public at a later point in time.
- Over the past 20 years, the TESLA authentication system was used in a variety of real-world applications. Today, TESLA is being considered for the authentication of satellite navigation messages.
Practical Techniques for Searches on Encrypted Data (IEEE S&P 2000)
Authors: Dawn Song (UC Berkeley, formerly CMU), David Wagner (UC Berkeley), Adrian Perrig (CMU)
- This study presented the first efficient construction to enable an untrusted server to perform a search on encrypted data without leaking the search term.
- With the emergence of cloud computing, the paper inspired a wealth of follow-up research to further improve search operations, but also to support a variety of operations on encrypted data.
- With these systems, a user can perform operations on encrypted data in the cloud, without needing to trust the cloud operator.
A Sense of Self for Unix Processes (IEEE S&P 1996)
Authors: Thomas A. Longstaff (CMU SEI), Stephanie Forrest (Univ. of New Mexico), Steven A. Hofmeyr (Univ. of New Mexico), Anil Somayaji (Univ. of New Mexico)
- This paper introduced a new method for network anomaly and intrusion detection in which “normal” is defined by short-range correlations in a process’ system calls.
- The authors performed experiments that suggest that their “normal” definition is stable during normal behavior for standard UNIX programs.
- The work was part of a research program aimed at building computer security systems that incorporate mechanisms and algorithms used by natural immune systems.