First Time Miss: Low Overhead Mitigation for Shared Memory Cache Side Channels

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Cache hit or miss is an important source of information leakage in cache side channel attacks. An attacker observes a much faster cache access time if the cache line has previously been filled in by the victim, and a much slower memory access time if the victim has not accessed this cache line, thus revealing to the attacker whether the victim has accessed the cache line or not. For machines with private caches, this leakage can be mitigated by scheduling the victim and potential attackers on different cores, or flushing the private caches after a use. However, the latter is less practical for the large last-level cache. In this work, we propose a novel yet simple mitigation approach for cross-core attacks, called FTM (first time miss) approach. In this approach, in order to hide a cache hit to a shared cache, we make it to behave like a miss when it is accessed the first time by a thread. It is simulated by buffering the cache line for a time similar to the memory access time (i.e. like a miss penalty), and then sending it to the private cache. The next access onwards, it is safe to allow cache hits on this cache line because the attacker has already accessed it once, and expects it to be filled anyway. Thus, all of the cache lines appear to be accessed only by the attacker, and the access patterns of the victim can be hidden. The hardware overhead for the FTM scheme is minimal because it only needs a small per core buffer. Simulation-based evaluation on SPEC and PARSEC benchmarks shows low performance hit (< 0.1%) because of low number of first time misses in most application programs.

Original languageEnglish (US)
Title of host publicationProceedings of the 49th International Conference on Parallel Processing, ICPP 2020
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450388160
DOIs
StatePublished - Aug 17 2020
Event49th International Conference on Parallel Processing, ICPP 2020 - Virtual, Online, Canada
Duration: Aug 17 2020Aug 20 2020

Publication series

NameACM International Conference Proceeding Series

Conference

Conference49th International Conference on Parallel Processing, ICPP 2020
CountryCanada
CityVirtual, Online
Period8/17/208/20/20

Bibliographical note

Publisher Copyright:
© 2020 ACM.

Keywords

  • cache
  • computer architecture
  • security
  • side channel

Fingerprint Dive into the research topics of 'First Time Miss: Low Overhead Mitigation for Shared Memory Cache Side Channels'. Together they form a unique fingerprint.

Cite this