Summary: | Modern computer architectures rely on caches to reduce the latency gap
between the CPU and main memory. While indispensable for performance, caches
pose a serious threat to security because they leak information about memory
access patterns of programs via execution time.
In this paper, we present a novel approach for reasoning about the security
of cache algorithms with respect to timing leaks. The basis of our approach is
the notion of leak competitiveness, which compares the leakage of two cache
algorithms on every possible program. Based on this notion, we prove the
following two results:
First, we show that leak competitiveness is symmetric in the cache
algorithms. This implies that no cache algorithm dominates another in terms of
leakage via a program's total execution time. This is in contrast to
performance, where it is known that such dominance relationships exist.
Second, when restricted to caches with finite control, the
leak-competitiveness relationship between two cache algorithms is either
asymptotically linear or constant. No other shapes are possible.
|