How randomized data can improve our security
Huge streams of data pass through our computers and smartphones every day. In simple terms, technical devices contain two essential units to process this data: A processor, which is a kind of control center, and a RAM, comparable to memory. Modern processors use a cache to act as a bridge between the two, since memory is much slower at providing data than the processor is at processing it. This cache often contains private data that could be an attractive target for attackers. A team of scientists from Bochum, Germany, in cooperation with researchers from Japan, has now developed an innovative cipher that not only offers greater security than previous approaches, but is also more efficient and faster. They are presenting their work at the prestigious Usenix Security Symposium in Anaheim, California (USA).
The team includes Dr. Federico Canale and Professor Gregor Leander from the Chair of Symmetric Cryptography, Jan Philipp Thoma and Professor Tim Güneysu from the Chair of Security Engineering, all from Ruhr University Bochum, as well as Yosuke Todo from NTT Social Informatics Laboratories and Rei Ueno from Tohoku University (Japan).
Cache not well protected against side-channel attacks until now
Years ago, CASA PI Professor Yuval Yarom, who has been at Ruhr University since April 2023, discovered that the cache is not well protected against a certain type of attack. The serious Spectre and Meltdown vulnerabilities made headlines at the time because they affected all popular microprocessors as well as cloud services. Caches are unobtrusive, but they perform an important task: they store data that is requested very frequently. Its main function is to reduce latency. If the CPU had to fetch from slower RAM every time it needed to access data, this would slow down the system. This is why the CPU fetches certain data from the cache. However, attackers can exploit this communication between CPU and cache. Their method: They overwrite the cache’s unsecured data. The system requests the data from main memory because it cannot find it in the cache. This process is measurably slower. “In so-called timing side-channel attacks, attackers can measure the time differences and use them to observe memory accesses by other programs. Thus, they can steal private keys for encryption algorithms, for example,” explains Jan Philipp Thoma from the Chair of Security Engineering.
Innovative mathematical solution
While patches have been developed to fix the vulnerability for certain attacks, they have failed to provide provable security. However, the team from Bochum and Japan has now come up with an innovative solution: “Our idea is to use mathematical processes to randomize the data in the cache,” explains Gregor Leander, who recently received an ECR Advanced Grant for his research. This randomization in the CPU’s caches can help prevent attacks by disabling attackers from removing data from the cache.
“The interdisciplinary approach of cryptography and hardware security considerations is a novelty in computer security. While there have been previous ideas for randomized cache architectures, none have been very efficient and none have been able to completely withhold strong attackers,” said Tim Güneysu, who heads the Chair of Security Engineering. The new SCARF model uses block cipher encryption, a completely new idea for the field, according to the researchers. “Normally, we encrypt data with 128 bits, in the cache we sometimes work with 10 bits. This is a complex process because it takes much longer to mix this data with a large key,” said Gregor Leander. The large key is needed because a shorter encryption of such small amounts of data could be more easily broken by attackers. More