Trusted Platform Modules (TPMs) aren’t so trusted today. Intel and STMicroelectronics have had to release fixes for timing attacks in their TPM offerings.
An international team of researchers discovered they could extract “hidden” private keys from these TPMs—and do it surprisingly easily. The team calls their discovery “TPM-Fail.”
Timing attacks have been widely known about since at least 1993 (back then, we called it the “clocked adversary” problem). In today’s SB Blogwatch, we stroke gray beards and shake heads.
Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: monsters.
What’s the craic? Catalin Cimpanu reports—“vulnerabilities impact TPM chips in desktops, laptops, servers”:
Thanks to efforts from the research team, both vulnerabilities have been fixed, which is a good thing since both issues can be weaponized in doable real-world attacks. … Nowadays, it’s hard to find a device that’s not using a TPM, either in the form of a hardware-isolated chip, or … firmware-based TPMs — also known as fTPMs.
The novelty and danger factor surrounding TPM-FAIL relies in the fact that this attack is also fully weaponizable in a real-world scenario. … After enough observations of the response time, attackers would be able to recover [a] private key.
The first vulnerability is CVE-2019-11090 and impacts Intel’s Platform Trust Technology (PTT). … PTT is Intel’s fTPM software-based TPM solution … widely used on servers, desktops, and laptops.
The second is CVE-2019-16863 and impacts the ST33 TPM chip made by STMicroelectronics. [It] is incredibly popular … used on a wide array of devices ranging from networking equipment to cloud servers.
Of the two, the issue impacting Intel’s fTPM solution is considered the most dangerous, as it could be exploited remotely. … Applying the Intel PPT firmware updates should be a top priority.
You can say that again. Thomas Claburn quips—“You know what they say: Timing is… everything”:
[TPMs] are not entirely trustworthy. … Boffins from the Worcester Polytechnic Institute … University of California, San Diego … and the University of Lübeck … successfully conducted black-box timing analysis … to recover 256-bit private keys for ECDSA (Elliptic Curve Digital Signature Algorithm) and ECSchnorr signatures that are supposed to remain unobserved within the TPM.
The researchers found that a local attacker can recover the ECDSA key from Intel fTPM in 4-20 minutes.
Yikes. The researchers—Moghimi et al.—describe their work:
TPM serves as a root of trust. … TPM is supposed to protect our security keys from malicious adversaries.
While the key should remain safely inside the TPM hardware, we show how … timing leakage … allows an attacker to recover 256-bit private keys. … We even show that these attacks can be performed remotely on fast networks. … There is a high chance that you are affected.
A hacker can use these vulnerabilities to forge digital signatures … for bypassing Authentication, tampering with the OS, and other bad things. … Even rigorous testing as required by Common Criteria certification is not flawless.
The vulnerabilities we have uncovered emphasize the difficulty of correctly implementing known constant-time techniques, and show the importance of evolutionary testing and transparent evaluation of cryptographic implementations. … This is the second time that the CC evaluation process has failed to provide expected security guarantees.
This clearly underscores the need to reevaluate the CC process.
And brohee agrees:
I think Paul Kocher first published about timing attacks in 1996. … The fact that industry giants not only sold them, but got them certified says a lot about how much assurance CC and FIPS evaluation buys you.
Wait. Pause. Greg McLearn comments, “Common Criteria does not imply side channel analysis”:
I am a certified CC evaluator. … In no way does CC automatically imply resistance to side channel analysis.
If the product claims resistance to side channel analysis, then the work to get assurance of that claim will only be as good as the evaluator. … Existence of a Common Criteria certificate means nothing unless you read the claims and determine the rigour employed.
But tedunangst says that’s an unfair argument:
[The researchers] call out the fact that it was specifically certified for such use and to be resistant to such attacks.
Many people equate TPMs with user restrictions. JohnFen puts the T in unTrustworthy: [You’re fired—Ed.]
I don’t trust TPM because TPM is most often used to restrict my use of my own machines.
So, sky falling? Jaime2 adds some perspective:
TPM adds a bit of convenience. If TPM wasn’t a thing, then all of my laptop users would have to carry a flash drive with the drive encryption key in order to boot.
The whole idea of, “The sensitive stuff is on your computer, but it’s in a safe place that no one could ever get to,” has always been a bit of a sales job. The truth is that TPM storage is good enough for most people.
Meanwhile, Joshua Yoder—@MrJoshuaYoder—waxes stoic:
Secure today, broken tomorrow. There is no such thing as 100% secure. Ever.
You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites… so you don’t have to. Hate mail may be directed to @RiCHi or email@example.com. Ask your doctor before reading. Your mileage may vary. E&OE.