Simple Illustration of Zoom Encryption Failure

The Citizen Lab April 3rd, 2020 report broke the news on Zoom using weak encryption and gave this top-level finding:

Zoom documentation claims that the app uses “AES-256” encryption for meetings where possible. However, we find that in each Zoom meeting, a single AES-128 key is used in ECB mode by all participants to encrypt and decrypt audio and video. The use of ECB mode is not recommended because patterns present in the plaintext are preserved during encryption.

It’s a long report with excellent details of how terrible Zoom engineering management practices have been, violating industry standards of safety and product security.

The report includes the famous electronic codebook (ECB) mode penguin, which illustrates why it is considered broken for confidentiality.

I say famous here because anyone thinking about writing software to use AES surely knows of or has seen this image. It’s from a campaign meant to prevent ECB mode selection.

A problem is simply that using ECB means identical plaintext blocks generates identical ciphertext blocks, which maintains recognizable patterns. This also means when you decipher one block you see the contents in all of the identical blocks. So it is very manifestly the wrong choice for streams of lots of data intended to be confidential.

However, while Citizen Lab included the core image to illustrate this failure, they also left out a crucial third frame on the right to drives home what industry norms are compared to Zoom’s unfortunate choice.

The main reasons this penguin became famous is because it shows huge weakness faster than trying to explain ECB cracking. It makes it obvious why Zoom really screwed up.

Now, just for fun, I’ll still try to explain the old-fashioned way here,

Advanced Encryption Standard (AES) is a U.S. National Institute of Standards and Technology (NIST) algorithm for encryption.

Here’s our confidential message that nobody should see:


Here’s our secret (password) we will use to generate a key:


And then here’s our resulting key:


We take our plaintext “zoom” and use the key to generate the following ciphertext:



I’ve kept the blocks separate above and highlighted the middle two because you can see exactly how “zoom” repetitive plaintext is reflected by two identical blocks.

It’s not as obvious as the penguin, but you still kind of see the point, right?

If we string our blocks together, as if sending over a network, to the human eye it is deceptively random-looking, like this:


And back to the key, if we run decryption on our stream, we see our confidential content padded out in blocks uniformly sized:


You also probably noticed at this point that if anyone grabs our string they can replay it. So using ECB also brings an obvious simple copy-and-paste risk.

A key takeaway, pun intended of course, is that Zoom used incredibly weak protection by choosing AES-128 ECB. That’s bad.

The fact that they claimed AES-256 is worse, because then they’re not disclosing just how weak their protection is. It’s misleading customers who might run away when they saw AES-128 ECB (as they should).

Maybe run away is too strong, but I can tell you all the cloud providers use AES-256 as a minimum and the NSA published a doc in 2015 saying AES-256 was a minimum for top secret information.

And on top of all that, the keys for Zoom were being generated in China even for users in America not communicating with anyone in China.

It’s all a mess and part of a larger pattern, pun intended of course: weak confidentiality protections in Zoom engineering.

Here are some more examples to round out the pervasiveness of mismanagement.

Zoom used no authentication for their “record to cloud” feature, so customers were unwittingly posting private videos onto a publicly accessible service with no protection.

If someone chose to add authentication to protect their recorded video, the Zoom cloud only allowed a 10 character password (protip: NIST recommends long passwords. 10 is short) and Zoom had no brute force protections for these short passwords.

They also used no randomness in their meeting ID, kept it a short number and left it exposed permanently on the user interface.

Again all of this means that Zoom fundamentally didn’t put the basic work in to keep secrets safe, didn’t apply industry-standard methods that are decades old.

It would be very nice, preferred really, if there were some way to write this off as naive or accidental.

However, there are now two major factors prohibiting that comfortable conclusion.

  1. The first is set in stone: Zoom CEO was the former VP of engineering at WebEx after it was acquired by Cisco and tried to publicly shame them for using his “buggy code”. He was well aware of both safe coding practices as well as the damage to reputation from bugs, since he tried to use that as a competitive weapon in direct competition with his former employer.
  2. The second is an entirely new development that validates why and how Zoom ended up where they are today: the CEO announced he will bring on board the ex-CSO of Facebook (now working at Stanford, arguably still for Facebook) to lead a group of CSO. The last thing Zoom needs (or anyone for that matter) is twelve CSO doing steak dinners and golf trips while chatting at the 30,000 foot level about being safe (basically a government lobby group). The CEO needs expert product security managers with their ear to the ground, digging through tickets and seeing detailed customer complaints, integrated deep into the engineering organization. Instead he has announced an appeal-to-authority fallacy (list of names and associations) with a very political agenda, just like when tobacco companies hired Stanford doctors to tell everyone smoking is now safe.

This is not about patching or a quick fix. It really is about organizational culture that would choose ECB mode for encryption, with weak secrets management everywhere, and from that decide they would bring in a disgraced ex-CSO from Facebook for privacy guidance.

*** This is a Security Bloggers Network syndicated blog from flyingpenguin authored by Davi Ottenheimer. Read the original post at: