On November 9, 2020, the Federal Trade Commission (FTC) announced in a press release that it had reached a settlement with Zoom Video Communications, Inc. (Zoom) to resolve allegations that Zoom had engaged in unfair and deceptive acts with regard to its video conferencing services.  Financial institutions and other companies that allowed remote workers to utilize this platform should carefully assess what impact this consent order may have and what changes may need to be made to protect virtual business meetings going forward.

The FTC alleged in its complaint that:

  • Zoom deceptively marketed its services as offering “end-to-end, 256-bit encryption”, when in fact it provided a lower level of security.  The FTC claimed that Zoom’s misleading claims gave users a false sense of security when discussing sensitive topics such as financial information.
  • Zoom deceived some users who wanted to store recorded meetings on the company’s cloud storage by falsely claiming that those meetings were encrypted immediately after the meeting ended.  Instead, some recordings allegedly were stored unencrypted for up to 60 days on Zoom’s servers before being transferred to its secure cloud storage.
  • Zoom engaged in unfair practices when it allegedly installed software that allowed Zoom to automatically launch and join a user to a meeting by bypassing an Apple Safari browser safeguard that protected users from a common type of malware.  The software is alleged to have remained on users’ computers even after they deleted the Zoom app, and would automatically reinstall the Zoom app—without any user action—in certain circumstances without adequate notice or user consent.

Under the terms of the consent order, Zoom has agreed to do the following for the next twenty years:

  • establish and implement a comprehensive security program;
  • assess and document on an annual basis any potential internal and external security risks and develop ways to safeguard against such risks;
  • implement a vulnerability management program;
  • deploy safeguards such as multi-factor authentication to protect against unauthorized access to its network, institute data deletion controls, and take steps to prevent the use of known compromised user credentials;
  • review any software updates for security flaws and must ensure the updates will not hamper third-party security features;
  • implement regular security training for all employees, including specialized training for developers and engineers;
  • not to make misrepresentations about its privacy and security practices, including about how it collects, uses, maintains, or discloses personal information, its security features, and the extent to which users can control the privacy or security of their personal information;
  • obtain biennial assessments of its security program by an independent third party, which the FTC has authority to approve; and
  • notify the FTC if Zoom experiences a data breach.

In a blog post about the settlement, the FTC felt it was necessary to justify the need for an enforcement action by noting that, “Even though Zoom has discontinued most of the practices challenged in the complaint, the most effective means for future compliance is a comprehensive security make-over assessed by a qualified third party, monitored by the FTC, and enforceable in court.”

Perhaps foreshadowing what future FTC enforcement actions may look like under a Biden Administration, both of the Democratic FTC Commissioners submitted dissenting statements calling for stronger actions to be taken against Zoom.  In his dissenting statement, Commissioner Rohit Chopra criticized the consent order for failing to provide any remediation for Zoom users or payment of any civil penalties by Zoom.  In her dissenting statement, Commissioner Rebecca Kelly Slaughter also criticized the consent order for failing to require Zoom to take any corrective action to mitigate the harm to customers.  In addition, she asserted that the consent order should have gone beyond security allegations and also addressed privacy, such as by requiring Zoom to engage in a review of the risks to consumer privacy presented by its products and services, to implement procedures to routinely review such risks, and to build in privacy-risk mitigation before implementing any new or modified product, service, or practice.

For more insights on the consent order’s implications, please listen to our upcoming Business Better podcast on this topic which will be available here.