Saturday, September 12, 2020

TikTok’s Buyer Just Got Another Problem: Child Facial Recognition Class Action–MusicTech.Solutions | MUSIC • TECHNOLOGY • POLICY

[This post first appeared on MusicTech.Solutions]

There is a long line of copyright infringement cases that demonstrate how hard it is to right a situation that starts out wrong. In addition to TikTok’s copyright problems which it is frantically trying to buy its way out of, TikTok also has a problem with how it treats children.  This isn’t the first time around for TikTok’s exploitation of children–they also were fined by the FTC as the plaintiffs note in their complaint.  A group of child advocates have complained to the FTC that TikTok is ignoring the FTC’s orders.

As any TikTok buyer will soon discover, TikTok is the gift that keeps on giving.  TikTok has been sued in a multi district class action for violating the privacy rights of children and the biometric privacy laws of several states (TikTok’s parent company Bytedance is also named). In their complaint (now In re TikTok, Inc., Consumer Privacy Litigation, Case No. 1:20-cv-04699, Master Docket No. 20 C 4699, U.S.D.C. N. Dist. Ill. East. Div.)  the children state:

Part of the reason for TikTok’s popularity, particularly with younger users such as Plaintiff, are the filters and other effects users can apply to their own videos, as well as those uploaded by others. In order to utilize many of these effects, Defendants scan users’ faces and “face geometry” to capture their biometric data, as well as to determine the user’s age using an algorithm.

In collecting and utilizing Plaintiff’s and the Class’ biometric identifiers4 and biometric information5 (referred to collectively at times as “biometrics”), Defendants fail to: (1) warn users that the app captures, collects, and stores their biometric data; (2) inform users of the purpose or length of time that they collect, store, and use biometric data; (3) obtain users’ written consent to capture their biometric data; and (4) implement and/or make publicly available a written policy disclosing to users its practices concerning the collection, use, and destruction of their biometric information in violation of the Illinois Biometric Information Privacy Act (“BIPA”), 740 ILCS 14/1, et seq….

Many of the features and effects require scanning the user’s face geometry in order to place effects over the user’s face, swap the user’s face for an emoji or other individual’s faces, or enhance aspects of their facial features.

But Plaintiff and similarly situated users place themselves at risk when they utilize TikTok features that require access to users’ biometric identifiers and/or information. TikTok acknowledges that it shares personal information that it collects from users with third parties, including entities in China. Multiple U.S. military branches and the Transportation Security Administration have banned use of TikTok due to privacy and cybersecurity concerns.

TikTok, and its predecessor, also have a long history of exploiting the millions of minors that make up the lion’s share of TikTok’s user base. In 2019, the Federal Trade Commission settled a case against TikTok and for violating the Children’s Online Privacy Protection Act by improperly collecting personal information from children under 13 years old without their parents’ consent.8 The FTC fined Defendants $5.7 million, the largest COPPA fine in the FTC’s history.

In response to complaints regarding children under 13 years old using the app, TikTok implemented a feature that scans the user’s face to determine if he or she appears to be 13 years old or younger. TikTok compares the geometry and features of the individual’s face to an algorithm to determine his or her age.

In other words, TikTok violates the law by capturing child biometric data, then they capture child biometric data to use in an algorithm to catch themselves violating the law.

The case has progressed to the settlement stage.  In response to the reported allegations that TikTok was attempting to fix the outcome of the settlement by cherry picking which attorneys participate in the settlement mediation, the court issued this statement in a case management order dated yesterday (9/1/20) (my emphasis):

[T]he Court is informed by certain Plaintiffs’ counsel that progress has been made in settlement discussions with Defendants. Certain other Plaintiffs’ counsel take umbrage with the manner in which those settlement discussions have taken place, stating that they were not allowed to participate in the settlement discussions (for one reason or another) and have not been informed of the terms of any potential settlement. It goes without saying that, before this Court can approve a class-wide settlement of any kind, it must consider the factors set forth in Fed. R. Civ. P. 23(a), (b), and (e). Those factors include, without limitation, whether the settling class representatives and class counsel have adequately represented the class, whether the proposal was negotiated at arm’s length, whether the relief provided for the class is adequate, and whether the proposed settlement treats class members equitably relative to one another.

In the Court’s experience, it often is advisable for the settling plaintiffs to encourage the participation in the settlement process of attorneys who represent other plaintiffs who have brought similar claims in other venues. After all, those attorneys may represent potential class members, possible objectors, or others who may opt out of any settlement class altogether.

So the Court is basically not having it when it comes to TikTok’s tactics.  It may be shocking that TikTok even wound up in this situation of getting sued for strong arming children and then strong arming children in its attempt to escape liability (and potential criminal prosecution).

All of this occurs in the shadow of the US government’s order requiring that Bytedance divest itself of TikTok.  And that leads to the most interesting part of the court’s case management order:

[T]he attorneys for certain Plaintiffs have raised the concern that any upcoming sale of TikTok, Inc., or its assets by its current owner to a domestic company may result in the destruction of relevant documents as that term is used in the Federal Rules of Civil Procedure. The Court wants to make it clear that, in the event that such a sale takes place, any successor-in-interest shall be bound by Paragraph of CMO No. 1 (just as the Defendants are now) and must make all reasonable efforts to preserve any and all evidence in the possession, custody, or control of TikTok, Inc., that is relevant to the claims and defenses raised in this action.

Of course, I’m sure it’s not lost on the Court that much of this evidence may be located outside of the Court’s jurisdiction and would be subject to China’s National Intelligence Law.

Who wants to bet that preservation ship has already sailed?


No comments: