Testimony: Biometrics Bill
The NYCDSA Tech Action working group supports the passing of Int. 0213-2026 [1]. The following testimony was submitted on March 4th 2026 and preceded by an earlier version of presented at City Council on March 2nd 2026.
NYCDSA Tech Action is deeply concerned with the expansion of mass surveillance and its consequences, including ICE harassment of New Yorkers. Immigration enforcement today relies not only on government authority, but also on private technologies, including facial recognition tools, biometric databases, and large scale data sharing agreements between technology deployers, developers, data brokers, and government. Extensive academic literature has established that these tools are rife with biases across gender, race, age, and disability [2]. New Yorkers should not be forced to accept biometric surveillance as part of simple, daily activities such as buying groceries or taking their kids to a sports game. Int. 0213-2026 intends to grant New Yorkers the power to give or withhold consent for the use of biometric recognition technology, while simultaneously preventing any place or provider of public accommodation from refusing service, charging different prices or rates, or otherwise penalizing customers who do choose to withhold consent. We also laud the strong provisions in the bill that prevent the disclosing, selling, leasing, trading, or sharing of biometric data in exchange for anything of value with any third party.
However, we are concerned that, as written, the bill text underspecifies notice and consent. Underspecification allows places or providers of public accommodation to instantiate superficial notice and consent procedures that manipulate New Yorkers, circumventing the intentions of the bill, and allowing mass surveillance to proliferate unchecked. An extensive body of legal and interaction design research has documented and analyzed this exact issue, termed “dark patterns” [3,4]. Dark patterns refer to manipulative or deceptive interaction design choices, like burying terms of consent within broader terms of service or loyalty program agreements may technically satisfy requirements without customers ever meaningfully registering what they have agreed to. Online cookie consent practices are also rife with dark patterns that manipulate users into the cookie sharing preferences of website owners [4]. We call for the bill language around data collection signage and written consent to explicitly prohibit the use of dark patterns, as codified in academic literature [3]. Beyond prohibiting deceptive practices, the bill language should be expanded to mandate best practices. This may necessitate mandating affirmative opt-in rather than opt-out consent procedures which include notice as to how the data will be used and for what purpose, and notice around consent timelines and revocation, i.e how long consent is valid for and how to revoke or revisit that consent at any time. Relatedly, bill language around data collection signage and written consent must also be expanded to protect all New Yorkers, explicitly incorporating accessibility provisions for individuals with visual impairments, non-English speakers, and minors who may not be able to consent. Ultimately, even with these provisions, individual notice and consent is a fraught privacy mechanism as individual data sharing has collective impact (e.g. an individual sharing their data can help build surveillance tools that hurt their neighbors). In light of this power dynamic, we encourage council members to introduce legislation supporting the formation and rights of community data trusts or data unions, which would facilitate collective bargaining around data rights, enabling more equitable and just privacy protections for all New Yorkers [3,4].
We are also concerned that, as written, this bill does not apply to the use of biometrics data beyond the purpose of identification and verification. Gait, behavior, or sentiment analysis tools leveraging biometrics data can be used for applications like crime or threat recognition [5], which would be prone to the same gender, race, age, and disability biases of all algorithmic tools leveraging biometric data are affected by [2]. Algorithmic price discrimination [6] using biometric data can be enacted without necessarily requiring identification. For example, biometric data could facilitate implicit or explicit ethnic group profiling to upcharge grocery items central to cultural cuisines. More expansive bill language to address the range of biometric applications both known and unforeseen is necessary to better protect New Yorkers.
Recommendation
The Council should pass Int. 0213-2026 after addressing the aforementioned concerns. Thank you for the opportunity to submit testimony.
Signed,
Democratic Socialists of America (DSA) Technology Action Working Group
Erik Sandahl, Hasan Khan, Raaid, Samuel Whalen, Shruthi Velidi, Sohini Upadhyay, Tiffany W
References:
[1] Int. 0213-2026
[2] Bias in data-driven artificial intelligence systems—An introductory survey
[3] What Makes a Dark Pattern… Dark?
[4] The FTC and the CPRA’s Regulation of Dark Patterns in Cookie Consent Notices
[5] Threat Recognition from Gait Analysis
[6] AI algorithms, price discrimination and collusion: a technological, economic and legal perspective