This article is part of the 2023 BCLT-BTLJ Symposium.
By Al Malecha, JD 2024
Many of us in the legal field have had an overwhelmingly negative first impression of AI, which is often brought up in the context of copyright infringement, fictitious case law, or shirked liability. Outside of the legal sphere, the perception of AI is not much better. Fifty-two percent of Americans are “more concerned than excited about the increased use of artificial intelligence,”1 and in the instances where AI is relatively harmless, it is little more than a marketing gimmick. Yesterday, I tried a Coke with a new flavor “created with artificial intelligence.”2 It was not very good.
Despite the risks and the gimmicks, AI has improved many of our lives. AI can predict how and when machinery parts wear out, which prevents accidents.3 AI can detect sepsis before it starts, which dramatically reduces one of the primary causes of deaths in hospitals.4 AI has also allowed us to create more accurate climate models to inform policy decisions and address climate change.5
This month, an AI-enabled necklace called the Rewind Pendant launched. According to its creators, it will function as a personal search engine for “everything you’ve seen, said, or heard.”6 The Rewind Pendant works by running constantly in the background, capturing everything on your screen and all the audio around you. Rewind’s founder Dan Siroker created Rewind because of his experience of going deaf in his 20s. After a hearing aid changed his life, Siroker began to “hunt for ways technology can augment human capabilities and give us superpowers.”7
As someone with a disability that can make typing and other fine motor skills painful, something like the Rewind Pendant would change my life. I can think of numerous instances where AI-enabled technology could do similarly incredible things for others. But despite my enthusiasm for AI’s ability to help us create a more accessible world, this product gives me serious pause.
You don’t have to think too hard to imagine how a device that records everything could be easily abused. Under federal law and the laws of thirty-five states, only one party needs to consent to a recording.8 In these states, it would be perfectly legal for someone wearing the pendant to record a conversation with another person who simply thinks the device is a necklace.
Beyond the device’s collection of data, there are insufficient safeguards on a federal level to prevent the information collected on the device from use by law enforcement and third parties. However, in California, multiple aspects of our privacy laws reduce the risks inherent in such a device.
First, the California Invasion of Privacy Act requires that both parties consent to a recording. This act, codified at California Penal Code §§ 630-638.55, was passed in 1967, five years before Californians voted to amend the state constitution to include an explicit right to privacy. Violation of § 632(a) is a misdemeanor, and § 637.2 further provides that victims of non-consensual recording may bring a civil suit against the person who recorded them.
Since the constitutional amendment to privacy was passed in 1972, California has added further protections against invasions of privacy facilitated by new technologies. The California Consumer Privacy Act of 2018 gives Californians both the right to know about the personal information a business collects from them and the right to opt-out of the sale or sharing of their personal information.9 The California Privacy Rights Act (CPRA) of 2020 goes further and prevents profiling, which is defined as “any form of automated processing of personal information.”10
While files on Rewind are stored locally, if a user wants to use the device to summarize or process information from their day, Rewind states that a user’s text files are shared with their “LLM partners.” LLM stands for “large language model,” and is a type of AI that uses deep learning on large datasets to process natural language. It’s not clear what these partners would do with the information they have from audio recordings, and perhaps more importantly, it’s not clear who the partners even are. The CPRA would enable users to understand how their data is being processed by an LLM, understand how their data is incorporated into the algorithm’s learning, and hopefully prevent a user’s profile from being built and accessed by third parties. The California Electronic Communications Privacy Act acts as another layer of protection and requires that law enforcement obtain a circumscribed warrant before compelling the disclosure of information from either a service provider or an electronic device directly.11
For AI-based tools to create a more accessible future, they must be implemented within policy frameworks that recognize the importance of the right to privacy for all, especially disabled individuals and other marginalized groups. The California Privacy Amendment, even after 50 years, serves as a solid foundation to protect against abuses by new technologies. We must continue to advocate for policy that protects the right to privacy in California, and advocate for federal legislation like the American Data Privacy and Protection Act that would prohibit the collection and use of certain “sensitive data” relating to health and disability status.12 We cannot accept the loss privacy in the name of accessibility; rather, privacy is an essential aspect of an accessible society.