Podcast Transcript:
[SHABRINA] 0:07
Hello, and welcome to the BTLJ podcast. I’m your host, Shabrina Khansa.
[MAHIMA] 0:18
And I’m your host Mahima Madan.
[SHABRINA] 0:20
In today’s episode, we’ll explore the fascinating world of biometrics and its role in a recent case, Barnett v. Apple, which involves a dispute over Apple’s use of facial recognition and Touch ID technology.[1] Biometric authentication is a high-tech way of using one’s physical characteristics, such as their fingerprints or facial structure, to verify their identity. This technology has revolutionized the way we secure our devices, but its increasing popularity has raised concerns about privacy and data protection. That’s where the Biometric Information Privacy Act comes in.[2] The Illinois statute, widely known as “BIPA,” governs the collection, use, and storage of biometric information. It requires companies to obtain written consent from individuals before collecting their biometric data and to take measures to protect that data.
[MAHIMA] 1:14
BIPA is now at the center of a class action suit, Barnett v. Apple.[3] Here, plaintiffs allege that Apple violated BIPA because it offered its users the option of using the facial and fingerprint recognition features without first instituting a written policy regarding the retention and destruction of this biometric data and without getting their written consent. On January 3, 2022, the trial court granted Apple’s motion to dismiss and thereafter, the Appellate Court affirmed its decision.[4] Today, we’ll explore the implications of this decision and the future of biometric privacy legislation with our guest expert, Tatiana Rice.
[SHABRINA] 1:59
Tatiana Rice serves as Senior Counsel with the Future of Privacy Forum’s U.S. Legislation team and leads their biometrics workstream. In her role, Tatiana researches and analyzes legal and legislative trends relating to consumer data privacy, biometric technologies, and privacy enforcement on the federal and state levels. We hope you enjoy the conversation!
[JJ] 2:41
Welcome Ms. Tatiana Rice. Thank you for sitting down with us. I’m JJ.
[JAANVI] 2:46
I’m Jaanvi.
[JJ] 2:48
And we are very excited to explore with you Apple’s use of facial recognition data in relation to Illinois’s Biometric Information Privacy Act, or BIPA, and the recent dismissal of Barnett v. Apple, a class action suit in which plaintiffs allege that Apple’s facial recognition practices violated BIPA. Can you just start off by letting the listeners know, what is biometric information, and why are states interested in regulating biometric information?
[TATIANA RICE] 3:17
Yeah, for sure, and thank you all so much for having me on the podcast. I love talking about BIPA and, it’s funny, there’s been a couple other podcasts, even this week, talking about BIPA and these ramifications. So this is definitely extremely timely. So biometric information is a kind of funky word. In the strict legal sense, biometric information, at least in the US and according to US agencies, is a unique physiological characteristic that can be used to uniquely identify somebody’s identity. So, for example, most people think of biometrics as facial recognition, your fingerprint, an iris scan, things like that. However, there is actually a difference in the science world and, for scientific purposes, sometimes researchers will say a biometric is simply a measurable, physical characteristic or personal behavioral trait. Because if you just take the word bio and metric, just think about physical characteristics, and you’re measuring that. And you can see between these differences, like where the scientific understanding is different from the legal or policy understanding, we’re already going to have a little bit of a problem, and I’m sure we’ll talk a bit about that throughout this conversation.
[JJ] 4:38
Would you say that biometric information in general is more sensitive than other types of personal information? And if so, what about biometric data makes it more sensitive?
[TATIANA RICE] 4:50
Biometric data is absolutely more sensitive from other forms of data. If we’re focusing exclusively on that legal definition of a biometric being able to uniquely identify an individual, like your face or your fingerprint. And the reason that it’s so sensitive is because it’s immutable, and it can’t be altered. So that means if it’s breached, you can’t do a whole lot. So, for example, if your social security number is breached, you can often change it. If your credit card is breached, again, you can change it, you can get rid of your credit card. You can’t get rid of your face, you can’t get rid of your fingerprint. So these things, if they are out there, and they pose a security threat, it’s kind of always there.
And even outside the context of BIPA and this understanding of a biometric being tied to your unique identity, if we’re just thinking about body characteristics that can be measured, you can also create these sensitive inferences that are actually not covered by these biometric privacy laws. So, for example, you can do a gait analysis. People can be uniquely identified based on their gait. You can also use that same kind of data to infer whether somebody has a disease like Parkinson’s. And that kind of stuff is not covered by biometric laws. And I think it’s also really interesting, because it’s still a privacy risk. Right?
[JJ] 6:11
Thank you. Yeah, that’s very interesting. Can you tell us about what the Biometric Information Privacy Act is, or BIPA, and why is it significant?
[TATIANA RICE] 6:21
The Illinois Biometric Information Privacy Act is kind of the seminal biometric privacy law in general, probably even globally, because it is, first, one of the first privacy laws really to go into effect. It went into effect in 2008, which is when I was in eighth grade at that point, even before the invention of the modern cell phone. And it’s still today even considered one of the strongest data privacy laws because it has a private right of action, which is pretty unique to, at least in the US, privacy laws that we have on the books. And how US policymakers outside of Illinois and the courts have thought about the scope of these laws is heavily dependent on BIPA because, again, BIPA contains this private right of action that allows the courts to decide what are the boundaries of what technologies should or should not be within the scope of the law.
In a recent case, Cothron v. White Castle,[5] they decided what is a privacy violation. When does a violation occur under the law? And that’s also significant because there just hasn’t been a ruling like that before, and other regulators, from folks that are in California regulating under the CCPA or CPRA, to the EU are going to have to decide if the Illinois Supreme Court’s analysis of what is a privacy violation under BIPA applies to their law, or whether they think that kind of analysis is correct, generally. So it’s a really interesting law because it is such a focal point of privacy generally, even outside of biometric privacy.
[JJ] 7:59
So in the Barnett v. Apple ruling, which we will later discuss in the episode, the court explicitly references Sections 15 and 20 of BIPA. Can you explain to us why these sections are important?
[TATIANA RICE] 8:14
So BIPA has four main sections or provisions that govern how entities can collect and use biometric information. And again, BIPA kind of has these definitions of biometric information and biometric identifiers that are supposed to be aimed towards biometrics that are used to uniquely identify individuals. So, you have a section 15(a), which says that you have to have a retention or destruction schedule that is publicly available that tells people or employees how long you are retaining their data and when you are deleting it. Or what are the circumstances under which you are deleting it? So Illinois has, at minimum, you have to delete it after three years or whenever the kind of processing purpose elapses.
Section 15(b) is really the core of it, I would say, and where a lot of violations occur, which is that you need to obtain a written consent to be able to collect biometric information. And again, this law is extremely outdated, and I will always advocate for it to get amended because even at minimum, obtaining written consent is quite outdated. Like, having to actually do a wet ink signature rather than a lot of people nowadays, you know, do electronic signatures. But, I digress.
Section 15(c) and (d) talk about how you are prohibited from selling or sharing biometric data, and then 15(e) is that reasonable security, so you have to have reasonable security to store biometric information.
Section 20, as you were talking about, is the enforcement mechanism. It’s what allows individuals to bring a private right of action against an entity if they are collecting biometric information and they don’t get consent or they don’t have a retention schedule, or whatever. And this is really interesting because, unlike other laws in general, Section 20 says that anyone aggrieved by a violation may bring suit, which the Illinois Supreme Court has interpreted to mean that anybody can bring a cause of action, unlike in other federal courts and other state courts where you actually need to allege an actual injury. Here, there’s what they call statutory damages, where you don’t have to allege any kind of injury to bring suit. It’s enough that a company just violated the law to be able to bring a cause of action.
[JJ] 10:40
Awesome. Thank you for giving us the background of biometric information and BIPA. I want to move on to talk about Apple’s biometric technologies. Could you briefly explain how Apple’s Face ID and Touch ID/ fingerprint tools work?
[TATIANA RICE] 10:58
Biometric technologies in general are really interesting because they come in a lot of different forms. But Apple’s ID, lucky for me, Face ID and Touch ID are pretty straightforward. Because most people do have an Apple iPhone, so they really are at least familiar with the technology or have used it before. The interesting part of when you use a Face ID with your Apple phone is it’s scanning your face, or it’s scanning your fingerprint to unlock your phone. It’s authenticating your identity. So, specifically, they undergo what is called biometric encryption, or biometric hashing, which means that they aren’t just storing a photo of your face on a server somewhere, or they aren’t just storing a photo of your fingerprint somewhere.
What they do is they take different vectors of your biometric identifier and they convert it into a mathematical representation. So that could look like a face. The vectors are between your nose and your cheek. And I know people who are listening can’t actually see me doing this, but I am doing it. Between your eyes, between your nose and your chin. And they convert all of that information which is unique to each individual person into another unique mathematical representation. And, according to Apple, they take over 30,000 mathematical vectors when creating these kinds of unique biometric mathematical representations. And this is actually really important because biometric encryption is a really, really strong method of securing biometric data. It’s not totally foolproof, but it is very, very difficult to reverse engineer back into an original biometric identifier. And that is different from other kinds of facial recognition technology that is out there that is just storing your fingerprint or your face print. And Apple is storing this information on device, which I know we’re going to talk about a little bit in this Barnett case, but it is a fairly privacy-protective way of collecting and storing biometric information.
[JJ] 13:03
Can you talk a little bit about the technologies that are used to read people’s emotions? You described earlier how facial recognition technologies use these vectors, which are mathematical representations to represent the distance between someone’s eyes or their mouth and nose. Can you explain a little bit of how these emotion recognition technologies work?
[TATIANA RICE] 13:30
Yeah, so emotion recognition is what I would call this “characterization technology.” So what they’re doing is they are taking a photo or video, and they are making some kind of inference based on that visual input. So if the computer is able or the AI is able to determine, “hey, this person is smiling,” they will associate that with, “Okay, there is a 95% chance this person is happy.” Or if the person’s eyes are darting, and this is actually something that happens a lot with exam proctoring software, is they will track your eye movements to be like, “Okay, is this person or could they be cheating? Is this a red flag?” They will kind of characterize that and they’ll make inferences based on it. It’s very, very different actually from recognition, which again, typically is completely identifiable because, again, your eyes, your nose, your face, is completely unique to you. How your eyes are darting is less unique to you. But nonetheless, you still are forming these really sensitive characterizations and sensitive inferences, which then can be linked to your identity, whether that’s you’re test taker 1234, or you’re candidate 5678.
[JJ] 14:52
Can you give us a summary or just a quick overview of the case Barnett v. Apple?
[TATIANA RICE] 14:58
Sure, so the case involves three plaintiffs that are based out of Illinois, Illinois is where BIPA comes out of, that sued Apple for its Face ID and finger ID technologies. So specifically, they alleged that Apple failed to give them a publicly available retention and destruction schedule. So they did not tell them how long they were going to be storing their biometric data. But more importantly, Apple did not get their consent to be able to process that data. So generally, BIPA is a pretty plaintiff-friendly law. There are very few cases that actually go against tech companies or defendants, but this is actually one of them. So what the holding says is that there was a motion to dismiss, and Apple claimed that they were not subject to the requirements of BIPA because they technically were not collecting biometric data or possessing biometric data because the data was completely stored on the device.
So again, going back to that mathematical representation that I was just talking about, they store that all on device. That is not getting shipped out to their Apple servers. It’s not on their cloud. It is completely just on your phone. And so the Illinois trial court actually agreed with Apple’s argument that they were not collecting biometric information. Because again, they were converting it into a mathematical representation, which was only stored on the user’s phone. It was never actually shared with Apple to their servers. So the plaintiff appealed this decision to the Illinois appellate court, but the Illinois appellate court actually agreed with the lower court decision and agreed with Apple relying on these dictionary definitions of what is possession and what is collection because the statute doesn’t actually define these terms. And that’s actually pretty similar in other privacy laws in that they don’t get in the weeds of what is collection and what is possession. So courts often have to rely on these kinds of dictionary definitions and understandings of what these terms mean.
[JJ] 16:59
I want to talk a little bit more about this idea of collection, capturing, and possessing. You mentioned that Apple made the argument that its collection of biometric information does not implicate users’ privacy because all of the steps to capture and collect users’ biometric information is stored within the device. This seems to imply that BIPA is only concerned with preventing companies from sharing consumers’ biometric information, rather than collecting biometric information. Do you agree? And do you believe this is true for privacy law in general? And how might such thinking be dangerous for consumer privacy?
[TATIANA RICE] 17:40
I actually agree with both characterizations. I agree with Apple. And I agree with what you’re saying that it’s dangerous for consumer privacy. And what I mean is the largest concern of BIPA, again, is the security of biometric data to prevent identity theft. So in this case, Barnett v. Apple, Apple was using the gold standard for security of biometric data by encrypting it and storing it on device. Apple could not do a whole lot more in terms of securing the biometric data. And we do want to incentivize companies to use this kind of highest security possible because it is the highest risk kind of data. And we don’t want this kind of extreme liability for companies that are trying to do the right thing.
However, where we do run into problems is, and what’s dangerous about Apple’s argument, is that you can’t just say having really good securities means that consumers shouldn’t worry about their privacy or not have privacy rights like consent, right? It has to be both. So Apple can’t just say, “We have really good security, therefore, we shouldn’t need to get consent, and therefore we shouldn’t need to have a retention or destruction schedule.” I don’t agree with that line of reasoning. It has to be—and it’s actually something that’s also debated under the GDPR as well. The GDPR exempts what they call de-identified data, because that data, again, cannot be re-identified with the consumer.[6] But a lot of experts and a lot of regulators believe that biometric data can just never be outside the scope of a privacy law, it can never really be de-identified or have a certain security threshold. That means that you don’t have these kinds of other privacy obligations because it is such sensitive information. And it really is, it will always be sensitive no matter what kind of security that you’re using on it.
[JJ] 19:35
I want to move forward to a specific contradiction that seems quite apparent in this case and just some clarification on that contradiction. So in December 2022, the First District Appellate Court of Illinois ruled that Apple does not capture or collect any biometric information because their users become individual consumers, and are free to stop using Apple’s facial recognition and Touch ID features, and they’re able to delete any information collected by the Apple products.
However, the court also states, Apple, and I quote, “Apple is a sole owner of its software, while users are licensees. Users cannot access their own biometrics collected by and stored on their own devices without violating Apple’s software license agreement.”[7] These two statements seem contradictory. One suggests that users have complete control over their biometric information collected, while the other suggests that users have no control over their biometric information. Could you clarify what technical ability and what rights users have over their own biometric data?
[TATIANA RICE] 20:46
Yeah, I love this question because so often courts don’t understand the technical background of how these technologies work. And I totally can see the contradiction here, and so I think it’s worth getting in the weeds on. So as noted by the court, there is a difference between the product and the company, and even the product and the software. So I’ll get a little bit into the weeds. So Apple designs the software that powers iPhones. The software is a set of computer programs and associated documentation and data. So in terms of the hierarchy, you have the user, and then you have the application software, and then you have the operating system, and then you have the hardware, which is the iPhone. So the Face ID and finger ID are products. They’re facial recognition systems or fingerprint recognition systems built on both the Apple hardware and software that is contained on the device. So that software license agreement completely governs the software that is on the iPhone, which means that Apple owns the intellectual property of the software on the iPhone. So as a lot of people who do IP know, no one else can use software if you have a IP over it unless you permit them to do so. So by selling their phone to users, they have allowed users to use and license their software. It actually has to do very little with who owns the data collected by the software.
So when the court says users can access their own biometrics, remember, the software is responsible for encrypting their biometrics into these mathematical representations. So the use of their software to convert the identifiers into mathematical representations is protected by their IP agreement. So in that respect, no, users don’t have a whole lot of control to be able to get access to their biometric identifiers because it would mean that they would need to access the software to be able to know how the encryption works from their biometric identifier to the mathematical representation. But according to the court, again, Face ID and finger ID are optional features. Users can delete that data and opt to not use that feature. But I do think the court makes an interesting distinction between what is a product and what is the software.
[JAANVI] 23:24
Now, I’d like to transition towards discussing the issue of consent and the present case. The courts rule that users consent to retention of biometric data from Apple’s face and fingerprint recognition features even though they have not provided a written policy regarding the retention and destruction of the users’ biometric information. Could you explain to us why the court ruled this way?
[TATIANA RICE] 23:47
So BIPA is really fact specific. And I think it’s important to note here that the real holding, which holds legal weight, was that Apple did not collect biometric identifiers or information, so Apple did not need to abide by BIPA’s requirements for consent. But, as you note, the court does spend some time focusing on users’ voluntary participation in using these iPhone features. I would characterize that more as dicta than anything, but it is definitely worth examining. So a couple of facts that the court focuses on are, one, that again, these features are optional, users are opting to use them, the user is the sole entity that decides to use these features. And to enable these features, the user employs his or her device to utilize them.
So the court distinguished these facts and looking at consent with another case called Ronquillo, where an employer deployed a biometric system for their employees to clock in and out of work, and this is a very, very common fact scenario for BIPA cases.[8] The biometric device was owned by a third party vendor, and it was required for employment. So in that case, they did find that the entity violated BIPA, and they did need to get the employee’s consent because, again, the biometric device was basically being forced on them. They didn’t get consent. And this third party vendor was the one who was storing their data. But again, I think the court’s intent here in providing these facts around voluntary use of these features, and contrasting it with Ronquillo is to support their overall argument that it was the users, it was not Apple, that use and control these biometric features. And therefore BIPA did not apply to Apple.
[JAANVI] 25:41
As you just noted, the court in this case declared that BIPA was not violated and there was no violation of consumer privacy because users voluntarily participate in facial recognition technology. However, we also recognize that most users don’t quite fully understand how biometric collection technologies work or think through how much information is being collected through the use of these features. Do you think voluntary participation for facial recognition technology or other biometric technologies really rules out consumers’ privacy concerns?
[TATIANA RICE] 26:18
Absolutely, not. Definitely a gimme. And I think this is a great point. And I do actually think it works both ways. So to your point, having a user use a biometric system or any kind of program that is collecting personal data is not informed consent. So most consumers, most reasonable consumers, know that some form of data is being collected, but they have no idea how it’s getting shared, how it’s getting stored, etc. Like most people aren’t reading privacy policies. Even if they were, they would probably not understand the legalese that’s in a lot of them. So, though it’s pretty insufficient by itself to fully protect consumers, transparency into data collection, use, and transfer is extremely necessary to begin even starting talking about informed consent. And at the same time, it does work in the other direction. If you’re a company listening to this or thinking about this, in promoting consumer trust. So for example, if consumers knew that Apple was encrypting their biometric data in a format that is the gold standard of biometric data protection and almost impossible to hack, consumers probably would have greater trust in companies that they are sharing or they’re storing their data in this very sensitive way and actually thinking about them.
[JAANVI] 27:41
As previously seen, the courts have interpreted the statute broadly in a manner favorable to the plaintiffs and chipped away at defense arguments in deeper cases. And like Barnett v. Apple, did you find the outcome, or any other aspect of this case to be surprising?
[TATIANA RICE] 27:59
Yeah, I do actually find this decision surprising because, as you said, BIPA cases that are actually litigated, and I’d say probably 75% or more are typically settled before it even gets to the litigation phase, are usually in favor of the plaintiff or the plaintiff class action. So these rare defendant-favor decisions are rare. And this on-device distinction is fairly new. The other new argument that I’ve seen come up a little similar to this is where the data of download occurs. So in another case, I don’t remember it, some large tech company, was trying to use a facial recognition dataset in order to actually combat bias and discrimination in their own facial recognition system. Because they wanted to use what’s called a diversity of dataset training model in order to better train their own system.
But what they didn’t know was that this training set had photos of Illinois residents in it. And so they were sued, of course, under BIPA saying, “you didn’t get consent, you didn’t tell these people that you were collecting their biometric information.” And the court found that the download of the data set did not happen in Illinois, and therefore BIPA did not apply. So it’s really interesting to see how some of these decisions are getting met, and what the contours of the law are shaping up to be because again, privacy law, even in general, is so much in its infancy that these decisions have a very large impact on how we think about how data moves, how data flows, when a data violation occurs. And I think it’s really interesting.
[JAANVI] 29:53
So would you say that this decision has provided some optimism for companies defending biometric privacy class actions?
[TATIANA RICE] 30:01
I would not. Anybody that I talk to, their companies, and honestly, probably 95% of companies are subject to BIPA because most companies are using biometric authentication in some way, whether that’s for their employees, or they’re a store retailer, and they have a video camera that could do facial recognition if somebody was to commit theft or something. But nobody’s optimistic about BIPA defense. If anything, most folks that I’ve talked to are pretty terrified and will actually go through some effort to actively avoid Illinois, whether that is they don’t offer their products in Illinois, or they’re just straight up are not doing business in Illinois because of how hostile the environment is right now for BIPA in Illinois.
So another recent case that’s worth really focusing on because it’s really making very large waves right now in BIPA-land and privacy world is this case that I alluded to before, Cothron v. White Castle.[9] And so there, the Illinois Supreme Court was deciding what, again, what is a privacy violation. So there, they found that every single time that somebody used a biometric system without getting prior informed consent was violating the law, which meant that there was a separate claim for damages. So every single time an employee clocks in or clocks out of work, which could be up to four times a day, over the course of a year, over the course of five years, those damages get to be astronomical. In fact, they found damages for White Castle would bankrupt them because damages would be around $17 billion. And so, “B,” billion dollars really scares companies and they don’t want to even risk trying to comply. A lot of folks will try to, but still end up getting brought into this realm of BIPA litigation one way or the other.
[JJ] 32:03
Could you just explain the difference between what consent is versus informed consent?
[TATIANA RICE] 32:09
So informed consent—this is actually not out of the statute, but I would say most privacy laws are looking for some form of informed consent. So consent is you’re allowing an entity to collect your data, that kind of stops there. Informed consent adds a condition on top of it, which means you understand that your data is being collected, you understand what the data is being collected for, and, because you know both of those things, you’re able to provide consent that is more informative of how your data is being used.
Informed consent is never perfect, for sure, but it’s definitely more ideal to get informed consent versus just regular consent. Because, as we talked about before, most reasonable consumers don’t actually know how their data is getting used or shared in the ecosystem.
[JJ] 33:04
And how do companies make sure that they’re getting informed consent as opposed to just normal consent?
[TATIANA RICE] 33:11
Yeah, unfortunately, it’s not a science, and some companies are more well intentioned than others. So some vendors will really go the extra mile in making sure that all the information is on the user interface. So if somebody sees this is how we’re using your data. This is how long we are keeping it. We don’t share with anybody, etc. Other companies that are just trying to say, “Hey, we comply with the law, just for compliance purposes,” they don’t actually really care. We’ll just do a regular privacy policy and say, “If you want to know how we are collecting, using, sharing your biometric data, here’s a link to our privacy policy. You can click on it.” I think it’s pretty universal. Nobody actually does that, or maybe 2% of the population does that. So it is interesting, and I actually am not sure how the courts have come down on that as it relates to BIPA. I think a lot of times, in privacy law generally, they’ve been pretty lenient, but I can see the tides changing, especially as more folks are more concerned about how their data is being used, shared, and sold.
[JAANVI] 34:24
Moving on, facial recognition technology is rapidly becoming more prevalent in both employment and consumer contexts. What are the implications of this rule on consumers and companies in the future?
[TATIANA RICE] 34:38
Yeah, so again, I think back to my original point at the top. Biometric technologies come in a lot of different forms. So facial recognition gets all the attention because it’s the thing consumers interact with most, and it’s always like the big bad. And absolutely, it is extremely problematic, particularly if it’s used for mass surveillance, if it’s used by ICE, if it’s used for all of these kinds of nefarious purposes that could chill speech, chill activities, make people uncomfortable generally. But there are also a lot of different other forms of biometric technologies that are also really problematic and getting used more and more in the employment and consumer contexts that folks aren’t always thinking about, and others that are actually really valuable in the employment and consumer context.
So I’ll give a couple examples. So one is emotion recognition. So folks can develop technology that is, again, very much in its infancy right now. I think most people would say it is not accurate. It’s kind of snake oil at this point. Nonetheless, it is getting used, especially in the context of employment and hiring, where they will use this kind of video technology to be able to determine, does this person seem like they’re telling the truth? Does it seem like they are happy or sad? Does it seem like they might be upset at something that was being said? And what is really interesting about that, and that BIPA doesn’t cover and other biometric laws don’t cover, is discrimination and bias. So in that motion recognition scenario, somebody who is differently abled or who is neurodivergent, is going to express themselves extremely differently from somebody who is neurotypical. And so there are these kind of discriminatory effects even beyond facial recognition, which we know have the discriminatory effects, that are getting put in other kinds of technologies that could be called biometric.
And at the same time, we have things like biometric authentication, which can be a really good thing to prevent against fraud. So more and more we’re seeing about these AI deep fakes and how people are getting subjected to revenge porn and all of these different ways to access people’s bank accounts, and all of that. And biometric authentication has been actually a pretty good way to combat that, and we don’t want to hinder that progress. But we do want to make sure that we’re doing it in a safe way that is promoting consumer trust.
[JAANVI] 37:16
In the light of what you just mentioned, how can states better protect consumers’ biometric information?
[TATIANA RICE] 37:24
Pass comprehensive privacy law! Given this is kind of my job, I talk a lot to legislators and talk about how US legislation would really help protect consumers better. I purposely say comprehensive privacy law rather than just biometric data privacy law because, again, to this point of how fluid data can be. So data that’s biometric in one context can be health in a completely different context, like I said before, with the gait example. And laws that are designed just for biometrics in this limited capacity of identifying somebody is not going to get at those other privacy risks. They don’t account for bias and discrimination in systems, so more comprehensive laws that account for all these different kinds of data uses and all these different kinds of risks are really the better way to be able to protect body-based data and people’s personal sensitive body-based data.
[JAANVI] 38:27
What do you think the future of biometric privacy regulation holds, and what it would look like in the near future?
[TATIANA RICE] 38:35
It’s really hard to tell, unfortunately. A lot of states are trying to follow BIPA, and they’re trying to enact their own biometric data privacy law, but unfortunately, aren’t having a ton of success because of how hostile the BIPA environment in Illinois is. Some folks have tried to enact BIPA without the private right of action. That similarly has been really difficult for lawmakers to parse through because, without meaningful enforcement, a privacy law is kind of moot. And a lot of state AGs don’t always have the expertise or bandwidth to be able to also enforce privacy laws. So I’m not trying to defend companies and saying that states shouldn’t pass biometric privacy laws. It’s just BIPA needs to be reformed, especially since it was passed in 2008. And I don’t see other states being able to pass biometric data privacy laws until BIPA is reformed or there’s a federal national standard, which both are kind of an uphill battle.
[JJ] 39:42
Can you explain a little bit by what you mean that BIPA should be reformed? And you mentioned that BIPA is hostile, could you just explain a little bit more what you mean by that?
[TATIANA RICE] 39:52
Yeah, so BIPA, again, has this private right of action, which is very unique to a privacy law. And, because plaintiffs don’t have to allege any kind of injury, there have been over 2,000 or 3,000 lawsuits just in Illinois under this law. And that has made it very difficult for folks to be able to operate in Illinois because they’re just scared to death of this law. And I’m not saying that that’s a bad thing, right? BIPA’s extremely privacy protective, and I will always say that. And at the same time, right now, it is quite unworkable in a lot of different senses. So, for example, needing to require written consent is really difficult if you are operating solely on a user interface basis, right? So even if, let’s say, for sake of argument, Apple is covered by BIPA for its iPhone features. Are you going to write a paper and sign it, and then mail it back to Apple saying you consent before you can use the phone? No, that’s not going to happen. It’s unworkable. And in other examples, there is, what is a biometric? A lot of people still don’t really understand what a biometric is. Because, again, it’s so fluid.
So, for example, there’s a lot of pushback on what are called detection and characterization technologies. And I’ll just use detection because it’s a lot lower risk than anything else. So detection, basically, is a computer just trying to iterate, you know, is this a face? Are these eyes? Is this a nose? It does not usually identify you with your identity. And this comes up in the context of, for example, autonomous vehicles. So they will have cameras or LIDAR radars on the outside of cars to determine if there’s a pedestrian outside to make sure the car doesn’t hit the pedestrian. They’re also getting sued under these laws, even for that kind of technology, because there’s a fundamental misunderstanding about how a lot of these computer vision applications—and computer vision, again, is how a computer interacts with the visual world through video, photos, etc.—and trying to figure out what the contours of the law are. So BIPA reform has been talked about for a long time. It will definitely be interesting to see if it works out this year, particularly with this White Castle case coming out.
[JAANVI] 42:27
Just as you mentioned that, in functioning today’s society, the use of technology involving biometric information seems almost necessary. For people who are concerned about the privacy but still need to use such biometric information tools, face and fingerprint specifically, what can people do to protect such information?
[TATIANA RICE] 42:52
Great question. And unfortunately, I don’t have a good answer for it. It is really getting quite difficult, especially because of the transparency issue of consumers don’t really understand how their data is getting collected, used, and shared. And it’s also not really regulated outside of Illinois, Texas, and Washington.
So I actually had an example recently, where I was staying in an Airbnb. It was one of those—I think it was a hotel that was converted into a large Airbnb building. And they required facial recognition to be able to even use their system or log in to the building, which I thought was really interesting. And I tried to be like, “Hey, is there another way that I can authenticate my identity? Can I provide my driver’s license, and you can compare my driver’s license against my physical face?” And they just said, “No, that’s the policy. Too bad. If you don’t want to stay here, you can go somewhere else.” And it’s the day before I’m supposed to check in. And, of course, the prices are a lot higher, and I had no recourse. It was either I was going to risk my biometric data and my face, I still have no idea how they use it, how they collect it, and I work in this field. And there’s nothing I can really do about it. So unfortunately, it is really difficult.
My best advice for people is, do try to push for alternatives where you can, especially if you don’t know how the data can be used or shared. So the best example is, if you have social media, try to make it private. I’m sure a lot of us heard about Clearview AI, which is this facial recognition company. And I think according to earliest statistics, they can identify like 50% of the world’s population because how they develop their facial recognition system is just by scraping photos off of people’s social media profiles, and then associating that with their identity. And then they sell that to law enforcement. They sell that to whoever wants to buy it because there is no real regulation around it. Besides, if they’re covered by GDPR or they’re covered by BIPA, it’s just this kind of patchwork of where things apply.
[JAANVI] 45:06
Thank you Ms Rice. I think it has been a very enlightening session for all of us.
[TATIANA RICE] 45:12
Great. Yeah, it was so much fun to talk here. Always happy to chat more about biometrics, it is basically my life.
[MAHIMA] 45:37
Thank you for listening. The BTLJ podcast is brought to you by Podcast Editors Isabel Jones and Eric Ahern. Our Executive Producers are BTLJ Senior Online Content Editors, Katherine Wang and Al Malecha. BTLJ’s Editors in Chief are Dylan Houle and Jessica Li.
[SHABRINA] 45:58
If you enjoyed our podcast, please support us by subscribing and rating us on Apple Podcasts, Spotify, or wherever you listen to your podcasts. If you have any questions, comments or suggestions, write us at BTLJpodcast@gmail.com.
[MAHIMA] 46:13
This interview was recorded on March 15th, 2023. The BTLJ podcast team members who worked on this episode are Jaanvi Rathi, Josiah Young, Shabrina Khansa, Mahima Madan, and Brionne Frazier. The information presented here does not constitute legal advice. This podcast is intended for academic and entertainment purposes only.
Further reading and references:
[1] Barnett v. Apple Inc., 2022 IL App (1st) 220187.
[2] 2007 ILL. SB 2400
[3] See 2022 IL App (1st) 220187.
[4] Id.
[5] Cothron v. White Castle Sys., 2023 IL 128004.
[6] Council Regulation 2016/679, 2016 O.J. (L 119) (regarding the processing of personal data and on the free movement of such data).
[7] Barnett v. Apple Inc., 2022 IL App (1st) 220187.
[8] Ronquillo v. Doctor’s Assocs., LLC, 597 F. Supp. 3d 1227 (N.D. Ill. 2022).
[9] Cothron v. White Castle Sys., 2023 IL 128004