[Gayathri Sindhu] 0:07
Welcome to the Berkeley Technology Law podcast. Hey everyone, this is Gayathri, your host for the day from the Berkeley Technology Law Journal. Today, we are excited to share with you a conversation between Berkeley Law student Paul Wood and Berkeley Law Professor Rebecca Wexler, where they discuss the intersection between reproductive justice and data privacy. In June 2022, the Supreme Court issued a ruling in Dobbs v. Jackson Women’s Health Organization1, overturning Roe v. Wade2 and Planned Parenthood v. Casey3 holdings that the US Constitution grants a right to an abortion. Shortly thereafter, Professor Wexler co-authored with Professor Aziz Huq, an article in the New York University Law Review, titled “Digital Privacy for Reproductive Choice in the Post-Roe Era. Today, Professor Wexler reflects on that piece and on the need for an evidentiary privilege to shield reproductive data from use in criminal investigations. She also wants against the harms inherent in placing the burden on individuals to become “data literate.” Instead of holding technology companies accountable to fulfill their Post-Dobbs commitment to protect reproductive healthcare data. Given the continued frequency that issues pertaining to reproductive healthcare, from IVF to the abortion medication pill mifepristone are appearing in court, Professor Wexler’s conversation with us today is an uplifting respite, with her posing innovative ideas on how we can protect access to reproductive healthcare. We hope you enjoy the podcast.
[Paul Wood] 2:09
Good afternoon, Professor Wexler, thank you so much for joining us. Would you please give our listeners a little background on how you became interested in the intersection between digital privacy and reproductive choice?
[Rebecca Wexler] 2:20
Yeah, well, first of all, just thank you so much for having me on the podcast. I really appreciate being here and appreciate your covering this super important topic. So I became interested in the intersection between digital privacy and reproductive choice when the Dobbs came down. In the weeks that followed Dobbs, there was a lot of media attention to privacy concerns around reproductive healthcare apps like period trackers and data that could reveal the facts and loss of pregnancy. And this seemed to be what the media was focusing on in the digital privacy realm, and rightfully so—we should be worried about those things. But it occurred to me that actually, in my area of scholarly focus is digital evidence. And that actually, there’s a ton more digital evidence that’s likely to implicate somebody in seeking reproductive healthcare than just these period tracker apps like Flow, that were getting all the media attention. And so I started doing some media interviews, I co-authored an op-ed with Professor Aziz Huq at the University of Chicago, that basically said, look, the issue is actually far broader than just these digital data that are specifically already identified as reproductive health data, the data that could get you implicated or exposed or reveal somebody’s effort to obtain abortion care or to provide abortion care, includes location information, includes text messages, emails, internet searches, keyword searches, biometric data, Fitbit data, associational information, commercial transactional records, your purchase history, your web search history, your medical records, and so I just realized it’s a much, much bigger issue than the kind of narrowly focused anxieties that the media had at that moment. The other thing that I realized that really motivated me to weigh in personally, is that a lot of the conversation was about federal privacy laws. So you might think, oh, no, I’m really worried that there’s now all this digital evidence that could implicate somebody and now we’re going to see post Dobbs the criminalization of attempts to obtain or provide abortion care. And all this data that people are spewing out all the time in their everyday activities is going to make them vulnerable to criminal investigations and prosecutions. What we should really have is better data privacy laws. And so there was a data privacy community, this community that had been advocating for federal data privacy laws for a very long time, rightfully thought the passage of Dobbs and the overturning of the constitutional right to an abortion as a motivator to try to get their federal data privacy laws passed. And the worry that I had about that, is that it turns out that most data privacy laws have exceptions for law enforcement investigations. And so data privacy seemed like the wrong path, the wrong thing to turn to. And I’m an evidence law professor, and I focus on evidentiary privileges. And I thought, no, what we really need is a privilege that can block law enforcement access.
[Paul Wood] 6:05
I want to hone in on something you talked about from the evidentiary perspective. As you mentioned, this fear that reproductive data, which is a lot broader than people initially realize, could be collected and used as evidence for a criminal prosecution. But I know that generally, a characteristic of the law of evidence is that you can’t use illegally obtained evidence. But as you just mentioned, these privileges that apply … what does aboveboard look like in the evidence context, right here? Like, what should we expect? For people to obtain evidence that could be used? What is legal? What’s not? It’s just a starting point.
[Rebecca Wexler] 6:42
Great question. Right. So exactly, as you said, if the police go out, and they violate your Fourth Amendment rights and get something they shouldn’t without a warrant, if they go beyond the scope of a warrant, if they violate some laws in obtaining evidence, then you may have the right to suppress that and not allow it to be included in your trial as a criminal defendant. But it’s not hard for law enforcement to just collect this kind of stuff, aboveboard, as you’re saying. So first of all, if any of it is protected by the Fourth Amendment, which lots of it’s not because of this thing called the third-party doctrine, where if you give information about yourself to a corporation voluntarily or to your friend or to any third party, you’ve given up a reasonable expectation of privacy and that information, which means the government can go ahead and collect it from that third-party without a warrant. And so already huge swaths of the data that we might be concerned about here isn’t protected by the Fourth Amendment. But even if it were protected by the Fourth Amendment, for whatever minimal amount of it is protected by the Fourth Amendment, it’s just not hard for law enforcement to obtain a warrant to get it, if they can establish probable cause. So they have to establish that there’s probable cause that what they’re going to search or seize, is related to a violation of law that they’re investigating, they have to identify the place to be searched or thing to be seized with specificity. But in a situation in a world where obtaining abortion care, or providing abortion care, has been criminalized in a state Penal Code, law enforcement investigators are going to have no problem establishing that these sources of data surrounding abortion care providers or people that are suspected of seeking abortions are relevant to violations of criminal law. You criminalize abortion, and all of a sudden all this data is relevant to the crime of abortion. So it’s not going to be a heavy lift for law enforcement to establish probable cause and get a warrant to get this data. If the Fourth Amendment even applies, they’re gonna be able to get a warrant and get it. Okay, so that’s the first point.
The second point is that for statutory privacy laws, if we drop down from the constitutional level to the statutory level, you’ve got laws like HIPAA, Health Insurance Portability and Accountability Act. Exactly. Okay. People think that HIPAA is going to protect the privacy of their medical records, but guess what? It contains express exceptions that allow law enforcement to get those records. Similarly, the Stored Communications Act is a federal statute that people think will protect the privacy of things like their emails that are stored or their instant messages or their Facebook direct messages or … does Tik Tok have a messaging platform I don’t even know. So any of those communications data that are stored with third parties, the Stored Communications Act is your go-to to kind of protect the content of those messages. And yet, once again, the Store Communications Act has an exception that allows law enforcement to get it from the tech companies. Sure, they have to have some form of process, they have to get a warrant, or in this case, for content, a warrant plus, they might have to show something even more onerous than just probable cause. But they’re not going to have trouble showing that, making that showing, and then they’re going to be able to get that information when we’re talking about people who are actually seeking or providing abortion care. And that activity, they’re really doing it and it’s really criminalized. It’s not going to be hard for law enforcement to use these procedures to get the evidence in your language aboveboard. And in which case, it will all be admissible in court.
[Paul Wood] 11:10
Professor, given how easy it is for law enforcement to obtain this data, are there any ways individuals can try to protect themselves or prevent this data from falling into the wrong hands? Or would you say the only kind of solution to this problem is more systemic in nature?
[Rebecca Wexler] 11:27
I think it’s a great question. It’s that there’s a practical question, and there’s also a kind of social policy, “who should bear the burden of fixing this problem” kind of issue in that question. So I think it’s a systemic problem, it needs to have a systemic solution. There are steps that individuals can take, to have more privacy protections and protections from law enforcement. One would be if you want to have a conversation with somebody, have a conversation in person, and don’t keep a record of it. Don’t have the conversation over text message or chat. There are also people that will tell you that you should use various encrypted programs. I think the only real way to keep digital evidence out of law enforcement’s hands is not to have it exist in the first place. So even with end-to-end, encrypted programs, that may make it slightly more difficult for law enforcement to get, but if they’re sufficiently motivated, there might be opportunities for them to go around the end-to-end encryption and get it anyway. So for your average person, trying to tech up to such a degree that you have an advantage over law enforcement is not a reasonable solution. So I’m not saying there’s nothing individuals can do, like, sure, don’t take your phone with you, when you go to see your medical care provider, then you won’t have location tracking data from your phone that shows you going to the medical care provider. So you can certainly minimize this digital exhaust trail that’s going to expose you to liability. But it shouldn’t be, and this is the normative component, it shouldn’t be the responsibility of individuals to protect themselves from what I think is abusive and wrongful criminalization, it shouldn’t be the responsibility of individuals, the people that are going to be targeted by these prosecutions are disproportionately the people who aren’t going to be able to afford to travel to a state where healthcare and abortion care is legal. And so they’re disproportionately going to be communities, people who are poor, who are not maybe having the privilege of time and resources to develop these technical defenses against law enforcement. So I’m not that sympathetic to solutions that try to tell people to download some fancy encryption program or use Tor or something like that. I don’t think it serves the people most in need. And I don’t think it’s the right approach as a society to protect those communities. Instead, I think that what we need are the empowered, centralized actors to do better. So in the paper that I co-authored with Professor Aziz Huq, one of the things we lay out are options that tech companies, big tech companies, can and we think should do to try to protect their users from having their data exploited to serve criminalization purposes. And one of the things that the big tech companies could do systemically, I would help a lot, is to not collect certain kinds of information.
So, immediately following Dobbs, there was actually one major public commitment to doing exactly this. It was from Google. And I haven’t checked to see recently what’s going on with that, actually it’d be interesting to see whether Google has really maintained its promise or there have been any signs that it’s violating that promise, but immediately after Dobbs, Google actually came out and announced that it would delete location data from users who visit abortion clinics and other sensitive sites such as domestic violence shelters, and I think they may have put religious institutions in there as well. And so it’s a significant commitment, it probably is not something that would help an individual who’s already a suspect. So if Google’s got location data on you, that shows you traveling from Iowa to Illinois, and then disappearing in some kind of black hole within an even larger radius around an abortion clinic, that’s going to look a little bit suspect. So it’s not likely that it’s going to help those individuals who are readily targeted by law enforcement. But what that kind of commitment can do, is prevent what’s called geolocation warrants. Or it is kind of bulk, if you don’t like them you call them general search warrants, where law enforcement will go to Google and say, here’s a radius around an abortion clinic, tell me everybody’s device, every user who was present within that space in a particular period of time. So it can kind of help prevent these dragnet types of warrants. But the big point is, that it should be a systemic solution, tech companies can do something, the best thing that tech companies could do is not collect the sensitive data in the first place. Alternatively, they could try to purge the data routinely and quickly. Alternatively, if they do store it, another thing they could do is not volunteer it to law enforcement and not sell it to other intermediaries like data brokers who could pass it along to law enforcement. And finally, if law enforcement or civil litigants come calling with a legal process, you have to comply with the legal process, but you don’t have to comply right away. You can challenge it. You don’t have to give it speedy, efficient priority and processing. You can take it to court and try and move to quash, say it’s overbroad, make sure the government has really established probable cause. And it’s not just a rubber stamp, a kind of a warrant. So really put it to the test and stand up for the users and test the limits of that. But then, of course, if a court ultimately orders them to hand it over, they have to hand it over. So one systemic solution is from the tech companies. And I will put that in the order of, don’t collect, don’t volunteer, and then push back to the full extent legally allowed, but another systemic solution would be a legislated privilege.
[Paul Wood] 18:32
Well, we’ll get to the legislative solutions in a minute, but I was wondering, in your article, you mentioned this concept of surveillance capitalism. Could you briefly describe this economic logic and how it sort of exacerbates privacy concerns for those seeking information on reproductive healthcare?
[Rebecca Wexler] 18:50
Yeah, definitely. So this is a term that was popularized at least by Shoshana Zuboff, in her book, The Age of Surveillance Capitalism. And it basically talks about, you know, there’s phrases like “data is the new oil” or something like this. But the idea is that our economy at this point just runs on tech companies trying to vacuum up as much data about people as possible. And so vacuum it up, and store it, and then analyze it, and then use that to target individuals to sell targeted advertising. And so, this is our new model for capitalism, where companies are incentivized to try to surveil and quantify everything about us, our preferences, our habits, our friends, our associations, our locations with the Internet of Things (IoT), what we have in our refrigerators, when we run our dishwasher; the cars are surveilling us now, they didn’t use to be right now. They’re sending location information back then they might be collecting other data about us. We’ve got Fitbits, and other biometric trackers that are collecting information about our blood, and our blood oxygen levels, and our fatigue, and our health functions, and our gait. So, all of this is because we now live in this age of surveillance capitalism, where companies are incentivized to collect as much data as possible.
[Paul Wood] 20:59
Your article also highlights the bilateral economy of digital data flows, could you break down that concept for our listeners and sort of its implications on privacy?
[Rebecca Wexler] 21:10
Definitely. I will say, I have to credit my co-author Aziz Huq for the excellent language there, the bilateral economy of digital data flows was definitely something that came from Aziz and he was a fabulous co-author to get to work with. But the basic idea in plain, Rebecca Wexler speak, is that because we live in this digital environment shaped by surveillance capitalism, and this is a big distinction from the Pre-Roe Era when access to reproductive healthcare was criminalized. And the Post-Dobbs Era, when access to reproductive healthcare is criminalized, is that now post Dobbs, the first step for anybody who is imagining that they might need abortion care, and is going to try to seek assistance, is going to be to search through a digital device, on the internet to seek information or communicate with trusted friends, family members, doctors via a digital device. So part one is that in order for anybody today, to try to access this kind of care, the main channels to obtain it are going to be through these digital portals. And what’s bilateral about it, is then in that very act of trying to obtain reproductive care, that those same acts are also going to generate digital data trails that will be useful as evidence for prosecutors. So there’s a kind of catch-22. On the one hand, people seeking reproductive care are going to depend on digital devices to obtain it. And on the other hand, through that very process, they will be creating digital evidence that prosecutors can then use to investigate them, or law enforcement can then use to investigate them.
[Paul Wood] 23:25
So this concept you’re talking about, the main way of accessing reproductive care, generates data trails for the prosecution. I think that segues pretty well into recent events. I’m sure you’ve heard on February 16 of this year, the Alabama Supreme Court ruled that embryos created during fertility treatments are children under state law, which permitted two wrongful death suits.4 Given the potential for these healthcare facility records and online activity to be used in incriminating ways. What advice would you give to clinicians and healthcare facilities to protect their patients and themselves?
[Rebecca Wexler] 24:01
What advice could I give to clinicians? I mean, it’s so depressing and hard. What could a clinician do? A clinician who’s operating in the jurisdiction. I don’t think they have. I don’t think they have much of an option. So maybe it’s more about clinicians who are operating outside the jurisdiction where we’re now worried about cross-border investigations and looks kind of like long-arm reach. And for those clinicians, I think it’s really tricky. Because what are you going to say: you should not collect the information, you should purge the information, well that’s going to limit the ability to provide the service, to provide the care, so I can’t give the same advice to clinicians that I am advocating for big tech companies, which is don’t collect this data at all. There should be some limits on what you’re going to use to commercialize for profit, that doesn’t apply to the healthcare providers themselves, even out of state. What does apply to them, what could apply to them is to say, just like with the tech companies, if you do obtain, if somebody does serve you with the legal process for these kinds of records, at the very least bring them to task, challenge the legal process to the full extent lawful. That doesn’t mean that you should, you know, spoliation of evidence is a crime, it doesn’t mean that you should necessarily go to prison in order to avoid complying with a subpoena. If you avoid complying, or if you refuse to comply with a subpoena, you get criminal contempt, or you go to jail, and journalists do that sometimes to protect anonymous sources, and that will be an extreme example. But much short of that, you can at least move to quash the subpoenas, push back, and say it’s overbroad, it’d be unduly burdensome to provide the records. If it’s a warrant, is there really probable cause? And you could challenge those things in court. So I would advise that the medical facilities with all the other things that they have to do and deal with, also try to put some resources into making those kinds of motions in court.
[Paul Wood] 26:45
Thank you so much, Professor. I think we’ve kind of talked about the individual level, the healthcare level on the tech company level of this issue. I was wondering, now let’s get back to the legislative solution. What do you see as being the ideal solution to this problem from the legislature in the government versus what do you think is more realistic?
[Rebecca Wexler] 27:08
Ah, that’s tricky. What’s ideal versus realistic? Well, I’ll say, yeah, I know the ideal. The “ideal” is, as an evidence professor, I wanted the legislature … and actually, I worked with a federal congressional office to draft a draft bill that would establish an evidentiary privilege for data that is relevant to abortion care. So any data that could reveal the effort to obtain or provide an abortion will be covered. And what an evidentiary privilege does is actually block law enforcement from obtaining that information with a warrant or from introducing the information into court as evidence. So when you think about it, we talked about how easy it is to get a warrant if, in fact, the conduct does actually fall within the scope of the penal code. But even information that is evidence of a crime, is shielded from law enforcement by evidentiary privileges. And so the classic example is the attorney-client privilege. You can talk to your attorney about past crimes, and those communications are evidence that you committed the past crime. And yet law enforcement can’t obtain that with a warrant, they can’t come and search and seize those records. And if they happened to obtain them by accident, like they got a warrant to collect a bunch of papers and your attorney-client communications happen to be in there and they didn’t even know, they have to have a firewall. They have to have what’s called a taint team. And they have to not all have anybody on the actual prosecution team know about the contents of those records that were privileged. And then once you get to trial if, for some reason, somebody happened to try to introduce these records into evidence, the other party could say, hey, that’s privileged, and then they’re kept out. And we have the attorney-client privilege. And you say, oh, that seems like from a policy perspective, a little worried about that, because that might be extremely probative evidence. And it may be extremely probative evidence, even of very serious crimes. So why would we want to lose that though, we want law enforcement to get access to that evidence in the courts to be able to have access to that evidence. But there are policy considerations on the other side, which is we want to also be able to provide effective assistance of counsel and we want to be able to incentivize trusted communications between clients and their attorneys. And we think that that’s more important than actually finding the truth and having accuracy in an individual case where these communications are going to be suppressed. And so I thought, hey, now we’re talking. This isn’t HIPAA with a bunch of exceptions. This is real privacy protection with teeth. These are privacy protections that are more powerful than the Fourth Amendment, that are more powerful than privacy statutes. Let’s get something like that for abortion-relevant data, let’s get an evidentiary privilege. And there’s a similar policy argument that would be we want to incentivize safe access to healthcare. And so this privilege will enable people to access healthcare, just like with the attorney-client privilege, you’re incentivizing people to communicate with their attorney. Well, here, we’re going to incentivize healthcare and we as a society could choose that, that incentivizing, that safe bubble around accessing healthcare is more important. It’s so important that even though we’ll lose some evidence in the prosecution of crimes, there’s precedent for this, we do it all the time. We do it with evidentiary privileges. So I actually argued, and Aziz and I worked together with this Congressperson to put together this draft bill, which is in the appendix to the article. And we are actually arguing that the US Congress would have the authority under the Commerce Clause to create this type of privilege for abortion-relevant data that affects interstate or foreign commerce. And then that would apply across the board, even in state courts. And so it could actually have a huge effect. It could solve this problem, not just for federal courts, if we one day had a federal criminalization of abortion care, but it would also bind the courts in Alabama.
[Paul Wood] 32:48
I think there’s a huge amount of good that could be done with evidentiary protection like that at the federal level. But I wonder if you could speak to, obviously, different states have different levels of protection for reproductive health data being accessed across state lines. I know one of those states, California, is working on certain safeguards and privacy protections. Do you view the protections that California has as sufficient, as a sort of roadmap for other states? Or do you think that there’s more that could be done on an individual state level while we wait for a more permanent solution like an evidentiary privilege?
[Rebecca Wexler] 33:28
Great. And so I’m just going to remind myself, I think what we’re talking about is in California, what California did is they modified their rules for law enforcement cooperation across borders. So the concern is, let’s say tech companies live in California, they store all the sensitive data in California, and prosecutors from Alabama are going to go to California to get the tech companies to give them the data. Okay. And so the issue is that generally, states have agreements where they’re going to comply with warrants issued from another state to produce records. There are legal obligations for that and to have a mutual, national effective law enforcement investigation program, you need to have that. And in fact, there are also other laws that similarly require courts in different states to enforce each other’s subpoenas. So there’s the warrant angle and there’s the subpoena angle. And in the wake of Dobbs, what California did was they created an exception to this out-of-state warrant compliance rule, just for anti-abortion investigations. And so now, if a tech company in California or anybody else in California knows or should have known that an out-of-state warrant relates to an anti-abortion investigation, they’re not allowed to turn over records in response to that warrant, unless there’s some attestation that the seeking party isn’t actually going to use it for an anti-abortion investigation. So that’s what California did. I think it was a great step. And I think other states should do the same thing. I’m not sure if any other states have followed in the wake of California, I don’t follow suit. I don’t know the answer, but I would love to find out. But what it doesn’t do is, in any way limit the admissibility of information in court, all that does is restrict cooperation across borders to share the data, to get it into the hands of law enforcement. If law enforcement is able to get the data another way, so they purchase it from a data broker, or they get it in some other form, then there’s no limitation on their admitting it and relying on it in court. So what a privilege could do, I think is similar but is bigger, it will do double duty. The privilege would give you a way of clawing back or holding back information from a warrant because you take a hand over its privilege. And it would also mean that if somehow somebody got through, there would be a way to block their use of it. So I think a privilege would be stronger. But I think what California did already is fantastic. And more states should do that as well.
[Paul Wood] 37:05
Professor, in your opinion, is there a realistic fear of a challenge to this new law or modification working its way through the same court system that handed down the Dobbs decision?
[Rebecca Wexler] 37:15
Yeah, it’s a great question. It’s really interesting. I don’t know. I mean, on the one hand, these are local state laws, right? The Uniform Act for witnesses, the subpoena rules, these laws that require you to comply with out-of-state warrants are state laws. They’re not constitutionally grounded. So I don’t see any way that the Supreme Court could tell California that it can’t modify the state law however it wants. So your question really, on a deeper level, goes to this other issue, which is, is there some constitutional demand on states to obey another state’s legal process or to enforce other states’ legal process? And that this is definitely beyond my expertise. I think this would go to a Con law scholar who has expertise in state-state conflicts. But it’s a really interesting question, because, really, there’s one state’s preference pitted against another state’s preference. And so what’s the constitutional ground to pick one over the other? So I don’t know the answer. But that’s where I would go to try to research it. If there hasn’t been a challenge yet, I’m hoping that means there’s not a clear constitutional hook. And so it doesn’t really matter if it’s this court or another court sitting in the Supreme Court building in DC. If there really isn’t a constitutional hook, then it’s not going to get before them.
[Paul Wood] 39:22
Well, we’ll put a pin in that for now. Professor, what role do you believe public awareness and digital literacy should play in protecting reproductive rights?
[Rebecca Wexler] 39:32
Oh, I think public awareness is huge. I don’t think digital literacy should play much of a role at all. So public awareness is huge because I think we need legislative solutions. We need legislators to stand up and do what they can in state legislatures, the US Congress. Also, public awareness could help with consumer pressure on tech companies to try to do a better job. So the public has power here. But digital literacy has this ring to it of kind of blaming the victim. I think as we were talking about a little bit before. We say, why require the kind of Sisyphus? That guy that’s rolling the ball up the mountain and falls down and he’s constantly doing it. I feel like the idea of giving individuals the responsibility of safeguarding their own digital data through digital literacy from law enforcement investigation, as a prerequisite to being able to access abortion care is like a Sisyphean in cruel tasks bound to fail. And it’s just a way for us to try to pawn off the responsibility onto the people least able to address it. And then call it a day. And we’ve seen this with data privacy laws more generally. One of the big critiques of data privacy in the US more generally, apart from reproductive care, is that we have this notice and consent regime, which says companies can really collect anything they want, as long as they inform the user and the user consents. But as you and everybody else know, we all know, we get informed about these data collection practices and click-through agreements, it’s impossible to read them all, nobody could possibly spend their life nor should they becoming … a few people like Paul Schwartz, can become an expert in data privacy law, and that’s it. He should and we want him to, but that’s not something that everybody should have to do in order to just go about their lives and have other priorities. And so the same thing here, I really feel that this is the same thing here. Now, I did my 1L internship at the Electronic Frontier Foundation. And that’s a combination. It’s a nonprofit organization that has tech activists, policy activists, and lawyers all working together. And they have a lot of tools that are designed to try to empower individuals to keep their data private and protect their data. And I’m all for that, why not make digital literacy available? Sure, to those who are interested, to those who are privileged to have the resources and the time to study up and spend their time that way. But I am kind of allergic to the idea that digital literacy can be in any way a solution for accessing reproductive care because the people who are in need of reproductive care should not have to develop digital literacy skills in order to obtain it safely and securely.
[Paul Wood] 42:50
Sort of your answer focusing on the victim-blaming aspect of this. I know your past answers have touched on this. This kind of issue can particularly affect marginalized groups. Do you think there are any specific means we could use to protect them more than the evidentiary privilege that we’ve discussed?
[Rebecca Wexler] 43:11
Oh, that’s interesting. That’s an interesting question. I hadn’t thought about it quite that way. Because yes, the people who need abortion care and can’t obtain it, and can’t cross borders to obtain it are going to be the people who don’t have resources and are marginalized in a variety of ways. I would just be thinking off the cuff. I hadn’t thought of this before. But tech companies have all the information to identify who those more marginalized people are. So one thing that I imagine they’re technically capable of doing is even if they’re not willing to, say, purge location data for everybody, maybe they say they’re going to purge location data for your particular communities that are more marginalized, and maybe they won’t collect location data from individuals between certain age ranges who might have the capacity to become pregnant and need abortion care or who live far away from abortion care providers, they could identify probably pretty well, who’s most likely to be targeted by these new criminal investigations and they could choose to be extra protective of the data from those users through non-collection or purging because, at the end of the day, the tech company has to respond to legal process. Okay? They don’t have to give something up if they don’t have it in the first place. So non-collection or routine purging is 100% effective. And totally lawful.
[Paul Wood] 45:16
Thank you, Professor. Do you have any last words of advice you’d like to pass on to law student advocates, the people being affected by this intersection of privacy rights and reproductive care, and those working in the space?
[Rebecca Wexler] 45:32
Well, I have generic advice that I give to law students, I’d love to use this platform to give it. One of my generic pieces of advice to law students is that you should publish op-eds. So the fact that the two of you are working or you and your team are working on a podcast is already a great thing. But the idea is that once you have been in law school, even for a year, or two years, you already have more access and understanding of this language of power that runs our society than most people in the country. And so, think about that. And think about what you can do with that, while you’re still in school, to kind of share some of that knowledge and apply it to important current issues in the public sphere. So there’s an organization that I really like that had a big influence on me. I did one of their workshops, and they actually train you to do some public speaking and media as well as written op-eds, but it’s called “The OpEd Project”. And they’re committed to diversifying voices in the public sphere. They’ll work with anybody who shares that mission. So they don’t discriminate based on who you are. They want to help people who share the mission of a diverse public sphere conversation. And I think they have really excellent structures and advice about how to write op-eds and pitch them. And this whole article that we’re talking about really came out of an op-ed that I wrote with Professor Huq, and then that spun into news outlet interviews, congressional testimony, and ultimately an article. So op-eds are a portal to other kinds of opportunities. And I recommend using them.
[Paul Wood] 47:30
Well, thank you so much, Professor Wexler, for joining us today and talking with us. I know it was very insightful to me and I think our listeners will agree.
[Rebecca Wexler] 47:36
Thank you so much for having me and thanks for doing this podcast. I really appreciate it.
[Gayathri Sindhu] 47:48
You have been listening to the Berkeley Technology Law Journal podcast. This episode was created by Paul Wood, WanYi Lin, and Gayatri Sindhu. The BTLJ podcast is brought to you by Editors Eric Ahern, Meg O’Neill, and Juliette Draper. Our Executive Producer is BTLJ Senior Online Content Editor Linda Chang. BTLJ’s Editor-in-Chiefs are Will Kasper and Yuhan Wu. If you enjoyed our podcasts, please support us by subscribing and rating us on Apple Podcasts, Spotify, or wherever you listen to your podcasts. Write to us at BTLJ podcast@gmail.com with questions or suggestions of who we should interview next. This interview was recorded on March 8, 2024. The information presented here does not constitute legal advice. This podcast is intended for academic entertainment purposes only.