[00:00:02] Speaker 03: Good morning, Your Honors, and may it please the Court. [00:00:04] Speaker 03: I'd like to please reserve three minutes for rebuttal time. [00:00:07] Speaker 03: Sure. [00:00:14] Speaker 03: This case involves the Illinois Biometric Information Privacy Act, Section 15B of which states that no private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's [00:00:30] Speaker 03: biometric identifier or biometric information. [00:00:34] Speaker 03: The district court in granting summary judgment in favor of Facebook here erred by creating a judicial exception to the plain language of the statute and carving out, writing out persons from the coverage of the statute. [00:00:51] Speaker 02: Counsel, I wanted to ask you about that because I think your point is well taken that the statute does say persons. [00:01:00] Speaker 02: They argue, and we'll hear from Metta, but they argue that this is a strongman argument because I don't think their argument hinges on the fact that it just didn't apply. [00:01:15] Speaker 02: They're saying, on its terms, it doesn't apply to non-users. [00:01:21] Speaker 02: They're saying we just didn't violate the statute. [00:01:27] Speaker 02: I mean, I guess I'm wondering, do you actually win the case if we say, oh, yeah, well, Section 15B does apply to non-users? [00:01:37] Speaker 02: Isn't there a lot more we still have to figure out? [00:01:40] Speaker 03: Your Honor, Facebook is raising arguments in this appeal that the district court did not rule in their favor on. [00:01:50] Speaker 02: The district court's decision- But we can affirm the district court for alternative, on an alternative basis. [00:01:56] Speaker 02: This is coming up on summary judgment, right? [00:01:58] Speaker 03: Or, well- It's an appeal from a grant of summary judgment. [00:02:02] Speaker 03: Yeah. [00:02:02] Speaker 03: They move for summary judgment on multiple grounds. [00:02:04] Speaker 03: All of those grounds were denied except one ground, which is this judicial exception that the judge created, where he read out the word persons and limited the application of BIPA to just [00:02:17] Speaker 03: people with pre-existing relationships with the defendant. [00:02:22] Speaker 03: We submit that that is clearly in contrast and contradiction to the plain language of the statute. [00:02:26] Speaker 02: I think I agree with you. [00:02:28] Speaker 02: I think I agree with you, but I don't think you win based on that. [00:02:31] Speaker 02: You still have to show that they violated the statute as to non-users. [00:02:36] Speaker 02: And that's what I'm struggling with is [00:02:39] Speaker 02: you know, is there a material dispute here? [00:02:41] Speaker 02: Because what is your basis for how they violated the statute as to non-users? [00:02:47] Speaker 02: Because their argument seems to be, we don't have any information that could identify a non-user. [00:02:53] Speaker 02: The data we have is a bunch of code. [00:02:55] Speaker 02: We run it against, if it matches up to a Facebook user, we flag it, they can be tagged. [00:03:02] Speaker 02: If it doesn't match up, it's not like they could look out and say, I mean, if I were not on Facebook, they could not use that information. [00:03:09] Speaker 02: to come and identify me as an individual. [00:03:12] Speaker 02: Do I have the facts wrong in my understanding? [00:03:15] Speaker 03: Yes, you do, Your Honor. [00:03:17] Speaker 03: Their characterization of the factual record on those issues is an overstatement of what the record is. [00:03:24] Speaker 02: So I'm not wrong in the arguments they're making. [00:03:26] Speaker 02: You're saying there's a factual dispute that proves there's a dispute about whether that's accurate or not. [00:03:34] Speaker 03: Not only am I saying that, the district court [00:03:37] Speaker 03: explicitly said in the summary judgment decision that there were quintessential factual disputes on all of those contentions. [00:03:44] Speaker 02: I understand and the district court may have been correct and it may not have been correct on those issues. [00:03:49] Speaker 02: So I want to know what are the facts that suggest that the data that they're collecting as to non-users or, well, I say collecting, I got to be careful. [00:03:59] Speaker 02: I don't want to suggest that I'm saying collecting in the sense of violating the statute, but the data that they have as to non-users, what is your evidence in the record that shows there's a dispute about whether that can be used to identify a non-user? [00:04:17] Speaker 03: I just want to be clear, Your Honor, that this question that you've posed, and I'm happy to answer it, and I will momentarily, it's not what the district court ruled on. [00:04:27] Speaker 02: I understand that. [00:04:28] Speaker 02: But we can affirm based on any reason in the record. [00:04:32] Speaker 02: So I want to know whether we can affirm on this basis to say that the district court was wrong in finding a factual dispute on this issue. [00:04:40] Speaker 02: So tell me what the factual dispute is that a non-user could be identified with the information that they collect. [00:04:47] Speaker 03: So they raise two arguments here. [00:04:50] Speaker 03: One is that the information that they obtain doesn't count as a biometric identifier. [00:04:58] Speaker 03: That's number one. [00:04:59] Speaker 03: And number two, they say, whatever it is, we didn't keep it long enough, and therefore it shouldn't count. [00:05:05] Speaker 03: Those are the two arguments that they make on these factual disputes, which, again, the district court found to be in dispute. [00:05:11] Speaker 02: And so let's go to the first argument, because my understanding on the first argument, and I'm sorry to ask the question, but in my mind, this is pretty complicated. [00:05:19] Speaker 02: I mean, this is not intuitive as to what the definition of biometric information or biometric identifiers are. [00:05:25] Speaker 02: But as I boil it down, it seems to me that it has to be data that can identify somebody. [00:05:32] Speaker 03: Am I wrong about that? [00:05:35] Speaker 03: Let me answer it this way. [00:05:36] Speaker 03: Maybe you are in a sense, but maybe not. [00:05:40] Speaker 03: It depends on the- Okay, biometric information. [00:05:42] Speaker 02: Let's start with basics, okay? [00:05:44] Speaker 02: Yes. [00:05:44] Speaker 02: The definition of biometric information is any information, regardless of how it is captured, converted, stored, or shared. [00:05:50] Speaker 02: We've got a separate issue there. [00:05:51] Speaker 02: Are you talking about information or identification? [00:05:53] Speaker 02: I'm talking about information based on an individual's biometric identifier used to identify an individual. [00:06:02] Speaker 02: doesn't that suggest that this information has to be able to be used to identify an individual? [00:06:09] Speaker 03: That biometric information definition does contain the qualifier used to identify. [00:06:16] Speaker 03: But that's not what the case is about. [00:06:18] Speaker 03: If they've led you to believe that, Your Honor, they've misled you, because the contention is the other definition. [00:06:24] Speaker 03: Biometric identifier. [00:06:25] Speaker 02: Biometric identifier. [00:06:26] Speaker 02: Which does not contain that. [00:06:27] Speaker 02: Specific types of data, which include retina or iris scan, [00:06:31] Speaker 02: fingerprints, voice prints, or scan of hand or face geometry. [00:06:35] Speaker 02: And you're arguing this is face geometry. [00:06:37] Speaker 03: Yes. [00:06:37] Speaker 03: And there is no additional language, like there is in the biometric information definition, where in that latter definition, it adds the qualifier used to identify. [00:06:50] Speaker 03: The biometric identifier definition doesn't have that additional condition to it. [00:06:55] Speaker 03: All it has to be is one of the listed things. [00:06:58] Speaker 03: And what happened here is Facebook creates a scan of face geometry from every photo of a face that is uploaded to their system during the relevant time period. [00:07:14] Speaker 03: So that qualifies as a scan of face geometry. [00:07:18] Speaker 03: And this court actually had no problem concluding that exact same fact in the Patel case, which involved the exact same technology. [00:07:27] Speaker 03: In fact, if you look at the background discussion in the Patel case, in page 1268 of this court's opinion in Patel, where it's talking about the Facebook facial recognition technology, it says it makes the face scan from the uploaded photos [00:07:43] Speaker 03: It creates first what it calls a face signature. [00:07:46] Speaker 03: This is Facebook's terminology. [00:07:47] Speaker 03: The face signature, this court said was a scan of face geometry. [00:07:52] Speaker 03: That's what we have here. [00:07:54] Speaker 03: And they don't dispute that. [00:07:56] Speaker 04: Mr. Kerry, is there a difference in the Facebook case where that information, which involved users, was held onto? [00:08:06] Speaker 04: And Facebook here is saying it's not a violation of the statute. [00:08:10] Speaker 04: I read this as a statutory construction argument, that BIPA does not [00:08:17] Speaker 04: create a cause of action for information that's temporarily grabbed in order to see if there's a match and then deleted. [00:08:25] Speaker 04: And so I think that's a distinction that Facebook is drawing between this case and the other case. [00:08:32] Speaker 04: Why are they wrong in your view? [00:08:33] Speaker 03: Well, they're wrong on the law and they're wrong on the facts of this particular case. [00:08:37] Speaker 03: So let me start with the [00:08:40] Speaker 03: I don't know if you want me to start with the facts, but I could start with either one. [00:08:43] Speaker 04: Start with the law. [00:08:44] Speaker 04: I think I'm more interested in your interpretation of the statute. [00:08:48] Speaker 03: There is no temporal requirement in the statute for how long you have to keep something in order for there to be a violation. [00:08:55] Speaker 03: They are trying to write that [00:08:58] Speaker 03: caveat into the statute. [00:09:00] Speaker 03: They are trying to engage in more judicial law making. [00:09:04] Speaker 03: Well, I don't know. [00:09:05] Speaker 04: I mean, if you look at 15A, it talks about in possession of and needing a written policy in order to either retain or destroy that information. [00:09:14] Speaker 04: And there are other provisions as well that talk about possession. [00:09:18] Speaker 04: And as I take their argument, when the words collect, capture, or obtain involve some sort of possession of that data and then misuse of it. [00:09:30] Speaker 04: So why is that incorrect? [00:09:31] Speaker 03: It's interesting, Your Honor. [00:09:32] Speaker 03: They devote some discussion in their brief as to what they suggest, how they think the word. [00:09:38] Speaker 03: So you just mentioned 15A, which talks about possession of biometric identifiers. [00:09:44] Speaker 03: That 15A triggers the duty to publish [00:09:49] Speaker 03: a retention schedule. [00:09:50] Speaker 03: That's a different section of the law than 15B, which is the getting the biometric identifier violation. [00:09:59] Speaker 03: So let's just focus on the getting the information, because that's what we're talking about here. [00:10:03] Speaker 03: I believe that was Judge Nelson's question. [00:10:06] Speaker 03: So that one prohibits collection, capture, or otherwise obtain. [00:10:12] Speaker 03: Those are key words in the statute. [00:10:14] Speaker 03: They devote [00:10:15] Speaker 03: It's interesting. [00:10:17] Speaker 03: They conveniently try to define and ask this court to construe what they think the meaning of collect and capture should be. [00:10:25] Speaker 03: And their definitions have a [00:10:28] Speaker 03: an undercurrent of holding onto it for a while, for lack of a better term. [00:10:33] Speaker 03: They don't define anywhere in their briefs, otherwise obtain, which is broader than collect and capture by its nature. [00:10:40] Speaker 03: And if you look, and they want to rely on Merriam-Webster dictionary definitions for collection and capture. [00:10:46] Speaker 03: That's what they're citing to you in their section of the brief on what those terms should mean. [00:10:51] Speaker 03: The Merriam-Webster dictionary definition of obtain is to gain or obtain [00:10:57] Speaker 03: or attain, A-T-T-A-I-N, usually by planned action or effort. [00:11:06] Speaker 03: There's nothing in, like, when they define capture for you, they give you a definition that says recorded in a file. [00:11:12] Speaker 03: Well, let me ask this. [00:11:13] Speaker 03: But obtain doesn't have that definition. [00:11:15] Speaker 03: Let me ask this. [00:11:15] Speaker 04: It just means to obtain it. [00:11:17] Speaker 04: Can you mind if I interrupt you? [00:11:19] Speaker 04: The district court thought that it would be absurd to construe the statute in a way that would require Facebook to try to obtain the consent of millions of non-users in the state of Illinois. [00:11:32] Speaker 04: And I think this connects a little bit to the question I've been asking because if you don't construe the statute in a way that has some sort of holding onto it or possessing it, why was the district court wrong to think that it would be absurd to require a company to go and try to obtain the consent of all these non-users? [00:11:53] Speaker 03: Well, a couple of things there, Judge. [00:11:56] Speaker 03: Let me make one other comment here. [00:11:59] Speaker 03: We're getting back now with this question to the district court's rationale, which is the imposition of the new requirement that the statute is limited to people who have a preexisting relationship with the defendant. [00:12:13] Speaker 03: Therefore, people that the defendant doesn't know who they are, the statute can't apply to them. [00:12:16] Speaker 03: That's what the district court said. [00:12:18] Speaker 03: That's wrong because it reads out [00:12:20] Speaker 03: the plain words of 15B, which is any person or customer. [00:12:25] Speaker 03: Any person or customer obviously is more than a customer or a user or someone with a pre-existing relationship. [00:12:32] Speaker 03: Otherwise, the term any person would be completely superfluous. [00:12:35] Speaker 03: We can't do that. [00:12:36] Speaker 03: That decision from the district court and what they're continuing to urge here is wrong. [00:12:42] Speaker 03: I want the court to understand that on the facts of the storage, they have a footnote in their brief. [00:12:49] Speaker 03: that, so they say what they do is the photos get uploaded, they create face signatures of everybody, they run the face signatures against their database of what they call face templates for users, and if there's a match, they suggest a tag, you can tag a friend. [00:13:07] Speaker 03: They admit in their brief that the system isn't perfect, that sometimes non-user faces for which they create scans of face geometry called face signatures [00:13:19] Speaker 03: are matched to people that are users by mistake. [00:13:23] Speaker 03: And guess what happens in that scenario? [00:13:24] Speaker 03: Because there's a match, they're converted into face templates. [00:13:27] Speaker 03: And they keep those for however long they keep them. [00:13:30] Speaker 03: And this court in Patel even talked about dealing with the same technology, identified face templates as being face signatures that are matched. [00:13:39] Speaker 02: Can you point me to the evidence in the record that supports what you're just saying? [00:13:44] Speaker 03: Yes. [00:13:45] Speaker 03: And I'm going to make one more point, and then I'm going to give you the citations, Judge. [00:13:50] Speaker 03: Mr. Clayton Zellmer, my client's face signature, was one of the ones that was matched to a user and converted into a face template. [00:14:00] Speaker 02: Please, give me that. [00:14:01] Speaker 02: Because all I read was we had four face templates of your client between 2013 and 2020. [00:14:08] Speaker 02: I did not get this nuance that you're now saying that they stored those for some lengthy period of time. [00:14:15] Speaker 03: at supplemental excerpts of record 37, supplemental excerpts of record 126 and 127. [00:14:23] Speaker 03: Those passages, that's a citation to Facebook summary judgment motion and their deposition transcript of their expert who stated that one of the four Clayton Zellmer photos was matched to a Facebook user [00:14:38] Speaker 03: And we know from other parts of the record that when a face signature is matched, that's how we get face templates. [00:14:44] Speaker 02: That's what I'm asking for. [00:14:46] Speaker 02: I think we're missing a link here. [00:14:48] Speaker 02: And this is key, because you have to show a material dispute of fact. [00:14:53] Speaker 02: And you've said a lot of things, but we need to be able to trace this back in the record. [00:15:00] Speaker 02: And I'm not sure I see that connection that, I mean, you're combining like two different pieces of information. [00:15:07] Speaker 02: You're saying he matched four times. [00:15:09] Speaker 03: No, they took his face scan four times. [00:15:11] Speaker 03: OK, how many times did he match? [00:15:13] Speaker 03: And they created four face signatures from those four scans. [00:15:17] Speaker 03: OK, I understand that. [00:15:17] Speaker 02: One of those four was matched. [00:15:19] Speaker 02: OK, so one of those matched. [00:15:21] Speaker 02: And when it matches, where's in the record that it says that when they match, even if it's an incorrect match, they hold onto that data? [00:15:29] Speaker 03: OK. [00:15:29] Speaker 03: supplemental excerpts of record 42 to 43, they say that for non-users where face signatures are not matched, they do not store them as face templates. [00:15:42] Speaker 03: So for face signatures that do match... You're saying by implication. [00:15:47] Speaker 03: Well, I'm not only saying it. [00:15:49] Speaker 03: I've got another [00:15:50] Speaker 03: Another reference for you, Your Honor. [00:15:51] Speaker 03: It's Patel at page 1268. [00:15:54] Speaker 03: This court in Patel addressing this same technology defined a face template as a face signature that was matched. [00:16:03] Speaker 03: So what happens is they first create the face signatures. [00:16:07] Speaker 03: And if there are matches, they convert the face signature and save it as a face template. [00:16:11] Speaker 01: So the problem I think is- And they have one of those for my client. [00:16:13] Speaker 01: So counsel, the problem with your argument that I'm having is that the match was a mismatch. [00:16:19] Speaker 01: And that's in the record. [00:16:20] Speaker 01: I read the expert report this morning. [00:16:22] Speaker 01: That's in the record. [00:16:23] Speaker 01: They knew it was a mismatch because Mr. Zellmer was matched to a woman user. [00:16:26] Speaker 00: And they knew it right away. [00:16:28] Speaker 01: So we come back to face identifier and matching to the actual plaintiff who's upset. [00:16:35] Speaker 01: It's a mismatch. [00:16:37] Speaker 01: And if it's never going to be, if there's no templates of non-users, it's never going to match correctly. [00:16:45] Speaker 03: Your Honor, face scans, by the definition of biometric identifier, inherently are identifiers. [00:16:52] Speaker 03: They don't have to actually be used to find somebody yet, just like a fingerprint drawn in a crime scene. [00:16:57] Speaker 01: You need to stop talking over the judges. [00:17:00] Speaker 03: I'm sorry. [00:17:01] Speaker 01: The way I understand what has been presented to us is that these signatures, which are something less than a template, can be used to eliminate [00:17:11] Speaker 01: but they don't actually affirmatively match for non-users. [00:17:16] Speaker 01: Why is that not true? [00:17:19] Speaker 03: They are scans of face geometry of non-users and users, and they qualify as biometric identifiers under the statute as scans of face geometry. [00:17:30] Speaker 03: And that fact was acknowledged in the court's prior Patel decision. [00:17:36] Speaker 03: And because they are face scans, they inherently can be used to identify whose ever face it is. [00:17:43] Speaker 03: Does the defendant actually have to go and track it down at a particular moment in time for it to count as a biometric identifier? [00:17:49] Speaker 03: No. [00:17:49] Speaker 03: It just has to be capable of identifying somebody. [00:17:52] Speaker 03: Just like a fingerprint taken from a potential suspect at a crime scene. [00:17:55] Speaker 03: We haven't matched it yet. [00:17:56] Speaker 03: It's still an identifier for that suspect. [00:17:59] Speaker 03: I see that I'm into my rebuttal time. [00:18:00] Speaker 02: Yeah. [00:18:01] Speaker 02: Well, I will give you some time for rebuttal. [00:18:02] Speaker 02: Thank you. [00:18:02] Speaker 02: Thank you. [00:18:22] Speaker 00: Good morning. [00:18:22] Speaker 00: May it please the court, Lauren Goldman for defendant Facebook. [00:18:27] Speaker 00: BIPA is a very stringent statute. [00:18:30] Speaker 00: Our client paid $650 million in Patel to settle the claims brought by users asserting that Facebook didn't obtain the necessary informed consent before storing their templates. [00:18:43] Speaker 00: But even BIPA has limits. [00:18:46] Speaker 00: this case is outside those limits. [00:18:49] Speaker 00: There is, sorry, Your Honor. [00:18:50] Speaker 02: Well, I just want to know what's on the table here, because are you arguing that non-users are just not covered by BIPEDAL? [00:19:00] Speaker 00: Absolutely not. [00:19:01] Speaker 02: That's what I thought. [00:19:02] Speaker 02: That's what I thought. [00:19:03] Speaker 02: That's what I thought. [00:19:04] Speaker 02: So if we write in our opinion in one paragraph and say it clearly applies to persons, you're not going to dispute that point. [00:19:11] Speaker 02: You're making a more nuanced argument, as I understand it. [00:19:15] Speaker 00: We are making a more narrow argument. [00:19:17] Speaker 00: Let me make clear what we argued below and what we're arguing here and what I actually think Judge Donato held, although the court, of course, can affirm on any ground, as the court has observed. [00:19:29] Speaker 00: The way that we frame this below is it's excerpted at page 30 of our appellate brief. [00:19:35] Speaker 00: There's a block quote. [00:19:36] Speaker 00: And we said, we're not saying that BIPA doesn't apply to any claim brought by a non-user. [00:19:45] Speaker 00: What we're saying is that BIPA doesn't apply to the incidental processing of a nonuser in the course of recognizing an enrolled and consenting user. [00:20:00] Speaker 00: That's the narrow rule that we asked for below and that we are asking for an appeal. [00:20:04] Speaker 04: And what's your statutory basis for that reading? [00:20:08] Speaker 04: Because it's not from person, right? [00:20:11] Speaker 04: You agree that it's not in the use of the word person or customer. [00:20:16] Speaker 04: So where does that come from? [00:20:18] Speaker 00: It's in two places, Your Honor. [00:20:20] Speaker 00: To just pause on person for a second. [00:20:23] Speaker 00: So, person doesn't mean any person, any time, in any circumstances. [00:20:27] Speaker 00: It doesn't mean a person walking past a security camera that's out in front of a school. [00:20:32] Speaker 00: Person can mean a user. [00:20:33] Speaker 00: It can mean a customer. [00:20:35] Speaker 00: It can mean an employee. [00:20:36] Speaker 00: Most BIPA cases are brought by employees who are not customers. [00:20:40] Speaker 00: So, person can have a meaning that is less than every person in Illinois. [00:20:46] Speaker 00: In terms of the statutory hook, I would direct the court to two provisions. [00:20:50] Speaker 00: The first is section five of the statute, which is written right into the text and makes it very clear that what the legislature was doing here was regulating biometric technologies. [00:21:02] Speaker 00: It was not banning them, as the Supreme Court of Illinois and the Seventh Circuit have repeatedly observed. [00:21:09] Speaker 00: And now Mr. Zellmer admits that his proposed rule will constitute a ban on all [00:21:15] Speaker 00: biometric identifying technologies. [00:21:18] Speaker 00: But the other hook that the court is probably more interested in is right in 15B, which is the operative provision here. [00:21:25] Speaker 00: Because 15B talks about the collection and capture of biometric identifiers and biometric information. [00:21:35] Speaker 00: This statute protects two interests. [00:21:37] Speaker 00: There have been, I don't know, hundreds of BIPA opinions, right? [00:21:41] Speaker 00: They all talk about two things. [00:21:43] Speaker 00: They talk about privacy. [00:21:45] Speaker 00: And they talk about identity theft. [00:21:48] Speaker 00: The incidental processing of a non-user, when that data is indisputably, and we can get to the record on that, deleted immediately if there is no match, it doesn't invoke those things. [00:22:01] Speaker 00: And as the court was getting at with the court's questions about what identifier means and what biometric information means and what collecting capture means, [00:22:11] Speaker 00: 15b doesn't apply here. [00:22:13] Speaker 02: So that's where my hang up is because it seems to me that that's clear as to biometric information. [00:22:22] Speaker 02: Right. [00:22:23] Speaker 02: There is a hook that they have to be able to be identified. [00:22:25] Speaker 02: But then there is also, the statute does say you cannot collect or capture. [00:22:31] Speaker 02: We've got a separate issue about whether it's collect or capture, but whether biometric identifiers. [00:22:36] Speaker 02: And identifiers is just, I mean, I guess your position would be you could have a whole database of biometric information that [00:22:48] Speaker 02: But as long as it could never be used to identify anybody, it wouldn't be violated. [00:22:52] Speaker 02: OK. [00:22:53] Speaker 02: Tell me what. [00:22:54] Speaker 00: Well, because you have to. [00:22:57] Speaker 00: It's necessary to read the different provisions. [00:22:59] Speaker 00: That is not this case, right? [00:23:01] Speaker 00: This is not the Orwellian database case. [00:23:03] Speaker 00: We are not saying that it is perfectly fine to go out and collect [00:23:09] Speaker 00: face signatures for lots of strangers and save them. [00:23:11] Speaker 02: So if you held on to these non-user templates for a lengthy period of time, you would agree that would be a violation? [00:23:18] Speaker 00: No. [00:23:18] Speaker 00: I'm not conceding that. [00:23:20] Speaker 00: I'm just saying that's not this case, because we didn't do that. [00:23:23] Speaker 00: I mean, that Orwellian data, and in fact. [00:23:26] Speaker 02: No, but it seems to me we have to give some meaning to identifier that's different than information. [00:23:32] Speaker 00: Yes. [00:23:33] Speaker 02: How do we do that? [00:23:34] Speaker 00: Thank you, Your Honor. [00:23:35] Speaker 00: Let me address that. [00:23:38] Speaker 00: BIPA works in an interesting way. [00:23:41] Speaker 00: So what the legislature did is it said, we're going to regulate these six things, iris scan, retina scan, scan of face geometry, fingerprint, voice print. [00:23:51] Speaker 00: We're going to regulate those things. [00:23:52] Speaker 00: Those things, we're going to call them biometric identifiers. [00:23:56] Speaker 00: Then if you take one of those things and you turn it into derivative data, like you turn it into a number, you take a fingerprint and you turn it into a number, [00:24:05] Speaker 00: we want to make clear that that's still regulated. [00:24:08] Speaker 00: So that's biometric information. [00:24:11] Speaker 00: But what the courts have said about the term biometric identifier is that it's still something that needs to identify a person. [00:24:18] Speaker 04: Well, can I push back a little bit? [00:24:19] Speaker 04: Because it's converted as one of the words in biometric information. [00:24:24] Speaker 04: But it's also stored or shared, regardless of how it's captured. [00:24:28] Speaker 04: So it's more than just conversion of the data. [00:24:30] Speaker 04: It could be what Judge Nelson was referring to as the storage of a bunch of biometric identifiers. [00:24:37] Speaker 00: No, because of the words derived from. [00:24:40] Speaker 00: Right. [00:24:40] Speaker 00: Biometric information is information that is derived from a biometric identifier and can be used to identify an individual. [00:24:49] Speaker 00: The reason why the legislature didn't have to make clear in the term biometric identifier that that, too, had to be something that identified a person is that it's right there in the name, biometric identifier. [00:25:01] Speaker 01: Well, maybe not only that. [00:25:02] Speaker 01: I mean, I'm not sure that this is near as complicated as [00:25:04] Speaker 01: We're talking about it. [00:25:05] Speaker 01: I mean all the things listed under biometric identifier can identify a person because they're all unique things to a person, a fingerprint, a voice, a face. [00:25:17] Speaker 01: Is it just that simple? [00:25:19] Speaker 00: It is. [00:25:19] Speaker 00: I mean, this case is simple and narrow. [00:25:22] Speaker 00: You can get at it from any one of these ways, right? [00:25:25] Speaker 00: You can get at this case from a super common sense point of view. [00:25:30] Speaker 00: that if the legislature wanted to ban biometric technologies, it would have written a much shorter statute. [00:25:36] Speaker 00: It would have said, biometric technologies are hereby banned in the state of Illinois, except for police forces and public schools. [00:25:44] Speaker 00: It didn't do that. [00:25:44] Speaker 00: It wrote this very sort of nuanced statute. [00:25:48] Speaker 00: And Mr. Zellmer, as we talked about, has admitted that if you have a rule that a company can't try to match a face to a database [00:26:00] Speaker 00: That pertains to how all of these systems work. [00:26:03] Speaker 00: That's why we had this amicus brief from the Security Industry Association. [00:26:06] Speaker 00: They don't care about tag suggestions. [00:26:08] Speaker 00: They're concerned that all of these useful security applications of biometrics would be outlawed in Illinois if a security camera outside a school, if the school were liable to passersby for checking their faces against an enrolled database and then dumping the data. [00:26:28] Speaker 00: So that's one way. [00:26:30] Speaker 00: to get at this. [00:26:31] Speaker 01: I want to focus in a little bit on the issue that I'm struggling with the most, I think, and that is trying to understand, and this relates to some of Judge Nelson's questions, I think, about the technology and whether we have a dispute of fact or just a dispute of law with regard to these facial signatures, and can signatures themselves be used to identify someone or not? [00:26:54] Speaker 00: We do not have any disputes of fact. [00:26:56] Speaker 00: I'm happy to walk through both of the things that Judge Donato talked about. [00:27:00] Speaker 00: So first on face signatures. [00:27:04] Speaker 00: There was a dispute about, there was no dispute, actually, of fact. [00:27:09] Speaker 00: We walked through this in our brief and we gave the record sites there, but there was no dispute that A, Facebook could not and did not identify non-users with the face signatures. [00:27:19] Speaker 00: Judge Donato actually found that that's all over pages five to seven of his opinion, that it was impossible for Facebook to identify these people because there was no match to a saved template. [00:27:29] Speaker 00: What the judge said, which is actually not [00:27:32] Speaker 00: true on the record is that there was a dispute about whether Facebook's face signatures were interoperable. [00:27:37] Speaker 00: In other words, if Facebook had given them to some other company, that other company could potentially have used them to match to somebody who was enrolled in their database. [00:27:48] Speaker 00: There was no dispute of fact on that. [00:27:50] Speaker 00: Judge Donato pointed to two [00:27:52] Speaker 00: statements from the same witness, the second one of which didn't address interoperability. [00:27:57] Speaker 00: So there was no dispute on that. [00:27:59] Speaker 00: But on the main point... Hold back up. [00:28:01] Speaker 00: There was no dispute that that couldn't be done or that that wasn't [00:28:04] Speaker 00: it definitely wasn't being done. [00:28:06] Speaker 00: Mr. Zellmer has never alleged that that was ever done. [00:28:10] Speaker 00: And Judge Donato didn't find that it was ever done. [00:28:12] Speaker 00: Right. [00:28:12] Speaker 02: I understand that. [00:28:13] Speaker 02: But it could, I mean, it does raise an interesting question, because if you, I mean, this is one thing I have. [00:28:20] Speaker 02: Say there were some global database out there. [00:28:24] Speaker 02: Right now, Facebook, they only have the data on their users. [00:28:28] Speaker 02: They run it against their user. [00:28:29] Speaker 02: Say there was a global database where you could run it through that and it would capture non-users. [00:28:34] Speaker 02: Would that change this case? [00:28:37] Speaker 02: Because then those non-user templates presumably could be used to identify a non-user. [00:28:45] Speaker 02: I know that that's not here, but I got to get this in my head because I don't understand where the limits are here. [00:28:51] Speaker 00: Right. [00:28:55] Speaker 00: is, if I'm understanding the court correctly, the question is, I think this actually is addressed by the two cases that we cited as supplemental authority, which both hold that in order- The district court opinions. [00:29:06] Speaker 00: The district court opinions, but they're from Illinois, and they're useful in terms of thinking about this. [00:29:11] Speaker 00: in order for something to be a biometric identifier, the defendant has to be able to identify the person. [00:29:17] Speaker 02: It really can't... But that's my point. [00:29:18] Speaker 02: What if there was a Google program and you could send this data through Google and you could find it? [00:29:24] Speaker 02: Now, Facebook says, we didn't do that. [00:29:26] Speaker 02: We don't intend to do that. [00:29:28] Speaker 02: What if they could? [00:29:29] Speaker 02: Would that make this then a biometric [00:29:33] Speaker 02: would it turn this biometric identifier into biometric information? [00:29:38] Speaker 02: Because with the click of a button, presumably you could run it through another database. [00:29:43] Speaker 00: I think it would be possible that if Facebook could collect face signatures that it could match to a database that it had access to, [00:29:58] Speaker 00: That might make it a biometric identifier, but still plaintiff wouldn't win. [00:30:02] Speaker 00: First of all, those are not our facts. [00:30:03] Speaker 00: But second, plaintiff wouldn't win. [00:30:05] Speaker 00: Because then plaintiff would have a second problem, which is that Facebook would have to do this in a way that would constitute collect [00:30:14] Speaker 00: capture or otherwise obtain. [00:30:16] Speaker 02: Right, which then we come back to the ephemeral collection issue. [00:30:21] Speaker 02: Right. [00:30:21] Speaker 02: Can you address this issue? [00:30:22] Speaker 00: Yes. [00:30:24] Speaker 02: Can you walk me through what the evidence says about when there were four face signatures created of Mr. Zellmer? [00:30:33] Speaker 02: You agree with that, right? [00:30:34] Speaker 00: Yes. [00:30:35] Speaker 02: Between 2010 and 2022 or whatever it was. [00:30:38] Speaker 02: Yes. [00:30:39] Speaker 02: One of those apparently matched. [00:30:41] Speaker 02: Incorrectly to a female user What happened to that face signature? [00:30:47] Speaker 02: It was still dumped because it was still dumped so the implication and this is what I was getting at and I was pushing back because His the implication from the evidence he's reading is well, they only said they dumped it when it didn't match What about when it does match? [00:31:03] Speaker 02: So where in the record does it say that that was done? [00:31:05] Speaker 00: Okay, so I want to point the court to SER 110 [00:31:11] Speaker 00: 111. [00:31:13] Speaker 00: So this is testimony from Mr. McCoy, who is a Facebook engineer who was in charge of tag suggestions. [00:31:21] Speaker 00: And he testifies at 110, face signatures are not saved or stored. [00:31:26] Speaker 00: They exist only briefly during the next step of the process, classification, and are immediately discarded. [00:31:33] Speaker 00: At 111, he testifies the comparison process during which face signatures exist takes only a tiny fraction of a second to complete. [00:31:41] Speaker 00: The comparison between a face signature and a template is performed in a few clock cycles of the CPU, which lasts just a few nanoseconds. [00:31:50] Speaker 00: And at 1.3435, his deposition, he says, he's asked, how long does a face signature exist? [00:31:56] Speaker 00: Oh, like very, very tiny fractions of a second. [00:32:00] Speaker 00: Once we're done with it, we toss it, we discard it. [00:32:03] Speaker 00: There is no face signature for Mr. Zellmer. [00:32:07] Speaker 00: that was ever found. [00:32:09] Speaker 02: Even the one that inadvertently. [00:32:10] Speaker 00: Even the one that, no. [00:32:12] Speaker 04: Can I clarify that for a second? [00:32:14] Speaker 04: Because everything that you described is about if there is no match. [00:32:18] Speaker 04: But I take. [00:32:19] Speaker 00: Even if there is a match. [00:32:21] Speaker 04: So where in the record does it support that even if there's a mismatch, that data will be dumped? [00:32:29] Speaker 00: I believe that in the expert testimony that the court was referring to, that Judge Forrest was referring to, [00:32:36] Speaker 00: He just said all that was saved was this erroneous tag. [00:32:40] Speaker 00: So what happened was that somebody else, there was a match to another user, that other user was tagged, a tag was suggested for that other user who was a woman in that photo. [00:32:53] Speaker 00: Nothing was saved about Mr. Zellmer. [00:32:55] Speaker 04: No, no, Mr. Zellmer's facial scan is not saved because of this mismatch? [00:33:00] Speaker 04: No. [00:33:01] Speaker 00: There was no face signature for Mr. Zellmer that was saved. [00:33:05] Speaker 02: And so to rule in your favor, do we have to find that the nanosecond holding is not a collection or capture or otherwise obtaining? [00:33:19] Speaker 00: No, because I think that the court could find, as Judge Donato did, that that just simply doesn't apply to this. [00:33:27] Speaker 00: But yes, the court could also find, and it's not about the nanosecond. [00:33:30] Speaker 02: Well, but I thought Donato said it didn't apply because it would be absurd to say you have to go get consent, and therefore he kind of read out person. [00:33:39] Speaker 02: I'm actually struggling with that aspect of Donato's opinion. [00:33:43] Speaker 02: So that's why I'm looking to say. [00:33:45] Speaker 00: Understood. [00:33:45] Speaker 02: If I don't agree with that, [00:33:48] Speaker 02: then we do have to squarely find that the nanosecond, whatever it is, does not constitute capture. [00:33:54] Speaker 02: Because if it did constitute capture, then we'd have a problem, right? [00:33:57] Speaker 00: I don't think so, Your Honor, because there are all these alternative arguments that we're making here. [00:34:02] Speaker 00: But I just want to point the court to a couple of cases on what constitutes that the Illinois courts are. [00:34:07] Speaker 02: OK, and the other argument is it could never be used to identify, and so therefore it. [00:34:12] Speaker 00: Correct. [00:34:13] Speaker 00: OK, yeah. [00:34:13] Speaker 00: And that the Illinois [00:34:15] Speaker 02: So if we drop that argument and say, it could never be used to identify, then we don't have to get into the collection issue. [00:34:23] Speaker 02: We just say, it's not covered by the statute because of that. [00:34:27] Speaker 00: You could do it that way. [00:34:28] Speaker 00: I want to point the court, because it's not really about nanoseconds. [00:34:34] Speaker 00: is a great way to think about this, because it was dumped after nanoseconds. [00:34:37] Speaker 00: That point isn't disputed. [00:34:38] Speaker 02: Well, I'm just going based on what the expert said. [00:34:40] Speaker 02: I mean, I don't know. [00:34:41] Speaker 02: I don't know if he ever timed it. [00:34:42] Speaker 02: I don't know if we have timers for nanoseconds. [00:34:44] Speaker 00: They did. [00:34:47] Speaker 00: It was a couple of turns of the CPU, Mr. McCoy said. [00:34:51] Speaker 00: But I want to get back to why it's not just about nanoseconds. [00:34:56] Speaker 00: The Illinois courts, we cited the Barnett case and the Hurd case for what collect and capture [00:35:02] Speaker 00: mean in this context? [00:35:04] Speaker 00: And I know I'm out of time and over. [00:35:06] Speaker 00: This is helpful. [00:35:07] Speaker 00: You can always be helpful to the court. [00:35:08] Speaker 00: OK. [00:35:09] Speaker 00: Thank you, Your Honor. [00:35:10] Speaker 00: The Barnett case and the Hurd case explain that collect and capture in 15B and possess in 15A connote something more than just the ephemeral existence of something. [00:35:26] Speaker 00: What they connote is a degree of [00:35:29] Speaker 02: Control. [00:35:31] Speaker 00: Intentionality and control. [00:35:33] Speaker 00: They say it involves. [00:35:34] Speaker 02: What about otherwise obtain? [00:35:36] Speaker 02: He points to that language as well. [00:35:38] Speaker 00: Right. [00:35:38] Speaker 00: So courts have held that otherwise obtain, when you have it in a list of things like collect and capture, has to be read. [00:35:46] Speaker 02: Has to derive from those basic gist of it. [00:35:49] Speaker 00: It can't just be and everything else. [00:35:51] Speaker 00: It has to be read in the context of those other words, which have a very specific meaning. [00:35:57] Speaker 00: On that point, Your Honor, I would point the court to the Barnett. [00:36:00] Speaker 00: And I heard cases Barnett is from the Illinois Appellate Court. [00:36:04] Speaker 04: Can I ask, so I've wrestled a little bit with your argument that the data has to be used to identify for biometric identifiers. [00:36:15] Speaker 04: I get the point that this data is capable of identification, right? [00:36:19] Speaker 04: As Judge Forrest mentioned, you know, fingerprints, voice, you know, those are unique characteristics of a body that is capable of identifying a person. [00:36:30] Speaker 04: But your argument seems to be that if Facebook doesn't at the moment have the capacity to identify a non-user, it's therefore not a biometric identifier. [00:36:41] Speaker 04: And my concern is, couldn't there be a claim made about the misuse of biometric identifiers in some other way, even if Facebook can't identify them right then and there, such as [00:36:54] Speaker 04: collecting that data or selling it to a third party and someone else can go and identify the non-user because they happen to have the information in their database. [00:37:04] Speaker 04: Why wouldn't that be an actionable claim under BIPA? [00:37:08] Speaker 00: Well, those would be other provisions of the statute, right? [00:37:11] Speaker 00: Sections 15C and 15D talk about transferring of biometric identifiers. [00:37:17] Speaker 00: But in this situation, a face signature, which is this [00:37:22] Speaker 00: Ephemeral thing that Facebook can't use to identify anybody is just not a biometric identifier within the meaning of the statute. [00:37:31] Speaker 00: Mr. Zellmer's argument is that it is because literally, and I would dispute this, this is something that would have gone to a jury. [00:37:39] Speaker 00: This is the one jury point in the whole case. [00:37:43] Speaker 00: We would dispute whether a face signature even is a scan of face geometry, but putting that to one side, even assuming that it is. [00:37:50] Speaker 00: What the Illinois courts have said over and over again, and I would point the court to the Rivera decision on this, is that you can't just say something's literally a scan of face geometry. [00:38:00] Speaker 00: You have to say, okay, but is it a biometric identifier? [00:38:06] Speaker 00: If it can't be used to identify, then it's not a biometric identifier. [00:38:11] Speaker 04: But if that same face scan happens to be matched to a user, then you can identify the facial scan. [00:38:19] Speaker 04: Well. [00:38:20] Speaker 00: Is that, yes? [00:38:21] Speaker 00: Then you can use, no. [00:38:22] Speaker 00: No. [00:38:24] Speaker 00: Then you can identify the user, but the thing that's actually the biometric identifier is the saved template that's associated with an actual person's identity. [00:38:33] Speaker 00: That's frankly why the users had a much stronger claim. [00:38:36] Speaker 00: Judge Donato defined the class in the user case as people who had stored templates. [00:38:43] Speaker 00: He said it's the store. [00:38:44] Speaker 00: And the users in that case claimed that it was the stored template that was a biometric identifier because it was associated with their Facebook profile. [00:38:53] Speaker 00: It was associated with them and identified them. [00:38:55] Speaker 01: So this is a question I have related to that. [00:38:57] Speaker 01: Can you take a face signature and with only the information in that signature create a template? [00:39:04] Speaker 00: No, because a template is actually something else entirely. [00:39:08] Speaker 00: A template is something, it's a hyperplane that analyzes. [00:39:15] Speaker 00: Oh, that clarifies it. [00:39:19] Speaker 00: Thank you. [00:39:21] Speaker 00: A template is something that is saved and stored, and it is constructed analyzing multiple face signatures from the same person. [00:39:31] Speaker 00: That's the definition of it. [00:39:32] Speaker 00: That's in Dr. Turk's report. [00:39:34] Speaker 00: But it's outside the bounds, I think, of this particular appeal. [00:39:38] Speaker 01: So putting it in sort of very low-level common speak, you need something more. [00:39:43] Speaker 01: You need more data, more information than what is contained in a signature. [00:39:47] Speaker 00: And you have to know who it belongs to. [00:39:48] Speaker 00: And fundamentally, Your Honor, that was the problem here, is that there's no dispute that Facebook didn't know who it belonged to because it didn't know who Mr. Zellmer was. [00:39:57] Speaker 00: And that's why it wasn't a biometric identifier, because it couldn't be used to identify him. [00:40:04] Speaker 00: And there's nothing in the record here. [00:40:05] Speaker 01: I guess that argument, I'm not quite sure that that part of the argument makes sense. [00:40:09] Speaker 01: Because if you had a template, for whatever reason, if Facebook had a template of a nonuser, and then that nonuser gets uploaded in a picture of a user, you can match and you can identify that person. [00:40:20] Speaker 01: Why wouldn't that be true? [00:40:22] Speaker 00: This is not a class action, though. [00:40:24] Speaker 00: This is, I mean, if you had a template for a nonuser, sure. [00:40:28] Speaker 01: Right, so that's what I'm just saying. [00:40:29] Speaker 01: As a matter of technology, if you actually had a template of a non-user, you could be identifying non-users. [00:40:39] Speaker 01: You don't have to have the template that is associated in an account. [00:40:44] Speaker 01: This account holder hears his name and hears his email and hears his phone number and hears his template. [00:40:49] Speaker 01: You don't have to have all of that organized in such a way. [00:40:52] Speaker 01: If you have a template and that person ever shows up in some sort of photo in the system, you can do a match with that information. [00:41:01] Speaker 00: If you saved a template from them and knew who they were, sure, had some way of figuring out who they were. [00:41:05] Speaker 01: That's why I'm asking, is the signature, if the information in the signature can be used to create the template, then I think your argument is weaker. [00:41:15] Speaker 01: But your response to that is whatever is in the signature is not enough to make the template. [00:41:19] Speaker 00: Not by itself, Your Honor. [00:41:23] Speaker 02: Tons of questions, but we better let you sit down. [00:41:25] Speaker 02: Thank you. [00:41:26] Speaker 00: I very much appreciate the court's time. [00:41:27] Speaker 00: Thank you. [00:41:34] Speaker 02: Yeah, is that what he has left? [00:41:38] Speaker 02: Is two and a half, two? [00:41:39] Speaker 02: Oh, you gave extra time. [00:41:40] Speaker 02: Oh, yeah, yeah, give him three minutes. [00:41:43] Speaker 03: Thank you very much, Your Honor. [00:41:45] Speaker 03: A few points right off the bat. [00:41:48] Speaker 03: I want to push back on the commentary that was given to the court about face signatures and what they are and what was done with them here. [00:42:00] Speaker 03: The citations that were given to the court [00:42:03] Speaker 03: a few moments ago by opposing counsel to supplemental excerpts of record 110 and 11. [00:42:10] Speaker 03: That discussion relates exclusively to face signatures, not face templates, face signatures only. [00:42:18] Speaker 03: The question had been, do you keep anything? [00:42:21] Speaker 03: Did you keep anything of Mr. Zellmer's? [00:42:23] Speaker 03: And the citation to what is done with face signatures doesn't answer that question. [00:42:28] Speaker 03: Because here, the face signature was converted to a face template. [00:42:32] Speaker 03: And they've said that that's what happens to face signatures. [00:42:36] Speaker 03: They said it in the citations that I read earlier. [00:42:39] Speaker 03: And in Patel, very, very succinct one sentence I can read to the court. [00:42:44] Speaker 03: What was said in that case at page 1268, the technology extracts the various geometric data points that make a face unique, such as the distance between eyes, nose, and ears to create a face signature. [00:42:59] Speaker 03: So right there, the court in Patel is acknowledging that a face signature is a scan of face geometry. [00:43:04] Speaker 03: And the decision goes on to further note. [00:43:07] Speaker 03: The technology then compares the face signatures to faces in Facebook's database of face templates, i.e., face templates, i.e., face signatures that have been matched to profiles. [00:43:24] Speaker 03: And then if you look at the other citations I gave earlier, [00:43:27] Speaker 03: The evidence that we've got here is that face signatures are created off of every photo uploaded, whether user or nonuser. [00:43:35] Speaker 03: The comparison is run to the database of other stored templates for known users. [00:43:43] Speaker 03: And then if a match is created, the signature is converted to a face template. [00:43:48] Speaker 03: That's their language in their motion to dismiss, okay? [00:43:54] Speaker 03: And the signatures are thrown out, [00:43:56] Speaker 03: if there's no match. [00:43:58] Speaker 03: So that's what happened. [00:43:59] Speaker 03: So in that one photo of Mr. Zellmer in this case, where his signature was created, and by the way, that alone is a scan of face geometry that qualifies as a biometric identifier. [00:44:10] Speaker 03: But beyond that, if we want to get into this ephemeral defense, that signature was converted in that one instance at a minimum into a template that was kept. [00:44:21] Speaker 03: It turned out to be a mismatch. [00:44:24] Speaker 03: The system didn't even know it was a mismatch until this case. [00:44:28] Speaker 03: Otherwise, how would they go back and find out that it was a woman? [00:44:31] Speaker 03: They obviously had the information still there. [00:44:35] Speaker 03: So we later found out it was a mismatch, but they create these mismatches on who knows how they characterize it as rare. [00:44:43] Speaker 03: But one out of four, perhaps it's not that rare. [00:44:48] Speaker 03: The second point I want to make, [00:44:49] Speaker 03: The legal argument being put forward here is they want a new exception. [00:44:55] Speaker 03: Judge Donato granted the exception that I'm only going to apply BIPA to less than all persons, to persons who have a pre-existing relationship with the defendant. [00:45:06] Speaker 03: And since that's recognized to be a faulty argument by everyone, I believe, the new argument is we want a different exception now. [00:45:13] Speaker 03: We want an exception that says, well, BIPA section 15B says nobody can obtain a biometric identifier or biometric information without consent. [00:45:26] Speaker 03: And this is the caveat that they want. [00:45:28] Speaker 03: Except if it's in the process of incidental processing [00:45:34] Speaker 03: of some comparative data. [00:45:36] Speaker 03: That incidental processing exception is a new exception, which isn't in the statute. [00:45:42] Speaker 03: It could have been written into the statute, but it's not there. [00:45:45] Speaker 04: And it's not for the court to... Well, I think the argument that Facebook is making is if you don't read it that way, then you destroy an entire technology. [00:45:54] Speaker 04: beyond just this case and that the legislative findings indicated and Judge Donato mentioned this as well, that the law was intended to regulate, not eliminate, you know, these applications. [00:46:08] Speaker 04: So why is that a crazy way to read it? [00:46:11] Speaker 03: Well, nobody's asking for a ban. [00:46:13] Speaker 03: They could have always limited the system to users who consented. [00:46:19] Speaker 02: Um, but by applying the face scanning technology, I'm not sure that's actually true because you wouldn't know whether it's a user who'd consented until after you ran. [00:46:29] Speaker 03: Well, they could, they could have prompted a user to confirm that before uploading a photo. [00:46:33] Speaker 02: So then you could only upload photos that were of users. [00:46:39] Speaker 02: Well, you could never upload a photo of me and my family. [00:46:42] Speaker 02: And then the four of my family is, uh, [00:46:45] Speaker 03: Our Facebook users and one isn't I couldn't upload that photo That's true because Illinois has made us a legislative decision to to to give people individuals control over the collection and use of their biometrics and If you can't get their consent the statute is to promote the consensual use of biometrics Not the indiscriminate use of it here, and they're not they're not a security company [00:47:12] Speaker 03: and they're not a company that but the analogy would apply the same there that's the problem you know it doesn't really this isn't a security screening case you know the the statute has led to have it now you're just saying oh well we're not going to read this exception but we're going to go ahead and read a security screening case uh... [00:47:28] Speaker 02: exception I don't that's not this case this is a this is not there that this is about we we've got a we've got to interpret this in a way we've got to be mindful of how to look however difficult this case is and maybe it's not at the end of the day it's gonna get more complicated I mean this that the technology is going to increase and so we've got to get this right I would submit your honors that the statute as clearly written should be enforced and applied [00:47:57] Speaker 03: If there needs to be exceptions, if there are reasonable exceptions that could be drawn, the legislature is the place to go for that. [00:48:05] Speaker 03: not the court of law. [00:48:06] Speaker 03: And as to the facts, I believe we've shown that the facts where all inferences should be drawn in our favor are sufficient. [00:48:13] Speaker 03: I don't believe that this court should substitute its judgment for Judge Donato's who found, who's very familiar with the technology after having presided over the prior Facebook case and detailed record in this case. [00:48:25] Speaker 03: I don't believe that the court should take Facebook's invitation to overrule Judge Donato's [00:48:30] Speaker 03: conclusion that there was not enough factual development to make these factual findings. [00:48:35] Speaker 03: So thank you. [00:48:37] Speaker 02: We have your argument. [00:48:38] Speaker 02: Thank you. [00:48:38] Speaker 02: Thank you very much. [00:48:39] Speaker 02: Thank you to both counsel for your arguments in the case. [00:48:41] Speaker 02: The case is now submitted.