[00:00:33] Speaker 01: Okay, the next argued case is number 18-1470, Strike Force Technologies, Incorporated, against Security Corporation. [00:00:56] Speaker 03: Mr. Drymire. [00:00:57] Speaker 03: May I seize the court? [00:00:58] Speaker 03: As this court held in NCORA, which we addressed in the 28J letter, improving computer security can be non-abstract improvement if done by a specific technique that departs from earlier approaches to solve a specific computer problem. [00:01:17] Speaker 03: And that is precisely what we have here. [00:01:19] Speaker 03: In fact, the analogy between the systems and improvements here in NCORA are very [00:01:25] Speaker 03: In Cora, as here, the problem was that the software license was subject to hacking. [00:01:31] Speaker 03: That's at 1344. [00:01:33] Speaker 03: Now some systems had tried to improve upon the prior art, such as with a unique identification code. [00:01:41] Speaker 03: And that's true here as well, as in the Wessinger patent that is described and disparaged in the specification in these patents. [00:01:53] Speaker 03: attempts had been to place the license in a specific place, a non-modifiable ROM. [00:02:00] Speaker 03: What the claims that were upheld there did was it broke with that traditional approach by placing the license in a different place, in a non-modifiable part of the memory in the BIOS. [00:02:15] Speaker 03: That is the same thing as we have here. [00:02:17] Speaker 03: Prior art systems had the problem of [00:02:22] Speaker 03: All of the identification information coming in through the access channel, whether it was in-band or whether even it was a partially out-of-band system. [00:02:32] Speaker 03: In all of those systems, the identification information was coming back in through the access channel, which made it particularly susceptible to hacking. [00:02:42] Speaker 03: And in a self-authenticating environment where everybody is really operating in an anonymous fashion, and if you have the credentials, [00:02:51] Speaker 03: you look like the real person, whether you may be a hacker or not, there was no way to authenticate that the person who signed the credentials was who they said they were. [00:03:03] Speaker 03: And so these claims solved that problem by relocating the authentication to a different place, just as was true in NCORA. [00:03:14] Speaker 03: In contrast to the other systems where you had either in-band or even partially out-of-band, always coming back through the access channel, [00:03:21] Speaker 03: Here you have two separate channels. [00:03:25] Speaker 02: You have the access channel. [00:03:26] Speaker 02: Suppose we were to disagree with you on that and to say that the idea of having two separate channels is an abstract idea that it doesn't involve the complete separation, doesn't involve an inventive concept. [00:03:41] Speaker 02: Is there anything about the diversion, the interception that is an inventive concept? [00:03:49] Speaker 03: I think here, Your Honor, it is true that the specific solution, because here we don't purport to preempt the notion of complete out-of-band communication or even complete out-of-band authentication, because you could have a system in which the accessor initiates the communication with the security computer and establishes the authentication channel. [00:04:15] Speaker 03: That would not be covered by these claims [00:04:18] Speaker 03: If it simply at that point goes directly from the security computer to the host computer and says grant access. [00:04:26] Speaker 02: Okay, but I understand that you're arguing that that means that there's not complete preemption here, but are you arguing that the diversion, the interception is itself an innovative concept? [00:04:37] Speaker 03: I think that the interception device really underscores the inventive aspect of this because it's not of itself an inventive concept. [00:04:47] Speaker 03: Not in and of itself, but what it does is it underscores that, as in DDR, you're overriding the traditional method in which identification is verified. [00:05:00] Speaker 03: In the traditional method, it goes straight down the access channel, and all you're able to do is establish that the credentials are credentials that match up with authorized credentials, but you're not able to establish that the person presenting those credentials [00:05:16] Speaker 03: is in fact the authorized user. [00:05:19] Speaker 03: What this does is, and the interception device I think sort of is the exclamation point on it, is it cuts that off, it overrides that traditional process by sending it off into the separate authentication channel. [00:05:33] Speaker 03: And then in the separate authentication channel there are specific components that are arranged in a way that allow you to know that the [00:05:41] Speaker 03: accessor is in fact the authorized accessor and that's because you have a predetermined address or telephone number that you reach out to establish that authentication channel and the fact that the person answers and is able to supply the predetermined data in response again through the authentication channel allows you to know that it is in fact the authorized user and in fact [00:06:11] Speaker 03: Unlike the prior systems, which some of which did use biometric data, but use of biometric data in a system in which all the information is coming back through the access channel doesn't really significantly improve security. [00:06:27] Speaker 03: Because if you have a hacker there and they capture the biometric data, your fingerprint, all that is, is digitized data. [00:06:35] Speaker 03: It's the same as any other digitized data, and it can be replicated by the hacker in a later attempt to access. [00:06:42] Speaker 03: If, however, you have two physically separate channels, the authentication channel completely physically independent of the access channel, then if you use biometric data, it's not going to be captured unless the hacker simultaneously is hacking two different channels at the same time. [00:07:03] Speaker 03: In this way, you know that the person who's inputting the biometric data in the authentication channel is in fact the authorized user. [00:07:12] Speaker 03: It's those improvements over the prior art that are described and have to be accepted as true at this point, which is a motion to dismiss under Berkheimer and Atrix under VASCOM. [00:07:26] Speaker 03: It has to be accepted as true. [00:07:29] Speaker 03: And I think really, again, as in CORA, the court said, [00:07:33] Speaker 03: How are we to know whether these are in fact improvements over the prior art as they are claimed to be in this specification? [00:07:41] Speaker 03: We have to accept that that's true at this point in the litigation. [00:07:46] Speaker 03: What the court has consistently advised against is treating the 101 inquiry as though it were 103 lite. [00:07:56] Speaker 03: And that is particularly the opposite here because these claims have already survived. [00:08:02] Speaker 03: multiple 103 challenges. [00:08:04] Speaker 03: There were two IPRs, each with multiple grounds that were not instituted because the prior that they had to cobble together three or more references did not disclose all of the limitations of these claims. [00:08:21] Speaker 03: And so where you have something that is sufficiently robust and inventive, non-obvious to survive multiple 103 [00:08:29] Speaker 03: to allow somebody to waltz in and say, well, but it's a lot like if you try to go into a preschool and try to somehow map it onto. [00:08:40] Speaker 01: But they did say that every step was known and that the combination of steps was known. [00:08:46] Speaker 01: I mean, it isn't that clear as a given. [00:08:51] Speaker 01: Is it that these other aspects were set aside? [00:08:56] Speaker 03: Your Honor, I want to be clear. [00:08:58] Speaker 03: The preschool analogy that they use to suggest that all these steps were already known is not our system. [00:09:06] Speaker 03: In that hypothetical that they give forward, Aunt Sally, whoever the relative is that's coming, all they are able to do is to verify that Aunt Sally has permission to pick up the child. [00:09:21] Speaker 03: They're not able to verify that Aunt Sally is Aunt Sally. [00:09:24] Speaker 03: It's the latter problem that is the real problem of computer technology where you have a man in the middle, the hacker in the self-authenticating environment. [00:09:35] Speaker 03: Because whoever comes in and says, I'm Aunt Sally, looks just like Aunt Sally. [00:09:40] Speaker 03: They're anonymous. [00:09:41] Speaker 02: They're just presenting the credentials. [00:09:43] Speaker 02: All you need to do is change the hypothetical and ask the parent what Aunt Sally looks like. [00:09:50] Speaker 03: Well, Your Honor, but again, I think because of the unique [00:09:54] Speaker 03: nature of computer technology, it could be Aunt Sally's twin. [00:10:00] Speaker 03: Aunt Sally has authorization, but Aunt Bertha, who is Aunt Sally's twin, does not. [00:10:05] Speaker 03: And you can't distinguish them from the credentials. [00:10:09] Speaker 03: From the credentials, they each have the same driver's license. [00:10:13] Speaker 03: You can't distinguish them. [00:10:14] Speaker 03: That's the true problem of user authentication. [00:10:20] Speaker 03: Authenticating that the person is who they claim to be. [00:10:24] Speaker 03: All of the other systems are designed to make sure that the credentials are authorized credentials. [00:10:32] Speaker 03: They don't authenticate that the user accessor is the accessor. [00:10:36] Speaker 03: And that's what these claims do, precisely because of the particular architecture that they use. [00:10:43] Speaker 03: It may be that the elements of it, a router, a security computer, et cetera, are known elements. [00:10:52] Speaker 03: but they are arranged in a way here that allow you to do something that the prior art was not able to do. [00:10:58] Speaker 03: It's that improvement that is claimed. [00:11:01] Speaker 03: And again, it's not all out of band. [00:11:05] Speaker 03: There are lots of out of band systems that don't do it the way that are claimed here. [00:11:09] Speaker 03: In fact, Wessinger is a version of out of band. [00:11:12] Speaker 03: It's partial out of band because you send the information out through out of band, but it comes back through the access channel. [00:11:20] Speaker 03: And again, as I said before, it doesn't even preempt all completely out of band systems. [00:11:25] Speaker 03: But what this completely out of band system does, arranged as these elements are arranged, is it allows you to ensure that the person seeking to access is in fact the authorized accessor. [00:11:41] Speaker 01: What allows you to do it is the advancement, the capability of technology. [00:11:45] Speaker 01: And what they say and held was that this is an old idea. [00:11:49] Speaker 01: It's just that we now are able to implement it. [00:11:53] Speaker 03: No, Your Honor, I think that what's true, I mean, again, the sending the PIN out through the mail for the bank example, which they give, and then you bring the PIN back in. [00:12:04] Speaker 03: All the PIN is is a password. [00:12:06] Speaker 03: That's the old art. [00:12:08] Speaker 03: All the PIN is is the password because it's coming back in through the authentication channel. [00:12:13] Speaker 03: You may be an imposter, but if you were there when the mail was delivered and you picked up the PIN, you're able to use the PIN. [00:12:19] Speaker 03: So you don't know whether the person is who they claim to be, only that they have the right credential. [00:12:26] Speaker 03: What this system does is organize things, especially, I think it's critical again, the physical separation of the channels, so that no hacker can get the authenticating information. [00:12:39] Speaker 03: Because they are not the hackers in the access channel, they're not in the authentication channel, because there's nothing there. [00:12:45] Speaker 03: There's no information there. [00:12:46] Speaker 03: They're not in the authentication channel. [00:12:48] Speaker 03: Because they're not in the authentication channel, if, for example, you use the biometric data, the fingerprint, they don't get it. [00:12:57] Speaker 03: And you know that the accessor is the accessor. [00:13:00] Speaker 03: It's that unique combination. [00:13:02] Speaker 03: And the specification chooses that language. [00:13:06] Speaker 03: And in Berkheimer, in Atrix, in Bascom, the court has said that you have to accept these allegations, that it is an improvement. [00:13:14] Speaker 03: It is not a well-understood common routine [00:13:18] Speaker 03: that it's one that is not and it improves on the prior art. [00:13:24] Speaker 03: Those have to be accepted as true at this point in the process. [00:13:28] Speaker 03: So at most, we would be entitled to a remand to establish that those facts as alleged in the specification are true. [00:13:38] Speaker 03: I think that you can actually resolve this at step one for the reasons that I've already stated. [00:13:43] Speaker 03: I would like to reserve the balance of my time. [00:13:46] Speaker 03: Okay, thank you. [00:13:48] Speaker 01: We'll hear from Mr. Anacole and you're ready. [00:14:05] Speaker 00: May it please the Court. [00:14:07] Speaker 00: The decision below should be affirmed because the District Court correctly held that the focus of the claims here is out-of-band authentication and that idea is not specific to computer technology. [00:14:19] Speaker 00: And we can see that from this court's cases, such as Secured Mail, Prism, both of which were addressed in the briefs, as well as the case that came out after our briefing was complete, Asgari-Kamrani, number 20162415, which also dealt with a form of out-of-band authentication. [00:14:37] Speaker 00: So there's been three cases where specific ways of performing authentication in a computer network were held to be abstract, and those are the cases that we're relying on. [00:14:48] Speaker 00: The cases that opposing counsel, Strike Force's counsel is pointing to, do not relate even to authentication. [00:14:55] Speaker 00: They're just simply not as analogous as the cases that we're relying on. [00:14:59] Speaker 00: And the reason that Secured Mail, Prism, and Asgari-Kamrani held the claims they're ineligible is because authentication, even when using different communication channels, is not a computer-specific problem. [00:15:14] Speaker 00: And the use of multiple channels [00:15:16] Speaker 00: is not a computer-specific solution. [00:15:18] Speaker 00: Communication channels could be paper mail, they could be telephones, they could be all manners of different communication channels, and that's what our preschool analogy illustrates. [00:15:29] Speaker 00: And I'll note that in PRISM and secured mail, this court relied on analogies to determine that the ideas at issue were not computer-specific, and that's what we've done here. [00:15:41] Speaker 00: I do want to address Strike Force's criticism of our analogies. [00:15:46] Speaker 00: mention this idea of authenticating who Aunt Sally is. [00:15:51] Speaker 00: I don't see anything in the claims that require a verification of the identity of a person apart from verifying that they have the correct information. [00:16:03] Speaker 00: So there's nothing in the claims that reflects this idea of authenticating who Aunt Sally is as opposed to just that they have the right login identification. [00:16:17] Speaker 00: And so I think that that distinction just doesn't hold water. [00:16:26] Speaker 00: I'd also like to address ANCORA. [00:16:31] Speaker 00: So that case, again, found that the claims were not abstract because they were directed to a computer-specific solution, to a computer-specific problem. [00:16:43] Speaker 00: Specifically, there was this BIOS memory [00:16:45] Speaker 00: And it was alleged that the structure of this BIOS memory, which is of course unique to computers, is what allowed the solution there to prevent hacking. [00:16:54] Speaker 00: So specifically, the BIOS memory had multiple areas, the BIOS ROM and the BIOS E-squared PROM. [00:17:02] Speaker 00: And that later part is where this license verification structure was stored. [00:17:07] Speaker 00: And the reason that that provided an improvement to computer security was that the patent explained [00:17:15] Speaker 00: BIOS e2 prom is more difficult to access for a programmer. [00:17:23] Speaker 00: And when you do access it improperly, there's a higher risk that the computer will be disabled. [00:17:28] Speaker 00: So they're relying on this very specific computer structure, a memory within the computer that has different traits from other memories that make tampering with that memory less likely to be successful for a hacker. [00:17:43] Speaker 00: Here, we're not [00:17:44] Speaker 00: Seeing any structure that is computer specific that's being relied on to provide the alleged benefits. [00:17:53] Speaker 00: The supposed resistance to hacking comes from the use of two separate communication channels. [00:17:58] Speaker 00: Communication channels, as I've said, are not computer specific. [00:18:02] Speaker 00: And using two separate communication channels improves security outside of computers as well. [00:18:08] Speaker 00: And we saw that in secured mail. [00:18:11] Speaker 00: We saw that in Asgari-Kamrani. [00:18:13] Speaker 00: And let me just explain a little bit more, since that wasn't in the briefs, because it hadn't come out yet, why there were two channels at issue there. [00:18:21] Speaker 00: So I do want to note that that's a Rule 36 affirmative. [00:18:23] Speaker 00: So we don't know what the reasoning of this court was. [00:18:26] Speaker 00: But we do know. [00:18:28] Speaker 00: And it's not precedent. [00:18:29] Speaker 00: That's correct. [00:18:29] Speaker 00: It's not precedent, but it is persuasive, I believe, under Federal Circuit Rule 32.1. [00:18:35] Speaker 00: So what we know is the content of the claim there. [00:18:37] Speaker 02: So Rule 36 is persuasive? [00:18:40] Speaker 02: Non-precedential dispositions are persuasive. [00:18:46] Speaker 00: I believe, well, I guess Your Honor is of course free to decide however persuasive you think it is, but I'll just explain what the facts were there and what the claim said. [00:18:59] Speaker 00: The claim said that you have a central authority, an external, I'm sorry, a central entity, an external entity and a user. [00:19:09] Speaker 00: The user is trying to access the external entity. [00:19:12] Speaker 00: So they first start a transaction with that entity, and then in order to authenticate that transaction, they separately communicate with the central entity to get a code. [00:19:25] Speaker 00: So there's communications between the external entity, that's one communication path, and the user, and communications between the central entity and the user. [00:19:36] Speaker 00: And those two separate communication paths are both needed to perform the authentication. [00:19:43] Speaker 00: But setting that aside, since it is a Rule 36 judgment, let's take a look at secured mail. [00:19:49] Speaker 00: Again, that's a case we've pointed out uses not just out-of-band authentication, but completely out-of-band authentication, which is what StrikeForce is relying on here. [00:19:59] Speaker 00: And in that case, a person receives a piece of mail [00:20:05] Speaker 00: through the normal postal service and then wants to validate or authenticate that mail. [00:20:11] Speaker 00: And so they initiate a computer connection and over the computer connection you authenticate that the mail is authentic. [00:20:21] Speaker 00: And that's a bi-directional out of band communication channel because it's separate from the paper mail. [00:20:28] Speaker 01: You're presenting arguments under 102 or 103, not 101. [00:20:34] Speaker 00: No, I disagree, Your Honor. [00:20:36] Speaker 00: I'm explaining that in these previous cases, Secured Mail, Prismasgaray-Kamrani, this court found that all of the claims were directed to abstract ideas, not because they're old. [00:20:50] Speaker 01: But then most cases depend on their facts. [00:20:53] Speaker 01: I know you're drawing an allergy of the facts, but the 101 cases, the abstract ideas, lines have been drawn as to when [00:21:03] Speaker 01: It is or isn't an abstraction. [00:21:08] Speaker 00: The point that I'm trying to make, Your Honor, is that what made the ideas in those claims abstract is that they did not provide a technology-specific solution to a technology-specific problem. [00:21:23] Speaker 01: But your friend says that in this case, technology-specific details sufficiently have been provided. [00:21:31] Speaker 00: And the details [00:21:33] Speaker 00: which opposing counsel provides that are allegedly technology specific is this use of two separate channels. [00:21:39] Speaker 00: And that's why I'm pointing to secured mail to show that the use of two separate channels to perform authentication is not technology specific. [00:21:47] Speaker 00: Secured mail negates the idea they're proposing that the use of two separate channels to perform authentication is technology specific. [00:21:55] Speaker 01: So what would have been needed? [00:21:57] Speaker 01: The code, the computer code to overcome this gap [00:22:03] Speaker 00: It's not a question of inadequate disclosure of the use of two separate channels. [00:22:10] Speaker 00: It's that the use of two separate channels itself is not technology specific because the use of communication channels is not limited to computers. [00:22:19] Speaker 00: And that's why we provide this analogy to show that the use of multiple channels for authentication is performed outside of the computer context. [00:22:27] Speaker 00: So they would need a completely different focus of their claims. [00:22:31] Speaker 00: They would need something like [00:22:33] Speaker 00: exploiting the structural traits of BIOS, as in Ancora, as distinguished from other computer memory, that is a computer-specific solution because it arises specifically from the structure of a computer component. [00:22:46] Speaker 00: Here there's nothing they're pointing to that arises specifically from the structure of a computer component that they're relying on for the solution. [00:22:53] Speaker 01: The whole focus is computer hacking. [00:22:56] Speaker 01: I'm trying to understand where the line appropriately should be drawn. [00:23:02] Speaker 01: between adequacy and overcoming 101 and not doing so. [00:23:08] Speaker 00: I think characterizing this as hacking, Your Honor, is too general of a characterization to be useful. [00:23:14] Speaker 00: What they mean by hacking here is simply intercepting data that's in transit, right? [00:23:19] Speaker 00: Intercepting information that's being communicated. [00:23:20] Speaker 01: There's no dispute that that's the target. [00:23:24] Speaker 01: Sorry? [00:23:25] Speaker 01: Interception. [00:23:26] Speaker 01: To avoid interception. [00:23:29] Speaker 00: Right, to avoid interception of information being communicated. [00:23:33] Speaker 00: That problem is not computer specific because information is routinely intercepted in communication channels. [00:23:42] Speaker 02: Well, it's computer specific, but it's applying the same concept that exists outside of computers in the computer context. [00:23:51] Speaker 00: The word hacking is a computer specific word. [00:23:57] Speaker 00: But I'm saying the substance of what they were using that word to refer to here, interception in a communication channel, is not computer specific. [00:24:06] Speaker 00: And that's why, you know, they refer to ANCORA because ANCORA used the word hacking, but it was using that word to refer to something different. [00:24:13] Speaker 00: It was using that word to refer to tampering with data stored on the computer to obtain rights to run software that you did not own. [00:24:22] Speaker 01: So it would have solved the 101 problem if they had explicitly said we're looking at computer hacks? [00:24:30] Speaker 00: No, Your Honor. [00:24:34] Speaker 00: The word computer, the word hacking in this context simply means intercepting communications in a communication channel, which is not computer specific. [00:24:46] Speaker 01: So that would have had to say intercepting computer communications. [00:24:51] Speaker 01: Is that the difference that you're doing? [00:24:53] Speaker 00: No, no, that's just a verbal formulation, Your Honor. [00:24:56] Speaker 00: I'm talking about the substance of what they're pointing to. [00:24:59] Speaker 01: I'm trying to get at the substance of where, when it's crystal clear they're talking about computers and hacking computer communications. [00:25:09] Speaker 01: So I'm trying to understand the gap between 101 and 103. [00:25:16] Speaker 00: 103 is about what's old. [00:25:18] Speaker 00: 101 is about what's computer-specific. [00:25:21] Speaker 00: And the reason I say that the so-called hacking here is not computer-specific is because it's no different than intercepting communications in a non-computer communication channel, unlike the hacking in Ancora, which was specifically about accessing computer memory and the use of different types of computer memory. [00:25:46] Speaker 00: Let me also make this point. [00:25:48] Speaker 00: Hacking is the problem, right? [00:25:50] Speaker 00: In order to have a non-abstract solution, you need both the problem to be computer-specific and the solution. [00:25:59] Speaker 00: So I think for the reasons I've already explained that they don't have a technology-specific problem here, but I, you know, that's not the only issue. [00:26:08] Speaker 00: The other issue is do they have a computer-specific solution? [00:26:12] Speaker 00: And the answer to that is also no. [00:26:15] Speaker 00: Because their alleged solution is to use two separate communication channels in case the communications in one channel are intercepted. [00:26:26] Speaker 00: And that solution is not specific to computers because it works with paper mail, as we saw in secured mail. [00:26:33] Speaker 00: It works with telephones, as we see in our preschool analogy. [00:26:37] Speaker 00: And so even if they did have a computer-specific problem of hacking, their way of preventing that hacking [00:26:45] Speaker 00: is not a computer-specific solution. [00:26:47] Speaker 00: And I do want to also address this interception issue because this is... You can't mean that. [00:26:55] Speaker 01: You say even if we all agree that this is a computer problem, don't bother looking at computer issues and solutions. [00:27:07] Speaker 00: If they had a computer-specific solution, [00:27:10] Speaker 00: Your Honor, that would be patent eligible. [00:27:13] Speaker 01: But why does that convert the issue into an abstract idea? [00:27:18] Speaker 00: That is the dividing line that this court has identified between abstract ideas and patentable computer-specific improvements. [00:27:29] Speaker 00: So in ANCORA, in ENFISH, in FINGIN, the common trait that [00:27:38] Speaker 00: links all of those cases your honor is do you have a computer specific solution to a computer specific problem and they don't hear and that's why the solution is implemented by computer implementing an information based solution in a computer environment is not enough and we pointed to for example intellectual ventures one versus [00:28:05] Speaker 00: capital one financial for that proposition, but that's well established in the case law. [00:28:09] Speaker 00: Just as implementing escrow or intermediate settlement in a computer environment under Alice didn't prevent the claims from being abstract. [00:28:20] Speaker 00: And that's what we have here. [00:28:22] Speaker 00: A communication channel solution that's simply implemented in a computer environment. [00:28:28] Speaker 00: I don't deny that it could be useful just as implementing intermediate settlement in a computer environment could be useful, [00:28:35] Speaker 00: but it doesn't make a computer specific and that's a crucial dividing line in the case law. [00:28:40] Speaker 00: Before I run out of time, I just want to address this interception issue. [00:28:45] Speaker 00: The agreed construction of interception device is something that prevents the host computer from receiving. [00:28:51] Speaker 00: That is a purely functional and generic construction and such purely functional and generic elements cannot provide an inventive concept as a matter of law. [00:29:01] Speaker 00: I also want to point out that the interception device [00:29:04] Speaker 00: does not even have to be separate from the host computer according to StrikeForce itself, for example, at appendix 799. [00:29:12] Speaker 00: And there's embodiments that StrikeForce has referred to in its reply brief showing that the interception device and the host computer may be the same device. [00:29:21] Speaker 00: So to say that the interception device shields the host computer is false because StrikeForce has said that the interception device may be part of the host computer. [00:29:29] Speaker 00: And even if that wasn't the case, because it's purely functional, [00:29:33] Speaker 00: and generic, it cannot provide an event of concept as a matter of law. [00:29:41] Speaker 01: Thank you. [00:29:42] Speaker 01: Mr. Dreimeier, you have four minutes. [00:29:49] Speaker 03: Thank you, Your Honor. [00:29:49] Speaker 03: I want to start with the fact of where you have to draw the line, because as this Court has noted, if you draw it at too high a level of generality, the exception swallows the rule. [00:30:00] Speaker 03: And my opponent suggested that these claims are not about user identification, but that fails to recognize the critical feature of these claims. [00:30:11] Speaker 03: And I want to just point to several points where that is clear. [00:30:15] Speaker 03: In this specification, column 2, lines 20 to 22, it says that the technical problem that's addressed is that the hacker is in itself authenticating [00:30:25] Speaker 03: environment. [00:30:28] Speaker 02: In other words, the hacker appears to be who they are. [00:30:31] Speaker 02: That's true. [00:30:33] Speaker 02: He was wrong about that. [00:30:33] Speaker 02: But the fact is that in the non-computer context, we have the same problem with the same solution. [00:30:39] Speaker 03: Well, Your Honor, as I think the court, the case that says this best is a recent case that we did not cite, it's data engine technologies be Google 906 F3rd 999 2018 [00:30:55] Speaker 03: which came down after briefing, made clear that it's not sufficient just to be able to trace to some real world analogy. [00:31:05] Speaker 03: It has to be an apposite one. [00:31:08] Speaker 03: And here the preschool analogy is not apposite because in order to actually practice these claims, in every instance when anybody, even if they're on the permitted list, comes to pick up the child, you would have to call that person's cell phone [00:31:24] Speaker 03: which they would have to have in hand to be able to verify that they are in fact the person who's on the list and is an authenticated user. [00:31:32] Speaker 03: That never happens. [00:31:34] Speaker 03: So all they're doing is they're sort of like waving their hands. [00:31:37] Speaker 03: Something looks like it. [00:31:39] Speaker 03: It's the precise problem of who wants to be linked. [00:31:41] Speaker 02: It sure sounds like something that happens pretty frequently in the real world in the Euro before computers. [00:31:46] Speaker 02: I mean, take the example of somebody picking up cash from a bank. [00:31:51] Speaker 02: with a note from the depositor and the bank decides to call the depositor's landline to ask if the person who's picking it up is the person who's authorized to pick it up. [00:32:03] Speaker 02: I mean, you can come up with lots of different examples of this dual-channel utility. [00:32:08] Speaker 03: But once again, Your Honor, what that doesn't do is authenticate that the person who's there is in fact the employee as opposed to someone who's masquerading as the employee. [00:32:19] Speaker 01: And this goes back to whether... [00:32:21] Speaker 01: Every method of authentication without limit, isn't that part of the problem? [00:32:28] Speaker 01: And I think part of the foundation for the decision that this was abstract. [00:32:36] Speaker 03: Well, Your Honor, what's critical is that these claims, because they reach out, that these separate authentication channels, you reach out to a cell phone or PDA that's already identified in a database, you reach out and the person [00:32:51] Speaker 03: who answers it has to give back predetermined information. [00:32:55] Speaker 03: This is along a second channel. [00:32:57] Speaker 03: So it's not somebody who's sitting in the access channel and may have got it before. [00:33:03] Speaker 03: That is a significantly higher degree of certainty. [00:33:07] Speaker 02: In the bank example, you say, put the guy on the phone. [00:33:10] Speaker 02: I want to listen to his voice to see if this is the person that I sent him. [00:33:15] Speaker 02: Is it voice recognition? [00:33:16] Speaker 03: Again, the fact that you could conceive of a system in the real world, in the brick and mortar system, that is not sufficient. [00:33:28] Speaker 03: That was true in DDR, where you could conceive of the store within a store, but that wasn't the problem that was being addressed, and it didn't work the same way. [00:33:39] Speaker 03: In DDR, as here, there's the interception, you bring it over to the other system, and I just want to say, quote, [00:33:45] Speaker 03: This is at COM 12 lines 28 to 33 that reflect that in a computer network system, in a corporate system, everybody has their user ID and password, verifying that they are an authenticated user in that sort of first level way is not that significant. [00:34:08] Speaker 03: What is significant is verifying, and this is what it says, [00:34:12] Speaker 03: Authentication of the person seeking access is the most significant. [00:34:17] Speaker 03: And that is what these claims do. [00:34:19] Speaker 03: They improve the other systems by providing a completely physically separate authentication process using all of these elements, the interception device, the security computer, the subscriber database, the authentication channel that's completely physically separate and shares no facilities with the access channel. [00:34:42] Speaker 03: to a predetermined number where the user has to input back in, back through the authentication channel, predetermined data, including biometric data. [00:34:55] Speaker 03: That system provides a level, it may be true, your honor, that no system can give 100% certainty that the user is the user. [00:35:06] Speaker 03: This system improves it. [00:35:08] Speaker 03: It's true in NCORA as well. [00:35:10] Speaker 03: It wasn't absolutely impenetrable, but here as well. [00:35:16] Speaker 03: What you've done is improved dramatically the security and the ability to identify the user as the user. [00:35:22] Speaker 03: At the very least, that is what the specification says is true. [00:35:26] Speaker 03: Whether that is or is not factually true is something that should be determined as a matter of fact on remit. [00:35:32] Speaker 01: Thank you very much. [00:35:33] Speaker 01: OK. [00:35:33] Speaker 01: Thank you. [00:35:34] Speaker 01: Thank you both. [00:35:35] Speaker 01: Your case is taken under subpoena.