[00:00:38] Speaker 04: Okay, go ahead. [00:00:40] Speaker 04: Good morning. [00:00:43] Speaker 04: May it please the court. [00:00:44] Speaker 04: My name is Carrie Goldberg, and I have the honor and privilege of representing John Doe, age 15 at the time. [00:00:52] Speaker 04: This case boils down to duty. [00:00:55] Speaker 04: I'm asking the court to hold that the duty Grinder breached arises from its obligations to design a reasonably safe product and exercise reasonable care to not increase the risk of child rape. [00:01:08] Speaker 04: These duties are unrelated to Grinder's status as a publisher and do not oblige Grinder to monitor content. [00:01:16] Speaker 04: The case should be reversed and remanded because Section 230 does not immunize Grinder. [00:01:22] Speaker 04: In my limited time, I would like to talk about the Grindr product, the well-worn Barnes duty test, which applies when a defendant tries to make a Section 230 immunity excuse, and then the trafficking claim. [00:01:39] Speaker 04: Grindr is the world's largest gay hookup app. [00:01:43] Speaker 04: It connects nearby strangers for offline sexual encounters based on sexual preferences and it ranks its customers based on how near one another is. [00:01:54] Speaker 04: Extracting locations from users' phones and then basically organizing them from nearest to farthest indicating exactly how near they are to the very foot [00:02:07] Speaker 04: Grindr makes money from charging for its premium services and also for its in-app advertising. [00:02:13] Speaker 04: The more users, the more money Grindr makes. [00:02:17] Speaker 04: A subset of Grindr's customers are children. [00:02:21] Speaker 04: They advertise on social media products frequented by children, like TikTok and Instagram. [00:02:27] Speaker 04: There are screenshots on excerpts of record, page 171, paragraphs 35 to 41. [00:02:35] Speaker 04: where they're using adolescents in PE class and at recess. [00:02:44] Speaker 04: Grindr has long known that children who use the app get sexually assaulted. [00:02:49] Speaker 04: As the complaint shows, there have been over 100 criminal cases related to child rape on Grindr. [00:02:57] Speaker 04: There's been a handful of civil cases dating back to at least 2014. [00:03:02] Speaker 04: against Grindr for child sexual assault. [00:03:06] Speaker 04: We talk about at least 14 different articles referencing the crisis of gay children being raped on Grindr. [00:03:14] Speaker 04: And there's even been a national investigation by the UK into Grindr's lax age restrictions. [00:03:21] Speaker 04: A peer-reviewed study, which we cite in our brief, says that over half of gay adolescents have their first sexual encounter with an adult they meet on Grindr. [00:03:32] Speaker 04: Meanwhile, Grindr represents itself as safe. [00:03:35] Speaker 04: On page 148, we quote, at Grindr, we've created a safe space where you can discover, navigate, and get zero feet away from the queer world around you. [00:03:48] Speaker 04: Now, against this backdrop, plaintiff, a 15-year-old child who lived with autism and ADHD got swept up into Grindr's marketing funnel. [00:04:00] Speaker 04: He thought he, a socially awkward kid who'd never dated and had very few friends, could meet gay friends through Grindr. [00:04:10] Speaker 04: So he frictionlessly downloaded the app while he was at school without needing to identify himself or age verify, and was immediately matched with a man who lived next door to the school because they were so geolocation proximate. [00:04:28] Speaker 04: And that day the man raped him. [00:04:30] Speaker 04: The following day, Grindr matched him with a different adult nearby, and that man raped him. [00:04:36] Speaker 04: And then the next it happened, and then the next it happened, until his parents discovered. [00:04:42] Speaker 04: Three of the four men were criminally convicted. [00:04:46] Speaker 04: John Doe sued for strict product liability, defective design and manufacturing, failure to warn, negligence, negligent misrepresentation, and trafficking. [00:04:57] Speaker 04: The lower court dismissed because, but for the content, none of the harms would have injured John Doe. [00:05:05] Speaker 04: But we know that's the wrong test. [00:05:09] Speaker 05: The design case that we have is the speed filter case where, I guess, Snapchat created. [00:05:21] Speaker 05: a product that was separate from the publishing function. [00:05:26] Speaker 05: How is the publishing, which is what goes on with Grinder, how does that analogize to the speed filter that Snapchat created? [00:05:40] Speaker 04: So like in Lemon vs. Snap, the duty underlying Doze claims [00:05:46] Speaker 04: differ markedly from the status of a publisher. [00:05:50] Speaker 04: Manufacturers, like in Lemon, have a specific duty to refrain from designing a product that poses an unreasonable risk of injury. [00:05:59] Speaker 04: Meanwhile, entities acting solely as a publisher, those that review material, make decisions about what to publish and what to withdraw, have no similar duty. [00:06:10] Speaker 04: Like in Lemon versus Snap, none of our claims have to do with the publication of content. [00:06:17] Speaker 04: I mean, all apps, all social media products involve publication. [00:06:22] Speaker 04: But not all duties that they breach necessarily are resulting from content created by a third party content provider. [00:06:31] Speaker 04: So we're not trying to say that Grindr was a content provider in any of the claims. [00:06:38] Speaker 04: Certainly Grinder did provide the content when it comes to the geolocation and the lower court got that fact wrong because Grinder extracts location information, nobody, no users publish [00:06:53] Speaker 04: where they are, and Grindr says on the profiles how near and far users are. [00:07:01] Speaker 05: Is that distinguishable from Dirof, where the court, where we held that neutral facilitation is also part of a publisher's duty or a publisher's activity? [00:07:17] Speaker 04: Right. [00:07:17] Speaker 04: Dyerhoff also looked almost solely at Prong 3, whether the plaintiff was trying to basically expand what an information content provider is, similar to Your Honor's concurrence in the roommate's case. [00:07:39] Speaker 04: And basically, Dyerhoff and roommates, they were saying, well, there's illegal content here, and that somehow should transform a user's content into the responsibility of the defendant provider. [00:07:53] Speaker 04: And that's just not what we're dealing with here. [00:07:55] Speaker 04: But also, I'll add that unlike Dyerhoff or in the Breidt v. Yolo case, we're not dealing with that kind of content neutral defendant, where [00:08:08] Speaker 04: the platforms themselves, like people could write anything to anybody. [00:08:16] Speaker 04: Which brings me really, I want to kind of talk about the [00:08:20] Speaker 04: the basic kind of interpretation of the Barnes test in Calise, because I think it sheds light on the difference here. [00:08:31] Speaker 04: Because Calise clarifies Barnes, it says that there's basically a two-step duty analysis when deciding whether or not a plaintiff is treating a platform as a publisher. [00:08:43] Speaker 04: And first we examine the right from which the duty springs. [00:08:47] Speaker 04: Does it spring from the platform status as a publisher or from somewhere else? [00:08:51] Speaker 04: And then does that duty require, and require is really important here, the platform to moderate content? [00:08:59] Speaker 04: If so, 230 attaches. [00:09:01] Speaker 04: But footnote three in Pride, it emphasizes the word required. [00:09:07] Speaker 04: Immunity only applies if moderation of content is the only option, not just if it's one of many options. [00:09:15] Speaker 04: And in both Dirov and also Breidt v. Yolo, the court on this second prong said, well, monitoring content really is the only way that we could have stopped these harms from happening. [00:09:31] Speaker 04: Because in Yolo, we would have needed to look to see [00:09:35] Speaker 04: that these were harassing and bullying messages. [00:09:38] Speaker 04: But in Grindr, we have a really different scenario. [00:09:41] Speaker 04: We have a company that has sort of an amalgam of defects. [00:09:49] Speaker 04: It markets to children and then it has no age verification. [00:09:53] Speaker 04: It geo-locates them to and ranks them based on the nearest to furthest adults. [00:10:01] Speaker 04: It matches them. [00:10:02] Speaker 04: There's no segregation between adults and children. [00:10:05] Speaker 04: And then it doesn't warn children, the very foreseeable customers based on who it's advertising to, that there's a real problem with child rape on the platform. [00:10:17] Speaker 05: an internet branch, which was a failure to warn case. [00:10:27] Speaker 05: But the court there said, we said very carefully, that the warning was of information it had obtained independently of the function of the website. [00:10:41] Speaker 05: So here, is there any independent knowledge, or is this all part of just the activity of the website? [00:10:47] Speaker 04: Yes, there's extensive independent knowledge that we talk about with regard to Grinder knowing that child rape happens with a lot of frequency. [00:10:58] Speaker 04: We talk about how there have been over 100 criminal cases, a number of civil cases. [00:11:04] Speaker 04: The peer review journal studied. [00:11:06] Speaker 04: Grinder's own executives have said that they wish that they are looking into whether or not they can do age verification. [00:11:17] Speaker 04: a great amount of information at their hands, but they made a concerted business decision to not protect kids and instead to increase the risk of harm to them by having such cruddy age verifications and also using this geolocation method having no warnings. [00:11:37] Speaker 00: So what things should they have done in designing their website? [00:11:41] Speaker 04: So they should have, and this really goes to the issues of what are the things that they should have done that are not publishing functions? [00:11:50] Speaker 04: And there's a series. [00:11:51] Speaker 04: Under the defective design, they should have not designed not reasonably safe products. [00:11:56] Speaker 04: So they should have created a geolocation feature that doesn't allow, sorry, let me start over. [00:12:06] Speaker 04: First of all, should have had age verification. [00:12:09] Speaker 04: They should have had. [00:12:10] Speaker 00: And how would they do that? [00:12:13] Speaker 04: Well, they could do more than just say, how old are you? [00:12:17] Speaker 04: You have to be 18 and then ask them how old they are. [00:12:20] Speaker 04: All kids are going to say that they're 18 if they want to use. [00:12:23] Speaker 00: So what should they do? [00:12:24] Speaker 00: Are there other websites that have created something that would verify ages? [00:12:29] Speaker 04: Yes, there are third party apps that can age verify, that can check identity. [00:12:36] Speaker 04: But they can also work with the distributors of their product, the App Store and Google Play to make sure that young people are not using the app. [00:12:46] Speaker 04: They should not have had a default. [00:12:48] Speaker 00: I guess I don't understand what they would do at the App Store. [00:12:51] Speaker 04: Well, the app knows who the customers are. [00:12:53] Speaker 04: So that's where people download the product and where Grindr markets itself. [00:12:59] Speaker 00: Does the App Store know how old Doe was? [00:13:01] Speaker 04: They may well have. [00:13:02] Speaker 04: I don't know that. [00:13:03] Speaker 04: It would come out in Discovery. [00:13:05] Speaker 04: But the thing is that there's a standard of care. [00:13:07] Speaker 04: It's coming up in the series of social media cases against Google, TikTok, Meta, [00:13:15] Speaker 04: and SNAP where in the MDL and also the JCCP here in California where the courts are saying that not having effective age verification is a product defect. [00:13:29] Speaker 04: It's up to these geniuses in the Silicon Valley offices to figure out how to do it. [00:13:36] Speaker 04: But we know that it's broken because a whole tremendous number of kids are going onto Grindr and getting sexually assaulted. [00:13:44] Speaker 05: Do you want to save some time for rebuttal? [00:13:47] Speaker 04: Oh, I have two minutes left, and that includes my rebuttal. [00:13:50] Speaker 04: OK. [00:13:50] Speaker 04: Do you want to save that time for rebuttal? [00:13:53] Speaker 05: What's that? [00:13:53] Speaker 05: Do you want to save that time for rebuttal? [00:13:55] Speaker 04: Well, I do want to save time for rebuttal, and I'll also talk about the trafficking claim during that time. [00:14:00] Speaker 04: OK. [00:14:01] Speaker 04: Thank you. [00:14:27] Speaker 03: Good morning. [00:14:27] Speaker 03: May it please the court. [00:14:29] Speaker 03: Grinder is the online destination. [00:14:31] Speaker 03: Please identify yourself. [00:14:33] Speaker 03: Oh, I'm so sorry. [00:14:35] Speaker 03: I'm Bika Kumara on behalf of Grinder. [00:14:39] Speaker 03: Grindr is the online destination for gay and bisexual men looking for community. [00:14:44] Speaker 03: Much like Facebook, Grindr has become a staple of social life for gay and bisexual men, with one out of four of them reporting that they met their spouse or long-term partner on Grindr. [00:14:55] Speaker 03: Grindr is also for adults only. [00:14:57] Speaker 03: Grindr makes serious efforts to stop minors from publishing profiles and exchanging messages with adults, but no screening method is foolproof. [00:15:06] Speaker 03: Doe found a way around Grindr's methods. [00:15:09] Speaker 03: And while what happened to him is wrong, he cannot sue Grindr for failing to monitor for or block his profile. [00:15:16] Speaker 03: or the messages he exchanged with adults. [00:15:19] Speaker 03: Doe's claims are deficient for three reasons. [00:15:22] Speaker 03: First, section 230 of the Communications Decency Act provides immunity in cases like this, where the only way to avoid liability would be not to publish the profile or messages. [00:15:34] Speaker 03: This court's decisions in Dyerhoff and Bride are dispositive. [00:15:38] Speaker 03: Second, Doe has failed to allege essential elements of his claims. [00:15:42] Speaker 03: And finally, the First Amendment bars the claims. [00:15:46] Speaker 03: I'll move on to section 230 specifically, but welcome questions from the court. [00:15:51] Speaker 01: Section 230, if Grinder just published the geolocation information but did not allow its users to exchange messaging, would section 230 apply? [00:16:09] Speaker 03: I guess it would depend on how [00:16:13] Speaker 01: I'm not trying to answer that question, it depends on, are you asking if... Well, they're arguing that this isn't just about publishing content, a third party's content, that there's something defective in the product itself and that this geolocation information isn't content. [00:16:27] Speaker 01: It's just nothing that a third party publishes, it's just extracted from their phone. [00:16:32] Speaker 01: So if they're not messaging, but I want to know what this geolocation is, is that something that can be considered the publication of content? [00:16:38] Speaker 03: Yes, and we know that from the Herrick case and the Second Circuit. [00:16:42] Speaker 03: But more generally, if we take a step back, Ms. [00:16:45] Speaker 03: Goldberg has accused Grindr of matching adults and children. [00:16:48] Speaker 03: I think it's important to understand how the app works in that way. [00:16:51] Speaker 03: To get onto the app, you have to create a profile. [00:16:53] Speaker 01: They're not saying that Grindr is intentionally matching children and adults. [00:16:58] Speaker 01: They're saying it's a negligent product. [00:17:00] Speaker 01: There's a product defect in that that allows this to happen because Grindr advertises in locations that would entice children and then doesn't have adequate age filters and then with respect to geolocation is not publishing content. [00:17:17] Speaker 03: Well, everything you just mentioned are ways of pleading around Section 230's bar. [00:17:24] Speaker 03: So specifically with respect to geolocation information, yes, it is content. [00:17:28] Speaker 03: It is information about where the users are located. [00:17:30] Speaker 03: And to be sure, Grinder doesn't say, if you walk this way, you will run into somebody on the app. [00:17:36] Speaker 03: It says, this person's approximately [00:17:40] Speaker 03: 1,000 feet from you. [00:17:41] Speaker 03: Not only that, but you can turn off the location function on Grindr. [00:17:45] Speaker 03: It is not mandatory that you share your location. [00:17:48] Speaker 03: So users are willingly and voluntarily doing this. [00:17:52] Speaker 03: They're voluntarily allowing Grindr to publish their location information. [00:17:55] Speaker 03: It's no different than when I go on Google Maps and I search for nearby drug stores. [00:18:00] Speaker 03: I'm allowing Google to know where I am, to provide me the information that I'm seeking. [00:18:04] Speaker 03: Again, Herrick involved the exact same allegations about geolocation information. [00:18:09] Speaker 03: and concluded that the publication of that geographic location information was content. [00:18:15] Speaker 03: Another case not discussed in the briefing is a DC Circuit case on gun sellers, I believe. [00:18:23] Speaker 03: It's like Marshall's. [00:18:24] Speaker 03: I can't remember. [00:18:25] Speaker 03: I can get it for you. [00:18:25] Speaker 03: But the point is, this is not a new concept. [00:18:29] Speaker 03: It is not new to allege that a defendant published the location of where somebody is and thereby enabled them to meet. [00:18:37] Speaker 03: mind you, they would have to exchange messages to me, right? [00:18:40] Speaker 03: So you ask that question. [00:18:42] Speaker 03: There is no way that the men who interacted with Doe could have found him without messaging with him and finding and asking him where he was and arranging to meet. [00:18:54] Speaker 03: This is quintessential publishing content. [00:18:57] Speaker 03: It's no different than Dyerhoff. [00:19:00] Speaker 03: Judge Ikuta asked about that there. [00:19:02] Speaker 03: The court affirmed dismissal of a minor's parents claim that a minor met a drug dealer by using the site. [00:19:08] Speaker 03: Now, how did that happen? [00:19:09] Speaker 03: That happened because they exchanged messages. [00:19:11] Speaker 03: So too here. [00:19:13] Speaker 03: Doe exchanged messages with these men and arranged to meet offline. [00:19:18] Speaker 03: And the duty that all of the claims seek to impose is a duty for Grinder not to publish the profile of Doe, including his location or approximate distance from other users, as well as the messages he exchanged with adults. [00:19:32] Speaker 03: And I would also point the court to Doe versus Mises, where the fifth clerk had affirmed dismissal of largely identical claims. [00:19:39] Speaker 03: A 13-year-old girl met a 19-year-old man [00:19:41] Speaker 03: on MySpace after lying about her age to create a profile. [00:19:45] Speaker 03: They exchanged personal information. [00:19:47] Speaker 03: They met. [00:19:48] Speaker 03: The man assaulted her. [00:19:49] Speaker 03: And the parents said, why didn't you age verify? [00:19:52] Speaker 03: This is not about publishing content. [00:19:54] Speaker 03: This is about age verification. [00:19:56] Speaker 03: And the court held, no, you can't get around section 230 by purporting to hold someone responsible for something like age verification. [00:20:04] Speaker 03: What you're really asking them to do is not publish the content at issue. [00:20:15] Speaker 03: I think to analyze the, I mean, I think to go back to what this court does when analyzing whether Section 23 applies, it doesn't do a but for test. [00:20:24] Speaker 03: We agree with that. [00:20:25] Speaker 03: We're not advocating that. [00:20:26] Speaker 03: And the district court didn't do that. [00:20:27] Speaker 03: What we are advocating is to looking at the duty to see whether it would require moderation of content. [00:20:34] Speaker 03: And that's exactly the case here. [00:20:35] Speaker 03: So if you look at the complaint, starting with the product liability claims, you can see what the duties are that they're alleging, where that comes from. [00:20:44] Speaker 03: The complaint alleges that Grindr could have prevented the harm through monitoring the content of accounts, including direct messages. [00:20:52] Speaker 03: That's paragraph 107D. [00:20:53] Speaker 03: That's exactly the kind of conduct that section 230 is designed to protect. [00:20:59] Speaker 03: The complaint faults Grindr failing to verify the age of users. [00:21:02] Speaker 03: But again, all that means is you didn't stop Doe from coming onto the platform, publishing a profile, and exchanging messages. [00:21:09] Speaker 03: The complaint also suggests Grindr should have geofenced, presumably to- Wait a minute. [00:21:15] Speaker 01: This issue with age verification, I understood your briefs to making the point that it would require invasion of privacy, and it's much more difficult than perhaps the plaintiffs are suggesting. [00:21:25] Speaker 01: But if there were age verification, and Grinder does have a rule, you're supposed to be 18 to be on this site, and a person could not access the site because they weren't old enough, [00:21:37] Speaker 01: There's no taking down publication or publishing. [00:21:41] Speaker 01: There's nothing to publish because the person isn't allowed access to the site. [00:21:45] Speaker 01: So it seems a step removed from whether the complaint is actually asking Grinder to moderate or censor publication. [00:21:54] Speaker 03: Well, they're not asking for age verification in a vacuum, right? [00:21:57] Speaker 03: It's not as if Grinder could verify the age of our video on their site, allow everybody on, and they would be fine. [00:22:03] Speaker 03: What they are alleging and what the MDL court found that Ms. [00:22:05] Speaker 03: Goldberg is talking about is where a website or where a provider is sued for failing to stop a minor [00:22:14] Speaker 03: from accessing content and publishing content through age verification, whatever other means we talk about, that is protected by Section 230. [00:22:22] Speaker 03: So the only reason they're asking for age verification [00:22:25] Speaker 03: or asking Grinder to do age verification, which, as you noted, is not as simple as they might suggest, is to stop them from publishing content and exchanging messages with adults. [00:22:37] Speaker 03: Now, to be sure, Grinder does take steps to keep Miners Hobbit's platform. [00:22:42] Speaker 03: Those steps are not in the record, but they do exist. [00:22:45] Speaker 03: They use state-of-the-art technology. [00:22:48] Speaker 03: They do monitor messages. [00:22:49] Speaker 03: And I think what's important here is Section 230 is designed to incentivize exactly that type of activity. [00:22:55] Speaker 03: to incentivize sites like Grinder, web providers like Grinder, to take steps to block inappropriate content. [00:23:05] Speaker 03: And the fact that Grinder didn't do it perfectly here is the reason why the plaintiff is suing. [00:23:09] Speaker 03: Again, all of the duties in each of the claims would seek to hold Grinder responsible for publishing the profile and the messages exchanged between users. [00:23:24] Speaker 03: That's true not just of the product liability claim, but the failure to warn claim and the misrepresentation claims. [00:23:30] Speaker 03: This court held in Dyerhoff and Bride, you can't get around Section 230 by claiming you didn't warn of risks of using the platform. [00:23:41] Speaker 03: And misrepresentation, again, DOE can't escape Section 230 by purporting to rely on a statement by Grinder that its app was designed to provide a safe, authentic, and law abiding community. [00:23:51] Speaker 03: This court addressed that as well in Beckman versus Match. [00:23:55] Speaker 03: And finally, turning to the negligence claim, [00:23:58] Speaker 03: Again, the complaint falls grinded for, quote, matching to with other users. [00:24:01] Speaker 03: But again, the matching is simply making content supplied by users available. [00:24:05] Speaker 03: Content users can elect to share or not to share. [00:24:08] Speaker 03: The only way not to match will be to refrain from publishing that content. [00:24:16] Speaker 00: Council let's suppose let's suppose that some that somebody not grinder but somebody else out there was able to devise some some mechanism for age verification that was sort of simple and effective Would would grinder then have an obligation to? [00:24:31] Speaker 00: Incorporate that into its system grinder might choose to do that But it would not have an obligation would not have an obligation even if it was even if it was easily available Yes, I think that would be a design in it in that wouldn't be a design defect. [00:24:43] Speaker ?: I [00:24:43] Speaker 03: No, because again, the duty to age verify all comes down to stopping minors from accessing the platform, publishing profiles, and exchanging messages. [00:24:53] Speaker 03: The only way to get onto Grindr is by publishing a profile. [00:24:56] Speaker 03: And the only way to meet other people in person, live, is by exchanging messages. [00:25:02] Speaker 03: Any duty that seeks to stop minors from accessing the platform, well, perhaps [00:25:07] Speaker 03: Something, again, that Grindr would absolutely adopt if it were easily available simply boils down to holding Grindr responsible for allowing a minor to go onto the platform and publish a profile and exchange messages with adults. [00:25:20] Speaker 03: Again, that's what the Doe versus Mice-based court held in the Fifth Circuit. [00:25:26] Speaker 03: That's what Dierhoff holds as well. [00:25:28] Speaker 03: Even in Dierhoff, in Bride, all of these plaintiffs [00:25:32] Speaker 00: They try to... So even if a company had an easy way of age-verifying and keeping minors off of its platform, it would not have an obligation to adopt something easily incorporated in order to prevent the minors from getting on there. [00:25:48] Speaker 00: And then all of the allegations come in, of course, out of the complaint. [00:25:51] Speaker 00: That is, that DOE knows that there are minors who are coming onto its website and that it is benefiting from that. [00:26:00] Speaker 00: So for example, what happens if Grindr had a box that said, now we want to verify your age, you're supposed to be 18, and please check the following box. [00:26:11] Speaker 00: Are you willing to tell us that you're 18? [00:26:12] Speaker 03: I'm sorry. [00:26:15] Speaker 00: What if the question said, check this box. [00:26:17] Speaker 00: Are you willing to tell us that you're 18? [00:26:22] Speaker 02: Yes. [00:26:23] Speaker 00: I mean, that would all be OK. [00:26:25] Speaker 02: Yeah, that would all be OK. [00:26:28] Speaker 00: Even if you weren't asking, [00:26:29] Speaker 00: The question, are you at least 18? [00:26:31] Speaker 00: The question was, are you willing to tell us that you're 18? [00:26:34] Speaker 03: I see. [00:26:36] Speaker 03: I think no matter what, all of the claims, look, Grindr, I mean, I can tell you Grindr would not do that. [00:26:44] Speaker 03: Grindr does not do that. [00:26:45] Speaker 03: Grindr takes its responsibilities and its policies very seriously. [00:26:49] Speaker 00: We're just trying to figure out what the limits are of a negligent design claim. [00:26:55] Speaker 03: I think the important part is what does the duty seek to require Grindr to do? [00:27:01] Speaker 03: How do you avoid liability? [00:27:03] Speaker 03: If the only way you avoid liability is by censoring content, that is protected by Section 230. [00:27:09] Speaker 03: Grindr chooses to be an adults-only platform. [00:27:11] Speaker 03: It doesn't have to be. [00:27:13] Speaker 03: Just like Facebook and other apps choose to be open to minors, Grindr doesn't do that. [00:27:18] Speaker 00: So Grindr could open this up to anyone? [00:27:22] Speaker 03: It could, sure. [00:27:23] Speaker 03: It's not, you know, [00:27:26] Speaker 03: Grinder is a community for gay and bisexual men. [00:27:30] Speaker 03: It is, it's not, I mean, the other side will characterize it as a hook up app, maybe even as a sex app, but that's not what it is. [00:27:38] Speaker 03: It is a community for some vulnerable members of communities across the country and in this case in Canada. [00:27:44] Speaker 03: who have no other access to that type of community. [00:27:48] Speaker 03: Grinder, again, chooses to be adults only. [00:27:50] Speaker 03: It doesn't have to do that, but it does. [00:27:52] Speaker 03: And it does what it can to stop minors from accessing the platform. [00:27:55] Speaker 03: But the law doesn't allow the court, doesn't allow holding Grinder responsible for failing to block profiles and messages exchanged with adults, which, again, slip through Grinder's systems here. [00:28:11] Speaker 03: The law encourages Grinder to do what it can, [00:28:14] Speaker 03: to enforce its content policies without being afraid that doing so will expose it to liability. [00:28:28] Speaker 03: I'm happy to keep talking about section 230 or move on to FOSTA or any of the other claims, but very much want to address any remaining questions that the court has. [00:28:37] Speaker 05: All right, go ahead. [00:28:40] Speaker 03: OK. [00:28:41] Speaker 03: I think on the FOSTA claims, [00:28:43] Speaker 03: or the exception to Section 230 under FOSTA. [00:28:47] Speaker 03: I think the Reddit case forbids, forecloses liability here. [00:28:52] Speaker 03: There is no allegation that Grinder knew that Doe was a minor. [00:28:56] Speaker 03: In fact, Doe represented that he was an adult. [00:28:58] Speaker 03: Or that Grinder knew that Doe would be assaulted by men on its platform. [00:29:05] Speaker 03: The type of trafficking that is unlawful under 1591 is more analogous to the case we just heard. [00:29:11] Speaker 03: It's not simply creating a place where people can meet. [00:29:16] Speaker 03: I also, Grindr would also ask that if the court decides to reverse on section 230, which of course it should not, that it affirm on other grounds that are well within the record and easily questions of law that the court can decide. [00:29:31] Speaker 03: Thanks very much. [00:29:34] Speaker 05: I think we have some time for rebuttal. [00:29:46] Speaker 04: I just want to remind your honors that this case was dismissed on a 12b6. [00:29:51] Speaker 04: So it's plaintiff's facts that matter and not all the new facts that I just heard from my friend on the other side. [00:29:59] Speaker 04: Listening to this argument from Grinder, if I didn't know better, I would almost believe that Grinder believes it has no duty whatsoever. [00:30:10] Speaker 04: But that can't be. [00:30:11] Speaker 04: Grinder must have some duty. [00:30:13] Speaker 04: The fixes to Grinder that would not require third party monitoring are age verification, dividing child and adult populations, not providing proximity information for underage users, geo-fencing around schools and playgrounds, identity verification, common sense measures with distributors, warnings to underage people that rape [00:30:37] Speaker 04: is a real risk of using the product, not advertising to children. [00:30:42] Speaker 04: I heard [00:30:47] Speaker 04: No reliance on Ninth Circuit precedent here. [00:30:50] Speaker 04: I heard reliance on Herrick, which uses a but for test, but for content, would any of these harms have occurred? [00:30:58] Speaker 04: Herrick is an unpublished Second Circuit opinion. [00:31:01] Speaker 04: And then I heard MySpace, which is a Fifth Circuit opinion, dating back to 2008, does not use the Barnes test. [00:31:08] Speaker 04: And it demonizes the child rape victim and her mother and kind of uses an anachronistic way of speaking about rape victims and their decision to hold third parties responsible. [00:31:24] Speaker 04: I'd like to talk about the trafficking claim. [00:31:29] Speaker 04: No specific mens rea is required. [00:31:32] Speaker 04: I would direct your honors to the case AM versus Omegle, where Judge Mossman in the District of Oregon made an extensive analysis about how we apply the venture liability trafficking test. [00:31:50] Speaker 04: Here, unlike in Doe v. Reddit, [00:31:54] Speaker 04: where they only sued for beneficiary liability. [00:31:59] Speaker 04: In this case and in Omegle, the plaintiff sued for actual knowledge that trafficking was happening on the platform. [00:32:08] Speaker 05: All right, your time has expired. [00:32:10] Speaker 05: Thank you. [00:32:11] Speaker 05: Okay. [00:32:12] Speaker 05: The case of Doe v. Grinder is submitted and the court for this session stands adjourned. [00:32:33] Speaker 05: This court for this session stands adjourned.