[00:00:17] Speaker 02: May it please the court, Your Honor. [00:00:19] Speaker 02: My name is Juyeon Han. [00:00:21] Speaker 02: I represent the appellants and plaintiffs, the estate of Carson Bride and other children. [00:00:29] Speaker 02: The district court heard when held that the Section 230 of the Communications Decency Act applied in this case [00:00:46] Speaker 02: I offer here arguments to say that the claims of misrepresentation and products liability did not treat YOLO as a publisher of third party content, and thus, Communications Decency Act did not apply here. [00:01:14] Speaker 02: for itself as a promisor, not as a publisher, when they stated that they will reveal and ban users who harass and bully other users on the platform. [00:01:28] Speaker 03: The complaint frame that is a tort claim, however, not as a contract or promisory [00:02:06] Speaker 02: quote, no bullying. [00:02:08] Speaker 02: If you send harassing messages to our users, your identity will be revealed. [00:02:14] Speaker 02: This is a conspicuous pop-up notification that all users had to read before they can use the app of YOLO. [00:02:22] Speaker 03: Are you asking us to re-characterize the claim in your complaint as a contract claim? [00:02:30] Speaker 03: Is that what you're [00:02:39] Speaker 02: However, I draw this analogy of the content of the statement as a promise because, solely because there was a reliance upon the statement which was so identifiably concrete. [00:02:52] Speaker 02: It said, it did not say we may ban users or we will try to ban users or there is a possibility that we may ban and reveal users. [00:03:02] Speaker 02: It said we will reveal and ban the users. [00:03:10] Speaker 02: that every user had to see. [00:03:13] Speaker 02: YOLO has no tolerance for objectionable content or abusive users. [00:03:17] Speaker 02: You will be banned for any inappropriate behavior. [00:03:40] Speaker 02: all of the users had remembered, relied upon, which we have included in the pleadings. [00:03:46] Speaker 02: This is very integral to the factual situation that distinguishes YOLO from all the other social media apps that perhaps has come to this Court's purview. [00:03:57] Speaker 02: The second fact that is very important is that YOLO [00:04:09] Speaker 02: Just anonymity designed as a one sided anonymous app, which was made for senders to use anonymous messages to target non anonymous receivers. [00:04:23] Speaker 03: So the receiver is all publishing. [00:04:25] Speaker 03: So why isn't that immunized? [00:04:27] Speaker 03: I mean, lemon juice to have. [00:04:36] Speaker 03: that it had nothing to do with content that was published, but your negligent design claim is all related to content that was published. [00:04:45] Speaker 02: It may not always be related to content. [00:04:48] Speaker 02: I understand that there may be some nexus to a content-related material, but I do want to pose that there are so many [00:05:05] Speaker 02: However, what it meant is that [00:05:51] Speaker 02: That still goes to the content. [00:05:55] Speaker 02: It's not always related to content. [00:06:20] Speaker 03: in Fair Housing, there was a structure where a question and answer had to be input, and that was... The template in Fair Housing was actually a violation of the Fair Housing Act itself because it required disclosure of gender and the like. [00:06:37] Speaker 03: But like in Carifano v. Metris-Wash, simply providing a template [00:06:55] Speaker 03: a violation of some law? [00:06:57] Speaker 02: It is not a violation of law and as your honor had also recognized in the fair housing decisions and the en banc decision, the violation of the law itself in that case of fair housing was something that was separate from whether a CDA applies to the scenario. [00:07:19] Speaker 02: But the takeaway from that is that there are sometimes designs [00:07:34] Speaker 02: And here, there are even more dangerous features. [00:07:37] Speaker 03: This was at Garafano v. Metro Splash, which allowed a user to give intimate information about who they were, their profile, et cetera. [00:07:48] Speaker 03: That, we said, was fine. [00:07:50] Speaker 03: That did not cause the publisher to lose immunity. [00:07:54] Speaker 03: It was only in Fair Housing where they were actually engaged in violating the Fair Housing Act that we said they were a [00:08:22] Speaker 02: profile sitting in carafano. [00:08:42] Speaker 02: This was marketed and sold to teenagers. [00:08:47] Speaker 02: So this is about an one-sided anonymous app where people cannot talk to each other, but only one person is receiving anonymous apps, and they are all 14, 14, 15, 16-year-olds. [00:09:00] Speaker 03: And y'all know exactly what it was designed for. [00:09:03] Speaker 03: We need to know why that is in publishing content, because either a message has a name on it, or the message doesn't have a name on it, and that all sounds like content to me. [00:09:14] Speaker 02: It's actually, if we were to put it in a context of a benign message, let's say that I checked into a hotel yesterday, and I get a note that slipped under my door that says, I couldn't get you. [00:09:29] Speaker 02: And it could be any kind of content. [00:09:31] Speaker 02: It might be that my clients were going to come and get me for dinner. [00:09:35] Speaker 02: But if I didn't know where it was coming from, it was just on a blank piece of paper, then I may be confused as to who is going to come and get me. [00:09:44] Speaker 02: Maybe it's a threat. [00:09:45] Speaker 02: So regardless of what content it is, regardless of whether it was a publisher function or not, the system that is designed to allow for one-sided anonymity with no recourse [00:10:01] Speaker 02: The safety feature that did not work, that is a problem here. [00:10:45] Speaker 02: from cyber bullying, there would be so many other ways to understand the back end of the system in order to understand the harm, but also why the design was input in such a way and how that affected the profitability. [00:11:07] Speaker 02: that the duty that Yellow had was simply to follow what they had stated they would do. [00:11:12] Speaker 02: There's nothing about publisher, just like in Barnes. [00:11:16] Speaker 02: Barnes was all about removal of a post of obscene material that a woman's ex-boyfriend had posted. [00:11:25] Speaker 02: And the duty that followed was to do what they had promised her they would do. [00:11:30] Speaker 03: Oh, so in Barnes, we said the negligent undertaking claim failed. [00:11:38] Speaker 03: stopple claim that we that we said has succeeded and you have just told me that you don't intend to have your claim re-characterized as contract or promised area stopple so under Barnes it would fail as a negligent undertaking. [00:12:38] Speaker 02: that CTA does not apply. [00:12:41] Speaker 02: How do you get around dry off? [00:12:43] Speaker 02: Dyerolf is another case that simplistically talks about anonymity. [00:12:47] Speaker 02: And Dyerolf was a completely different website. [00:12:51] Speaker 02: This was where everybody had the benefit of anonymity, but they also had a profile attached to every single person. [00:12:59] Speaker 02: So it wasn't a one-sided complete anonymity like the ones here. [00:13:09] Speaker 02: and website. [00:13:12] Speaker 02: And it reinforced communication between multilateral parties. [00:13:17] Speaker 04: They had a personality index or whatever you called it. [00:13:23] Speaker 04: The posters were still anonymous, was it not? [00:13:27] Speaker 02: The posters could have an anonymous way of characterizing [00:13:39] Speaker 02: from which you can reply and they can see somebody had replied [00:14:03] Speaker 02: You cannot even reply without publicly humiliating yourself and saying, I got this note from someone. [00:14:10] Speaker 04: Everyone, please take a look and... That a buyer in dry-off wouldn't know who was offering the drugs at what place. [00:14:18] Speaker 02: Isn't that true? [00:14:19] Speaker 02: That is true, but that is basically true of all profile-making websites in this digital era. [00:14:26] Speaker 02: The distinguishing factor here is that [00:14:31] Speaker 02: in this case, without publicly showing what had been received. [00:14:37] Speaker 04: How would someone using the dry-off platform reply to a drug peddler saying, I'll buy it five instead of ten? [00:14:46] Speaker 02: Oh, because there was a way to communicate both ways in Dirof, in the site that Dirof was about. [00:14:55] Speaker 04: I know who he was communicating with. [00:14:57] Speaker 02: That is true. [00:15:00] Speaker 02: For example, it would be that if on our mobile, you and I were on the Explorer Experiencer app, in Dilaw's case, we wouldn't know that we're talking to one another. [00:15:13] Speaker 02: But I can still direct messages to you, and you can direct messages to me. [00:15:18] Speaker 02: But in the case of YOLO, I would be exposed. [00:15:22] Speaker 02: You would know that you're [00:15:38] Speaker 04: to the room that he's willing to pay five rather than ten? [00:15:42] Speaker 02: No, there is a way that they can communicate. [00:15:44] Speaker 04: Directly? [00:15:45] Speaker 04: All right. [00:15:46] Speaker 04: And? [00:15:47] Speaker 04: That's the distinction. [00:15:48] Speaker 04: There's direct communication with the sender, anonymous sender in dry off, and there's no direct communication here. [00:15:55] Speaker 04: How does that affect anything? [00:15:57] Speaker 02: It affects the contextualization of these facts. [00:16:00] Speaker 02: There are harms that come [00:16:04] Speaker 02: The abuser wants to be the abuser so you're telling him don't abuse me? [00:16:29] Speaker 02: Here, this was a hotbed for cyberbullying. [00:16:32] Speaker 02: There was no way to ask any guardians, school officials. [00:16:38] Speaker 02: No law enforcement could ever track who had been sending what. [00:16:42] Speaker 02: You didn't know if there was one person sending it to you, if it was a particular person sending it to you, or if the entire room felt the same opinion of you. [00:16:51] Speaker 02: This was how messy this act was. [00:16:54] Speaker 02: But it was targeted, again, to teens. [00:16:58] Speaker 03: Do you want to save some time for rebound? [00:17:25] Speaker 05: Good morning, Your Honors, and may it please the Court. [00:17:28] Speaker 05: Nick Pucci on behalf of Yolo Technologies, Incorporated. [00:17:35] Speaker 05: During the hearing in the Dyerhoff matter, there was significant time spent explaining how did the product work? [00:17:43] Speaker ?: How did the underlying product and Dyerhoff connect the users to talking to each other? [00:17:49] Speaker ?: I think we need to spend a few minutes just to understand [00:18:15] Speaker 05: At least most of you are familiar with that platform from prior cases. [00:18:19] Speaker 05: The YOLO app sits on top of Snap. [00:18:38] Speaker 05: look at the photo of the turban I'm wearing today, what do you think of the color? [00:18:41] Speaker 05: You can post that and I could post that and I would get anonymous feedback from people. [00:18:47] Speaker 05: Here's the critical point for background. [00:18:50] Speaker 05: The people who see those posts are people that I have agreed to connect with on Snap. [00:18:55] Speaker 05: The way Snap works is to double opt in. [00:18:58] Speaker 05: If I wanted to connect with Judge Bea on Snap, he would have to agree to it and I would have to agree to it. [00:19:05] Speaker 05: And then I could broadcast my photos to you [00:19:13] Speaker 05: Absolutely, Your Honor. [00:19:15] Speaker 05: You can simply delete the app, or you can stop making posts that are asking for anonymous feedback from people you've agreed to connect with on Snap. [00:19:23] Speaker 05: There's control there, and I think that's important. [00:20:12] Speaker 05: certain times, for example, and this wouldn't be in the record, but at certain times the app would use different types of technology filters to try to scan for bullying content and try to automatically filter those contents. [00:20:25] Speaker 05: There's also instances... Would they find out who's posting it? [00:20:30] Speaker 05: Potentially. [00:20:31] Speaker 05: This is designed to be a guideline, Your Honor. [00:20:33] Speaker 05: It says, you know... This is a guideline that's not followed. [00:20:37] Speaker 05: Well, it's not felt clearly. [00:20:39] Speaker 05: That doesn't say when we're going to reveal the identities. [00:20:41] Speaker 05: That doesn't say when we're going to ban them. [00:20:43] Speaker 05: It says that there is no bullying on this platform. [00:20:46] Speaker 05: And if you do it, your identity will be revealed and you will be banned. [00:20:51] Speaker 05: Right. [00:20:51] Speaker 05: It does say that. [00:20:52] Speaker 05: That doesn't say what. [00:20:53] Speaker 05: It doesn't say how. [00:20:54] Speaker 05: And it also, the problem with that is, [00:21:02] Speaker 05: or greater abusive communications that as an accident is protected directly under section 2. [00:21:07] Speaker 04: What you're saying is you will be banned and you will be revealed, parenthesis, but maybe not by us. [00:21:15] Speaker 04: It could be. [00:21:16] Speaker 04: It could be by law enforcement who, of course, we've been, we've been, we've interacted with them. [00:21:20] Speaker 04: So this is a wish, not a promise is what you're saying. [00:21:24] Speaker 05: I think, I think it is a guideline that we've provided. [00:21:26] Speaker 04: The reason why, if we take a step back and say, why did you look at it? [00:21:37] Speaker 04: by the company of what will happen, right? [00:21:43] Speaker 04: But not necessarily by us who make the representation, is that what you're saying? [00:21:49] Speaker 04: It could mean that because our instances... It does mean that because you're not saying, you're not saying we will investigate, reveal, [00:22:18] Speaker 04: and it may be by us or it may be by somebody else, and it may happen or it may not happen. [00:22:24] Speaker 04: That would be a more full and complete representation, right? [00:22:30] Speaker ?: Something along those lines probably would have been better if the app was going to make that. [00:22:33] Speaker 05: No, no, it would have been better to interview. [00:22:35] Speaker 05: Right. [00:22:35] Speaker 05: But again, I mean, keep in mind, this is a guideline where the fact is the app is sticking their finger out and saying, no bullying, right? [00:22:42] Speaker 05: That's the idea. [00:22:43] Speaker 03: Why is it like the promise or the promise in Barnes? [00:23:15] Speaker ?: I don't know for sure. [00:23:18] Speaker 05: But they had two versions of the complaint. [00:23:20] Speaker 05: They had over a year, I believe, to put up the second version of the complaint. [00:24:01] Speaker 05: This court is well aware of, certainly from its opinions, how we feel about that. [00:24:06] Speaker 05: In roommates.com, this court said that websites are complicated and there's always going to be close cases where a clever lawyer can try to find an argument around it. [00:24:16] Speaker ?: You can be artful and try to circumvent Section 230. [00:24:28] Speaker ?: dig in even deeper. [00:24:29] Speaker 05: What exactly we were talking about the language a minute ago. [00:24:32] Speaker 05: What are they saying that we misrepresented? [00:24:35] Speaker 05: They're saying that you promised or you told us that you would ban people or reveal their identities and there should be no bullying. [00:24:42] Speaker ?: What they're basically saying is, hey, you did not regulate content. [00:24:45] Speaker ?: You didn't moderate content. [00:24:47] Speaker ?: How would we determine if somebody was bullying somebody? [00:24:50] Speaker ?: How would we determine if there's abusive language? [00:24:53] Speaker ?: We would have to look at the content. [00:25:03] Speaker 05: is even though they try to get around 230, if you look at the actual substance, they back right into it. [00:25:10] Speaker 05: I don't know if that's accurate, Your Honor. [00:25:16] Speaker 05: I mean, I certainly understand the court's holding in Barnes regarding the promissory stopple, but in this situation, they're asking for content moderation. [00:25:51] Speaker 05: He shouldn't have said no bullying. [00:25:53] Speaker 05: He shouldn't have made the threat or the or [00:26:19] Speaker 05: is if you look at the language, it says no bullying. [00:26:21] Speaker 05: What promises no bullying? [00:26:24] Speaker 05: That is clearly an instruction to say, hey, come correctly, act appropriately when you're using this app. [00:26:30] Speaker 05: By the way, the same app that people use to solicit the anonymous feedback they're receiving. [00:26:50] Speaker 05: When I, what do you think of this photo? [00:26:52] Speaker 05: You know, they post a photo themselves on the door and say, put a note underneath and tell me what you think about it. [00:26:57] Speaker 05: That's a little different than receiving a creepy note from somebody who you have no connection to whatsoever and who you haven't solicited feedback. [00:27:21] Speaker 05: this court is found directly against that in Dirov, specifically. [00:27:27] Speaker 05: So that alone would suggest, you know, it basically guts their products liability. [00:27:35] Speaker 03: I'd rather didn't really focus on anonymity, did it? [00:27:38] Speaker 03: Yes. [00:27:38] Speaker 03: Actually, the website was anonymous, but it didn't really focus on that as being an issue. [00:27:45] Speaker 05: I can give you the citation, Your Honor. [00:28:00] Speaker 05: It's on 1095 to 96 of that decision, where it says an anonymity feature alone is a content neutral feature with no inherent danger or predictability of harm. [00:28:18] Speaker ?: I think, so I mean, the idea that because something's anonymous makes it inherently dangerous is not true, especially when you have a content neutral messaging platform like YOLO. [00:28:48] Speaker ?: when they communicate with each other, that's always going to happen. [00:28:50] Speaker 05: And so when they say, Oh, because your messaging platform had an anonymous feature versus one that didn't have an anonymous [00:29:15] Speaker 05: And just to be clear, Your Honor, it's not as people who they know, it's as people who have, who have double opted in to connect with each other. [00:29:21] Speaker 05: So it could be strangers, in all honesty. [00:29:23] Speaker 03: There is that sidenote, I'm sorry, I don't see it. [00:29:27] Speaker 05: Sure. [00:29:28] Speaker 05: Um, you have the case citation 934, uh, F3D, uh... I'm looking at 1093. [00:29:35] Speaker 05: 1093, it's 1095 and 1096. [00:29:37] Speaker 05: I can try to pull up, uh, the specific language if you like. [00:29:40] Speaker 03: This is the background section, and I don't see any holding at all. [00:29:52] Speaker 05: page 1100. [00:29:54] Speaker 05: I'm sorry, the pin size should be 1100. [00:29:56] Speaker 05: If I may, when we talk about some of the other cases out there, let's look at the underlying technology. [00:30:29] Speaker 05: We don't do any of that. [00:30:31] Speaker 05: All we do is provide an anonymous messaging app that allows the own user to post content and say, hey, what do you guys think of this? [00:30:39] Speaker 05: They can do a poll. [00:30:40] Speaker 05: They can say, do you like it? [00:30:42] Speaker 05: Do you not like it? [00:30:42] Speaker 05: Right? [00:30:43] Speaker 05: And so when you solicit anonymous feedback or conduct a poll inherently, you're actually asking. [00:30:49] Speaker 05: It's very likely you could get positive content or negative comments back. [00:30:53] Speaker 05: That's how polls work. [00:30:55] Speaker 05: And so this is, if you look at, [00:31:10] Speaker 05: So that's just with regards to those cases. [00:31:14] Speaker ?: The question was also asked if we allowed this case to proceed and discovery was allowed to be completed. [00:31:21] Speaker 05: What would you look for? [00:31:43] Speaker 03: didn't find your quote in the case, by the way. [00:31:46] Speaker 03: I'll look again. [00:31:48] Speaker 03: And that the duty to warn is independent from moderating content. [00:31:56] Speaker 03: In other words, they could have just told parents, by the way, this is a feature that allows anonymous communications. [00:32:04] Speaker 03: And studies have shown that this is harmful to children. [00:32:07] Speaker 03: Why does that claim fail? [00:32:31] Speaker 03: So I don't see why you can't have a failure to warrant claim in this context, so could you address that? [00:32:45] Speaker 03: Sure. [00:32:46] Speaker 05: In order to have a failure to warrant claim, I think I'd go back to the idea that just because this allows some anonymous message, [00:33:05] Speaker 03: could lose on the merits. [00:33:06] Speaker 03: The question is whether it gets past the CDA immunity. [00:33:12] Speaker 05: Right, but then I guess the next point would be is that what are we warning of? [00:33:17] Speaker 05: We're warning of communications, the content of communications, and then they have a problem with Section 230. [00:33:23] Speaker 05: Because again, the way that YOLO apps is a neutral platform that [00:33:34] Speaker 05: for people they've connected with on Snapchat. [00:33:37] Speaker ?: Where is the duty to warn in that? [00:33:39] Speaker 05: Why would a failure to warn claim survive given the protections afforded under Section 230? [00:33:47] Speaker 05: It would require us to go above and beyond and engage in activities that we would not be required to do or activities that we could do that are protected in good faith. [00:34:10] Speaker 05: claims. [00:34:54] Speaker 05: They are saying, you said you were going to do something. [00:34:57] Speaker 05: Well, what was that thing that we were going to do? [00:34:58] Speaker 05: It was to regulate the content. [00:35:00] Speaker 05: It was to analyze whether someone's being a bully, to decide whether to take them and or their communications off the platform. [00:35:06] Speaker 05: That is per force immune under section 230. [00:35:09] Speaker 05: So again, you have to boil it down because it's trying to circumvent 230, but that's what it comes out to. [00:35:14] Speaker 05: Section 230 also directly bars appellants claims for content moderation. [00:35:31] Speaker 03: you want to address the negligent design claim? [00:35:36] Speaker 05: Sure, well the negligent design claim again rests upon the idea that having some type of anonymous messaging feature somehow makes it inherently dangerous. [00:35:48] Speaker 05: Now their description again of the platform has evolved over time, but even if we break out the full description, I don't see how you can have a negligent design when you have users who are soliciting anonymous feedback and can [00:36:12] Speaker 05: That's, I think, where it becomes problematic. [00:36:13] Speaker 05: I think, I think when you look at the design, whatever design they're suggesting is actually existing on the platform is not accurate. [00:36:19] Speaker 05: The user didn't request that the feedback be anonymous, did he? [00:36:24] Speaker 05: Yes, they did, because the whole idea, Your Honor, is you post something. [00:36:29] Speaker 04: I see it. [00:36:30] Speaker 04: YOLO represented the feedback would be anonymous. [00:36:32] Speaker 05: Correct. [00:36:33] Speaker 05: That's what the person took on. [00:36:34] Speaker 05: You say, how do you like my shirt today, guys? [00:36:36] Speaker 05: And then all of a sudden, and you do it via YOLO, you allow a YOLO sticker to be on there. [00:36:41] Speaker 05: People who you've connected with, [00:36:47] Speaker 03: I don't think that's the issue, right? [00:36:49] Speaker 03: The question is whether a negligent design claim is barred because of CDA immunity. [00:36:55] Speaker 03: That's the question. [00:36:56] Speaker 05: Correct. [00:36:57] Speaker 05: And of course it is. [00:36:58] Speaker 05: It falls directly within protection of the CDA. [00:37:04] Speaker 03: We have 11B SNAP, which allowed a negligent design claim to go forward. [00:37:23] Speaker 05: carrot and a stick, and they hit, there's a carrot, and they said, speed and post about it. [00:37:28] Speaker 05: Here, there's nothing like that. [00:37:29] Speaker 05: We don't, we don't, we don't control any of the content whatsoever. [00:37:32] Speaker 05: The only person who controls the content is the one posting, saying, what do you think about ExplainZ, or give me your feedback on it. [00:37:38] Speaker 05: Section 230 is clear. [00:37:40] Speaker 05: Yolo, the Yolo app is obviously an interactive computer service, and it should not be treated as a publisher of their [00:37:51] Speaker 05: that allow users to solicit anonymous feedback on their own. [00:37:58] Speaker ?: All right. [00:37:58] Speaker 03: I think we have your argument. [00:37:59] Speaker 03: Thank you. [00:38:00] Speaker 05: Thank you, Alex. [00:38:11] Speaker 02: Your Honor, may I move to strike everything that has been argued that is off the record, that there had [00:38:24] Speaker 02: are from the record, and those things properly belong to the realm of discovery, which we were barred from. [00:38:31] Speaker 02: Please disregard those arguments, and I move to strike those parts of the oral argument. [00:38:40] Speaker 02: Circumvention of the Communications Decency Act is what's going on here. [00:38:47] Speaker 02: The circumvention [00:39:17] Speaker 02: the state. [00:39:51] Speaker 02: and the hearts here. [00:39:54] Speaker 02: Carson Bride, at the remaining last minutes of his life, was trying to activate the revelation of identities of his users per [00:40:27] Speaker 02: for the misrepresentations and the negligent design of the safety feature that never worked. [00:41:00] Speaker 02: because they probably were lured, they were burned and harmed, and then they would withdraw. [00:41:05] Speaker 02: So that initial use and misleading and being harmed by the misrepresentation is still ongoing, regardless of whether there was an opportunity after that. [00:41:28] Speaker 02: content moderation and