[00:00:00] Speaker 01: Thank you, Your Honor. [00:00:01] Speaker 01: May it please the Court, Joel Kurtzberg from Cahill-Gordon on Randell on behalf of XCorp. [00:00:06] Speaker 01: I'd like to reserve five minutes for rebuttal. [00:00:08] Speaker 02: If I could try to help you. [00:00:10] Speaker 01: Thank you. [00:00:11] Speaker 01: At first blush, AB 587, the law at issue in this case, may seem like an innocuous transparency measure. [00:00:18] Speaker 01: And that, of course, is precisely what the government would like you to believe. [00:00:21] Speaker 01: But mandated transparency measures like AB 587 are much more problematic from a First Amendment standpoint than first meets the eye. [00:00:30] Speaker 01: that is perhaps best made clear by the extraordinary diversity of the amici that have filed briefs supporting our position that AB 587 is deeply threatening to the well-established First Amendment interests. [00:00:45] Speaker 01: We have people from all over the political spectrum, Reporters Committee for Freedom of the Press, US Chamber of Commerce, First Amendment scholars, conservative entities like the Washington Legal Foundation, all [00:00:58] Speaker 01: reflecting a deep concern about a statute that on its face is aimed at pressuring social media companies to change their content modification policies so as to carry less or even no expression that's viewed by the state as injurious to its people. [00:01:19] Speaker 02: If we were to determine that the content moderation sections of AB 587 were unconstitutional, as you suggest, what do we do with sections such as the high-level statistics at 22677 or the penalty provision at 22678? [00:01:38] Speaker 02: In other words, those that don't bear on content moderation and are strictly statistical in nature. [00:01:44] Speaker 02: What do we do with those? [00:01:46] Speaker 01: I think that the statistical, I'm sorry, I'm not sure which provisions you're referring to, the statistical requirements. [00:01:52] Speaker 02: I was referring specifically to 22677 and 22678. [00:01:56] Speaker 02: There are others, but I just point those out as two examples. [00:02:00] Speaker 01: So the statistics in 22677 are geared towards specific categories of content, and that is what we would suggest is what makes them problematic. [00:02:12] Speaker 02: So just the reporting of the numbers is, in your mind, makes it content moderation? [00:02:17] Speaker 01: It's not that it is just reporting of numbers. [00:02:21] Speaker 01: Those provisions require social media companies to define these specific categories of content or decline to do so. [00:02:29] Speaker 02: That portion of it I totally get. [00:02:32] Speaker 02: and putting it without reference to specific code sections, it seems to me you've got different portions of this bill. [00:02:39] Speaker 02: You've got some that are content moderation, which are the ones that I think we're primarily fighting about, or you're fighting about. [00:02:45] Speaker 02: And then you've got those that are more typically commercial speech type. [00:02:52] Speaker 02: I'm not saying it is, but that type of request. [00:02:57] Speaker 01: The challenge that we have is as to the terms of service report, which requires either the definition or the declination to define those categories. [00:03:06] Speaker 01: And I get that. [00:03:07] Speaker 01: And also requires statistics, which is what I thought you were getting at, about speech that was so-called actioned that falls within those categories of content. [00:03:20] Speaker 01: We suggest, and it's not just because they're statistics, but the structure of the law [00:03:25] Speaker 01: is designed to apply pressure in a way that violates the First Amendment. [00:03:30] Speaker 02: That was actually stated by the legislature and the governor and others as well, right? [00:03:34] Speaker 01: That is absolutely correct. [00:03:36] Speaker 01: And the law does that in two significant ways. [00:03:39] Speaker 01: It does it indirectly by forcing companies to make these disclosures about what the legislative history makes clear are the most controversial categories of so-called awful but lawful speech. [00:03:53] Speaker 01: as part of an effort that the legislative history and the attorney general openly admit was to get the public to pressure companies to limit these disfavored but constitutional categories. [00:04:06] Speaker 03: Mr. Pritzberg, I think the O'Brien case, famously, and you can give me a review of our court's own precedent, told us not to look at what was good, the legislative history, for purposes of evaluating the facial and constitutionality of provision. [00:04:22] Speaker 01: I think here one can look at the legislative history, notwithstanding O'Brien, because the very structure of the law itself makes very, very clear that it is signaling out these categories of disfavored speech, and the law itself implies, it's very, very clear that these are categories that [00:04:43] Speaker 01: people don't want to look at, which is what the attorney general admits. [00:04:46] Speaker 03: Let me ask you a question. [00:04:48] Speaker 03: I'm going to ask it to the state as well about how this statute may work. [00:04:55] Speaker 03: The key provision here, subsection 3022, 677, requires a statement of whether the current version of the terms of service defines each of the following categories of content. [00:05:06] Speaker 03: And if so, those categories, [00:05:09] Speaker 03: Does your client believe it would comply with the statute to turn in, as it already has, its terms of service? [00:05:18] Speaker 03: Say, there's our definitions. [00:05:20] Speaker 03: You decide for yourself. [00:05:21] Speaker 03: We have decided not to define these things. [00:05:23] Speaker 03: The statute only requires us to do it. [00:05:25] Speaker 03: If we do define those things, we are in full compliance, no problem. [00:05:29] Speaker 01: We think that that is problematic, Your Honor, and it is problematic for the reason that the state is compelling disclosures about [00:05:36] Speaker 01: specific categories of speech that they have used to frame this debate and they have selected those categories in a way to try to generate controversy that will apply. [00:05:48] Speaker 03: Well, whatever reason they might have tried to do it, your terms of service are already posted by you, correct? [00:05:54] Speaker 03: I mean, they don't work if they're not. [00:05:56] Speaker 01: We are not challenging on this appeal the requirement that we post our terms of service. [00:06:02] Speaker 04: You haven't abandoned that claim, though. [00:06:04] Speaker 04: You just simply haven't appealed. [00:06:05] Speaker 01: That is correct. [00:06:06] Speaker 04: We have not abandoned it. [00:06:08] Speaker 04: We're on a PI denial. [00:06:09] Speaker 03: Correct. [00:06:10] Speaker 03: So your terms of service either define those categories or not. [00:06:17] Speaker 03: I think some of them do, some of them don't. [00:06:19] Speaker 03: And those are your definitions that you've already disclosed to your users. [00:06:23] Speaker 03: That is correct. [00:06:24] Speaker 03: And if they don't define those categories, where does the mandate that you're claiming from the law, the compulsion, to define those categories in ways that are any different from the way your terms of service define them? [00:06:38] Speaker 01: They don't mandate us to define the categories in a specific way, but they mandate us to take a position [00:06:45] Speaker 01: on what the legislative history makes clear are the most controversial categories to moderate and define. [00:06:51] Speaker 03: So I guess it can't be both. [00:06:55] Speaker 03: It can't be either that the law requires you to just explain whether your terms of service have these categories, if they do or not, and then also require you to define them. [00:07:07] Speaker 03: Which is it? [00:07:08] Speaker 03: Why isn't this statute susceptible of a reading? [00:07:12] Speaker 03: that you simply, if you have already defined those categories, whatever California thinks, if you have decided to define those categories, great, here they are. [00:07:22] Speaker 03: If you haven't, no problem. [00:07:25] Speaker 01: Your honor is not wrong that we are entitled to respond to the statute by saying we don't define hate speech or racism. [00:07:35] Speaker 01: But the report also asks about policies that are supposedly, quote, intended to address those categories, which is a judgment call that is different than just saying if you do. [00:07:47] Speaker 03: And this is very helpful. [00:07:50] Speaker 03: yourself to find those categories in the terms of service, you read the law as requiring you to opine or discuss those categories, even if they're not part of your own terms. [00:08:04] Speaker 01: If the categories are not part of our terms, then 22 677 [00:08:09] Speaker 01: A4A requires us to disclose about any existing policies that are, quote, intended to address those categories. [00:08:18] Speaker 04: So you, part of your position is you are required to tell California, essentially, your views on hate speech, extremism, harassment, foreign political interference, how you define them or don't define them, and what you choose to do about them. [00:08:35] Speaker 01: That is correct. [00:08:36] Speaker 01: and that those categories were chosen, as the legislative history makes clear, because they are the most, quote-unquote, fraught with political bias and the most difficult to define. [00:08:49] Speaker 01: And the intent, which is also very clear and that the Attorney General has conceded, in part, is to have [00:08:57] Speaker 01: that controversy get the public to pressure the social media companies to do more to limit or eliminate those categories of conduct. [00:09:12] Speaker 02: there's an assumption that based on what you say, you will change the content accordingly. [00:09:17] Speaker 02: So for example, if somebody is talking about from the river to the sea or about Hunter Biden's laptop, they want to know what you're going to do about it under your policy, right? [00:09:29] Speaker 02: Which requires you to make a value judgment. [00:09:32] Speaker 01: We do think it requires us to make a value judgment. [00:09:35] Speaker 01: It requires us to either define the categories that are flagged as the most controversial, [00:09:40] Speaker 01: or to decline to define them, which also could be controversial to not define them. [00:09:46] Speaker 03: Mr. Kurtzberg, to rewind at the beginning, at the top of your argument, you talked about the transparency. [00:09:51] Speaker 03: And you probably saw this coming. [00:09:52] Speaker 03: We're thinking about Moody. [00:09:57] Speaker 03: Why aren't some of the transparency provisions, if we take out all the stuff we've been talking about the last 10 minutes, if we take out those categories which you assert are problematic, [00:10:09] Speaker 03: How is the remainder of the statute distinguishable from the Texas and Florida laws that the Supreme Court left untouched in Moody? [00:10:22] Speaker 03: And why would we welcome a circuit split on something where it seems like Florida, Texas, and California are all agreed on and the Supreme Court has left alone? [00:10:33] Speaker 01: Well, first of all, your hypo excludes the main distinction between these laws and the laws in the NetChoice or Moody cases, right? [00:10:41] Speaker 01: So, so this law, unlike those, specifies categories of content that are designed, that are picked to be controversial. [00:10:50] Speaker 01: Stipulated, severable. [00:10:51] Speaker 01: Okay. [00:10:51] Speaker 01: So, so there's very little left if you, in fact, you are going to remove those requirements from the Terms of Service Report requirement in 22677. [00:11:01] Speaker 01: We would essentially be submitting the current version of our Terms of Service. [00:11:08] Speaker 01: and if there are changes to the terms of service. [00:11:10] Speaker 01: But all of the statistics and the descriptions are the remainder of the provision that we are challenging. [00:11:17] Speaker 03: Well, the statistics, if the statistics are untied to the categories, do not forget whether it's Florida or Texas. [00:11:28] Speaker 03: But those laws, which again are on the books and have been left alone by the Fifth and Eleventh Circuits as well as the Supreme Court, [00:11:36] Speaker 03: How is this different once we take those problematic categories out? [00:11:39] Speaker 01: I don't think you can take them out, because the statistics... As a matter of First Amendment law or California severability law? [00:11:45] Speaker 01: As a matter of severability, I also do think as a matter of First Amendment law, and I do think that the Fifth and the Eleventh Circuits did get it wrong, but I think that this statute is distinguishable for the reasons that Your Honor has accepted out in your hypothetical, but to address that hypothetical, [00:12:03] Speaker 01: the statistics and the detailed description are the gut of the Terms of Service report that we are challenging. [00:12:10] Speaker 01: You cannot say that somehow or reread this statute under a severability analysis about providing statistics for something other than what the statute says that the statistics are to be provided for. [00:12:24] Speaker 04: Did the Attorney General argue below the statute was severable? [00:12:29] Speaker 01: Yes, Your Honor, the Attorney General did argue that. [00:12:32] Speaker 03: So I guess that's the severability question. [00:12:35] Speaker 03: You've also said you have a First Amendment point and are asking us to disagree. [00:12:41] Speaker 03: Do we want to invite the Supreme Court to revisit these issues so soon after Moody? [00:12:45] Speaker 01: We think you don't have to, Your Honor. [00:12:47] Speaker 01: And your hypothetical actually addressed the main distinction here between this case and the Moody cases. [00:12:53] Speaker 01: So we do think that this statute is more problematic as to these [00:12:58] Speaker 01: compelled disclosures about content moderation than those in Texas and in Florida. [00:13:03] Speaker 03: Okay, so those laws, as I understand, I think the Texas law, upheld by the Fifth Circuit, not touched by the Supreme Court, requires statistics of actions, content moderation actions. [00:13:17] Speaker 03: How is this different, again, once we take out the categories? [00:13:20] Speaker 01: Oh, if you take out the cat, that is the main distinction and the statistics here. [00:13:24] Speaker 01: That's clarifying. [00:13:24] Speaker 01: But just to be clear, it's not just as a matter of severability law. [00:13:29] Speaker 01: The statistics that are asked for here and the descriptions are completely linked to the categories. [00:13:35] Speaker 01: It does not say in the statute that you should just provide statistics about anything that was flagged about any topic for any reason. [00:13:45] Speaker 01: It targets specific categories of content [00:13:49] Speaker 01: that were selected for the very reason that they are the most controversial ones with the most difficult decisions. [00:13:56] Speaker 03: Well, it does both, does it not? [00:13:57] Speaker 03: I mean, the action content moderation, those are defined terms that refer to other statistics. [00:14:04] Speaker 03: In other words, your client would have other things they could disclose without pulling out those mandated categories. [00:14:11] Speaker 01: If you look at the statistics sections of 22.677, they are all linked to [00:14:19] Speaker 01: detailed description of the content moderation practices that are intended to address the content described in paragraph three, which are those categories. [00:14:30] Speaker 01: There are some disclosures about how automated content is used as a, in, in, or whether human review is involved. [00:14:42] Speaker 01: Those disclosures are so unmoored from what this whole thing is about that there's practically nothing left. [00:14:49] Speaker 01: All of the disclosures are essentially about providing statistics about what was actioned or flagged as to those categories of content. [00:14:59] Speaker 01: So when you accept the categories of content, the hypothetical in my mind, there's very little left in actual practice when you look at what this terms of service report section of the statute, which is what our appeal is about, requires. [00:15:15] Speaker 02: Let me ask you this. [00:15:17] Speaker 02: We've talked about Moody a little bit, but my colleague, I think, suggested that Moody left the laws from Texas and Florida intact. [00:15:29] Speaker 02: My understanding is all they said is they didn't apply facial analysis properly and send it back without deciding whether either law survived. [00:15:39] Speaker 02: What's your analysis of Moody? [00:15:41] Speaker 01: So I think Your Honor is correct. [00:15:44] Speaker 01: They didn't make an ultimate decision. [00:15:46] Speaker 01: And I should qualify my answer. [00:15:50] Speaker 01: The section of the laws that dealt with disclosures about content moderation decisions was not granted as to that point, except as to a very limited point, a provision that required detailed explanations of every content moderation decision. [00:16:11] Speaker 01: On that point, the court simply was very, very clear. [00:16:16] Speaker 01: The court applied Zauderer and the concurring opinions made very clear that it did so because there was a forfeit of the argument. [00:16:23] Speaker 01: No one had argued anything otherwise. [00:16:25] Speaker 01: So I think it is a completely open question as to that. [00:16:29] Speaker 01: But the Fifth and Eleventh Circuit opinions, I see your honor. [00:16:32] Speaker 03: No, well, yes. [00:16:33] Speaker 03: I think we're in vehement agreement on that. [00:16:35] Speaker 03: What I was referring to were the disclosure requirements that was the question that was not granted on the petition. [00:16:42] Speaker 03: When I say they left them alone, it was the question's not granted. [00:16:45] Speaker 03: But just a quick tie-up question here. [00:16:50] Speaker 03: But the majority says Zotterer applies unqualified. [00:16:54] Speaker 03: Why should we read so much into Concurrence's statement that it's forfeited rather than more justices telling us that Zotterer applies even to these content moderation questions? [00:17:08] Speaker 01: I think it is clear, you could look at the briefings yourself, Your Honor, there was no briefing on that question. [00:17:15] Speaker 01: It was not contested. [00:17:16] Speaker 01: And I think when an issue comes before a court and an argument, and that's why the Concurring Opinions noted it, but it is a judicially noticeable fact that it wasn't fully briefed and contested, and I don't think a holding [00:17:30] Speaker 01: that Zauderer applies in that context really counts as a full holding of the court at the PI stage, especially. [00:17:37] Speaker 03: Sure. [00:17:37] Speaker 03: So just to close the loop here, I believe the court viewed Niflo's consideration of controversial speech, which was part of your argument, as an application of Zauderer. [00:17:50] Speaker 03: for the court to say that Zauderer applies to content moderation would not exclude an application of the NIFLA concern that you raise, is that right? [00:18:00] Speaker 03: Is NIFLA, in other words, do you view NIFLA as an application of Zauderer? [00:18:05] Speaker 01: I think it's, that's a metaphysical question almost. [00:18:08] Speaker 01: That's why I ask it, yes, I need help. [00:18:09] Speaker 01: I understand where you're coming from, Your Honor, I do. [00:18:11] Speaker 01: I think that it is clear that Zauderer does not apply if the speech is [00:18:18] Speaker 01: not purely factual and controversial. [00:18:20] Speaker 01: Now, whether one considers that a precursor to the test as to whether it applies or part of the test is metaphysical, but the practical effect is the same. [00:18:29] Speaker 01: And here, it is unequivocally clear that it is controversial. [00:18:34] Speaker 01: In fact, the law is intended to require [00:18:38] Speaker 01: disclosures about the most controversial content topics, the decisions that raise the most controversy, and it is also clear that it is designed to pressure the companies to change their policies, which under another decision from the Supreme Court recently, the Vullo decision, is not permitted. [00:18:56] Speaker 01: The court there said you can't coerce people, and the other part of NetChoice, of course, said that social media companies have the right [00:19:04] Speaker 01: First Amendment right to exercise editorial discretion as to what their feeds look like, and the government can't interfere with that. [00:19:10] Speaker 01: But Volo also said that they can't do indirectly what they can't do directly. [00:19:16] Speaker 01: And if NetChoice makes clear they can't do it directly, Volo makes clear they can't do it indirectly. [00:19:22] Speaker 01: And here they do both. [00:19:23] Speaker 01: They try to do it directly and indirectly. [00:19:27] Speaker 02: contention I gather is that or does not apply here because it isn't just not it does not just factual it's controversial it doesn't meet the qualifications as justice Thomas said it that's a it's a commercial speech issue this isn't commercial speech from your perspective you think that this is first amendment covered and that strict liability applies right strict scrutiny your honor yes liability [00:19:51] Speaker 01: That is correct. [00:19:52] Speaker 01: I have not even gotten to the fact that there's direct pressure applied in this, in this context and that part of our challenge here is as applied. [00:20:01] Speaker 01: We provided a letter that went from the attorney general that threatened the whole purpose of the letter. [00:20:08] Speaker 01: It's a November two, I never remember three letter that was sent to the CEOs of all the big social media companies, including my client. [00:20:17] Speaker 01: And the letter, the sole purpose of the letter was to urge the social media companies to change the way they moderate two of the categories of AB 587, disinformation and misinformation. [00:20:30] Speaker 01: And in that letter, the AG reminds the social media CEOs that AB 587 is now on the books and he will not hesitate to enforce that law [00:20:45] Speaker 01: once it comes into effect and the evidence in the record is that that letter was interpreted as a threat. [00:20:54] Speaker 01: It is a letter that says you're not doing enough about getting certain content off of your platforms and [00:21:02] Speaker 01: By the way, I have this power under this statute to enforce it, and keep that in mind when you make your decision. [00:21:09] Speaker 02: Your time is up. [00:21:10] Speaker 02: Let me ask Michael. [00:21:11] Speaker 02: Does either have additional questions? [00:21:13] Speaker 02: All right, very well. [00:21:14] Speaker 02: Let's hear from the state. [00:21:15] Speaker 02: Thank you. [00:21:26] Speaker 00: Good morning. [00:21:28] Speaker 00: Gabrielle Boutan on behalf of Attorney General Rob Bonta. [00:21:34] Speaker 00: Terms of Service report requirements at issue here require platforms to disclose high-level information about their existing voluntary user Terms of Service and content moderation practices. [00:21:47] Speaker 00: This requirement merely compels commercial [00:21:50] Speaker 00: product disclosures and is subject to Zauder's scrutiny. [00:21:54] Speaker 04: I'm sorry, Council, and I'm sorry to interrupt you so early in your argument. [00:21:58] Speaker 04: That's all right. [00:21:58] Speaker 04: But when I look at subsection three here, A3, where there's a requirement of a disclosure of whether you define and then later [00:22:14] Speaker 04: sort of why, why not, but of some of the most controversial aspects of speech, hate speech, racism, extremism, et cetera, et cetera, et cetera. [00:22:31] Speaker 04: It's just hard for me to see that this is commercial speech, this is just random commercial speech part of a commercial transaction. [00:22:41] Speaker 04: So how can requiring the companies to do all this on categories A through E get to Zadar and commercial speech? [00:22:58] Speaker 00: Your Honor, under CTIA, a disclosure is uncontroversial even if it can be tied in some way to a controversial issue, and even if it can be used by others to support arguments in a heated political controversy. [00:23:12] Speaker 02: Forgive me. [00:23:13] Speaker 02: I want to be sure I understand your position. [00:23:15] Speaker 02: I thought that the state had in effect stipulated that in effect [00:23:24] Speaker 02: The intent here was to compel people to moderate what they're doing with respect to hate, speech, et cetera, and the other things my colleague referred to. [00:23:33] Speaker 02: Is that wrong? [00:23:35] Speaker 00: That's not correct, Your Honor. [00:23:37] Speaker 02: Our position is- What about the letter from the AG that was referred to in Mindy Kelly? [00:23:40] Speaker 00: Your Honor, that letter has been grossly distorted in this action. [00:23:44] Speaker 02: What does the letter say? [00:23:45] Speaker 00: The letter is eight pages long. [00:23:47] Speaker 00: The statute is barely mentioned once in a footnote. [00:23:50] Speaker 00: He never suggested or said he would use the statute to coerce any particular content moderation policies. [00:23:57] Speaker 00: All he said is he noted the statute was upcoming over a year later. [00:24:01] Speaker 00: and that the platforms would then have to comply with the statute's transparency obligations. [00:24:06] Speaker 00: In other words, they would have to disclose to the public what they're actually doing. [00:24:10] Speaker 00: There was no threat saying, if you do not use these categories for content moderation, then we'll investigate you under this statute. [00:24:18] Speaker 03: Ms. [00:24:18] Speaker 03: Poutine, I guess unlike CTIA, there is an identity here of the law and the controversy. [00:24:26] Speaker 03: I think that's the claim here. [00:24:29] Speaker 03: What does, [00:24:31] Speaker 03: What do the categories add to the disclosure? [00:24:34] Speaker 03: I spoke with your friend a little bit about some of what Florida and Texas were doing with respect to disclosures. [00:24:41] Speaker 03: You already have their terms of service. [00:24:45] Speaker 03: The terms of service either provide these categories or they don't. [00:24:52] Speaker 03: Why is that not enough for California? [00:24:56] Speaker 00: Your Honor, the categories are [00:25:00] Speaker 00: categories of content that are often, just as a matter of fact, moderated by social media platforms. [00:25:09] Speaker 00: By including them in a terms of service report, you're putting all of these reports in one place on the attorney general's website. [00:25:15] Speaker 00: It is provided to the public as required by the statute, and therefore in one place the public can compare what, if at all, [00:25:23] Speaker 00: Each of these companies may be doing with respect to certain types of categories of information. [00:25:28] Speaker 03: Then I guess to come back to where I started with your friend, do you view the law setting aside the disclosures? [00:25:35] Speaker 03: Does the law require a company to define those categories and provide those categories in response to it, or could it simply just provide its terms of service agreement? [00:25:47] Speaker 00: Your Honor, all it would have to do in the report is to say that we do not define this category, and therefore they would not have to provide any definition of that category. [00:25:55] Speaker 03: Then how does it provide the disclosures that are required by the next subsection? [00:26:00] Speaker 00: The statistics? [00:26:01] Speaker 00: The statistics. [00:26:02] Speaker 00: Oh, the statistics. [00:26:03] Speaker 00: Well, subdivision 5A says that they must provide information on content that was flagged by the social media company as content belonging to the categories described in paragraph three. [00:26:17] Speaker 00: So that's factual. [00:26:19] Speaker 00: They can only report statistics that they have based on actions that they actually took based on categories that actually exist. [00:26:27] Speaker 03: Categories that actually exist in the state's mind or in the company's mind? [00:26:30] Speaker 00: Oh, in the company's mind. [00:26:31] Speaker 03: Because this refers to... And then only as provided in their terms of service. [00:26:34] Speaker 03: So if their terms of service don't speak to the categories the way the California legislature has spoken to the categories, X and other companies are free not to disclose. [00:26:45] Speaker 00: Well, as content belonging to any of the categories described, I mean, I think there may be an assumption that if they're in the terms of service as categories that they moderate content by, then 5A would be applicable. [00:27:02] Speaker 00: I think certainly has a facial challenge. [00:27:04] Speaker 00: I think if we back up for a minute. [00:27:07] Speaker 04: But before you do that, I mean, when you look at 4 and 4A, there has to be a detailed description of content maturation practices used, including but not limited to all of the following, any existing policies intended to address the category 3. [00:27:28] Speaker 04: So they have to give you everything that they've got on what they believe to be hate speech or racism, extremism, harassment, foreign political interference, because those aren't defined terms, right? [00:27:42] Speaker 00: All of that means is, is there, as a matter of fact, [00:27:47] Speaker 00: an actual company policy that exists to address those categories. [00:27:52] Speaker 04: However they define them because they're not defined in the statute. [00:27:56] Speaker 00: If the company chooses to define those categories as an official matter, as an official company policy, in essence, it's only if [00:28:06] Speaker 00: It's essentially an undisclosed terms of service, right? [00:28:10] Speaker 04: But if they, for example, have nothing where they address what they consider to be hate speech, they have to tell you that, right? [00:28:20] Speaker 04: They have to essentially say, we have nothing that we have that addresses anything that we consider to be hate speech, right? [00:28:32] Speaker 00: If they do not have an official office policy about hate speech. [00:28:36] Speaker 00: Yes, I would also add, however, not compelled speech to compel that. [00:28:41] Speaker 00: Yes, but it's a purely factual matter whether or not they have an official policy. [00:28:45] Speaker 03: So are the health care related disclosures and NIFLA, but those were unconstitutional. [00:28:50] Speaker 03: How would you distinguish that? [00:28:52] Speaker 00: Health disorders. [00:28:53] Speaker 00: Well, I think, so it depends what prong of Zauder were referring to. [00:28:57] Speaker 03: Well, that those were, those, I think, I think California and those arguments said purely factual and the Supreme Court said maybe, but also controversial unconstitutional. [00:29:08] Speaker 00: Sure. [00:29:08] Speaker 00: And I think the difference here is that in NIFLA, the court was focused on whether or not the, the, [00:29:18] Speaker 00: entity or person was required to speak a message that's not about its own products or services and that's contrary to its own viewpoint. [00:29:30] Speaker 00: And that description of what the test is for uncontroversial is very much [00:29:37] Speaker 00: reiterated in CTIA. [00:29:39] Speaker 00: Again, it's not a matter of, is this a controversial topic? [00:29:43] Speaker 00: Are we forcing the speaker to make a disclosure about something other than their own product or services? [00:29:50] Speaker 00: The commercial product or services. [00:29:51] Speaker 03: Well, if their product or service is, for whatever reason, not one that purports to moderate foreign political interference content or hate speech or racism, [00:30:04] Speaker 03: Aren't you forcing them to then disclose, speak to say, we are a company that permits hate speech or we are a company that does not permit hate speech? [00:30:16] Speaker 03: Isn't that disclosure required and how is that different? [00:30:19] Speaker 00: It's not different because we have to be thinking of, because I think we've gotten away from the fact that these platforms- Yeah, if it's not different from NIFLA, I think you lose. [00:30:28] Speaker 00: Well, it is different from NIFLA in the uncontroversial respect, because again, controversial has to do with whether you're forced to disclose a message about someone else's services. [00:30:38] Speaker 00: That's not about your own product, which this is. [00:30:41] Speaker 00: This is about their own product. [00:30:43] Speaker 04: forcing them to disclose either here's how we define hate speech and what we do about it or we don't really care about hate speech we let we have no policies on it because we let anybody say [00:30:59] Speaker 04: anything, whether we consider it to be hate speech or not, that is not compelled speech that's subject to strict scrutiny when they basically have to tell you what their position is on, for example, hate speech and whether they want to do anything about it or not. [00:31:15] Speaker 00: Your Honor, it's not a matter of company. [00:31:17] Speaker 00: What is your ideology on hate speech? [00:31:20] Speaker 00: This is [00:31:21] Speaker 00: The terms of service are a contract between the, and ExCorp very clearly says that in its own terms of service. [00:31:28] Speaker 00: It's a contract for use of a product. [00:31:30] Speaker 04: Would it be the same if the statute said you need to put in the terms of service how all of your officers voted in the last election, that that wouldn't count as compelled speech because you've put it in a bucket called terms of service? [00:31:45] Speaker 00: No, there's absolutely a difference here. [00:31:48] Speaker 00: One, the distinction is that you apply Zauderer [00:31:51] Speaker 00: when it has to do with a disclosure specifically about a product or service. [00:31:56] Speaker 00: What board members voted, the subjective opinions about officers within the company, that's very clearly outside of that definition, but consumers have the right to understand, what are the rules, am I using this service, and what am I gonna have in front of me? [00:32:07] Speaker 03: Well, they get that from the terms of service. [00:32:09] Speaker 03: Why do they have to add these categories? [00:32:10] Speaker 03: If the categories aren't in the terms of service, again, how is it not a compulsion that the state is saying, [00:32:17] Speaker 03: Your terms of service may be what they are, but we also want you to tell us about these controversial topics, even if they're not in your terms of service. [00:32:25] Speaker 00: Well, I think we're getting away from the doctrinal analysis here for Zauder. [00:32:31] Speaker 00: And I think what you're getting to is, is there a rational, I want to get the test right, is it reasonably related to a substantial state interest? [00:32:40] Speaker 00: That's all that's required under Zauder. [00:32:42] Speaker 02: You have to show. [00:32:45] Speaker 02: that purely factual and uncontroversial, and as you've heard, we're all concerned about the fact you've got this list of things that have to be disposed. [00:32:54] Speaker 02: Companies could have terms of service, not discuss any of them. [00:32:59] Speaker 02: The state is saying you must tell us what, basically, what you think about each of these things, and if you don't, you've got to tell us that too. [00:33:08] Speaker 02: Why is that not compelled speech? [00:33:10] Speaker 00: Because again, it is not what you think about these things. [00:33:13] Speaker 00: It is what does your product do? [00:33:15] Speaker 00: How is your use of the product affected? [00:33:18] Speaker 02: You're still compelling them to say that. [00:33:22] Speaker 00: Correct. [00:33:22] Speaker 00: But again, it's commercial product disclosure. [00:33:25] Speaker 02: Before they have to say anything. [00:33:27] Speaker 02: It's a private company. [00:33:29] Speaker 02: They have terms of service. [00:33:30] Speaker 02: They may or may not talk about things. [00:33:31] Speaker 02: Say they don't say anything. [00:33:33] Speaker 02: This statute says, you've got to tell us. [00:33:36] Speaker 02: Even if you don't have a policy, you can say, we don't have a policy. [00:33:39] Speaker 02: And that's compelled speech, is it not? [00:33:41] Speaker 00: That's right, and you apply Zauder in the situation of compiled speech. [00:33:44] Speaker 02: You apply Zauder if it's purely factual and non-controversial. [00:33:47] Speaker 02: And what we're concerned about is it's neither. [00:33:50] Speaker 00: Well, and again, Your Honor, I think one needs to go back to the rule for how uncontroversial is defined. [00:33:57] Speaker 00: It's not about whether or not the disclosure itself, which again is a statistic or statement of a matter of fact about whether there's an official policy about what the product does or does not do, [00:34:07] Speaker 00: It's not whether that is related to a controversial issue about whether platforms should moderate this content It's what it it's it's are you being forced to state a message? [00:34:19] Speaker 00: Not about your product and that is contrary to your message and I'd like to give I'd like to give a hypothetical about to compare this so let's say there's a company that [00:34:28] Speaker 00: that produces food or some kind of supplies and they take an ideological stand, they're very vocal about it, about if it's food it's non-GMOs or only recycled materials, those kinds of things, right? [00:34:45] Speaker 00: Would you be able to have a statute, particularly in the food context, mandating that they disclose what the ingredients are, including GMOs? [00:34:57] Speaker 02: That's wholly different. [00:34:59] Speaker 02: That, tobacco and so on, if they have to say what's in there, that's not controversial. [00:35:05] Speaker 02: It's what's in there. [00:35:05] Speaker 02: It's a fact. [00:35:07] Speaker 02: But here, you're compelling people to say things that they may or may not want to talk about at all. [00:35:13] Speaker 02: And there are a whole list of them. [00:35:14] Speaker 02: You've got representatives on both sides from the whole First Amendment bar of the United States that's concerned about this, which in and of itself says this is controversial. [00:35:25] Speaker 03: Can I maybe, can I try a slightly different example? [00:35:29] Speaker 03: The San Francisco Chronicle's op-ed page or letters to the editor policy, could California require purely factual disclosures of how they're planning to address ideological balance on their letters to the editor page? [00:35:45] Speaker 00: How they're planning to address. [00:35:47] Speaker 03: And how is that different from, that is the old fashioned form of content moderation. [00:35:52] Speaker 03: It is clearly protected under red line and other cases. [00:35:58] Speaker 00: I mean, I think you would have to go through the Zauderer analysis. [00:36:04] Speaker 03: And where does that end up? [00:36:06] Speaker 03: It may be a distinction without a difference. [00:36:08] Speaker 03: Sure, we might start with Zauderer. [00:36:09] Speaker 03: The question is, where do we end up? [00:36:11] Speaker 03: And why wouldn't we end up in the same place with that law as we would with this law, that under NIFLA, it's controversial, even if fact-based and unconstitutional? [00:36:25] Speaker 00: Again, I think the focus, I guess I don't know how many more times to say I don't think the controversial analysis that you're describing is in line with the way that CTA describes what that word means. [00:36:36] Speaker 03: Okay, maybe, could you talk to severability a bit? [00:36:39] Speaker 00: I would, and may I also speak to Moody just very briefly because we have not had a chance to respond to what actually the court says in Moody. [00:36:46] Speaker 00: I would just like to say it's not a pure assumption [00:36:51] Speaker 00: because the issue was briefed. [00:36:52] Speaker 00: It was briefed in the Texas case. [00:36:54] Speaker 00: The issue, whether it's out or applies, was briefed by net choice in the Texas case in the Supreme Court briefing, and it was briefed by both sides below in the circuit court briefing. [00:37:04] Speaker 00: The majority itself does not say that they're making an assumption. [00:37:07] Speaker 00: They state it twice that you applies out or to content moderation. [00:37:10] Speaker 03: But then I believe it goes on to say that the Fifth Circuit's approach to the content moderation was it kind of calls out and says that was incorrect. [00:37:18] Speaker 03: There are more constitutional problems here than we foresee. [00:37:21] Speaker 03: Why is that not analogous, at least with respect to the categories here? [00:37:24] Speaker 00: Because, well, I think I understand your question. [00:37:32] Speaker 00: The court found that there had been mistakes in the application, or there may have been mistakes in the application of Zauderer's scrutiny, but they didn't say that the Fifth Circuit was incorrect to apply Zauderer. [00:37:41] Speaker 00: So I'm referring to the fact that the Fifth Circuit certainly said that Zauderer applies. [00:37:46] Speaker 00: That is my point here. [00:37:48] Speaker 03: That's the starting point, but we have to reach the end point, which is where does Zauderer take us? [00:37:52] Speaker 00: I understand. [00:37:52] Speaker 00: No, and so Moody takes us, I don't mean to suggest that Moody takes us further than establishing the appropriate standard of review. [00:38:00] Speaker 02: But Zauder can also tell us what is not commercial speech. [00:38:03] Speaker 02: That's the standard we're looking at it. [00:38:05] Speaker 02: But if we're talking about something that's controversial, strictly numerical, that's one thing. [00:38:13] Speaker 02: But again, even if you look at Zauder, how does that really help the state in this case? [00:38:22] Speaker 00: If you look at Zauder, how does that help the state in the sense of how the prongs apply? [00:38:26] Speaker 02: You're claiming it's just a regular commercial speech? [00:38:29] Speaker 02: It's like a tobacco disclosure. [00:38:31] Speaker 02: What's in the product? [00:38:33] Speaker 00: Well, again, I would point to Moody, and I would point to both net choice cases in which it was the same type of speech at issue as facts related to content moderation. [00:38:43] Speaker 02: It's a little different than those. [00:38:44] Speaker 02: This statute's a little different than both the Texas and the Florida case, as I recall. [00:38:51] Speaker 02: Matter of fact, I maybe think of the earlier case, but isn't this statute modeled after one in England? [00:38:58] Speaker 02: It might have been the other case. [00:39:02] Speaker 02: I'm not sure. [00:39:07] Speaker 03: Thanks for sticking around. [00:39:11] Speaker 03: Could you speak to severability? [00:39:12] Speaker 00: Yes, thank you. [00:39:14] Speaker 04: Before you speak to it, can you add to your answer to Judge Johnston's question whether your brief to us discussed severability? [00:39:21] Speaker 00: I think we do mention that it's a brief to you. [00:39:25] Speaker 04: Yes, your brief on appeal. [00:39:28] Speaker 00: I don't recall if we discussed that. [00:39:30] Speaker 00: We certainly raised that below, and the district court did not make any finding on that issue. [00:39:37] Speaker 04: So if the court- Please answer. [00:39:39] Speaker 04: That's fine with my question, but please answer Judge Johnston. [00:39:42] Speaker 00: Oh, certainly. [00:39:44] Speaker 00: I mean, we certainly think that the provisions within the terms of service report are severable. [00:39:49] Speaker 00: We think they're grammatically severable. [00:39:51] Speaker 00: We think that the attorney general would still want the information [00:39:54] Speaker 00: that may not be as much at issue here today. [00:39:58] Speaker 03: Okay, to get a little more, I guess, granular, maybe just like the last case, we've got maybe four buckets again. [00:40:05] Speaker 03: The reporting, the terms of service, the description of content moderation practices, the data reporting, and then the categories that we've spent most of the morning talking about. [00:40:17] Speaker 03: Which parts of those are salvageable if assuming the content categories are struck down? [00:40:25] Speaker 00: Well, Your Honor, they should all be salvageable, but. [00:40:28] Speaker 03: But how? [00:40:28] Speaker 03: So for example, with the data reporting, do you agree with your friend that the data reporting is entirely keyed to the categories so that if the categories don't exist, there's nothing to report? [00:40:41] Speaker 00: So I believe you're referring to subdivision 5A, and it says information on content that was flagged by the social media company as content belonging to any of the categories described in paragraph three, including all the following. [00:40:58] Speaker 00: Your Honor, as I sit here, I'm hesitant to make a snap judgment on that. [00:41:02] Speaker 00: I think that, again, because the district court didn't decide this issue, I think it would be better to remand. [00:41:07] Speaker 00: If the court were to decide that any part were subject to a preliminary injunction, I certainly would ask that the court remand as to that issue. [00:41:24] Speaker 00: And I would just bring up one other case for your consideration, Your Honor. [00:41:29] Speaker 00: And admittedly, this was not in our briefing, but I believe Judge Bennett was on the panel that issued this opinion, and that's Twitter v. Paxton. [00:41:38] Speaker 00: And I'm sorry, let me, and it's 56F4 1170, Ninth Circuit 2022. [00:41:47] Speaker 00: And in that case, [00:41:51] Speaker 00: The Twitter brought suit because there had been a civil investigative demand by the Texas AG relating to their content moderation policies. [00:42:01] Speaker 00: This court found that injuries asserted were not right. [00:42:07] Speaker 00: And part of the reason for that is because the idea of subjective pressure on the social media platform was not sufficient to constitute a cognizable injury for the challenge. [00:42:20] Speaker 04: That case didn't have a required report semi-annually or in any other time frame, did it? [00:42:27] Speaker 00: I think that's a fair distinction, but I think it's also notable that [00:42:31] Speaker 00: Many of their theories and allegations of injuries and part of their challenge also have to do with how the attorney general can and may enforce the disclosure requirements and conduct hypothetically an investigation. [00:42:47] Speaker 00: So to the extent that any of their injuries are premised on the burden of any investigation, we don't think that's right. [00:42:55] Speaker 00: I also would just add as far as creating pressure, I also don't think [00:43:00] Speaker 00: That is a concrete injury that's been shown here. [00:43:05] Speaker 00: Again, that Twitter case states that a subjective statement generally of chilling and of feeling pressure is insufficient. [00:43:15] Speaker 00: And in Twitter, there actually had been meetings and discussions inside the company about whether and how to change their policies because of the civil investigative demand. [00:43:24] Speaker 00: There's not even evidence of that. [00:43:25] Speaker 02: I think we should go over time to either of my colleagues have additional questions of that. [00:43:30] Speaker 02: We used up your time, so we're going to give you a couple of minutes of rebuttal, okay? [00:43:35] Speaker 01: Thank you, Your Honors. [00:43:36] Speaker 01: Just a few points. [00:43:38] Speaker 01: First, I do want to focus on the November 3, 2022 letter. [00:43:43] Speaker 01: which they said was exaggerated. [00:43:45] Speaker 01: It is in the record at ER 1067 through 74. [00:43:50] Speaker 03: Mr. Kurtzberg, as you answer this, should it make a difference that VULO involved executive branch action and this involves legislative branch action? [00:44:01] Speaker 03: In terms of the challenge, right? [00:44:02] Speaker 03: In VULO, they were challenging a state official, whereas you're not challenging the letter, you're using the letter as a backdrop for your challenge to a statute. [00:44:11] Speaker 03: Should that matter? [00:44:12] Speaker 01: I don't think that it does matter, Your Honor. [00:44:14] Speaker 01: Here, there is the very person who is authorized to enforce a law making a threat to enforce the law in the context of a letter that, again, they say it was only the AB 587 is only in a footnote. [00:44:29] Speaker 01: That is not correct. [00:44:30] Speaker 01: But in the context of a letter, the sole focus of which [00:44:34] Speaker 01: is about how the social media platforms are not doing enough, and they have a duty and an obligation, and it's a moral duty and a legal duty to do more to keep disinformation and misinformation off of their platforms. [00:44:49] Speaker 01: And in the midst of the letter that says you have to do something to change your policies because it's not good enough, there is a section that is entitled on ER 1070. [00:45:00] Speaker 01: The state of California has a mandate [00:45:02] Speaker 01: to protect its citizens' voting rights. [00:45:05] Speaker 01: In that section, that's where AB 587 is mentioned. [00:45:09] Speaker 01: There are eight California statutes that are mentioned. [00:45:13] Speaker 01: And after a sentence that is about AB 587, in the text it says, in 2024, social media platforms will also have additional transparency obligations as required by recent state legislation. [00:45:27] Speaker 01: that requires disclosures on content moderation practices as it relates to extremism or radicalization, disinformation or misinformation, and foreign political interference among other areas. [00:45:38] Speaker 01: The very next sentence of the letter, very next sentence says the California Department of Justice will not hesitate to enforce these laws against any individual or group who violates them. [00:45:49] Speaker 01: The whole purpose of the letter is to get the companies to change their policies. [00:45:54] Speaker 01: If this is nothing other than a transparency statute, why did it occur to the Attorney General, why did it occur to even reference a transparency statute in a letter [00:46:06] Speaker 01: The sole purpose of which is to get them to change their policies about two of the categories that are covered by the statute. [00:46:15] Speaker 01: We submit. [00:46:15] Speaker 02: We've given you a couple extra minutes. [00:46:16] Speaker 02: I know you could talk a lot longer, both of you. [00:46:19] Speaker 02: But let me ask my colleagues, do either of you have additional questions? [00:46:22] Speaker 02: Thank you for your arguments. [00:46:23] Speaker 02: They're very helpful. [00:46:24] Speaker 02: This is very useful to us. [00:46:28] Speaker 02: We will act upon it soon. [00:46:32] Speaker 02: Before the case just argued is submitted, I know we have a number of, I think it's students who are here today. [00:46:39] Speaker 02: The court's unfortunately not going to be able to meet with you, but our law clerks will do so. [00:46:44] Speaker 02: So if you'll please stand by for them, the court stands adjourned for the day. [00:46:49] Speaker 02: Thank you, Your Honor.