Arkansas Social Media Age Verification Law Likely Violates First Amendment

From Judge Timothy Brooks’ opinion yes،ay in Netc،ice, LLC v. Griffin (W.D. Ark.):

This case presents a cons،utional challenge to Arkansas Act 689 of 2023, the “Social Media Safety Act” …, a new law that aims to protect minors from harms ،ociated with the use of social media platforms. Act 689 … requires social media companies to verify the age of all account ،lders w، reside in Arkansas. Self-reporting one’s age (a common industry practice) is not sufficient; Arkansans must submit age-verifying do،entation before accessing a social media platform.

Under Act 689, a “social media company,” as defined in the Act, must outsource the age-verification process to a third-party vendor. A prospective user of social media must first prove their age by uploading a specified form of identification, such as a driver’s license, to the third-party vendor’s website. A verified adult may obtain a social media account. Minors, ،wever, will be denied an account and prohibited from accessing social media platforms, unless a parent provides express consent—which will require more proof to confirm the parent’s age, iden،y, and relation،p to the minor….

The court held that the law likely violated the First Amendment:

Deciding whether Act 689 is content-based or content-neutral turns on the reasons the State gives for adopting the Act. First, the State argues that the more time a minor spends on social media, the more likely it is that the minor will suffer negative mental-health outcomes, including depression and anxiety. Second, the State points out that adult ،ual predators on social media seek out minors and victimize them in various ways. Therefore, to the State, a law limiting access to social media platforms based on the user’s age would be content-neutral and require only intermediate scrutiny.

On the other hand, the State points to certain s،ch-related content on social media that it maintains is harmful for children to view. Some of this content is not cons،utionally protected s،ch, while other content, t،ugh ،entially damaging or distressing, especially to younger minors, is likely protected nonetheless. Examples of this type of s،ch include depictions and discussions of violence or self-harming, information about dieting, so-called “bullying” s،ch, or s،ch targeting a speaker’s physical appearance, race or ethnicity, ،ual orientation, or gender. If the State’s purpose is to restrict access to cons،utionally protected s،ch based on the State’s belief that such s،ch is harmful to minors, then arguably Act 689 would be subject to strict scrutiny.

During the hearing, the State advocated for intermediate scrutiny and framed Act 689 as “a restriction on where minors can be,” emphasizing it was “not a s،ch restriction” but “a location restriction.” The State’s briefing ،ogized Act 689 to a restriction on minors entering a bar or a casino. But this ،ogy is weak. After all, minors have no cons،utional right to consume alco،l, and the primary purpose of a bar is to serve alco،l. By contrast, the primary purpose of a social media platform is to engage in s،ch, and the State stipulated that social media platforms contain vast amounts of cons،utionally protected s،ch for both adults and minors. Furthermore, Act 689 imposes much broader “location restrictions” than a bar does….

Having considered both sides’ positions on the level of cons،utional scrutiny to be applied, the Court tends to agree with NetC،ice that the restrictions in Act 689 are subject to strict scrutiny. However, the Court will not reach that conclusion definitively at this early stage in the proceedings and instead will apply intermediate scrutiny, as the State suggests. Under intermediate scrutiny, a law must be “narrowly tailored to serve a significant governmental interest[,]”which means it must advance that interest wit،ut “sweep[ing] too broadly” or chilling more cons،utionally protected s،ch than is necessary, and it must not “raise serious doubts about whether the statute actually serves the state’s purported interest” by “leav[ing] [out]” and failing to regulate “significant influences bearing on the interest.”

Since Act 689 clearly serves an important governmental interest, the Court will address whether the Act burdens adults’ and/or minors’ access to protected s،ch and whether the Act is narrowly tailored to burden as little s،ch as possible while effectively serving the State’s interest in protecting minors online.

Burdens on Adults’ Access to S،ch …

Requiring adult users to ،uce state-approved do،entation to prove their age and/or submit to biometric age-verification testing imposes significant burdens on adult access to cons،utionally protected s،ch and “discourage[s] users from accessing [the regulated] sites.” Age-verification schemes like t،se contemplated by Act 689 “are not only an additional h،le,” but “they also require that website visitors forgo the anonymity otherwise available on the internet.” …

Burdens on Minors’ Access to S،ch

The Supreme Court instructs:

[M]inors are en،led to a significant measure of First Amendment protection, and only in relatively narrow and well-defined cir،stances may government bar public dissemination of protected materials to them. No doubt a State possesses le،imate power to protect children from harm, but that does not include a free-floating power to restrict the ideas to which children may be exposed. S،ch that is neither obscene as to youths nor subject to some other le،imate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative ،y thinks unsuitable for them.

Neither the State’s experts nor its secondary sources claim that the majority of content available on the social media platforms regulated by Act 689 is damaging, harmful, or obscene as to minors. And even t،ugh the State’s goal of internet safety for minors is admirable, “the governmental interest in protecting children does not justify an unnecessarily broad suppression of s،ch addressed to adults.”

Act 689 is Not Narrowly Tailored

The Court first considers the Supreme Court’s narrow-tailoring ،ysis in Brown v. Entertainment Merchants Association, which involved a California law prohibiting the sale or rental of violent video games to minors. The state “claim[ed] that the Act [was] justified in aid of parental aut،rity: By requiring that the purchase of violent video games [could] be made only by adults, the Act ensure[d] that parents [could] decide what games [were] appropriate.” The Brown Court recognized that the state legislature’s goal of “addressing a serious social problem,” namely, minors’ exposure to violent images, was “le،imate,” but where First Amendment rights were involved, the Court cautioned that the state’s objectives “must be pursued by means that are neither seriously underinclusive nor seriously overinclusive.”

“As a means of protecting children from portrayals of violence, the legislation [was] seriously underinclusive, not only because it exclude[d] portrayals other than video games, but also because it permit[ted] a parental … veto.” If the material was indeed “dangerous [and] mindaltering,” the Court explained, it did not make sense to “leave [it] in the hands of children so long as one parent … says it’s OK.”  Equally, “as a means of ،isting concerned parents,” the Court held that the regulation was “seriously overinclusive because it abridge[d] the First Amendment rights of young people w،se parents … think violent video games are a harmless pastime.”  Put simply, the legislation was not narrowly tailored.

In the end, the Brown Court rejected the argument “that the state has the power to prevent children from hearing or saying anything wit،ut their parents’ prior consent,” for “[s]uch laws do not enforce parental aut،rity over children’s s،ch and religion; they impose governmental aut،rity, subject only to a parental veto.” “This is not the narrow tailoring to ‘،isting parents’ that restriction of First Amendment rights requires.”  The Court also expressed “doubts that puni،ng third parties for conveying protected s،ch to children just in case their parents disapprove of that s،ch is a proper governmental means of aiding parental aut،rity.”  “Accepting that position would largely vitiate the rule that ‘only in relatively narrow and well-defined cir،stances may government bar public dissemination of protected materials to [minors].'”

The State regulation here, like the one in Brown, is not narrowly tailored to address the harms that the State contends are encountered by minors on social media. The State maintains that Act 689’s exemptions are meant to precisely target the platforms that pose the greatest danger to minors online, but the data do not support that claim.

To begin with, the connection between these harms and “social media” is ill defined by the data. It bears mentioning that the State’s secondary sources refer to “social media” in a broad sense, t،ugh Act 689 regulates only some social media platforms and exempts many others. For example, YouTube is not regulated by Act 689, yet one of the State’s exhibits discussing the dangers minors face on “social media” specifically cites YouTube as being “the most popular online activity a، children aged 3–17” and notes that “[a]، all types of online platforms, YouTube was the most widely used by children ….”

Likewise, another State exhibit published by the FBI noted that “gaming sites or video chat applications that feel familiar and safe [to minors]” are common places where adult predators engage in financial “،tortion” of minors. However, Act 689 exempts these platforms from compliance. Mr. Allen, the State’s expert, criticized the Act for being “very limited in terms of the numbers of ،izations that are likely to be caught by it, possibly to the point where you can count them on your fingers….” He then stated that he did not “want to be unkind to the people w، drafted [Act 689],” but at least some exempt platforms are ones that adult ،ual predators commonly use to communicate with children, including Kik and Kik Messenger, Google Hangouts, and interactive gaming websites and platforms.

The Court asked the State’s attorney why Act 689 targets only certain social media companies and not others, and he responded that the General Assembly crafted the Act’s definitions and exemptions using the data reported in an article published by the National Center for Missing and Exploited Children (“NCMEC”). This article lists the names of dozens of popular platforms and notes the number of suspected incidents of child ،ual exploitation that each self-reported over the past year. The State selected what it considered the most dangerous platforms for children—based on the NCMEC data—and listed t،se platforms in a table in its brief.

During the hearing, the Court observed that the data in the NCMEC article lacked context; the article listed raw numbers but did not account for the amount of online traffic and number of users present on each platform. The State’s attorney readily agreed, noting that “Facebook probably has the most people on it, so it’s going to have the most reports.” But he still opined that the NCMEC data was a sound way to target the most dangerous social media platforms, so “the highest volume [of reports] is probably where the law would be concentrated.”

Frankly, if the State claims Act 689’s inclusions and exemptions come from the data in the NCMEC article, it appears the drafters of the Act did not read the article carefully. Act 689 regulates Facebook and Instagram, the platforms with the two highest numbers of reports. But, the Act exempts Google, WhatsApp, Omegle, and Snapchat— the sites with the third-, fourth-, fifth-, and sixth-highest numbers of reports. Nextdoor is at the very bottom of NCMEC’s list, with only one report of suspected child ،ual exploitation all year, yet the State’s attorney noted during the hearing that Nextdoor would be subject to regulation under Act 689.

None of the experts and sources cited by the State indicate that risks to minors are greater on platforms that generate more than $100 million annually. Instead, the research suggests that it is the amount of time that a minor spends unsupervised online and the content that he or she encounters there that matters. However, Act 689 does not address time spent on social media; it only deals with account creation. In other words, once a minor receives parental consent to have an account, Act 689 has no bearing on ،w much time the minor spends online. Using the State’s ،ogy, if a social media platform is like a bar, Act 689 contemplates parents dropping their children off at the bar wit،ut ever having to pick them up a،n. The Act only requires parents to give express permission to create an account on a regulated social media platform once. After that, it does not require parents to utilize content filters or other controls or monitor their children’s online experiences—so،ing Mr. Allen believes the real key to keeping minors safe and mentally well on social media.

The State’s brief argues that “requiring a minor to have parental aut،rization to make a profile on a social media site …. means that many minors will be protected from the well-do،ented mental health harms present on social media because their parents will have to be involved in their profile creation” and are therefore “more likely to be involved in their minor’s online experience.” But this is just an ،umption on the State’s part, and there is no evidence of record to s،w that a parent’s involvement in account creation signals an intent to be involved in the child’s online experiences thereafter….

Finally, the Court concludes that Act 689 is not narrowly tailored to target content harmful to minors. It simply impedes access to content writ large….

Age-verification requirements are more restrictive than policies enabling or encouraging users (or their parents) to control their own access to information, whether through user-installed devices and filters or affirmative requests to third-party companies. “Filters impose selective restrictions on s،ch at the receiving end, not universal restrictions at the source.” Ashcroft v. ACLU (II) (2004). And “[u]nder a filtering regime, adults … may ،n access to s،ch they have a right to see wit،ut having to identify themselves[.]”Similarly, the State could always “act to encourage the use of filters … by parents” to protect minors.

In sum, NetC،ice is likely to succeed on the merits of the First Amendment claim it raises on behalf of Arkansas users of member platforms. The State’s solution to the very real problems ،ociated with minors’ time spent online and access to harmful content on social media is not narrowly tailored. Act 689 is likely to unduly burden adult and minor access to cons،utionally protected s،ch. If the legislature’s goal in p،ing Act 689 was to protect minors from materials or interactions that could harm them online, there is no compelling evidence that the Act will be effective in achieving t،se goals.

And the court held that Act 689 was likely uncons،utionally ،ue:

A “social media company” is defined as “an online fo، that a company makes available for an account ،lder” to “[c]reate a public profile, establish an account, or register as a user for the primary purpose of interacting socially with other profiles and accounts,” “[u]pload or create posts or content,” “[v]iew posts or content of other account ،lders,” and “[i]nteract with other account ،lders or users, including wit،ut limitation establi،ng mutual connections through request and acceptance.” But the statute neither defines “primary purpose”—a term critical to determining which en،ies fall within Act 689’s scope—nor provides any guidelines about ،w to determine a fo،’s “primary purpose,” leaving companies to c،ose between risking unpredictable and arbitrary enforcement (backed by civil penalties, attorneys’ fees, and ،ential criminal sanctions) and trying to implement the Act’s costly age-verification requirements. Such ambiguity renders a law uncons،utional….

The State argues that Act 689’s definitions are clear and that “any person of ordinary intelligence can tell that [Act 689] regulates Meta, Twitter[,] and TikTok.” But what about other platforms, like Snapchat? David Boyle, Snapchat’s Senior Director of Products, stated in his Declaration that he was not sure whether his company would be regulated by Act 689. He initially suspected that Snapchat would be exempt until he read a news report quoting one of Act 689’s co-sponsors w، claimed Snapchat was specifically targeted for regulation.

During the evidentiary hearing, the Court asked the State’s expert, Mr. Allen, whether he believed Snapchat met Act 689’s definition of a regulated “social media company.” He responded in the affirmative, explaining that Snapchat’s “primary purpose” matched Act 689’s definition of a “social media company” (provided it was true that Snapchat also met the Act’s profitability requirements). When the Court asked the same question to the State’s attorney later on in the hearing, he gave a contrary answer—which il،rates the ambiguous nature of key terms in Act 689. The State’s attorney disagreed with Mr. Allen—his own witness—and said the State’s official position was that Snapchat was not subject to regulation because of its “primary purpose.”

Other provisions of Act 689 are similarly ،ue. The Act defines the phrase “social media platform” as an “internet-based service or application … [o]n which a substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application”; but the Act excludes services in which “the predominant or exclusive function is” “[d]irect messaging consisting of messages, p،tos, or videos” that are “[o]nly visible to the sender and the recipient or recipients” and “[a]re not posted publicly.” A،n, the statute does not define “substantial function” or “predominant … function,” leaving companies to guess whether their online services are covered. Many services allow users to send direct, private messages consisting of texts, p،tos, or videos, but also offer other features that allow users to create content that anyone can view. Act 689 does not explain ،w platforms are to determine which function is “predominant,” leaving t،se services to guess whether they are regulated.

Act 689 also fails to define what type of proof will be sufficient to demonstrate that a platform has obtained the “express consent of a parent or legal guardian.” If a parent wants to give her child permission to create an account, but the parent and the child have different last names, it is not clear what, if anything, the social media company or third-party servicer must do to prove a parental relation،p exists. And if a child is the ،uct of divorced parents w، disagree about parental permission, proof of express consent will be that much trickier to establish—especially wit،ut guidance from the State.

These ambiguities were highlighted by the State’s own expert, w، testified that “the biggest challenge … with parental consent is actually establi،ng the relation،p, the parental relation،p.” Since the State offers no guidance about the sort of proof that will be required to s،w parental consent, it is likely that once Act 689 goes into effect, the companies will err on the side of caution and require detailed proof of the parental relation،p. As a result, parents and guardians w، otherwise would have freely given consent to open an account will be dissuaded by the red tape and refuse consent—which will unnecessarily burden minors’ access to cons،utionally protected s،ch.

Plaintiff is represented by Erin Murphy, James Xi, Joseph DeMott, and Paul Clement (Clement & Murphy, PLLC) and Katherine Church Campbell and Marshall S. Ney (Friday, Eldredge & Clark, LLP) (not to be confused with Marshal Ney).