Jeff Landry, Louisiana Gets It Right on App Store Accountability
Why the App Store Accountability Act balances child protection with constitutional rights
I have been incredibly hard on Governor Jeff Landry of late. But part of doing what I do is calling out bull when I see it and highlighting the good when I see it. So when Landry signed Louisiana's App Store Accountability Act into law on June 30th, making Louisiana the third state to require app stores to verify users' ages and obtain parental consent for minors, I was happy with the law.
Of course, the predictable complaints started immediately. Groups like NetChoice and the Chamber of Progress immediately argued the law "undermines First Amendment rights" and "compromises the privacy of all users." But what those critics aren't telling you is that this law represents exactly the kind of measured, constitutional approach to child protection that courts have been demanding. And it's long overdue.
The Problem Is Real, and It's Getting Worse
The facts should alarm every parent in Louisiana. Pew Research shows that 48 percent of teenagers now say social media has a mostly negative effect on people their age, up from just 32 percent in 2022. World Health Organization data reveals problematic social media use among adolescents jumped from 7 percent in 2018 to 11 percent in 2022, with girls showing higher rates than boys.
This isn't abstract policy debate material. School counselors across Acadiana deal daily with kids whose mental health collapsed after exposure to anonymous harassment, dangerous challenges, or algorithms that fed them increasingly extreme content. Parents watch their children struggle with depression, anxiety, and self-harm ideation fueled by platforms designed to maximize engagement rather than well-being.
The problem got worse in 2020. Across Louisiana and the rest of the country, kids were dismissed and sent home for the remainder of the school year. It was two and a half months of isolation. Kids were glued to their screens for a shot at social interaction. But that increased their addiction to toxic social media apps and habits.
Meanwhile, our current system of "age verification" consists of children lying about their birthday on a form. A 13-year-old can download Instagram, agree to the terms of service written by corporate lawyers, and start receiving targeted advertising based on psychological manipulation techniques—all without their parents knowing it happened.
Louisiana's Constitutional Approach
Louisiana's App Store Accountability Act learned from the constitutional failures of earlier attempts at online child protection. When federal courts struck down social media age verification laws in Arkansas and other states, they focused on two main problems: the laws were too vague about which platforms they covered, and they imposed content-based restrictions on speech.
Louisiana's law sidesteps both issues. Instead of targeting specific types of speech or particular platforms, it focuses on the distribution mechanism: the app stores themselves. Every app, regardless of content, gets the same treatment. This isn't the government saying, "This speech is dangerous for children." This is the government saying, "Parents should be involved when their minor children enter into legal contracts with corporations."
The constitutional distinction matters. Courts have consistently held that the government can regulate commercial transactions involving minors without violating the First Amendment. We require parental consent for everything from school field trips to medical treatment. Somehow, downloading apps that collect personal data and serve targeted advertising has been exempt from this common-sense principle.
How It Actually Works
Despite the scare tactics from Big Tech lobbyists, Louisiana's law doesn't require dystopian government surveillance. Representative Kim Carver, who authored the legislation, has been clear that existing infrastructure can handle age verification in most cases. When parents set up phones for their children, they already provide age information to Apple and Google. When they link credit cards to app store accounts, they've already verified they're adults.
The law requires app stores to verify a user's "age category"—child (under 13), younger teen (13-15), older teen (16-17), or adult—and share this information with app developers so they can provide age-appropriate experiences. For minors, the app store must link their account to a parent's account and obtain verifiable parental consent before downloads.
This isn't about helicopter parenting or government overreach. It's about giving parents the tools they need to make informed decisions about their children's digital lives. Right now, Apple and Google offer family controls, but they're optional and buried in settings most parents never find. Louisiana's law makes parental oversight the default, not the exception.
The Industry Fight Behind the Scenes
The corporate drama around this law reveals Big Tech's priorities. Meta supports app store age verification laws because it shifts responsibility away from their platforms and onto Apple and Google. Apple and Google oppose these laws because they don't want to be the ones collecting sensitive personal information and dealing with the liability.
None of them wants to admit they've all profited from a system that treats children like adult consumers while externalizing the mental health costs onto families and society.
Actually, scratch that. Social media doesn’t see us (or our kids) as consumers. Instead, we are the product, our information being sold to advertisers and data analytics.
Meta's support isn't entirely cynical—their platforms, particularly Instagram, are often the primary targets of criticism about teen mental health. They've recently implemented "Teen Accounts" with stricter privacy settings for users under 18. But it's telling that they're pushing for these protections to be implemented at the app store level rather than taking full responsibility themselves.
The truth is, all these companies have the technical capability to implement robust age verification and parental controls. They've chosen not to because it might reduce engagement and advertising revenue. Louisiana's law forces their hand.
Privacy Concerns Are Manageable
The privacy objections to Louisiana's law aren't entirely without merit, but they're overstated. Google's Kareem Ghanem warned that app stores might be "forced to ask for government IDs, government documents, other invasive forms of ID verification."
But this assumes the worst-case scenario while ignoring simpler alternatives. Credit card verification works for most adult users. Parental attestation works for children. Biometric verification is already standard for unlocking phones. The infrastructure for privacy-protecting age verification already exists—what's been missing is the legal requirement to use it.
More importantly, we need to weigh privacy concerns against the documented harms of the current system. Is the theoretical risk of data collection really worse than the demonstrated reality of children accessing content that contributes to depression, eating disorders, and self-harm?
Why This Matters for Louisiana
Louisiana has a particular stake in getting this right. We have large school districts across Acadiana dealing with the fallout from unregulated social media use. We have families struggling with teenage mental health crises that correlate strongly with social media exposure. We have a cultural tradition of community responsibility for child welfare that's been undermined by the "digital wild west" approach of the past decade.
The App Store Accountability Act represents Louisiana values: family authority over government control, local responsibility over corporate profit, and child protection over abstract ideology. It gives parents tools without mandating specific outcomes. It protects children without limiting adult access to information.
The Broader Stakes
This isn't just about Louisiana, and it's not just about apps. The fundamental question is whether we're going to allow technology companies to continue treating children as independent consumers, or whether we're going to insist that childhood means something in the digital age.
The old model—where 13-year-olds could agree to complex terms of service for platforms designed to be psychologically addictive—was never sustainable. The mental health data makes that clear. The question was whether the change would come from responsible regulation or from a backlash that could threaten legitimate technological innovation.
Louisiana chose the responsible path. We've created a framework that protects children while respecting constitutional rights, that empowers parents without replacing parental judgment, that addresses real harms without limiting legitimate speech.
Other states are watching. Congress is considering federal legislation. The success or failure of Louisiana's approach could determine whether we get sensible reform or regulatory overreach.
The Right Choice
Critics will continue to argue that Louisiana's App Store Accountability Act goes too far or doesn't go far enough. They'll raise constitutional concerns that courts have already addressed. They'll warn about privacy implications while ignoring the privacy violations children already face from unregulated data collection.
But this law does something simple and important: it says that parents should be involved when their minor children download apps and agree to legal contracts with billion-dollar corporations.
If that principle makes Silicon Valley executives uncomfortable, perhaps they should ask themselves why they've been fighting so hard to keep parents out of the loop in the first place.
Louisiana has chosen to side with families over corporations, child protection over corporate profits, and parental authority over tech industry convenience. The App Store Accountability Act won't solve every problem with teenage social media use, but it's a crucial first step toward treating children like children rather than products.