Emily Jones: Alabama parents, not Apple, should decide what kids download

Apple Alabama kids
(Andrey K/Unsplash, William Hook/Unsplash, YHN)

Across the country, more and more parents, educators, and everyday Americans are starting to ask an uncomfortable but necessary question: why can minors download powerful, adult-facing apps without parental approval?

This isn’t about being anti-technology, banning smartphones, or dictating how parents should raise their kids.

This question matters because it confronts a hard truth: our digital systems are designed to bypass parents — and children are paying the price.

Most kids in America now get their first smartphone around age 11. Social media access often follows immediately — even though these platforms were never designed with children in mind.

At the same time, rates of anxiety, depression, self-harm, and suicide among children and teens surge. This is no longer a coincidence we can ignore.

Against that backdrop, requiring parental approval before minors can download certain apps is a commonsense step to establish parental authority in a space where it never truly existed.

From the beginning, app stores have allowed children to bypass parents entirely.

A parental-approval requirement enforces age verification and parental consent before minors can download certain apps.

It doesn’t ban technology or micromanage families; it simply brings transparency and accountability back into a system that currently lacks both.

The accountability loophole no one talks about

To understand why this problem is so widespread, it helps to understand how app downloads actually work today.

Most app stores rely on self-reported ages. When a child sets up a device or an app store account, they can enter any birthdate they choose — and in most cases, no verification follows.

Once that account exists, age-based app restrictions are largely meaningless. Apps rated for older teens or adults can still be downloaded with a single tap.

Developers face virtually no consequences if minors access content that is clearly inappropriate for their age. App stores disclaim responsibility, developers point back to the platform, and parents are often left unaware until harm has already occurred.

This is the loophole: children can access adult-facing apps because no one in the system is required to confirm who the user actually is — or to notify a parent before access is granted. The system is designed for convenience, not child safety.

Closing that loophole doesn’t require banning apps or devices across the board. It requires something far simpler — and more honest: parental approval for most apps, and clear age-based restrictions for the most harmful ones.

The stories behind the statistics

This debate often stays abstract — until you see what actually happens to kids.

In the United Kingdom, a 14-year-old girl died after being exposed to a steady stream of self-harm and suicide-related content on Instagram.

Her family later learned that algorithms repeatedly pushed graphic and emotionally damaging material into her feed, even as her mental health deteriorated.

What began as casual social media use became a constant exposure to content no child should ever see — much less be encouraged to consume.

In another case in the United States, a family has sued TikTok after their 16-year-old son died by suicide.

According to the lawsuit, the platform’s “For You” page repeatedly inundated him with videos about self-harm and suicide, amplifying harmful content rather than intervening. His parents say they had no idea what he was being shown until it was too late.

These are not rare anomalies. Stories like these are becoming far too common in our technology-heavy world.

They are predictable outcomes of giving children access to platforms powered by engagement-driven algorithms — often without meaningful adult oversight.

The data is clear — and disturbing

I recently read 10 Rules for Raising Kids in a High-Tech World by Jean M. Twenge, PhD. It is an extremely insightful book — and one every parent should read before giving children unfettered access to the internet.

In it, Twenge cites national data that is, quite frankly, terrifying for any parent:

  • One in five girls aged 13 to 15 has been sexually propositioned through social media
  • 37% report being exposed to unwanted nudity
  • The majority of teen girls using Instagram or Snapchat have been contacted by a stranger who made them uncomfortable

None of this happens by accident. It is the result of platforms intentionally designed to maximize engagement — not to protect children. It’s a system more concerned with profit and safety. 

Parental rights come with parental responsibility

This is especially true when it comes to social media.

While parents should absolutely have the authority to approve or deny most apps, social media is different.

The evidence is overwhelming that social media platforms are uniquely harmful to children’s mental health, emotional development, and sense of self.

These apps are engineered to exploit insecurity, reward comparison, and keep users — especially young users — hooked.

For that reason, we should be honest about what real protection looks like: social media apps should not be accessible to children under the age of 16 – no exceptions.

This brings us back to a crucial distinction: parents absolutely have the right to decide whether their child gets a smartphone — even as the evidence increasingly shows they shouldn’t.

The government should not ban devices, but restricting accessing to social media until 16 and requiring parental approval for all other apps should be the bare minimum when it comes to protecting children online. 

We also need to stop pretending that just because something is a parental right, it’s automatically a good idea.

Yes, parents can decide if their children receive smartphones, but it is becoming increasingly clear that children don’t need one until age 16 — or later. And even then, access to social media should not be treated as a rite of passage.

Giving young children smartphones with unrestricted app access is extremely risky — full stop.

Normalizing early and unsupervised tech use hasn’t made kids more resilient or connected. It has made them more anxious, more isolated, and far more vulnerable to exploitation. To be completely honest, it is destroying our children and we are sitting idly by allowing it to happen. 

Good parenting sometimes means saying no — even when it’s inconvenient or unpopular.

A parental-approval requirement doesn’t force parents to restrict apps — but age-based limits on social media would set a clear boundary that reflects what the data already tells us. It simply ensures parents are aware, involved, and empowered to make informed decisions. What parents do with that authority is their choice — but they deserve to make those choices with their eyes wide open.

Big Tech’s incentives are not aligned with kids’ safety

To understand why this problem persists, it’s worth being honest about who benefits from the status quo.

Big Tech makes billions by keeping kids scrolling, clicking, and engaging for as long as possible — often at the expense of their mental health.

App stores and developers have no financial incentive to slow kids down, verify ages, or notify parents. Their profits depend on frictionless access, not child safety.

When society shrugs and looks the other way, it isn’t protecting freedom. It’s protecting a business model that treats children as a revenue stream.

Why the rules must catch up

Part of the challenge is that technology advances far faster than the norms, safeguards, and expectations we place around it.

Many of the basic assumptions we still rely on were formed decades ago — long before smartphones, social media algorithms, or AI-driven manipulation existed.

By the time adults recognize a problem, children have often already been exposed.

We can either adapt our standards to reflect modern risks, or continue pretending that yesterday’s guardrails are sufficient for today’s digital world.

Requiring parental approval for app downloads is a step toward the former — and protecting children should not be a partisan issue.

This is the floor, not the ceiling

No single policy or standard will solve every problem. But setting clear expectations establishes a baseline: parents matter, kids aren’t adults, and tech companies shouldn’t get the benefit of the doubt when children are involved.

Protecting kids sometimes means limiting access — and that shouldn’t be controversial.

When forced to choose between Big Tech’s profits and children’s safety, the answer should be obvious.

Emily Jones is a mom from Jackson County and a Republican candidate for State Board of Education in 2026. She is the founder of the first Moms for Liberty chapter in the state seeking to fight for the preservation of parental rights and the protection of our children.

Recent in Guest Opinion

Michael J. Brooks: The God box