Spend enough time around our political system, and you’ll eventually watch the right thing fail for no good reason.
That is where I find myself today, reflecting on how House Bill 347 – a straightforward, carefully written piece of legislation to protect Alabama families – failed to pass in Montgomery. Unfortunately, the people who will pay the price for this inaction are ordinary Alabamians, including women and children, whose images can be taken without consent and weaponized against them right now, today, with no meaningful remedy under state law.
Let me explain about what HB347 would have done because the details matter.
Under current Alabama law, technology developers enjoy broad immunity when their tools are used to create or distribute what the law calls private images — non-consensual intimate imagery, including AI-generated content. That immunity made sense in an earlier era, when platforms were genuinely passive, and the content that appeared on them was created entirely by users.
However, it doesn’t meet the moment now. For example, xAI’s platform Grok has already been documented generating criminal imagery of women and minors, and at a tremendous scale. During just a nine-day period this year, the platform posted 4.4 million images, of which over 40% were sexualized images of women.
HB347 would have changed that. It would have held companies like xAI accountable when they knowingly design and profit from tools whose foreseeable output is the sexual exploitation of real people. That is exactly the kind of common-sense protection Alabama families deserve.
The bill also would have required platforms to establish a clear, accessible process for victims to request removal of illicit material — and to act on those requests within seventy-two hours. That deadline matters. Grok operates on a platform that has systematically dismantled its own safety and content moderation infrastructure, leaving victims with nowhere to turn. Watchdogs have documented the same failures across other platforms, which is exactly why Alabama cannot afford to leave these protections off the books.
Artificial intelligence tools capable of taking ordinary photographs and generating non-consensual sexual imagery from it are already here, already accessible, and already being used by millions of people. Alabama had a bill on the floor that addressed some of the largest problems with the software, drafted with enough legal precision to withstand constitutional scrutiny, sponsored by a broad bipartisan coalition of representatives who understood the stakes.
I understand that legislation is complicated. I understand that there are competing interests, competing priorities, and a limited amount of floor time in any session. What I do not understand is how protecting Alabama women and children from AI-generated sexual exploitation failed to clear that bar.
Artificial intelligence is advancing faster than any technology this generation has seen, and the decisions we make right now about how to govern it will shape what it becomes. That is a conversation worth having. But if Alabama cannot pass a bill protecting its own citizens from AI-generated child sexual abuse material — if that clears the bar of ‘too complicated’ or ‘not the right time’ — then we should be honest with ourselves about whether we are prepared for what comes next.
State Rep. Ben Harrison is a Republican representing Lauderdale County.

