The Wire

  • New tunnel, premium RV section at Talladega Superspeedway on schedule despite weather


    Construction of a new oversized vehicle tunnel and premium RV infield parking section at Talladega Superspeedway is still on schedule to be completed in time for the April NASCAR race, despite large amounts of rainfall and unusual groundwater conditions underneath the track.

    Track Chairman Grant Lynch, during a news conference Wednesday at the track, said he’s amazed the general contractor, Taylor Corporation of Oxford, has been able to keep the project on schedule.

    “The amount of water they have pumped out of that and the extra engineering they did from the original design, basically to keep that tunnel from floating up out of the earth, was remarkable,” Lynch said.

  • Alabama workers built 1.6M engines in 2018 to add auto horsepower


    Alabama’s auto workers built nearly 1.6 million engines last year, as the state industry continues to carve out a place in global markets with innovative, high-performance parts, systems and finished vehicles.

    Last year also saw major new developments in engine manufacturing among the state’s key players, and more advanced infrastructure is on the way in the coming year.

    Hyundai expects to complete a key addition to its engine operations in Montgomery during the first half of 2019, while Honda continues to reap the benefits of a cutting-edge Alabama engine line installed several years ago.

  • Groundbreaking on Alabama’s newest aerospace plant made possible through key partnerships


    Political and business leaders gathered for a groundbreaking at Alabama’s newest aerospace plant gave credit to the formation of the many key partnerships that made it possible.

    Governor Kay Ivey and several other federal, state and local officials attended the event which celebrated the construction of rocket engine builder Blue Origin’s facility in Huntsville.

1 month ago

Justice Will Sellers: Magna Carta’s peer review

(Pixabay, Tingey Injury Law Firm/Unsplash, YHN)

If the 4th of July has a pre-game, it is June 15.

On that date in 1215, the Magna Carta was signed, beginning a gradual process of defined individual rights and limiting the power and authority of the British crown. The Declaration of Independence which outlined the colonists’ desire for freedom from the edicts of King George, is a direct descendant of the Magna Carta.

It would be foolish to argue that the Magna Carta anticipated all the rights and freedoms we enjoy today, but it certainly planted those seeds 806 years ago. Those seeds sprouted in the form of the development of common law and fully germinated in the U.S. Constitution.


Perhaps one of the most unusual rights secured by the Magna Carta is the right to a jury trial; such an integral part of the court system but those who live in common law countries fail to recognize its uniqueness. Modern countries have limited the use of juries as most governments prefer having only a judge make decisions. Juries can be unpredictable and messy with uncertain results at times contrary to desires of the powerful; regardless, English-speaking countries recognize the importance of having individual citizens decide factual disputes and mete out punishment.

Judges were typically appointed by the King, and, while learned in the law, were often unfamiliar with the standards of the community they judged. Juries were developed to provide a commonsense approach to the law, to provide a check on the crown’s power expressed through judges and to achieve a sense of factual fairness.

The international community discusses the rule of law and debates fundamental human rights, yet most countries restrict the jury system. Inasmuch as individual rights are touted and basic freedoms extolled, juries are given short shrift. Fair trials are demanded in a host of international legal agreements, but in the Universal Declaration of Human Rights and the European Convention of Human Rights, there is no right to a trial by jury.

The Magna Carta set in motion a concept that in order for the law to be fair, community leaders needed to be a part of the process. Demanding fairness is easy to mandate but assuring fairness and acceptability of decisions requires a jury. The signatories to Magna Carta understood what moderns seem to forget.

The populism reflected in the jury system is simply not something the legal process of other countries tolerate; most are afraid of the common sense of their average citizens.

The language in the Magna Carta provided that punishments, proceedings and prosecutions required “… the lawful judgment of peers and by the law of the land.” This idea was cultivated by the English legal system until it expanded to include not only criminal cases, but civil cases, as well.

And the decision of a jury was respected to the point that any factual finding by a jury could not be altered by a judge. This served to limit judicial power and balance the interest of the crown with the community.

Before the American Revolution, various taxes imposed on the colonists were enforced in courts with appointed judges but no juries. Omitting juries outraged colonial lawyers, who realized the crown’s end run was the denial of fundamental rights they previously enjoyed.

These rights were further limited when royal governors attempted to override jury verdicts when the results were unfavorable to the crown. Just like dictators today, autocrats throughout history were afraid of a commonsense citizenry integrally involved with the legal system.

One argument against allowing any review of a jury’s decision was that to do so, substituted the royal governor for a “jury of peers.” When the King limited the use of juries, the colonists were so antagonized that many leading citizens begin to contemplate a separation from England. But, in any separation, the system of common law and the right to a trial by jury would remain, albeit unrestrained by a monarch. In a sense, diminishing jury trials started the move toward independence.

When the U.S. Constitution was drafted following the hard-fought revolution, there was no guarantee to a trial by jury. To remedy this omission, and not wanting to rely on custom alone, an enumerated bill of rights was proposed to guarantee a right to a trial by jury in criminal matters in the Sixth Amendment. But the Bill of Rights went one step further; and, with the Seventh Amendment, provided that the right to a trial by jury would also apply in civil cases:

In suits at common law … the right of trial by jury shall be preserved, and no fact tried by a jury shall be otherwise re-examined in any court of the United States, according to the rules of common law.

Under the English legal system, juries were used in civil cases, but in an abundance of caution, the founders wanted to make certain this fundamental right could not be eliminated by a later government; a written bill of rights guaranteed this.

Thus, a legal system impaneling a jury is so accepted that we find it odd when other countries have only a judge. Former U. S. Supreme Court Chief Justice William Rehnquist acknowledged, “The right to trial by jury in civil cases at common law is fundamental to our history and jurisprudence.”

The jury system we rely on to resolve disputes assigns average citizens an important role in our legal system. Beginning with the Magna Carta, the right to a trial by jury, though limited, was initiated as part of the common law that grew into the system we accept wholeheartedly today.

Some 806 years ago, the concept began as a test of strength between the monarchy and nobles, spawning a fundamental right unique in the world.

Our Declaration of Independence and Constitution owe much to their ancestor, the Magna Carta.

Will Sellers was appointed as an Associated Justice on the Supreme Court of Alabama by Gov. Kay Ivey in 2017.

2 months ago

Justice Will Sellers: Loyalty still matters

(Wikicommons, Pixabay, YHN)

Always the catch-all political crime, an accusation of treason is used to punish rivals and remove them from civic engagement. Autocrats use the insinuation of treason with brutal efficiency to banish, if not execute, a political problem or inconvenient idea.

While treason is bandied about to characterize someone with whose political beliefs we disagree, our founders made treason a particularly difficult crime to prove. As with so much of the Constitution, the terms were specifically written to prevent abuses witnessed by colonials. Article III, Section 3 not only provides safeguards that treason not be used to silence political opponents, but it also limits the extent of any punishment.

Because of these strictures, we often forget what real treason looks like and fail to fully appreciate loyalty to country or creed. While national ties are not unlike family bonds, this intrinsic loyalty to place or relations is often weakened by opportunity or ideology. Few people today really know a traitor to their country. There may be disagreements on any number of levels, but seldom do acts fully rise to the level of treason within the Constitutional definition. Treason in the United States is more than a lazy term of derision occasioned by mere policy disagreements.


Seventy years ago, when highly placed British diplomats surreptitiously defected to the Soviet Union treason was made manifest.

In May 1951, the Cold War was escalating between the capitalist West and the communist East. The United States had witnessed hearings before the House Un-American Activities Committee, and citizens were rocked by allegations of Soviet agents operating within our nation’s government.

Alger Hiss had been convicted of perjury, which fanned the flames that other government employees had divided loyalties and worked for the Russians. But many of the accused denied any involvement in espionage; for every accusation, there was denial and not always crystal-clear evidence of treason.

In a sensational trial held in March 1951, Julius and Ethel Rosenberg were convicted of espionage, but there was hardly uniform consensus that they were significantly guilty, and there was enough evidence to question the appropriateness of the death penalty. In similar cases, the accused were defiant and vociferously expressed innocence. Thus, the county was divided about whether the treason was actual and if those accused were more political dissenters than disloyal Americans.

The actions of the British diplomats and the subsequent revelations after their defection left no doubt that our former allies, the Russians, had for years spied on us and penetrated both British and American governments at a very high level.

At Cambridge University in the 1930s, several undergraduates, including Guy Burgess and Donald MacLean, were recruited by the Soviets to provide information about Great Britain. They were from privileged families and considered among the elite attending a premier university. Nothing in their background gave the slightest hint that their loyalties had shifted from King and Country to Stalin and the Bolsheviks.

MacLean joined the British foreign office in 1934 and almost immediately began supplying information to the Russians. Until his defection in 1951, he delivered more than 4,500 documents to his Soviet handlers.

Burgess was initially employed by the BBC but also British Secret Service and, later, with the foreign office. While working as a spy he supplied the Soviets with more than 4.600 confidential or top-secret documents.

Using information obtained from MacLean, the Russians leaked a copy of a letter from Churchill to President Truman which included an embarrassing assessment of Stalin. The FBI believed the leak had come from the British Embassy and suspected MacLean, but they were unable to confirm their suspicions.

Later, as Western intelligence services began decrypting old Soviet traffic between Washington and Moscow, MacLean emerged as a leading suspect given his access to a host of sensitive documents about the U.S., British and Canadian committee on the development of atomic weaponry. Recalled to London, MacLean was tipped off by fellow Cambridge spy Kim Philby (who was stationed in DC with knowledge of the investigation) that he was under suspicion.

Given the stress of his dual identity, MacLean started drinking heavily and was viewed as so unstable that, once accused, he would confess and implicate others. Not wanting to risk exposure, Burgess and Philby explained to Moscow that MacLean must leave Britain, and Burgess began making plans for MacLean to defect.

At this same time, Burgess was dismissed from the foreign service based on conduct unrelated to his espionage. With his career at an end, he decided to accompany MacLean. Moscow felt a dual defection with mutual support could be successful. Others disagreed and argued that 2 defections would prompt counterintelligence to begin connecting dots to uncover seemingly loyal British citizens who served Stalin’s workers’ paradise.

By a series of feints and head fakes, Burgess and MacLean successfully defected and were noticeably absent prompting the secret service and other agencies to assess the situation. They soon realized their slow response to American inquiries had given the spies time to depart without exposing their accomplices.

The situation quickly began to unravel as guilt by association caused suspicion to fall on others who had served with Burgess and MacLean. Most importantly, trust between the U.S. and British intelligence agencies deteriorated, which may have been even more significant than the disclosure of state secrets. The instability caused by these defections led both the CIA and MI-6 into a frenzied self-examination, placing former colleagues under suspicion and disrupting normal operations in search of disloyalty. An inordinate amount of time was consumed by Western allies chasing spies who either did not exist or were not in positions to supply actionable intelligence.

Burgess, MacLean and their ilk committed treason by being unambiguously disloyal to their country. There remains no question that their actions lead to deaths and seriously compromised military and diplomatic secrets. They also sowed seeds of discord among Western security agencies.

While many in the United States questioned accusations against American citizens, the defection of Burgess and MacLean laid bear with absolute certainty that the Russians were creating an atmosphere of distrust, destruction and deception aimed solely at achieving Soviet hegemony. In 1951, this reality shocked the world into confronting communist deception forcefully and directly.

The survival of the Western democracies depended upon blunting communist expansion; in 1989, the appropriateness of their actions was confirmed.

Will Sellers is a 1985 graduate of Hillsdale College and an Associate Justice on the Supreme Court of Alabama.

3 months ago

Justice Will Sellers: Remembering the Bay of Pigs and its aftermath


When great powers stump their toe on foreign policy, the initial pain, though slight, often causes loss of focus, a stumble and sometimes a more serious accident.

Sixty years ago, the United States sponsored an unsuccessful invasion of Cuba, and the colossal failure ultimately damaged our nation’s reputation, emboldened our enemies, worried our allies, and clouded our vision of proper objectives for foreign relations.

President John Kennedy’s inauguration was a cause for much optimism as a young, vibrant breath of fresh air would lead America in a new direction. His inaugural address was an inspiring call to a new nationalism of service to the world at large, and he promised that the United States would do all in its power to protect freedom around the globe.


The naivety of his rhetoric was not apparent, however, until he was challenged by an energized Russian bear ready to test the mettle of the young president.

At the beginning of the new administration, America had every reason to be hopeful that the world was moving towards greater freedom. The Eisenhower administration had successfully used covert action to change the governments of Iran and Guatemala, some hotspots of communist insurgency had been stopped, and there was stability in the Philippines and Vietnam.

When the torch was passed to the Kennedy administration, the world appeared stable and controllable.

During his transition from electoral success to governing, Kennedy reached out to some of the smartest and most capable individuals in business and academia. These whiz kids promoted a theory that the machinery of government was a science, and if the formulas were correct, the results would be both predictable and successful.

But, while genius in government is great, practical simplicity is always better. Understanding and assessing people and personalities often primes academic articulation. Within a matter of months, President Kennedy was to learn this the hard way.

By failing to understand the difference between ideology and interests in diplomacy, the Kennedy administration embarked on a path that reflected an impractical view of the world as they wanted it to be and failed to appreciate that an effective foreign policy must reflect a national self-interest to deal with the world as it is.

Even before the Bay of Pigs, members of Kennedy’s foreign policy team decided on a covert coup to oust Portugal’s dictator.

This plan made little sense.

There was no overarching U.S. interest at stake, any local opposition to the regime was minimal, and, to make matters worse, Portugal was a NATO ally. Thankfully, the coup never got off the ground, the covert action was scrapped, and the instigators departed before any real damage was done.

But the thought process, or lack thereof, was troubling. And any further ideas about forced regime change should have been put on hold until a comprehensive foreign policy was developed and measured objectives approved.

But rather than seriously considering American interests, the excitement of covert action and the thrill of cloak and dagger operations distracted the young administration and set in motion one of the biggest disasters that was as open to ridicule as it was notorious for ineptness.

When U.S.-sponsored Cuban exiles landed at the Bay of Pigs, nothing went according to plan. There was no expected popular uprising, and, more importantly, Kennedy had canceled any air support. With limited engagement from the Navy, the landing party hardly got off the beach.

The conflict was a total rout with almost the entire invasion force killed, wounded, or captured. In retrospect, any casual observer would question the need to invade Cuba, our national interest there, and any thoughtful steps to take to achieve our goals short of force. The after-action report was devastating and served as a proof text for Murphy’s law.

The Bay of Pigs served as a shakedown cruise for the new administration, and the evaluations of its first four months was resoundingly negative. Allowing a small country like Cuba to thwart an American-sponsored coup fueled our enemies to take full advantage of the geniuses who attempted to advance the national policy of a new administration.

After the Bay of Pigs, the stature of the United States was substantially reduced in the eyes of the world; perhaps for the first time, we were vulnerable, and our enemies probed and tested our resolve.

Indeed, for the rest of his presidency, Kennedy’s foreign policy exploits would be an attempt to overcome this defeat in Cuba. Sensing distraction, our enemies took full advantage of us.

In Europe, the Soviets approved building a barrier between East and West Berlin, and when Kennedy signaled that he would take no actions to stop construction, the barrier became the solid, fortress-like wall, which was improved and secured to provocatively divide the people of Berlin.

In Southeast Asia, Russia amped up its support of the Pathet Lao in a proxy war for control of Laos. Khrushchev rhetorically decimated Kennedy at the Vienna Summit some months later.

Atoning for the loss of prestige at the Bay of Pigs, Bobby Kennedy became obsessed with Cuba, diverting resources in any number of attempts to topple the Castro regime. In fact, some of the most preposterous assassination plans cooked up by the CIA were aimed at Castro.

Rather than destabilizing Cuba, Kennedy’s singular focus forced Castro into a strong alliance with Russia, resulting in a Soviet base 90 miles from Florida. The obsession with Cuba led to the Cuban Missile crisis which was the closest the world has yet come to a nuclear war.

But perhaps the most significant legacy from Kennedy’s bruised ego was his desire to reveal his machismo and show he could draw a line in the sand against communism.

The place he chose to show resolve was Vietnam.

The Bay of Pigs represented not only a defeat of U.S. interests, but a disaster in creating a foreign policy that was rooted in a personal quest to show a powerful America and decisive administration. By focusing on goals and objectives that had little relation to the permanent interests of the United States, Kennedy ultimately followed a path leading to humiliation and defeat.

Engaging on the world stage requires critical thinking about America’s goals and the strategies to achieve them. Foreign policy must be practical and focused on long-term interests and not the distractions of ideological whims.

Will Sellers is an Associate Justice on the Supreme Court of Alabama

4 months ago

Justice Will Sellers: What’s in a name?

(Wikicommons, Pixabay, YHN)

There has been much debate lately about how we name public buildings and whether we should remove some names because of long ago actions that no longer conform to contemporary societal practices.

Public buildings are always tricky to name as evidenced by the fact that just a couple of years ago, the University of Alabama Law School was named after Hugh Culverhouse, Jr. in acknowledgment of a very generous donation. However, Culverhouse’s donation was later returned and his name was chiseled from the law school’s facade.

At Alabama State University many years ago, in-fighting and disputes among the trustees resulted in the Joe L. Reed Acadome being renamed.

Scandals and criminal convictions have caused other public facilities to suffer the same fate. A variety of buildings once named after Healthsouth founder Richard Scrushy no longer sport his name.


Similarly, there is no longer an Enron Field; ditto the MCI Worldcom Center.

So, before we take further action to name or rename any public buildings, I would like to suggest a new criterion that should be used in the future.

It is always complicated to name something after a person who is still alive because the curriculum vitae is not complete. As long as someone is alive, there is more than adequate opportunity for them to act inappropriately, commit wrongdoing or be revealed to possess feet made of clay rather than marble.

I order to ensure that a facility named in someone’s honor avoids becoming an embarrassment to the institution – a retreat as opposed to advancement – perhaps a good rule would be to wait until they have been dead for at least five years. That way their entire record is complete, most statutes of limitations have expired, and their legacy should be secure.

Of course, one problem with this proposal is that living people tend to contribute more when they are alive and are attempting to establish a lasting legacy that, along with a satiated ego, only their name on a building will suffice. Heirs tend to prefer to experience a legacy more in selfish monetary terms; naming opportunities thus abound for the living.

So, while donations might decrease, naming a building or whatever after someone who is deceased seems like a safe bet … or at least it used to.

I say “seems” because now we are in a local and nationwide frenzy to fully explore the lives of people for whom structures have long ago been named. And in doing this, any improper conduct is scrutinized based on our current understanding of what should have been acceptable, polite behavior during the life of the honoree.

Perhaps we are finally embracing Shakespeare’s observation that “the evil men do lives after them, but the good is oft interred in their bones.” No one can argue that commemorating someone who supported a criminal enterprise is acceptable. Not many cities have an Al Capone Avenue or a Benito Mussolini Drive. Open, obvious, and socially unacceptable behavior no matter how sizable the donation or political influence at the time can never reach the level of deserving commemoration.

But we must be very careful because the finer the tooth on the comb and the higher the magnification of the glass provides details and information that might be better left undiscovered.

In fact, few people look saintly when every nook and cranny of their life is fully examined.

Several times a year a new book comes out revealing that a well-known figure was a member of a socially unacceptable movement. Families under the yoke of an authoritarian government are shocked to discover an informer in their midst, a closeted Nazi or mafia hitman.

Years ago, France was so stunned that a documentary revealed more collaborators than resistance fighters that it was banned from television because it “destroys myths that the people of France still need.” And who can forget when decades after World War II ended, United Nations General Secretary Kurt Waldheim was exposed as a Nazi intelligence officer? Others have discovered past family connections to organized crime. DNA testing often confirms embarrassing assignations.

But in discovering a not so auspicious past, few communities completely jettison a native son and families still embrace wayward kin. Most simply use the exposure to show the failed humanity that affects all of us.

I once heard a businessman on a panel about ethics comment that “there is a little larceny in all of us.” In saying so, he was not refusing to condemn bad behavior, but was pointing out the permanence of original sin. Thornton Wilder penned the famous quote: “There is so much good in the worst of us, and so much bad in the best of us, that it behooves all of us not to talk about the rest of us,” which is probably a good thing to remember when we critique others.

I don’t know anyone who likes seeing their face in a mirror with magnification!

The lives of humans are complicated, complex and contradictory. Many seemingly mainstream people can have odd ideas about the world. In hindsight, old fads look sinister; the basis of past popular culture rarely survives contemporary scrutiny.

Political correctness changes with the wind and whims. A thorough background check unearths activities that are at the same time sublime, ridiculous and embarrassing. So, if we are going to name public buildings after people, we should be very careful not only in who we choose to honor, but how we choose to judge them.

We must also make certain that judging ourselves by the same standards does not reveal more than a measure of hypocrisy, sanctimony, and pretense.

Rather than naming buildings after people, whether they are dead or alive, I have a much better idea. Perhaps we could name our edifices, parks and public squares after non-violent animals, flowers and friendly fruits and vegetables.

The Alabama State Capitol already sits on Goat Hill, and the University of Alabama has Rose Administration building; both of which provide precedent for my proposal.

Why not continue this trend for other buildings?

Who could be offended by Hippo Dining Hall, Squash Hall of Justice or Daffodil Dormitory? If we start moving away from recognizing humans and honor non-flawed animals and plants, instead, we will have one less issue to divide us and give sign manufacturers a real boon to their business.

Happy April Fools’ Day to you all!

Will Sellers is an Associate Justice on the Supreme Court of Alabama

5 months ago

Justice Will Sellers: Liberty of conscience didn’t come easy


We take freedom of conscience for granted, but, 500 years ago, accepting and practicing beliefs outside of the mainstream was deadly.

The 1521 Diet of Worms was a legislative gathering held in Worms (one of the oldest cities in Europe) to consider Martin Luther’s theology.

The stakes were extraordinarily high as Luther, a mere monk, parried with the leading Roman Catholic scholars of his day. The ramifications of this meeting, while couched in religious terms, had clear political underpinnings. So much so that Holy Roman Emperor Charles V presided over the “meeting”, which allowed the trappings of his office to validate the ultimate decision.


Martin Luther’s heretical writings were to be publicly reviewed and examined; and while he was given safe conduct to attend the diet, that he was a heretic was a foregone conclusion. The issue before the diet was not the persuasiveness of Luther’s argument, but whether he would recant.

When Luther appealed to his conscience and argued that his conviction about his beliefs was firm and not subject to change, he set himself in the crosshairs of the 16th century religious and political establishment. Heresy and blasphemy were capital offenses. Keeping the doctrines of the Church pure and undefiled was taken quite seriously and anyone advocating a different belief system was considered an outlaw with no legal process available for protection.

Inappropriate beliefs about religion might lead others to perdition, so political power was enlisted to stop errant beliefs and prohibit any doctrine that was not officially sanctioned. But mandating beliefs or emotions fails to consider human advancement in rationally considering ideas, accepting some while rejecting others and developing a personal system of faith and knowledge.

The most critical idea inadvertently let loose by the Edict of Worms was that in matters of faith, people could think for themselves and choose a belief system appealing to the conviction of their conscience. While it would be easy to accuse the Holy Roman Empire of attempting to eliminate competing faiths, protestants held an equally monolithic view resulting in religious wars that seemed to miss the point of the faith each side advocated.

Protestants, while wanting to believe as they wished, were not willing to extend this liberality to others within their realm of influence. The puritan poet John Milton would see his writings banned by Cromwell’s Commonwealth, causing him to write one of the first essays against censorship and in favor of Christian liberty. Said Milton, “Let Truth and falsehood grapple; whoever knew Truth put to the worse in a free and open encounter?”

But even in the new world, liberty of conscience was slow to catch on.

The Dutch Reformed Peter Stuyvesant attempted to limit the worship of Quakers within his jurisdiction. Refusing to submit, the residents of Flushing, New York published the Flushing Remonstrance, which advocated for freedom of conscience not only to Quakers, but also to “Jews, Turks and Egyptians.”

Controlling beliefs and limiting ideas was nothing new for religion, but once institutional religion was de-coupled from government, limiting dissension became applied more frequently in the realm of politics. Often when a new political regime became ascendant in countries without a history of personal freedoms, dissent was stifled, disagreements became illegal and hagiographic propaganda replaced information. One area which politicians around the world agree is how much they despise opposing viewpoints.

Even in seemingly democratic countries with a history of freedoms supported by the rule of law, politicians simply hate criticism and will attempt to restrain if not eliminate it. Usually, this takes the form of mild disgust, but at times it can prove to be both personally and financially costly to oppose the ruling elite. We expect this from authoritarian governments, but when we find democratically elected governments engaging in censorship and limiting dissent, we should be troubled.

Consider the Reuther Memorandum of the 1960s. Seemingly established to advocate for fairness and equality of the public airwaves, Bobby Kennedy used the Reuther brothers’ report to curtail political opposition with breathtaking success. But this is not a liberal vs. conservative issue as regulation and limitations of speech are subtly advocated by all sides.

Even founding father John Adams had Congress pass the Sedition Act making it a federal crime to speak, write, or print criticisms of the government that were arguably false, scandalous, or malicious. Numerous newspaper editors were arrested and some even imprisoned under this act.

In recent memory, both conservative and liberal groups have advocated using government regulations to limit their respective definition of offensive speech. A journalist of the Jacksonian era, William Leggett, warned: “if the government once begins to discriminate as to what is orthodox and what heterodox in opinion, what is safe and unsafe in tendency, farewell, a long farewell to our freedom.”

Ideas, whether in opposition or support of a politician or political ideology, should never be restricted. It is much better to have ideas aired and let people decide. Bad ideas typically die a quiet death, but good ideas live on and become part of the progress of democratic government.

Rather than restrict poorly thought ideas, it is better to let them be proclaimed loudly and see them disintegrate on impact. Many years ago, Jerry Rubin encouraged student protestors to burn down a historic academic building. The more he shouted, the more the crowd responded, but no one was willing to act because no one was willing to torch a building to support academic freedom.

Bad ideas are like that. Giving someone a megaphone and unlimited time can be their undoing. Public debate, like sunlight, is a great disinfectant. Debate and open discussion forces ideas to compete against practical reality, thus winnowing out the ill-conceived and fostering workable solutions.

So, 500 years ago western civilization moved toward allowing people to think and believe as they followed their convictions, but even Luther and his followers failed to see the broad implication of their success. Various events would slowly chip away at autocrats forcing their subjects to subscribe to approved beliefs.

Instead, in a triumph of experience and personal liberty, states moved toward greater freedom. While elites might try to restrict the thoughts of others and limit criticism, free societies realize that allowing open discussion and a variety of viewpoints lays the foundation for a community in which all rights are respected, and belief systems are not censored but allowed to sink or swim based on the truth of their results.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

6 months ago

Justice Will Sellers: The necessity of American leadership in a post-COVID world

(Wikicommons, Pixabay, YHN)

Thirty years ago, the world seemed like a more stable place.

The United States was at the height of international prowess and had deftly negotiated with almost the entire world to oust Sadam Hussein from Kuwait. President Bush and his foreign policy team had built an international coalition to acknowledge that aggression against another sovereign state would not be tolerated. Even those countries that did not physically participate in the military coalition agreed to refrain from public dissent and allow the United Nations to live up to its charter.

The United States did not engage in appeasement or unilateral military action but initiated a precedent-setting effort that used international resolutions to express the world’s outrage. But unlike most U.N. resolutions, which aren’t worth the paper they are printed upon, the United States simultaneously organized a global military effort to give teeth to the resolutions.


At the same time, in another part of the world, East and West Germany unified. Long the focal point of Cold War tensions, the two Germanys agreed to combine and become a single country. This, too, was no small feat and required a significant amount of fancy footwork to make sure reunification didn’t ruffle the feathers of friend or foe.

A few NATO allies were not at all excited about a unified Germany and its impact on the balance of power in Europe; and, for obvious reasons, several Warsaw Pact allies shared similar concerns. But once Russia, the 800 lb. gorilla of the Warsaw Pact, gave its blessing, reunification could officially begin. Inasmuch as reuniting Germany was akin to the final prisoner exchange at the end of a war, Gorbachev consented to reunification and acknowledged the Cold War was over as the world moved in a vastly different direction.

Rather than dancing upon the grave of Communism, the United States engaged in careful, intentional diplomacy. It would have been tempting to replay the famous Nixon-Khrushchev kitchen debate, engage in triumphalism, and feed the narrative of American jingoistic swagger; but this was not the style of the Bush administration. When the Berlin Wall fell and Eastern Europe began to roll back the Iron Curtain, the Bush team worked behind the scenes to provide aid and comfort to the newly liberated countries, but care was always taken to avoid offending the Russians.

This polite diplomacy allowed the defeated communists to save face and find a new, less aggressive role on the world stage.
Russia had been a longstanding ally of Hussein’s Iraq, supplying it with weapon systems and financing the regime. One might expect that Russia would support Sadam and, perhaps, come to his defense, but Gorbachev chose first to broker a peace deal, which failed, and ultimately supported the conclusions of the U.N. resolutions. Perhaps most importantly, Russia stayed on the sidelines and did nothing to resist coalition forces.

This lack of action was unprecedented and signaled a complete change of tack by Russia. Unlike other Cold War hot spots, the Gulf War did not become a proxy war, but was a global effort; a real “united” United Nations coalition to act as a world police force to enforce the UN Charter militarily.

And it worked. Not only was there diplomatic consensus, but there was also military cohesion. The various and sundry coalition partners and their myriad of commitments and commands worked together, which resulted in Sadam being both isolated by the international community and overwhelmed by a lethal coalition force.

The Gulf War was over almost before it began. The sovereignty of Kuwait was restored, Sadam was humiliated, and it was reasonable to assume his international thuggery was over.

In fact, with the end of the Gulf War and the reunification of Germany, the world community seemed on the brink of a new paradigm in international relations. The United States was the only superpower left standing and reluctant as Americans were to take on a leadership role, the Bush administration was on the cusp of achieving what others had longed for: a stable and peaceful world with international cooperation and a global consensus on the role and implementation of the rule of law.

The world clearly seemed to be moving away from sentimental regionalism and toward a global economy with greater freedoms from governments that appeared to be democratically elected. The communism attributed to Marx and Lenin had died under the weight of a competitive international economy. Even China was developing a hybrid political and economic system that seemed to embrace some facets of capitalism while maintaining state control.

Regrettably, this brave new world that appeared truly transformed was only a mirage. Even though the United States was at the pinnacle of power, its leaders quit before solidifying their gains. As a peace-loving republic, America cashed in her peace dividend before maturity and took a holiday from history.

Too embarrassed to be assertive and advocate for our values, U.S. leadership chose to lead from behind. They assumed to our detriment that if our nation throttled back its role, others would join us. Rather than use our overwhelming might to force change on recalcitrant nations that questioned liberty and freedom, they choose consensus over taking charge—finding the lowest common denominator for action.

Even though past actions indicated otherwise, our leaders choose to bank on the good intentions of other countries with no history of personal freedom or democracy, much less the rule of law. Sadam was still in power, and like any good despot, he refused to accept defeat, consolidated what he had left, and continued to abuse his people with his power.

The U.S. foreign policy apparatus choose to look the other way when China reduced freedoms and clamped down on peaceful protests by killing demonstrators. Perhaps worse of all, Russia was allowed to drift away from true democratic reform and once again embrace autocratic rulers who used the trappings of democracy to gain power and whittle away at citizen self-rule.

Rather than continuing to advocate for American values on the international stage, our leaders were content to be part of a timid chorus rather than standing as a loud voice for reason, practical diplomacy and strength.

The world is safer when America is fully engaged. The international community needs a strong America to provide leadership and, when necessary, to use the overwhelming might of its military. Our foreign policy aims must be clear and focused on self-interest and the significant implications of trade to expand our economy. With enlightened self-interest, America can provide global leadership and peace to the world. The United States can help create a stable world by advocating for a global framework which allows individuals within nation states to pursue their respective political economies, peacefully and under rules of enforceable equity.

Will Sellers is an associate justice on the Supreme Court of Alabama.

6 months ago

Justice Will Sellers: The future of America is undiminished by circumstance

(Wikicommons, Joe Biden/Facebook, YHN)

It was President Harry Truman who said, “The only thing new in the world is the history you do not know,” and King Solomon, perhaps the wisest man ever, stated pretty much the same thing a few millennia ago when he recorded in Ecclesiastes 1:9 that “there is nothing new under the sun.”

Recent studies have shown the people look fondly upon the era that was one to two decades prior of their birth as the “good old days,” but few take time to really examine what made those days so seemingly good and why we regard times in which we never lived as better than the present.


Viewing current events through the lens of times that are distant memory can yield many disappointments, but recognizing that the past was flawed and often filled with misery can offer comfort that the future might not be as dim as we imagine If, as Solomon intoned, there is nothing really new under the sun and the only new things require greater learning and studying on our part, perhaps a good rule of thumb would be to worry less and study history more.

Presidential elections are contentious and have been since the founding of our republic. Don’t believe me? Read Winston Groom’s last book’s discussion of the 1800 election! Some elections are more spirited than others, but each four years there is an opportunity for hope, disappointment, disgust and even advancement. While it is true that elections have consequences and can clearly change the trajectory of our country, history shows us that most changes are not nearly as bad as we fear or as good as we had hoped. Some changes that occurred years ago, while viewed as earth-shattering at the time, now look fairly benign as we come to accept certain changes and, in retrospect, view them as appropriate and insignificant.

In the almost 245 years of our country’s existence, there have been probably a dozen presidential elections that stand out as marking the end on an era — some might say “error” — and the beginning of a new phase in the American experiment. But because American is an open society with freedoms that many across the globe envy, the clash of divergent viewpoints is not only helpful, but good.

If you believe that the arc of the universe is long and bends toward truth, then the testing of ideas politically and otherwise is necessary for progress and keeping the country as dynamic and free as possible. If we believe that right eventually emerges from conflict, then there is little to worry about.

Starting almost 100 years ago totalitarian systems were the rage. Both communism and fascism — two sides of the same coin — were seen as eclipsing liberty and democracy, jettisoning the best of western liberating thought. But these false ideologies could not stand the test of time. and over the last 30 years, systems granting greater freedoms have emerged from former dictatorial regimes. Those authoritarian governments failed precisely because they were rooted in lies and deceptions rather than the firm foundation of truth and liberty.

Truth is not only the best defense, but it is also the most buoyant and will eventually float to the top of any tempest. Time, though, is the magic ingredient; things need time and truth needs space to resist bruising, bullying and battering. Lies, deceit and fakery carry with them the sharp razzle-dazzle that distracts use from seeing the truth, but there is a point, after time, when truth emerges as the victor.

So, too, are laws of nature, economics and physics unchanged by feelings or perceptions. People may like to think that the magic of government will suspend all these laws, but that never happens. Perhaps for a season there is an appearance of suspension but that is really a recalibration to equilibrium anticipating a collapse which validates the offended law.

The British discovered this long ago when sterling was the measure for global trade and commerce. But, as the Britain Empire and economy constricted and monetary policy expanded, the value of the pound collapsed until the International Monetary Fund had to save the currency. So, while you may spend more than you take in by increasing the monetary supply, after a time the economy is impacted, and spending policies must be reconciled to deficit spending.

This maxim proves true in other areas, too, as there is only so far anyone can go without incurring the restrictions of practical laws that explain the universe as much as they limit government.

The critical thing for any country is the flexibility to withstand change and adjust to violations of these practical laws. The expansion of liberty and freedom of expression is critical to maintain a vibrant political system that marks forward progress based on a consensus from representative government, but is restrained by the good ideas from minority opinion so that, on balance, we are never overextended.

The future of the United States is as bright as we allow and promote constructive debate and maintain an open dialog to argue a position, no matter how vociferously, against diametrically opposed ideas with respect, dignity and decorum. Allowing the clash of ideas is critical so that policies grounded in practical experience are expressed and implemented; but, when failure occurs, other views are constantly considered to keep the country intact and moving forward.

I remain confident that our brightest days are ahead of us, and the promise of America continues to burn in the hearts of freedom lovers around the world.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

7 months ago

Justice Will Sellers: History’s most significant event casts a long shadow


The late great U.S. Supreme Court Justice Antonin Scalia often told the story of his oral exam at Georgetown University. To graduate as a history major, he was required to answer questions from the faculty to demonstrate the sufficiency of his education and entitlement to a diploma.

Asked to cite the most significant event in history, he thought it was a softball question and picked an event he considered important. Wrong! His inquisitor corrected him and disapprovingly stated: “No, Mr. Scalia; it was the Incarnation,” which means the birth of Christ, the Son of God as both fully divine and fully human.

So it is that this time of year the seminal event on the Christian calendar is celebrated by the faithful and unfaithful alike. Some merely acknowledge it in their actions and time off, while others take the religious significance to heart and fully participate in the seasonal countdown of Advent.


And while the secular celebration with all its associated trappings now largely undermines the religious significance, even the most secular cannot deny its importance and the overarching ideas it spawned.

The Incarnation was a religious hurricane so powerful that it spun off secular tornados that impacted the world in very subtle ways.

We see these secular manifestations most notably in buying, giving and receiving gifts; God purchased our redemption by giving the gift of his Son; we replicate His example in a small way by the sacrifice of buying and then giving to others. But the secular fallout occurs in other things, too, that while hardly religious in themselves, are directly connected to the Incarnation. In fact, it would not be a stretch to conclude that most high-minded ideals trace their roots here.

Think of self-sacrifice, scrupulousness, generosity, service to others and the unity of family. While these virtues are secularized, they are lauded as worthy, important and so critical to civilization that they form the basis for moral and character education. So even if the religious aspects of Christmas may not be acknowledged, the secular fallout, being so intertwined, cannot escape the cultural ramifications of the Incarnation.

The Incarnation as a historical event is so widely known that few movies recount the story, but never to miss a beat, Hollywood, and before that other mass media, capitalized on the themes of Christmas. Initially, the obvious religious metaphors were depicted in some books and magazines, but now films that open in December achieve commercial success by harking back to themes based on the Incarnation.

Be it nostalgia, sentimentality, or affairs of the heart, films during Christmas tend to bring people together and generally present a morality play focused on one or several virtues. And, while the religious overtones may be completely obscured, they are there nonetheless and easy to spot. People want optimism and hope that the future is brighter and that the new year will be better than the past — more so than ever before in this time of COVID. But films have helped us experience these things emotionally and provided us a means to feel happy at least for the duration of the show.

To feel good and confident even for a moment is the spark of commercial success for films. Frank Capra, the great film director of “It’s A Wonderful Life,” used this formula well and wisely throughout his long and storied career. But he didn’t direct films that told a good story and gave a momentary emotional rush merely for box office profits. Rather, he believed in fundamental values, permanent things and timeless ideals and considered it his duty to give hope, provide optimism, and exalt the individual human actor over the various manifestations of greed expressed in impersonal bigness, not only in government, but also in business, religious institutions and communities.

Capra’s films achieved both artistic and commercial success because they gave Depression-era audiences something with which to identify. Capra’s themes used the background drag of the Great Depression to give his hero an obstacle that was overcome not solely by individual effort, although that always played a critical part, but also by the collective efforts of friends who inspire, encourage and become part of a unified effort to defeat evil.

Whether it is George Bailey v. Mr. Potter, Mr. Smith v. Sen. Payne, Longfellow Deeds v. Lawyer Cedar or John Doe v. D. B. Norton, each conflict created a crisis of conscience and a crucial decision requiring action. But Capra’s films show that action is not unilateral, and is, instead, aided by the love, support and encouragement from friends.

Overcoming and achieving was not a singular endeavor, but a subtle spiritual effort where virtue ultimately triumphs. And the success of the hero gave audiences a renewed sense of purpose; that no mountain was too great or hurdle too high, but success in the defeat of adversity was possible by rightness of cause, individual commitment and assistance from others.

The hilarity of the film “You Can’t Take It With You” comes at the expense of stereotypes of corporatist drones, corrupt officials and unanchored peons. And, in showing the conflicts on every level, Capra in many ways uses the fruits of the Incarnation to not only entertain, but to give hope, encouragement and purpose.

So many who saw his films had little hope and diminished prospects. His films lifted people up, marginalized the mean spirited, and showed what true friendship meant and how happiness in and of itself is substantial and more important than material things, social status, or political influence.

If the Incarnation seems passé, look around you. This one event spawned a completely new era that permeates most of the things we do. And, if you look at successful entertainment, such as movies, commercial achievement is often directly related to expressing the secular aspects of the Incarnation to show the entirety of a challenged, but hopeful and confident humanity.

Frank Capra’s films do this, and other successful Christmas films follow suit.

Merry Christmas!

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

8 months ago

Justice Will Sellers: The enduring legacy of Margaret Thatcher

(Wikicommons, YHN)

Thirty years ago, this week, the longest serving British prime minister of the 20th-century resigned. Margaret Thatcher, having governed since 1979, saw her leadership challenged, but rather than continue to fight, she was gaslighted into believing she was losing her grip on her party and would lose her office in an embarrassing vote.

None of that was true.

In fact, the very men who rode to leadership positions on her coattails and hid behind her skirts during controversy allowed their greed for power to debase their loyalty to the Iron Lady. Dejected, she resigned and thus, quietly exited British politics.

Prior to Thatcher’s leadership, Britain was in decline and, by all economic measures, sliding into second rate status. Rather than control its financial destiny, the International Monetary Fund was needed to help the Empire shore up her accounts. Socialism dominated with anti-capitalist trade unions and nationalized industries weighing down any real economic growth.


But in the winter of British discontent, Thatcher emerged to lead the minority Conservative Party into the majority. For more than a decade thereafter, she was the face of the party, and even when she left the scene, the imprint of “Thatcherism” would remain a dominant political ideology.

Thatcher’s political program relied upon a simple appeal to the British sensibilities. She believed in limited government, liberty of the individual, and the rule of law. But rather than relying solely on rhetoric, she acted on her beliefs and ushered in a golden age that changed not only Britain but the entire world. Indeed, the world she inherited in 1979 stood in stark contrast to the world in 1990. She caused the contrast.

Unlike many political leaders who espouse high minded principles, she pursued hers with what some considered reckless abandon. Thatcher took significant steps to push back the suffocating hand of state control and return the economy to a true free market. Government intervention was replaced with individual responsibility and human action.

There are five significant events revealing what Thatcher believed by how she acted. And the impact of her actions had ramifications that still affect both British and international politics.

Thatcher organized her government to firmly oppose state-sponsored terrorism and declined to allow the cloak of diplomatic immunity to cover subversive activities. When the Libyan Embassy in London was used to harbor snipers to shoot protestors and ended up killing British policewoman Yvonne Fletcher, the prime minister terminated diplomatic relations and used special forces to clear the embassy and send the terrorists masquerading as diplomats packing.

She would similarly send a large contingent of Russians home when it was clear their embassy was a cover for supporting domestic terrorists and spying on military and industrial targets. These actions rankled some in the diplomatic community who wanted her to be more deferential, but by sending a message of British resolve, she earned the respect of the world community.

Perhaps the one thing cementing Britain‘s return to power was the Falkland Island campaign. While it was only a small British outpost near Argentina, Thatcher recognized that the Argentine invasion was not simply a threat to the islanders but also a challenge to international British interests. Unwilling to concede anything, she ordered the unequivocal liberation of the islands and effectively threw down a marker that she would defend and protect British subjects and interests anywhere at any cost.

Accepting the Argentine invasion would have been the easy course, but while some in Britain were embarrassed at her saber-rattling and projection of military power, the vast majority saw her actions as patriotic and a reminder of the former greatness of empire. After the Falkland’s victory, Thatcher’s popularity soared, and when a general election was called, she achieved a landslide victory establishing a conservative majority that lasted until 1997.

On the domestic front, Thatcher knew from the beginning of her administration that she had a dead reckoning with trade unionism, whose power had grown so strong and influential that strikes could paralyze the country. But rather than take them on directly, the wily strategist first worked to pass laws that prevented union corruption and inappropriate strikes.

Once those laws were in place, she realized that the first challenge would be with the coal miners’ union. At that time, coal miners in Britain were a larger part of a socialist network that had grown in influence because coal was so critical to energy and the economy. But a minority of the unions were not part of this network and Thatcher allied herself with them, stockpiling coal to outlast the socialists.

So, when the coal miners decided to strike, she was prepared. First with lawsuits that prevented sympathy strikes from other unions by exacting fines and then with resources to close unprofitable mines and wait until the unions were unable to hold out. The coal miners were the first step, but gradually she reduced the unions’ economic stranglehold and began to privatize state-owned industries, which made the British economy more dynamic and competitive.

With an established Church, the parameters for separation of church and state are not debated in Parliament. In fact, the prime minister was involved in approving ecclesiastical promotion. Unlike other politicians who rarely addressed religious issues directly, Thatcher had no such reticence. When she became alarmed at the liberal bent of the established Church, she found an opportunity to explain to the professional clergy exactly how she viewed their role in society.

Addressing the General Assembly of Scotland, she boldly stated, “Christianity is about spiritual redemption, not social reform.” She chastised church leaders for failing to appreciate capitalism and the spiritual benefits it provided. It is hard to imagine a political leader who would have the intestinal fortitude to attend a denominational gathering, articulate a theology and take ministers to task for in essence failing in their mission.

Her speech, known derisively as “The Epistle to the Caledonians” is readily available on the internet. Some Sunday when virtual church is off, watch Thatcher explain her version of Christianity and see her sense of faith boldly defended and publicly exhibited.

Perhaps the one thing that both defined Thatcher and also led to her resignation was her idea of Britain’s place in the European community. Her view of Europe was with an eye toward free trade and removing regulations and restrictions on the free flow of goods and services. She saw Europe not as a melting pot where states and people lose their currency and cultural identity, but, rather, as a mosaic where nations and people maintained their unique culture within a framework of collaboration centered on trade.

As the idea of a united Europe moved toward a common currency, democratic socialism and a heightened regulatory environment, Thatcher stood her ground and refused to participate. Her speech to the College of Europe at Bruges explained succinctly her concerns and her vision of developing a strong capitalist Europe. Like her speech to the Church of Scotland, this speech, too, is worth a listen as it is prescient considering the current status of Europe, Brexit and NATO.

Lady Thatcher’s political demise was ushered in by disloyal cabinet members who were willing to subjugate British hegemony to an amalgamated Europe. Nothing they would say or do could detract from her legacy. In her retirement as she traveled to the former Warsaw Pact countries, throngs of people venerated her as the force that helped liberate them from Soviet domination.

If Thatcher was not honored in her own country, the voices of the children freed from totalitarianism offered honor enough to the Iron Lady who held to her principles, saved her country from irrelevance, and ushered in a new world order based on liberty of the individual and the rule of law.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

9 months ago

Will Sellers: In defense of the Electoral College


I came of age politically with the 1968 presidential election. Alabama Governor George Wallace was running as an independent against Richard Nixon and Hubert Humphrey. My parents were Nixon supporters, and I, their five-year-old son, hopped on the Nixon bandwagon with gusto. The dinnertime conversations in the month preceding the election were all about whether Wallace’s third-party candidacy could work.

This all fascinated me, so I asked my mother to let me watch her vote on Election Day. She agreed, but to my dismay, when I joined her in the voting booth, I did not see Nixon, Humphrey or Wallace listed on the ballot. This made no sense to me; I thought we were here to vote for Richard Nixon? My mother then explained that we didn’t vote for the presidential candidate directly. Instead, we voted for men and women called presidential electors. These people were well-regarded and appointed for the special privilege of casting the deciding votes in presidential elections. This system seemed out of place to me, because in every other election the candidates were listed by name on the ballot. Why not for president? Why should my mother vote for nine people, who would then vote later for president, instead of voting directly for the president? This was my first encounter with the Electoral College. It would not be my last.


The first electoral college was a medieval construct dating back at least to the 12th century, when specific princes were chosen to elect the Holy Roman Emperor. They were influential noblemen, who, because of the importance of their respective kingdoms, were given the hereditary title of “elector.” After the death of the emperor, they met, much like the College of Cardinals, to choose a successor. Whether this idea influenced the deliberations of the Constitutional Convention is speculation, but, like most of the other aspects of the Constitution, the mechanics of the new government were based on historical facets of self-government. The new American nation was built on traditions of representative government expressed in the English parliamentary system, the organization of Protestant church government, and the colonial experience with various local governments in the New World.

Important questions necessarily arose during the Constitutional Convention concerning the process of electing the president. How exactly would a president be chosen, and to whom or what would he owe allegiance? Some advocated for election to take place in the House of Representatives, or in the Senate, or even in the several states. The obvious problem with these proposals is that they would create an axis between the president and the electing body. If the states elected the president, then the larger, wealthier, and more populous states would receive greater attention and more favorable treatment by the executive branch than would the smaller, less populous states. A similar imbalance of power would occur were the president chosen by the House or the Senate. Thus, the mechanics of electing the chief executive required balancing various interests to give the executive branch the requisite independence from other political bodies, while maintaining co-equality.

According to the chosen scheme, each state would appoint “electors” based on the number of House and Senate members comprising the state’s congressional delegation. These electors were appointed for the sole purpose of electing the president, and a simple majority of their votes would decide the election. This created another means by which the spheres of Congress and the federal government were balanced and divided from that of the states. The Constitutional Convention viewed electors as not necessarily aligned with a faction, but as citizens of honesty, integrity, and political acumen.

Originally, electors voted for two people; the person with the most electoral votes became president, and the runner-up became vice-president. Flaws in this system became evident with the presidential election of 1796, when John Adams was elected as president and his archrival, if not nemesis, Thomas Jefferson, was elected vice president. Four years later, Jefferson and Aaron Burr received the same number of electoral votes— neither had the required majority. This unworkable situation was remedied by the 12th Amendment to the Constitution, which prescribed that electors would separately vote for a president and vice president on the same ballot. Later, state legislatures, as they were constitutionally permitted and as the two-party system grew, allowed electors to run as proxies for the presidential and vice presidential party nominee.

For at least the first 100 years, the system worked well, and, other than the 12th Amendment, no major attempts were made to alter the process of electing the president and vice president. Several times, the election was submitted to the House of Representatives after the electors failed to achieve a majority vote for president. For example, in 1824, the election was submitted to the House, where power plays resulted in the election of John Quincy Adams, though Andrew Jackson won significantly more of the popular and the electoral vote. Rutherford B. Hayes, a Republican, lost the 1876 popular vote to Samuel Tilden, a Democrat, but became president because he had prevailed in the electoral vote, though voter fraud in some jurisdictions seemed certain.

Many Democratic candidates running for federal office embraced the idea of abolishing the Electoral College, not least Sam Rayburn, who, in his first congressional election in 1912, advocated electing the president by popular vote. If there was any momentum for this aspect of the Progressive movement, it lost steam as other, more critical issues advanced.

Today, the constitutional method for electing the president is under siege. The result of the 2016 election — with Donald Trump winning the presidency despite losing the popular vote — led pundits and politicians to call for the presidential election to be based on the popular, not electoral, vote. But lamenting results that saw two presidents in recent memory fail to win the popular vote obscures the effect that abolishing the Electoral College would have on a national campaign. A presidential campaign aimed at achieving a popular vote majority would completely ignore most states and focus, instead, on a few populous states containing the nation’s largest cities. This urban-centric strategy would silence the political voice of most regions of the country.

The Electoral College guarantees that successful presidential candidates will appeal to large swaths of the American landscape, and that the president himself will reflect the diversity of various regional ideas. It orchestrates the American chorus so that every section of the country will be heard by a serious presidential candidate. We might not always like the outcome; it is always frustrating when your candidate loses, especially if he or she won the popular vote. Nevertheless, the remedy is not to change the rules, but rather to master the nuances of the rules in order to organize a presidential campaign so that it attracts supporters — and votes — from all portions of the country.

My personal quest to understand the Electoral College better led to my service as an alternate elector in 2000 for George W. Bush. The controversial nature of that election focused national attention on each state’s canvas of presidential electors. The practice of scrutinizing each elector and the attempts made to shake loose a few electors in order to change the outcome caused the question to be asked again: Is the Electoral College the right system for modern America?

I served as a presidential elector in 2004, 2008, 2012 and 2016. My initial experience as an elector was that no one, certainly not the media, cared about the Electoral College. Perhaps some cub reporter was sent to cover the meeting of the electors, but that was about all the publicity we garnered.

That changed dramatically in 2016. Starting about two weeks before the electors met, I received thousands of letters from people across the country asking me, if not begging me, to change my vote. It did not matter to them that I had pledged to support my party’s nominee. I was even lectured by legal scholars about how my pledge was not actually binding. Several people sent me copies of the Federalist Papers, the Constitution, and even local petitions. Others left voicemails that, looking back, I wish I had saved. Few, if any, of these communications expressed any mature understanding of the American electoral system. But, in a way, the volume of communication, at least as compared with other years, showed that the role of an elector now seemed to matter again to the American people.

In 2020, if there was one issue each of the initial contenders for the Democratic nomination agreed upon, it was the necessity of abolishing the Electoral College and replacing it with election by popular vote. But the consequences of this change are largely ignored. Candidates calling for the end of the Electoral College, to be consistent, ought also to call for the end of primary elections, too. Since the election of president is the only national election, eliminating the Electoral College would change each party’s strategy for victory. It is akin to amending the rules of football so that a score is obtained not by touchdowns or field goals but by first downs.

Again, a popular election of the president would reduce the need for a diversified platform; the candidates would favor metropolises and ignore the heartland of America. Minority voters would get pushed aside, since the votes of the majority are all that matter. The executive branch would be weakened as the center of federal political power shifted toward Congress. While the president’s agenda would reflect only the interests of the 51% who elected him, Congress would continue to appeal, at least in theory, to the entire nation. And who’s to say that the president would need to win a majority of the popular vote? Would we have runoffs, ranked-choice voting, or just “first past the post,” where the top vote-getter in a crowded field takes the prize?

While the Electoral College can sometimes appear to achieve a skewed result, we must remember that it has served America well by providing a political balance to the three branches of government. Directly electing a president by popular vote sounds great, but a deeper examination reveals the toll that it would exact upon American republicanism. Instead of the “winner take all” system that most states use, perhaps adopting the Maine and Nebraska models would be an effective compromise. Two electoral votes reflect the majority vote of the state’s presidential ballots, but the rest of the electors are chosen by congressional district preference. This method diffuses power and is perhaps something the Framers might applaud, though the change would have to be accomplished one state at a time, as the selection of electors is still very much a state and not a federal function.

The Electoral College has weathered many storms, but the nation is still together and is still debating the limits of self-government. All told, it’s a pretty good track record.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

Editor’s note: This piece originally appeared in City Journal.

9 months ago

Air superiority then, space superiority now — The Battle of Britain 80 years hence

(Wikicommons, U.S. Space Force/Contributed, YHN)

Eighty years ago this week, hurricane season ended when the Royal Air Force won the Battle of Britain by stopping the Nazi war machine at the edge of the English Channel. Before the summer of 1940, Hitler had derided Great Britain as a nation of shopkeepers. Göring’s seemingly superior Luftwaffe pilots were outdone by the young British RAF, aided by friendly forces — not the least of which was a squadron of Polish pilots. They had shown the world that the Nazi juggernaut could be countered through perseverance, aided by the novel design of quick and lethal airplanes: the spitfire and hurricane.

Churchill named this battle when he declared after Dunkirk that with the conclusion of the Battle of France, the Battle of Britain would begin. Unlike past battles, the critical objective was as amorphous as it was strategic: the achievement of air superiority. It was a testament to the fact that warfare had changed forever, tilting the scales in favor of technology over brute strength.


Even Hitler and his retinue of yes-men knew that subjugating Britain would require a risky and complex invasion. The English Channel, though relatively narrow at some points, served as a giant moat that required amphibious landings on slow-moving vessels, which would be vulnerable to attack from above. Nazi control of the air would be the key to a successful invasion. With proper preparations for a seaborne invasion many months out, Göring pushed for an air campaign, and Hitler approved.

The Luftwaffe’s first objective was to destroy RAF airfields, but Luftwaffe planes were not designed for this mission, and their pilots — though experienced — were no match for the RAF’s pilots in spitfires and hurricanes. These planes had unmatched maneuverability, and home-field advantage played an equally important role. The British had a superior early warning radar system that enabled them to plot the likely flight path of incoming enemies and to scramble their gassed and fully loaded planes efficiently. Over Britain, each downed German represented not only a lost airplane but also a lost pilot. Maintaining air superiority was a fight for survival, and the British pilots knew that the fate of freedom for their island, and perhaps for civilization, rested on their shoulders. They turned the tide of the war in fighting, as Churchill noted, “undaunted by the odds, unwearied in their constant challenge and mortal danger.”

While the concept of air superiority was initially academic, the Battle of Britain proved it critical to modern military success. Since then, the need for air superiority has remained unquestioned. A country might not win with air superiority, but failure was guaranteed without it. The use of airpower to master the skies has been the first order of business in every major conflict since World War II. Even today, with the development of defensive missile shields and the capability of intercepting incoming aircraft and missiles, air superiority is and will remain a critical objective in any conflict. But air superiority is starting to give way to space superiority.

As we become more and more dependent on satellites, and as human activity in space becomes less of a novelty, controlling space will be critical not only for commercial and economic success, but also for global stability and the defense of our nation. The nation that controls space will control the destiny of the entire world. To be dominant in space is to be dominant period, and the dominating nation will have the final say over many aspects of our lives.

Those who would object to the militarization of space do not understand, or refuse to see, today’s reality. The activities of the Chinese Communist Party (CCP) in space are similar to those of the nations who sought to control the sea in the 19th century and the air in the 20th century. At present, these activities are largely unchecked by other nations and international organizations.

There was a time when the United Nations was capable of limiting space to peaceful means. Similar to the control of nuclear weapons, the United Nations provided a means of achieving an international consensus that limiting weapons in space was beneficial for all nations. But, as with any large organization attempting to achieve consensus among diverse groups, the only real agreement among nations became the lowest common denominator. Thus, UN limits on the militarization of space are limited, weak, and ineffective.

This void of international leadership is being filled by a resurgent communist China, intent on achieving world domination — a long-term national goal. With few international limitations, the CCP is seeking space superiority to impose its ideas on the world and thereby supplant civilization’s shared liberal principles. The UN has been aggressively helpless or simply unable to check China’s dreams of space superiority. While the CCP has yet to obtain the domination it seeks, it is clearly on track with covert military missions, like developing its own GPS system that would aid in obtaining space superiority.

The United States cannot let this happen. Students of history know that many of the great and terrible military conflicts could have been prevented or mitigated with proper foresight and preparation. Unless the United States acts soon to check CCP aggression in space, we may have extremely limited choices in the future.

Our new Space Force must explain the seriousness of this threat and develop strategic plans to protect space from the domination of any one country. This grand effort will require allies who not only understand the threat, but who are financially able to join with the United States to dominate space for peaceful purposes. The free world’s shared cultural and civic traditions could form the basis for ensuring that space can never be dominated by one country.

During World War I and in the following decades, Churchill stressed the importance of developing radar, the tank and the airplane. Without these developments, the Battle of Britain would have ended much differently. As we celebrate the 80th anniversary of victory at the Battle of Britain, and as we understand the strategic necessity of air superiority in protecting the island nation from foreign invasion, we should recognize the strategic necessity of space superiority today.

The United States and her friends cannot allow a country that is utterly opposed to freedom to control space and, in turn, Earth. The free world must develop space first and create enforceable laws to allow space to be an extension of the liberty we currently enjoy. In order to do that, we must overhaul our outdated legal regime concerning the development and deployment of space technologies, support the private development of space properly, and remove the bureaucratic barriers hindering important breakthroughs. We must not surrender space to totalitarians who would use it to subjugate free peoples around the globe. If we heed the call to action and engage in this new endeavor, we can ensure that the limitless possibilities of space are secured for future generations.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

10 months ago

The vice presidential debate that never was

(Wikicommons, PikRepo, YHN)

Over the last few election cycles, we’ve become accustomed to seeing the candidates for vice president square off in a debate. Perhaps this is acknowledging the greater responsibilities performed by modern-day vice presidents. I’ve always regretted that 60 years ago, vice presidential hopefuls Lyndon Johnson and Henry Cabot Lodge, Jr. didn’t debate. It would have been a show of contrasts and with the election so razor-thin, just might have made more of a difference. I’d like to imagine the refined and striking Cabot Lodge gracefully walking on the debate stage and standing adroitly behind the podium, poised and ready for repartee.

The scion of a blue blood Boston family, Lodge was a dedicated public servant having served his country in the House and then in the Senate as his family had done for generations. While he lost his senate seat in 1952 to Jack Kennedy, he continued to serve his country as Ambassador to the United Nations. In this role, he became the embodiment of Eisenhower foreign policy.

In stark contrast, think of Lyndon Johnson, lanky and awkward not especially polished with suits that weren’t precisely tailored. If there was another side of the tracks, that is where Lyndon grew up. The hardscrabble life he embodied, his limited education and his inarticulation was something even the Kennedy’s described as “hick” and “cornpone.” Johnson’s entry into politics was less of a calling to public service and more of a way out of insignificance. In fact, he won his senate seat by a mere 89 votes; rumors of fraud haunted him earning him the nickname “Landslide Lyndon.” Lodge’s entry to the Senate saw him win a decisive vote and any thought of impropriety was unfounded.


But in 1960, Johnson was majority leader of the Senate and not only possessed power but exercised it as absolutely as his mentor Sam Rayburn did in the House. Johnson wielded enormous influence. Lodge had been in the minority most his entire tenure in the Senate. But he too wielded power, but his power was a mastery of nuances in rules and personal persuasion that allowed him to effectively pass legislation that by its nature was bipartisan. Using rules to impose majority rule is easy since you have the votes. Johnson’s role as majority leader was to corral his fellow democrats into line and balance the more progressive factions of the party from Northern states with the conservatives from the South. That he did this well is evident in how the senate operated. Lodge’s task was harder; he wasn’t in the majority or in a leadership position and had to gracefully weave and bob through the senate rules and personal relationships to be effective.

If the debate featured questions about military service, Johnson would have been embarrassed. While he wore his silver star lapel pin, the story behind his valor had less to do with action in combat and more to do with political influence. If competent journalists had probed the record and incident further, they would have discovered that contrary to Johnson’s recitation of his heroism, he had in fact been on the ground in a malfunctioning B-26, when other planes in the same squadron were attacked by the Japanese. While Johnson was supposed to be an observer on a bombing run over Lae, his plane developed engine trouble and had to return to base. Somehow Johnson created a myth that he engaged the enemy and took actions of such magnitude that he was award the silver star. It would have been uncomfortable for sure if the Swift Board Veterans for Truth had their sights on Johnson. Lodge on the other hand had the distinction of being the first sitting senator since the Civil War to resign from the Senate and serve on active duty. And Lodge’s service was not in the rear echelon, but he was engaged in combat and even captured a German patrol. He went on to assist General Deavers in France and was a liaison officer to the Free French commanding general. Any questions about military service and comparison of war records would have favored Lodge on every level. For him, active duty meant just that, and his medals and citations were real and deserved. And even after the war, he continued to serve with distinction in the reserves.

While Johnson was classified as a Southerner, he was much more of a populist and new dealer. For a Republican, Lodge was very progressive and did not find many aspects of the new deal to be objectionable. Probably ahead of his time, he was more of a globalist and understood the need for the United States to be and stay involved in world affairs; foreign affairs was his bailiwick and he had ably advocated U.S. policy in the United Nations and spared frequently with Russian disinformation. Johnson was more of a domestic policy man and his view of domestic policy was finding policies that had large price tags that could be implemented to benefit his family, friends, and supporters. Not coming from money, Johnson used his power to create an empire of radio and TV stations that somehow escaped effective regulations by the FCC. If Lodge had a self-interest, it was advocating for the United States. And his advocacy wasn’t always appreciated by American allies as when he took the British and French to task over the Suez Canal. Communist countries especially resented Lodge’s unashamed dedication to peace and freedom and his advocacy for stability and against hostilities.

But the one policy that created the starkest and most significant divide was race relations and civil rights. Had there been a debate, the money question garnering the most viewers was when the moderator asked each of the candidates for their position on civil rights. The question would have been a trap for Johnson. He had voted against every civil rights bill during his entire time in federal office. While the Kennedy team pointed to his help in passing the Civil Rights Act of 1957 to assuage liberal constituents, most people knew that Johnson had watered down the bill so much that it was only window dressing and had limited impact. Lodge was a progressive on race and had supporters any number of bills to end discrimination and enforce desegregation. On the campaign trail, he even suggested that he was in favor of having a black man in the cabinet. In fact, it was Lodge who suggested that Ralph Bunche would be a wonderful ambassador to Moscow. This progressive thinking in 1960 was hardly well-received in all quarters.

So, if a debate had taken place anyone viewing or listening would have seen two different visions of American progress. But the debate didn’t occur, and we can only imagine what might have happened. Funny enough, Johnson’s record on Civil rights was embarrassing to the Kennedy clan; and, while Nixon was a strong supporter of civil rights, he had to distance himself from some of Lodge’s more progressive ideas.

Knowing how close the election in 1960 was and the allegations of voter fraud in Chicago and Texas, had Johnson and Lodge debated, who knows but that the election might have had a different outcome.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

11 months ago

75 years after ending World War II: Celebrating a lasting peace

(Wikicommons, Pixabay, YHN)

Seventy-five years ago today, World War II officially ended. After six years of global conflagration, the guns fell silent and the lights, a barometer of civilization, began to once again chase the darkness from the world.

The war left Europe decimated with 60 million people dead and the islands of Japan smoldering piles of rubble and ash. Although victory in Europe had been secured four months earlier in May, it took the horrific devastation of two atomic bombs to convince the Japanese that continued resistance was futile. In the years that immediately followed, the American occupiers punished Japanese war criminals while exercising restraint not to humiliate or dishonor the Japanese people. Perhaps the finest moment in the United States’ ascension to superpower status was its treatment of the vanquished Empire of Japan. The plan to occupy, restore, and rehabilitate Japan transformed the nation from fierce enemy to valuable ally.


The occupation of Japan contrasts sharply with the experience in Europe. There, Germany and its capital, Berlin, were divided among the four major Allied powers, with France, Britain and the United States overseeing West Germany and the Soviet Union controlling East Germany. This geographic and political division immediately set the stage for the Cold War.

In Japan, there was only one occupying power – the United States – and it gave near absolute authority to General Douglas MacArthur to organize and deploy a systematic plan to bring democracy to the Japanese people. Other allied nations attempted to insert themselves so as to influence Japan’s future, but MacArthur would have none of it. In fact, the Russians, who conveniently declared war on Japan less than a month before Japan surrendered, planned to invade Hokkaido, Japan’s northernmost and second-largest main island. Imprudently, Stalin notified President Truman of his intention, and Truman emphatically responded that all of mainland Japan would be placed under General MacArthur’s control. At the surrender ceremony in Tokyo Bay, MacArthur reportedly told a Soviet general that he would not tolerate a divided Japan and would use military force against any attempt to place Russian troops on Japanese soil. The Soviets backed down, and MacArthur proceeded to rebuild Japan completely free from Russian interference.

MacArthur approached his mission to win the peace in Japan with the same tenacity he exhibited when fighting the Japanese during the war. After securing for the Japanese people the basic necessities of food and shelter, he set about to secure their trust. To do so, he made the bold move of permitting Emperor Hirohito to remain the titular head of state. This did not sit well with a number of MacArthur’s contemporaries and allies, who viewed Hirohito as only a notch below Hitler on the evil-dictator scale. MacArthur understood that if the Emperor publicly approved of MacArthur’s plans, the Japanese people would acquiesce peacefully and without objection. An example of MacArthur’s keen understanding of Japanese culture, which revolved around shame and honor, took place when he allowed the Emperor, in his own time, to visit him and accord him the respect of a hereditary monarch. Such steps taken by MacArthur went a long way toward gaining trust and cooperation with the people of Japan.

MacArthur’s plan for post-war Japan stands in stark contrast to the treatment of Germany after World War I. Following the Treaty of Versailles, Germany was required to pay reparations amounting to $12.5 billion in today’s currency. The German economy was so weak that only a small percentage of reparations were ever paid, and what little was paid may have contributed to the hyperinflation Germany experienced in the 1920s. Having fought bravely in WWI, MacArthur learned many lessons from observing first-hand the failure of the Allied powers to enforce the treaty and secure lasting peace in Europe. Following Japan’s defeat in World War II, MacArthur refused to exact a crippling, retributive fine from the Japanese people to fund his plan to rebuild Japan. Instead, he tapped the United States Treasury to finance the occupation. Some 75 years hence, we can be proud that our policy was to rehabilitate and not humiliate. MacArthur wisely realized that Japan was an anchor in the Pacific and, as an ally, would be of great utility in providing stability to the region. What may have appeared as an excessively charitable approach toward conquered Japan at the time has proven incredibly prudent. The plan to forgive, rebuild, and democratize gained the United States a key ally in the Asia Pacific Rim.

MacArthur became a modern-day Moses, basically writing a constitution, encouraging collective bargaining and installing a market-driven economy to bring Japan’s industries to their pre-war production level. His Civil Liberties Directive is the clearest example of how radical his plan had to be in order to successfully transform Japan’s feudalistic society into one of democracy and liberty. This Directive lifted all restrictions on political, civil, and religious rights; political prisoners were freed and censorship of the press was abolished. MacArthur authorized free elections and not only gave women the right to vote but saw 38 women elected to the Diet, Japan’s equivalent to Congress. Up to that point in Japan, property rights were practically nonexistent. Most Japanese farmers worked under a system of virtual slavery, in which they were forbidden from purchasing their own land but were required to give a disproportionate amount of their crops to a small group of landowners. MacArthur extinguished this last vestige of feudalism by requiring the government to buy land at fair prices and then sell parcels to farmers on affordable terms. After the land reform program was fully implemented, nearly 90% of all farming land was owned by the people who lived on and cultivated it.

Seventy-five years ago, the mighty Japanese Empire, which initiated a war that killed millions of soldiers and civilians, was brought to heel and surrendered unconditionally on the deck of the USS Missouri in Tokyo Bay. From the ruins of total defeat began the process of total reconstruction. The United States, through the command of General MacArthur, guided the Japanese people as they beat their spears into plowshares and started down the path toward modernization and alliance with the West. Americans can be proud of the far-sighted policy of Gen. MacArthur who totally and unconditionally won the peace. When MacArthur left Japan, ordinary citizens spontaneously lined the route of his departure, most with thankful tears in their eyes for an American soldier who changed their country, secured their rights and gave them a stable constitutional government that stands today as the high mark of benevolent conquest.

Will Sellers is an Associate Justice on the Supreme Court of Alabama

12 months ago

Will Sellers: Alabama’s finest hour

Gov. Kay Ivey lays a wreath at the casket of Congressman John Lewis as he lays in state at the Capitol on Sunday, July 26, 2020 in Montgomery, Ala. (Governor's Office/Hal Yeager)

In describing his constituents, George Wallace used to say that “the people of Alabama are just as cultured, refined and gracious as anyone else in America.” Whether it was true when he said it or not, it made Alabamians stand a little higher and feel better about their circumstances.

If actions speak louder than words, on Sunday the people Alabama in memorializing John Lewis demonstrated to the nation how truly refined, gracious and cultured we really are.

While other parts of the nation were literally on fire and factions seethed with hate, Alabamians provided a stark contrast in honoring Congressman Lewis.


Where 55 years ago State Troopers severely beat John Lewis, on Sunday fully integrated law enforcement officers saluted him and gave him the dignity and respect he earned and deserved. Where once the Governor of Alabama prevented civil rights marchers from entering the Capitol, on Sunday Alabama Governor Kay Ivey silently stood near Jefferson Davis’ star and with respect and solemnly saluted and welcomed the casket of the 80-year-old congressman.

In other parts of America, Democrats and Republicans engage in angry debates, neither giving nor receiving quarter. In Montgomery on Sunday, members of both parties came together, transcended partisanship and found common ground in recognizing someone who lived a faithful life in support of peace, justice and mutual understanding.

Indeed, in some cities in our country federal law enforcement officials, without invitation or consent from mayors or governors, were engaged in riot control. At the Capitol in Montgomery, federal officials were not only invited but attended and participated in a memorial service. Federal troops came, not with a show of force, but as an honor guard to drape the mortal remains with an American flag as a pall to lie in state. While federal marshals were present, they were there to pay their respects and mourn Congressman Lewis, not to protect federal property from destruction.

On Sunday, Alabama taught the world what racial harmony looks like; Alabama showed an integrated community embracing a hopeful future.

Any outsider saw clearly that Alabama is no longer tied to a past anchored in division, but is a mosaic of people from all walks of life coming together, laying aside their differences and agreeing that when a great man dies, the brightness of his sun setting reveals a glorious legacy for all to pause, reflect and regard in all its majesty.

Sunday was a testament to dreams anticipated and while not yet fulfilled, much closer to reality. The celebration of John Lewis in his native Alabama served to acknowledge the legacy of the civil rights movement that still motivates us to judge people not on their externals, but on the internals of kindness reflected in the content of each one’s character.

Progress for unity comes in fits and starts. Sunday in Alabama was a giant leap forward and a day that helps define our future.

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

1 year ago

Let’s celebrate the Magna Carta!


In just a few weeks, fireworks will illuminate the night sky, parades will proceed down Main Streets, and the American people, even while social distancing, will pause to celebrate the Fourth of July, or Independence Day. And what event are we commemorating? Not a military victory, not a birth or death, but a mere vote! That vote, once and for all, declared the American colonies free and independent from British domination.

The many grounds justifying this vote are famously spelled out in the Declaration of Independence. John Adams wrote to Abagail in July 1776 predicting that the vote for independence would be “celebrated by succeeding generations as the great anniversary festival,” and “commemorated … by solemn acts of devotion to God Almighty,” accompanied by “bells, bonfires, and illuminations from one end of this continent to the other.” How prescient of John Adams, and how appropriate that we Americans continue to celebrate our independence even 245 years after the vote for independence was announced.


The Declaration of Independence rightly holds a preeminent place in American history; yet, there is another, much older document from history worth celebrating too. That document is Magna Carta, “the Great Charter,” signed this day [June 15] in the year 1215 A.D. by English barons and King John. It is not an exaggeration to say Magna Carta changed the concept of government forever. In fact, never before had a ruler, in what was almost a bloodless coup, agreed to limitations on royal power. Magna Carta changed the dialog about the divine rights of kings and absolute power. We would do well to remember 805 years hence and reflect on what civilization has achieved by limiting the power of government and giving liberty to the governed.

Besides chartering a peace between some rebellious barons and the King of England, what did Magna Carta do? To be clear, it did not establish the concept of government by democracy; the Greeks had managed their affairs by majority vote well over a millennium earlier. Rather, Magna Carta planted the first seeds of constitutional government. A constitutional government recognizes the truth that all citizens, including those in the government, are under the law. No one, not even the king, is above the law. In medieval times, this innovative concept challenged the regime that ceded absolute power in the monarchy, which was so prevalent in Europe and the rest of the world. Magna Carta placed the ruler under the law, forbidding him from dictating to his subjects above the limitations of the law.

Magna Carta calls this supreme law the “law of the land.” This law is not necessarily written down. Rather, it reflects the rights and customs of the people populating the land. From this novel concept came what we call “the common law.” The common law is built not at once, but as any structure is built – brick by brick, case by case. Each judgment handed down by the court sets a precedent which will inform the next judgment of the same kind. In societies embracing the common law, judges do not create the law of the land. Rather, they declare what it already is and apply it to each situation. And how do they know what the law is? They look to prior judgments, to immemorial custom, and to the fundamental rights of the people. In short, they look to practical experience, the tried and true, over the philosophical and speculative.

Magna Carta itself and the common law jurists and statesmen who followed conceived of rights in negative terms. Property rights, for example, are the natural corollaries of other peoples’ duties not to steal and destroy. Everyone besides the property owner has a duty not to trespass on the property owned by another, which means that owners have a right to the exclusive use and possession of their property. Fundamentally, rights are not invented by the government; they are inherent in what it means to be human. If the government has the power to create rights, then it can just as easily take them away. Magna Carta reflected fundamental rights and reduced them to writing, thus acting as a fence to clearly mark the boundaries between the government and the governed.

Magna Carta was viewed as so foundational to constitutional government, that it featured prominently in the early American colonies. For example, the first Massachusetts code of law explicitly cites Magna Carta as the source of the laws comprising that code. Additionally, South Carolina, when separating from North Carolina in the early 18th century, enacted a statute that incorporated the English common law, as established by Magna Carta, into its own set of laws. Alabama, like many other states, followed this trend. Furthermore, William Penn, of Pennsylvania fame, arranged for the first printing in America of Magna Carta, and the seal used by the Massachusetts Provincial Congress contained the image of a patriot with a sword in his right hand and a copy of Magna Carta in his left.

These historical tidbits evidence the importance Magna Carta held for our American ancestors, but the best evidence is our own written constitution. That document, like Magna Carta, places the law of the land above the government and recognizes certain individual rights, which the government must never infringe upon, much less violate. If the government ever acts “above the law” by exceeding its enumerated powers granted by the Constitution, it ceases to be a proper government. Under constitutional government, laws have parameters in which to operate, but they cannot curtail rights clearly expressed in both our federal and state constitutions.

For today, its 805th anniversary, let us never forget the grandfather of our Constitution, Magna Carta. We should celebrate the concept of constitutional government it ushered into the world and the growing impact of its civilizing influence. Under Magna Carta and its offspring, the United States Constitution and the Alabama Constitution, we should always hold our own elected officials accountable to govern according to and under the “law of the land.” And, we must always remember that government exists not to create our rights, but to protect the rights we inherently possess. When King John exceeded these rights, he set in motion a movement to constrain government by recognizing pre-existing rights and enumerating them lest future rulers forget their limitations. That is something well worth celebrating!

Will Sellers is an Associate Justice on the Supreme Court of Alabama.

1 year ago

It’s time to take a stand against China

(Pixabay, YHN)

Napoleon predicted that China’s “wokeness” would move the world; returning the compliment, the Chinese contend that it is too soon to measure the impact of the French Revolution. Today, China is very much awake and is revealing the dangerous ideas unleashed by the French Revolution.

Although half-way around the globe, China continues to command our daily attention. After all, it is the most populous country on the planet, the second wealthiest, and, recently has the dubious distinction of being the birthplace of the coronavirus. It truly is a remarkable nation.

In half-a-century, it has transformed itself from a third-world country into an international superpower, competing on the world stage against the biggest players: the European Union, Russia, and even the United States. The machine that is China may appear to contain a well-oiled and durably built engine powering the country up the hill of international clout. But soon the strain of its flagging economy will cause this Chinese engine to lock up, bringing the machine to a jarring halt before it begins its backwards slide.


History and economics teach us that the writing is on the wall for China’s recent trend of success. President Xi Jinping is ignoring the warnings, and his administration’s expansion and bolstering of the government’s authoritarian powers will accelerate China’s decline.

The root of China’s woes lies in the centralized, dictatorial control its government exercises on its nation’s citizens and industries. This creates log jams, stifles real growth, and throttles the creative potential of the Chinese people.

Under this authoritarian system, citizens have little motivation to take initiative, and those who do are met with an impenetrable barricade of bureaucratic red tape. Their entrepreneurial spirit, which years ago appeared unleashed, has now been squashed, and all that remains is a stagnant pool of government-issued status quo. Throughout history, the Chinese people have displayed a vibrant and creative spirit, but Chinese-style communism has sacrificed this spirit at the altar of power, efficiency, and uniformity.

Any system of government that stifles the human spirit is doomed to failure. Because people naturally yearn to be free, authoritarian regimes require armies of watchers surveilling the populace’s every move. The government must then enlist watchers to watch the watchers! So long as it exists, this kind of absurd societal structure engenders a culture of fear that paralyzes individual initiative.

Parents can no longer trust their children, who have been educated, or rather brainwashed, to report any violation occurring within the family to the state. The inevitable consequence is that China will fall behind those nations where the inalienable rights of the people are protected and where the spirit of ingenuity and entrepreneurship is encouraged, rewarded, and supported.

A prime example of China’s looming decline is exhibited by its infamous practice of stealing intellectual property from more technologically advanced countries. While the Chinese people have proven to be experts at reverse engineering existing tech, they lack the creative freedom to envision the infrastructure necessary to implement a new generation of technology.

This strategy necessarily results in China playing technological catch-up to the rest of the first world superpowers. That only works for so long. Much like a student who passes a class by cheating will later suffer the consequence of not being able to compete in the professional world, China has hamstrung itself by failing to establish the research infrastructure necessary to develop, much less envision, independent technologies for the future.

A practical consequence of this strategy is the production of second-rate military technologically inferior to that of its adversaries. China’s military is massive; there is no question about that. But in the 21st century, military strength is less about quantity and more about precise weapons systems delivering violent power with limited risk to military personnel.

Invading Korea with a million-man army may have worked 70 years ago, but times have changed. Simply put, because China has stolen the technology for its weapon systems, it does not and will never possess the infrastructure required to maintain and improve on those systems. To wage a 21st century-style conflict requires military personnel to make snap decisions in an asymmetrical environment. China’s bureaucracy could never support a winning strategy in the modern era of warfare.

Furthermore, we are now beginning to see the inevitable result of a centrally-controlled market – a crumbling infrastructure. China’s government has attempted not only to predict the nation’s internal growth, but to force growth to conform to the government’s direction and design. This leads to cities being built in government-projected locations with no inhabitants moving there.

Imagine the huge waste of resources involved in such a strategy. Economic expansion cannot be mandated by a government; growth is fundamentally organic and is tied to human action and human decision. As a government increasingly inserts itself into the ebb and flow of the market, waste begins to accrue and its accumulation further limits economic growth. Eventually, the system itself will crumble under its own weight.

This is starting to happen.

The only hope for China is for the central authority under President Xi to change course and adopt policies giving the Chinese people more control of their government with greater personal freedoms. As liberty and the evolution of self-government are engrained in our nation’s development and explicitly inscribed in our Constitution, we can help.

First, we should curtail the economic dislocation of a trade war. Retaliatory tariffs serve no purpose other than empowering central governments and increasing the costs of goods for Chinese and Americans alike.

Second, we should strengthen our military alliances in the Far East. If the Peoples Liberation Army decides to take action in its hemisphere of the world, it must be met with nothing but resistance from surrounding countries. The United States must not only project influence but also be a reliable partner.

Third, we should aggressively enforce international agreements regarding intellectual property. China must pay the price for choosing to steal rather than to invest in its own technological development.

Finally, we should clandestinely support Hong Kong and its struggle to regain liberty. Hong Kong’s culture of independence can spread like fire throughout China if fueled, and America can supply that fuel. The Chinese people are not so cut off from the rest of the world so as to be ignorant of their suffering.

There is hope for the country, but, ultimately, it must come from the bottom up, when the people demand the liberty to choose their own government.

Will Sellers is an associate justice on the Supreme Court of Alabama

2 years ago

As Alabamians prepare to watch ‘It’s a Wonderful Life,’ a reflection on the unabashedly patriotic films of Frank Capra


As Thanksgiving morphs into Christmas, the December television schedule will be filled with the usual assortment of Christmas classics, not the least of which is Frank Capra’s It’s a Wonderful Life. I’ve lost count of the number of times I’ve seen his movie and unlike some classics that are tiresome, Wonderful Life always grabs me. The idea of selfless giving is made manifest when the entire community comes to George Bailey’s aid. I think every small business owner secretly views his business as the Building and Loan and himself as George Bailey!

But Wonderful Life was not Capra’s masterpiece. His pre-war films all exalt the humble everyman taking on the various goliaths of the age. If you like Wonderful Life, let me suggest a Capra Trilogy to enjoy with your family over Christmas: You Can’t Take It With You; Mr. Smith Goes to Washington and Meet John Doe. Each of these movies plants a seed of a theme culminating in Wonderful Life. I don’t think you can watch any of these movies without a renewed sense of what it means to be an individual pitted against a soulless property developer, corrupt political leaders or a manipulative selfish tycoon.


Capra was a master of giving depression area people a toe hold in a uniquely American system that made Davids believe that Goliath could be defeated. But the doom of the strong was the happiness that radiated from the seemingly powerless little man. Though possessed of limited resources, he had the intangibles that faithful people know as the fruit of the spirit: Love, joy, peace patience, kindness, etc. In fact, all of Capra’s movies are really a morality play to inspire people to take on the challenges of their life and to stand up to the shameless bullies who yield power mainly for powers sake and the ego that comes with flexing muscles to show off.

The strain of populism so ingrained in the lives of Americans is perfectly reflected in Capra’s films. His focus was on the human action of simple everyday people making decisions based on visions of simple moral clarity. He lifted the permanent things that are so often neglected when compared to the temporary glitz and glamour of material gain. Each film contains a large dose of middle American values magnified time and again against the traps and situations of a complicated impregnable bureaucratic world. And in each case, the little guy wins, and the big mules not only lose face but are publicly shamed into accepting if not participating in their own defeat.

These films are in many ways a large mirror reflecting not only the tenor of the times, but also the implicit impact of the original sin of human nature struggling for freedom. In short, people can see themselves in these films and identify with the characters. Everyone wants to see the characteristics of the white hatted hero in themselves but are reminded by conscience that some of the traits of the villain are part of their psyche too. Everyone hopes that internally within their personal OODA loop, they will make wise and prudent choices when faced with decisions of moral consequence. Everyone in Capra’s films has a shot at redemption but not every character accepts the offer; the developing conflicts are what make each film so entertaining.

Capra’s films had consequence when they were initially screened by uplifting average people and giving them hope and a feel-good sense of their personal significance. Perhaps the greatest tribute to the impact of Capra’s films is that Mr. Smith was the last American film shown in France after the Nazi occupation. To the consternation of almost all of the American political class (including Ambassador Joseph Kennedy), the French were so inspired by a country that allowed dissention, vigorous debate and free speech, that as the lights of their freedom were dimming, they chose to see America at its best in the person of Jefferson Smith. There is no way to measure the number of French resistance fighters embolden by this film.

If you liked Wonderful Life, be inspired by the unabashed patriotic films of Frank Capra. You’ll be motivated and perhaps even challenged to identify with a character to live out the American dream in simple community with others who also struggle against human nature to find goodness and selfless service in their daily life.

Will Sellers is an associate justice on the Supreme Court of Alabama