The Consequences Of Being A Meat Computer

Part One: The Hardware

Phineas Gage and the Fragile “Character”

On September 13, 1848, a railroad foreman named Phineas Gage was packing explosive powder into a hole when the charge detonated early. A tamping iron—three feet seven inches, thirteen pounds—shot through his left cheek, behind his eye, through the front of his brain, and out the top of his skull.

He didn’t die. Within minutes he was sitting up, talking. He walked to an oxcart and rode upright to town. He told jokes to the doctor.

But something had changed.

Before the accident, Gage was his employers’ most capable foreman: responsible, shrewd, reliable. After the accident, that man was gone. The new Gage was fitful, profane, couldn’t hold a job or stick to a plan. His friends said he was “no longer Gage.”

The story often gets exaggerated in textbooks, but the core point survives scrutiny: damage to frontal systems reliably disrupts planning, restraint, and social judgment—the very stuff we call “character.”

The Tumor That Turned the Brakes Off

A more recent case: a man in his forties—married, employed, unremarkable—began collecting child pornography. He propositioned his stepdaughter. He was arrested, convicted, and sentenced to rehabilitation.

He couldn’t complete the program. He kept breaking the rules. He complained of headaches.

A brain scan found a tumor the size of an egg pressing on his orbitofrontal cortex—a region involved in impulse control, in the ability to translate “I shouldn’t” into “I don’t.”

Surgeons removed the tumor. The urges disappeared.

A year later, the urges returned. Another scan: the tumor had grown back.

They removed it again. The urges vanished again.

This case gets cited a lot, sometimes sensationally. But the important detail isn’t the content of the urges—it’s the mechanism.

His moral knowledge never disappeared. He knew what he was doing was wrong. He said so. He couldn’t stop.

The tumor didn’t make him evil. It broke his brakes. The part of the system that’s supposed to inhibit action, to override impulse with judgment—that part was being physically crushed.

When the pressure was relieved, the brakes worked again. When the pressure returned, they failed again.

Charles Whitman and the Hijacked Vehicle

On August 1, 1966, Charles Whitman climbed to the observation deck of a tower at the University of Texas at Austin. He had already killed his wife and mother. Over the next ninety minutes, he shot forty-six people, killing fourteen.

Before the rampage, he left a note. It’s one of the strangest documents in the history of American violence:

I do not really understand myself these days. I am supposed to be an average reasonable and intelligent young man. However, lately (I cannot recall when it started) I have been a victim of many unusual and irrational thoughts…

After my death I wish that an autopsy would be performed on me to see if there is any visible physical disorder.

They performed the autopsy. They found a tumor—a glioblastoma—pressing against his amygdala, the brain’s center for fear and aggression.

Did the tumor “cause” the massacre? That’s debated. The tumor was real, but Whitman also had other problems—amphetamine use, family trauma, possible abuse. The official report was cautious about drawing a direct line.

But here’s what isn’t debated: Whitman himself sensed something was wrong inside his machinery. He didn’t write about ideology, or grievance, or revenge. He wrote about irrational thoughts he couldn’t explain and asked doctors to look for a physical cause.

Part two: Viruses and Patches

You might respond: Those are extreme cases. Broken brains. Mine is fine.

But “fine” still means chemical.

Ozempic and the Vanishing “Food Noise”

For decades, society told obese people they just lacked willpower. Then came GLP-1 agonists like Ozempic. These drugs mimic a satiety hormone, and suddenly, patients reported the “food noise”—the constant intrusive thoughts about eating—went silent.

Even more telling, these drugs are showing signs of curbing other “choices,” like gambling, shopping, and alcohol addiction. If an injection can turn off your “desire” to drink whiskey, was it ever really your desire, or just a hormonal signal you couldn’t ignore?

ADHD and the Fuel-Injection Problem

We moralize focus as a virtue. But focus depends on reward circuitry. An ADHD brain has dysregulated dopamine signaling; it can’t “borrow” enough satisfaction from future reward to push through a boring present task.

Stimulants like Adderall don’t just “increase attention”; they flood the brain’s reward centers with dopamine. They essentially “pre-reward” the brain, making a boring task feel chemically satisfying so the biological machine will engage with it.

Lithium and the Salt That Stabilizes

Bipolar mania can turn a person into a reckless stranger—money evaporates, relationships explode, judgment vanishes. Lithium, an element on the periodic table, can stabilize the system.

The Zombie Drug: Scopolamine

Derived from a tree in Colombia, this terrifying drug can cause delirium, impaired memory formation, and abnormal compliance. Victims enter “automatism”—they are conscious and can talk, but their brain loses the ability to form intentions, inhibit impulses, resist coercion, track consequences, remember what’s happening, and act in line with goals. They may obediently empty their bank accounts for thieves and remember nothing later.

The Slow Poisons: Sleep, Fatigue, and Identity

Not every hardware problem is dramatic.

Consider sleep apnea. You stop breathing dozens of times a night. You never fully rest. You wake up exhausted, irritable, short-tempered. This goes on for years.

Eventually, you don’t think “I’m grumpy because I’m sleep-deprived.” You think “I’m a grumpy person.” The symptom becomes identity.

How many personality traits—pessimism, impatience, low frustration tolerance—are actually symptoms of undiagnosed physical conditions? How many people are walking around being blamed for what is, underneath, a hardware issue?

The Inheritance: Traits You Didn’t Choose

And then there’s the genetic layer.

We’ve known for decades that many psychological traits are substantially heritable: extraversion, neuroticism, conscientiousness, intelligence, even political orientation and religiosity to some degree.

Some males have an extra Y chromosome (47,XYY). Early sensational claims basically treated it as a “criminal chromosome.” That story got inflated by biased samples (studying prisons and institutions will obviously skew outcomes).

Still, in broader research, you do see patterns that can matter: higher rates, on average, of language and learning difficulties, ADHD-like impulsivity, and social challenges. And in some large population studies, higher conviction rates show up—typically not a clean “violence” signal, and often explained largely by those developmental and social pathways.

The important part isn’t “XYY causes crime.” This doesn’t mean genes are destiny. Environment matters. But it does mean the starting conditions weren’t chosen. And starting conditions constrain where you can go. Tiny changes in neurodevelopment can ripple through school, work, relationships, and self-control—and those ripples can change life outcomes. Not destiny. Probability.

You didn’t pick your genes. You didn’t audition for your temperament. You woke up inside a system that was already biased in certain directions—toward anxiety or calm, novelty-seeking or caution, optimism or dread—before you had any say in the matter.

Rabies: The Pathogen That Needs You to Bite

Rabies is a masterpiece of evolutionary horror.

The virus spreads through saliva. It needs its host to bite another mammal to continue its lifecycle. So what does it do? It rewires the host brain.

Infected animals become aggressive. They lose fear. They attack. And—here’s the creepy part—they develop hydrophobia, an aversion to swallowing, which keeps saliva concentrated and infectious in the mouth.

The animal isn’t “choosing” to be violent. It’s being piloted by a parasite that benefits from violence.

Rabies in humans, if untreated, follows a similar trajectory: agitation, confusion, aggression, hydrophobia. The person may still be conscious. They’re not in control.

Toxoplasma: The Parasite That Tweaks Risk

Toxoplasma gondii is less dramatic but more common.

This single-celled parasite reproduces in cats. If it infects a rodent, it alters the rodent’s brain chemistry in ways that reduce fear of cat odor—making the rodent more likely to be eaten, completing the parasite’s lifecycle.

Humans get infected too. Possibly around a third of the global population carries it. And some studies—debated, imperfect, but not trivial—suggest associations with slower reaction times, increased traffic accident risk, and altered risk-taking behavior.

You don’t need the strongest version of this claim for it to matter. The weaker version is enough: a microscopic organism you never knew you had might be subtly adjusting your risk settings.

Syphilis: The Madness in the History Books

Before antibiotics, tertiary syphilis was a common cause of insanity.

The bacterium, left untreated for years, invades the brain. Patients develop personality changes, paranoia, grandiosity, cognitive decline. Some historians speculate that erratic behavior in certain historical figures—rulers, artists, generals—might have been neurosyphilis.

The point isn’t to excuse anyone. It’s to notice: what we called “madness” or “evil character” sometimes had a microbial cause.

The Gut: Your Second Brain

Your intestinal microbiome produces much of your serotonin and modulates dopamine. It’s in constant chemical conversation with your brain.

Controlled studies have shown that altering gut bacteria can shift behavior—including social decision-making in economic games. Your sense of fairness, your willingness to punish cheaters, might depend partly on which bacteria are thriving in your intestines.

Your “self” isn’t sealed. It’s an ecosystem with weak borders.

Part Three: The Software

The World That Writes You

If the brain is hardware, the environment is the programmer.

We worship the myth of the Self-Made Man—two people face the same choice, one picks virtue, the other picks vice, purely because of “character.”

That’s a story we tell because it feels fair. It gives us heroes and villains.

Data doesn’t care about fairness.

The Coding Phase: Childhood Trauma as Installation

The most critical programming of a human being happens before they are old enough to vote, drink, or leave the house. In the 1990s, the CDC and Kaiser Permanente conducted the massive Adverse Childhood Experiences (ACE) study. They measured 10 types of childhood trauma (abuse, neglect, household dysfunction) and correlated them with adult outcomes. The results were terrifyingly deterministic.

A person with an ACE score of 4 or higher is 700% more likely to be an alcoholic and significantly more likely to engage in violence or criminal behavior compared to someone with a score of 0.

This isn’t just about “bad memories.” Toxic stress during childhood physically alters the developing brain. It shrinks the hippocampus (memory/learning) and enlarges the amygdala (fear/reaction).

If a child grows up in a war zone (or a violent home), their brain wires itself for survival (hyper-vigilance, aggression, quick reaction), not for suburbia (patience, long-term planning, negotiation). When that child grows up and assaults someone, they are running the “survival software” their environment installed. They didn’t “choose” to be violent; they were built to survive a violent world.

The Bandwidth Tax: Poverty as a Cognitive Load

We scold the poor for “bad decisions.” Junk food. Payday loans. Lottery tickets.

But scarcity isn’t just a lack of money. It’s a mental environment.

When you’re constantly worried about rent, food, and debt, those worries occupy cognitive bandwidth—attention, working memory, executive control. You’re not just dealing with problems. The problems are sitting in your head, running in the background like malware.

Studies on sugarcane farmers found their cognitive performance dropped measurably before harvest (when money was tight) compared to after harvest (when they were flush). Same people. Different financial state. Worse mental function.

Poverty doesn’t make people stupid. It makes them busy—so busy that there’s less cognitive capacity left for restraint, planning, and resisting temptation.

The Invisible Poison: Lead and the Lost Generation

Sometimes “environment” isn’t metaphorical. Sometimes it’s in the air.

For decades, we added tetraethyllead to gasoline. We pumped a neurotoxin into the atmosphere. Lead exposure in childhood is strongly linked to lower IQ, attention problems, and impaired impulse control.

Economist Rick Nevin and others found striking correlations between childhood lead exposure and violent crime rates roughly twenty years later—when those poisoned children became young adults. The pattern held across multiple countries with different policing, cultures, and demographics.

Crime is multi-causal. You can’t reduce it all to lead. But the narrower claim is damning enough: we poisoned children’s brains at scale, damaging the very systems we then blamed them for lacking.

Zip Code Destiny: The Neighborhood as Fate

If you want to predict someone’s future, you don’t need their “character.” You need their address.

Raj Chetty’s Opportunity Atlas mapped outcomes—income, incarceration, teen pregnancy—by childhood neighborhood. The results were brutal: where you grow up is one of the best predictors of where you end up.

The Moving to Opportunity experiment made it even starker. Families who moved from high-poverty to lower-poverty neighborhoods saw their young children’s future earnings rise and incarceration rates drop—but only if they moved before about age 13.

If “criminality” were a stain on the soul, a voucher wouldn’t matter.

If it’s a symptom of habitat—schools, peers, violence exposure, opportunity structure—then of course changing the habitat changes the outcome.

Thermal Determinism: The Weather as Trigger

Finally, something almost insulting in its simplicity.

Crime rates spike during heatwaves. In Los Angeles, extremely hot days (above 85°F) are associated with measurable increases in violent crime, with stronger effects in poorer neighborhoods.

A portion of what we call “self-control” evaporates when the body overheats. People do things in August that they wouldn’t do in November—and later explain it with reasons that sound noble.

The trigger was physiological. The story came after.

Part Four: The Physics Never Change

The Unbroken Chain

We’ve stayed close to the human scale so far: brains, parasites, neighborhoods, heat. But there’s a layer beneath all of that.

Atoms. Energy. Laws.

And once you take physics seriously, “free will” stops looking like a mysterious gift and starts looking like a category error—trying to find a ghost in a gearbox.

The Cosmic Pool Table

Imagine the moment of the break.

You call the shot “uncertain” because you can’t compute all the angles. But your uncertainty isn’t the ball’s freedom. If you knew every detail—force, friction, spin—you could predict the outcome. The table doesn’t “decide.” It unfolds.

Now scale up.

The Big Bang is the break: matter and energy thrown outward. Everything after—galaxies forming, planets cooling, organisms evolving, you reading this—is the same kind of unfolding: events following from prior events under stable rules.

For “free will” (the magical version most people mean), you would need the ability to step outside that unfolding. You’d need a moment where the chain pauses and a non-physical “you” grabs the wheel and turns it without being caused to do so.

The Material Mind

The intuitive protest: But I’m conscious. I think. I feel like I’m choosing.

Sure. You’re also physical.

Your brain is neurons. Neurons are made of molecules. Molecules are atoms. Atoms don’t have opinions. They respond to forces.

A “decision,” at the level where it actually happens, is electrochemical: ions crossing membranes, electrical spikes, neurotransmitters released into synapses, networks tipping into one state rather than another.

Where does “free will” enter? At what point does a thought force matter to behave differently than physics says it should?

If you claim “the mind overrides the atoms,” you’re proposing a new kind of force—a wish-force—that can push particles off their lawful trajectories. We don’t see that force anywhere in nature.

What we see is a system that feels like it’s choosing while it’s running the only program it could run.

The Quantum Escape Hatch

People grab for quantum mechanics like a trapdoor.

“Quantum events are random. Therefore the universe isn’t determined. Therefore I’m free.”

First, I don’t believe that quantum effects are random, but that the wave equation that determines quantum mechanical movements spans the entirety of the universe. Secondly, the brain is not likely to be affected by individual quantum phenomena. The brain is warm, wet, and enormous. Quantum effects get washed out at the scale where neurons fire and decisions form.

Third, randomness isn’t agency. If your actions are the inevitable result of prior causes, you’re a machine. If your actions are partly the result of irreducible randomness, you’re a machine with dice inside it.

Neither option gives you the authorship people want—a self that could have done otherwise in the exact same state of the universe.

Randomness doesn’t create responsibility. It just adds noise.

Part Five: Competing Amongst Yourself

The Crowd Behind Your Eyes

Here’s where it gets stranger.

We’ve been talking as if there’s a single “you” that might or might not be free. But even that framing is probably wrong.

The brain isn’t a dictatorship. It’s more like a parliament—or a crowd of competing drives, modules, and impulses, each with its own agenda.

The Interpreter and the Confabulator

In split-brain patients (where the connection between hemispheres has been surgically severed), researchers can send a command to one hemisphere that the other can’t access.

Flash “walk” to the right hemisphere. The patient stands up and walks. Ask the left hemisphere—the one that can speak but didn’t see the command—why they’re walking.

They don’t say “I don’t know.”

They invent a reason: “I was going to get a Coke.”

The speaking part of the brain didn’t make the decision. It made a story about the decision. And it believed its own story.

That’s not a rare malfunction. It looks like a window into what the conscious mind does all day: narrate, justify, smooth chaos into a coherent story called “me.”

The Committee Meeting

Think about a moment of genuine internal conflict. You want the dessert; you’re trying to lose weight. You want to yell at the person; you know it’s a bad idea. You want to stay in bed; you have obligations.

Who is the “you” in that situation?

It feels like there are multiple voices—multiple preferences, multiple selves—arguing inside your head. And then one “wins,” and you call that the decision.

But which one is you?

If “you” are the diet self, then the dessert-wanting self is a hijacker. If “you” are the dessert-wanting self, then the diet self is an oppressor. If “you” are somehow the arbiter standing above both, where does that voice come from? What makes it authoritative?

We experience an internal crowd. We experience conflict. We experience one voice “winning.” And we narrate that as “I chose.”

But we didn’t author the competitors. We didn’t design the arena. We didn’t pick the rules for who wins. We just… watched the outcome and took credit.

The Experiment You Can Try Right Now

Stop thinking for ten seconds.

Not “think about stopping.” Not “try to quiet down.” Actually stop having thoughts. Full silence. Complete blankness.

Most people can’t do it. Thoughts arise. Unbidden. Unwanted. From somewhere.

If you were the author of your thoughts, you should be able to stop. You should be able to choose silence. The fact that you can’t suggests something uncomfortable:

Thoughts happen to you. The sense that you’re producing them is another part of the narration.

Part Six: Deliver us from ignorance

Consider the following misconceptions that have occurred over the last two millennia:

Bloodletting

Originating over 2,000 years ago in ancient Egypt, it was thought that blood contained “humours” which needed to be kept in balance by removing excess blood from a patient to prevent or cure illness and disease. The practice spread to medieval Europe, where bloodletting became the standard treatment for various conditions, from plague and smallpox to epilepsy and gout. This procedure famously may have been the major cause of George Washington’s death in 1799 after he lost an estimated 40% of his blood.

Trial by ordeal/Trial by combat

In medieval Europe, in cases when there was no witness, such as treason, adultery, witchcraft, or heresy, evidence from an absolute, or divine, point of view was sought. In trial by ordeal, the accused was subjected to a painful and usually dangerous experience (such as walking across fire or red hot iron, or dipping one’s hand in boiling water, or consuming poison, etc.) on the premise that God would help the innocent by performing a miracle on his behalf. Similarly, in trial by combat, a designated “champion” acting on either party’s behalf, would fight, and the loser of the fight or the party represented by the losing champion was deemed guilty or liable.

Witch Tests

One such trial by ordeal were tests for witchcraft. Witch swimming: the act of tying up and dunking one accused of witchcraft into a body of water. Sinking indicated innocence (yet unfortunate death) while floating indicated guilt. The practice appears as early as the 9th century in Western Europe until it is banned by King Henry III in 1219, then reappears in 16th century Europe and Scotland. King James VI was a proponent claiming the devil refused the benefit of baptism and that water, which is the element of baptism, will effectively spit them out and prevent them from sinking. For example, in 1710, the swimming test was used as evidence against a Hungarian woman named Dorko Boda, who was later beaten and burned at the stake as a witch.

Prayer test: witches were thought incapable of speaking scripture aloud, so accused sorcerers were made to recite selections from the Bible—usually the Lord’s Prayer—without making mistakes or omissions.

Touch test: victims of sorcery were thought to have a special reaction to physical contact with their evildoer.

Witch’s mark: suspects were stripped and publicly examined for signs of an unsightly blemish – mark of the devil.

Scratch test: possessed people supposedly found relief by scratching the person responsible with their fingernails until they drew blood.

Causes of the black plague

The black plague or “black death” was the most fatal pandemic recorded in human history, causing the death of 75–200 million people in Eurasia, peaking during the 1400’s. Scholars of the time had many theories for the cause of the plague. Muslim scholars believed it was “martyrdom and mercy” from God, assuring the believer’s place in paradise. For non-believers, it was a punishment. Christian self-flagellants would whip themselves on the back with needle-sharp knotted whips three times a day for 33 days (relating to the age of Jesus) in order to invite God to show mercy. Medical masters in Paris turned to astrology, correlating the plague with the conjunction of Saturn, Jupiter, and Mars, under the moist sign of Aquarius, that took place in 1345, following both solar and lunar eclipses. The terrestrial cause was thought to be air poisoned from noxious gases released during earthquakes or rotting carcasses in swamps.

Human Sacrifice

Throughout ancient Central and South America, Europe, Africa, and Asia, child sacrifice was an all too common means to appease the gods for any number of purposes: to ensure a good harvest, to win in wars, or droughts, or plague. For example, in the Hebrew Bible, the king of Moab is told to have given his firstborn son and heir as a whole burnt offering (olah, as used of the Temple sacrifice). Another practice, that of live burial, persisted from the beginning of mankind. Frequently, it was regarded as necessary to entomb within the foundation of a building, living creatures and even men, an act which was regarded as a sacrifice to the soil which had to endure the weight of the structure. By this cruel custom people hoped to attain permanence and stability for great buildings. “We read in Thiele (Ddiiischc J’olkssagcii, I, 3) that the walls of Copenhagen always sank down again and again, although they were constantly rebuilt, until the people took an innocent little girl, placed her on a chair before a table, gave her toys and sweets, and while she merrily played, twelve masons covered the vault and finished the wall, which since that time remained stable.” –P. Carus

How should we view these deadly misconceptions?

Before we criticize these ancient people for their seemingly barbaric practices, let’s put ourselves in their shoes for a moment (did they have shoes?). War was frequent. Systems of government and policing were weak if not non-existent, resulting in a might is right culture. Also, many of these issues revolved around illness. These societies were unaware of the causes of illness and therefore nearly powerless to prevent or treat those illnesses. Staying alive was a difficult endeavor during those times. There was frequent still birth, maternal infection, food insecurity, and disease (including plagues, dysentery, diarrhea, typhoid, cholera, etc. due in part to open sewage and unclean drinking water). Around one-quarter of infants died in their first year of life and around half of all children died before they reached the end of puberty [ref]. They had little alternative other than to pray and give sacrifice to imagined gods as means to protect or cure them from mysterious illnesses. It wasn’t until relatively recently that Ignaz Semmelweis and Louis Pasteur developed germ theory in the 19th century that we understood microorganisms to be the cause of disease. Also, many societies did not see death as a finality. They believed in an afterlife, so human sacrifice was not considered to be as cruel as we view it today. These societies were deeply ignorant of the laws of nature like weather, agriculture, and the basics of civil engineering. Imagine, Newton’s laws were not discovered until the late 17th century. The neuron, the source of our thoughts, emotions, and behavior, was not discovered until the late 19th century. And DNA, the building blocks of what makes us human, wasn’t discovered until 1953.

Knowing how deeply ignorant these ancient societies were should give pause to sourcing one’s beliefs in the stories that have endured from these ancient times. Some beliefs were easily falsifiable, such as the sun being the wheel on Apollo’s chariot. Others are unfortunately unfalsifiable due to their nature. We cannot go back in time to prove Jesus was not birthed from a virgin and did not rise from the dead, but knowing those ancient societies were so often wrong should leave us, at the very least, highly skeptical. It would seem obvious we should not hold the beliefs from these times as absolutely true. For example, it is unlikely such deeply ignorant people had perfected morality. We heed no warning of the abominable shellfish, and most of us (at least in America) do not believe adultery and homosexuality to be worthy of death. And we know slavery to be immoral. Really, the Bible denounces shellfish, but couldn’t make it clear that slavery was wrong? In fact, it did quite the opposite – implicitly advocating for slavery by giving instructions on who should be a slave, how to treat a slave, and for slaves to obey their masters, etc.

It is not to say that science is infallible or has some monopoly on truth, but that we should give more weight to our recent discoveries and deeper understanding. And that we should practice those methodologies that rely on evidence and are open to and able to change based on new evidence. Take the ancient Roman aqueducts for example. They were an engineering marvel for their time, and what remains is still magnificent to behold. But, to think that a civil engineer of today should only learn the state of the art from that time in order to practice engineering is obviously absurd. I propose it as a similar case – the practice of determining morality and meaning today. Searching the Bible, Torah, Bhagavad Gita, Qur’an, etc., is to resort to sources deprived of modern information and understanding and motivated by superstition and desperation. We see the consequences of such beliefs when “Christian Scientist’s” children die when they aren’t given lifesaving vaccinations or treatments, when priests resort to pedophilia instead of allowing themselves to love another man (see below for some statistics), when women are forced to wear burkas or be “honor killed” by their parents or the “morality police“, when nations justify the extermination of other nations because they believe in the wrong God, and when suicide bombers kill a bus load of infidels in heroic ecstasy believing they are about to reach paradise. And tragically these ancient traditions still persist to today. Right now, child sacrifice is being practiced in Uganda. Right now, over 200 million women and girls in 27 African countries; Indonesia; Iraqi Kurdistan; and Yemen are subjugated to genital mutilation. 98% of girls in Somalia and 97% of girls in Guinea have had their clitoris removed because of an ancient practice to control a woman’s sexuality and maintain her purity. Female circumcision is not in a first tier religious text, yet most of us still find it to be reprehensible. To denounce these practices is to recognize the folly in holding onto the misguided traditions of our ancestors at the same time realizing a new secular morality can exist based on logic, facts, compassion, and reason.

Part Seven: The Way Forward

If we’re meat computers shaped by hardware we didn’t choose and software we didn’t write, with viruses and faulty patches—if even our moral traditions are the guesswork of people who didn’t know what brains were made of—then what’s left?

More than you might think.

Suffering doesn’t need a soul to be real.

Determinism doesn’t make pain hurt less. A child with lead poisoning still struggles. A person with a hijacked reward system still can’t stop using. A brain wired for survival in a violent home still flinches at loud noises decades later.

The causes are mechanical. The suffering isn’t metaphorical. It’s the most real thing there is.

And if suffering is real, reducing it matters—not because a deity commands it, but because suffering is bad. That’s not a claim that needs external justification. It’s the starting point.

Blame is a blunt instrument. Engineering is sharper.

Once you see behavior as output—of genetics, chemistry, environment, and physics—the question shifts from “who deserves punishment?” to “what’s actually going on, and how do we change it?”

When a car’s brakes fail, you don’t deliver a sermon to the car. You fix the brakes. Or you keep it off the road.

Humans are more complicated, but the principle holds.

If childhood trauma rewires the brain for threat, fund early intervention. If poverty drains cognitive bandwidth, reduce the scarcity. If a drug can quiet compulsive urges, prescribe it without moralizing. If a neighborhood produces worse outcomes, change the neighborhood.

This doesn’t mean ignoring harm or abandoning consequences. It means adding better tools to the kit. Diagnosis alongside judgment. Repair alongside punishment. Prevention alongside blame.

The fellowship was real. We need to rebuild it.

Religion did something right: it gathered people. It marked births and deaths and marriages. It gave structure to grief and celebration. It made people feel like they belonged to something larger.

When we discard the metaphysics, we shouldn’t discard the community.

The challenge is building secular institutions that serve the same needs—without requiring belief in things that aren’t true. That’s not impossible. It’s a design problem. Support groups, civic organizations, secular rituals, chosen families—these exist. They need to be strengthened, multiplied, made as accessible as a church on every corner.

Humans are social animals. We wither in isolation. The answer to religions that once thought epilepsy was demon possession isn’t atomized individualism. It’s better community.

You will keep feeling like you’re choosing.

Here’s the strange part: understanding that you’re a deterministic system doesn’t change the experience of being one.

You’ll still deliberate. You’ll still weigh options. You’ll still agonize. The narrator in your head won’t shut up just because you’ve learned it’s not the author.

The question was never whether to “believe in” free will. The system will keep running its program either way. The question is what to do with the knowledge.

And the answer is: use it to be less cruel.

To others—because their failures are as determined as yours. To yourself—because your failures aren’t proof of a rotten soul, just a system under strain.

What you are.

You didn’t choose your genes, your childhood, your parasites, or your neighborhood the day you lost your temper. You didn’t design the competing voices in your skull or pick which one would win.

You are a process. A pattern of matter briefly organized into something that can ask what it is.

That’s not a curse. It’s what it means to be real—to be made of the same stuff as stars and weather and everything else, running on the same physics, embedded in the same web of causes.

The universe doesn’t care about you. But you can care. About others. About reducing suffering. About building something better than what you inherited.

That might be the only thing that matters. And it’s enough.

Leave a Reply

Your email address will not be published. Required fields are marked *