Height advantage in hiking

2 min read

For an outdoorsy, not-so-tall girl, it’s not uncommon to wind up at the back of a pack of significantly taller, male hiking companions. Sweaty and panting, I watch their backpacks recede further away up the trail, and even the sweep guy might abandon his role to bolt around me. In an endurance situation, mental fatigue sends the foggy brain into rhythmic, ineffectual loops. Unable to do mental arithmetic while moving, one can only see that the negative space triangle formed by others’ legs is larger for taller people, and imagine that this reflects some advantage… but how much advantage?

Later, off the trail, pen and paper in hand, one can focus on calculating the magnitude of how much this height advantage adds up to, in terms of explaining how a physically fit person might lag so far behind:

Height of the taller person inches
Walking cadence steps per minute
Stride angle (angle between legs at full extension point) degrees
Height of the shorter person inches
The taller of the two hikers, being inches tall, has an assumed leg length (measured from hip joint pivot point) of inches. Given his stride angle of degrees, he takes steps that are inches long. At his walking cadence of steps per minute, he thus hikes at the rate of miles per hour.

Meanwhile, the shorter person has an assumed leg length of inches. Despite using the same stride angle and walking cadence as her companion (i.e., putting in the same amount of effort), each of her steps is smaller and she therefore covers ground more slowly...merely due to being shorter!

In order to keep up, the shorter person must work harder, by either:
(a) making her small steps more rapidly, at a faster cadence of steps per minute; or
(b) matching her companion's same walking cadence, but making each step longer by using a wider stride angle of degrees. (As efficiency-conscious runners well know, increasing step length beyond what is optimal for one's height has a dramatic effect on tiredness.)

Alternatively, if the shorter person exerts only the same effort as her taller companion, she will fall behind miles per hour of hiking. In such case, the taller person will have to wait (and get to rest!) minutes every hour, while waiting for the shorter person to catch up.

[Note: We’ve made the simplifying assumption of leg length as a fixed proportion (45%) of overall height – a reasonable constant, given that average ratios of leg length to height, and step length to leg length (a function of stride angle, which correlates positively with speed) enable trackers to infer height from footprints.]

Other factors driving differential physical effort between two companions are undoubtedly afoot during a hike: aerobic fitness, anaerobic endurance, strength-to-weight ratio, movement/form efficiency, backpack contents, stomach contents, sufficiency of recent sleep, injuries, performance of clothing/gear, and who’s chatting more than listening. Still, the point here is that leg length alone has a substantial impact on rate of travel. Regardless of which physical issues contribute to the exertion asymmetry, the optimal solution for both hikers (assuming they value fairness and social interaction) is to “put Herbie in front” — i.e., have the disadvantaged hiker set the pace.

Eli Goldratt’s 1984 classic The Goal vividly illustrates this principle of operational efficiency with….a hiking example! Herbie (the fat kid in the book; in our case, the short hiker) is the bottleneck. When the fast kids hike at their own pace with Herbie in the back of the single-file line of boy scouts, Herbie falls behind. They impatiently wait for him at trail intersections, but only to immediately take off hiking again as soon as he catches up and before he catches his breath. Herbie gets more and more tired, and thus even more physically disadvantaged, since fatigue initiates a negative feedback loop in terms of physical performance. Meanwhile, the fast kids get periodic rest, and so the effort differential increases from both directions. Putting Herbie in the front of the line — combined with distributing his backpack load among the fast kids — ensures that the hikers stay together and evenly spaced, and that the physical effort difference is somewhat lessened. (The effort saved by fast kids hiking slower than their capabilities is less than the effort saved by Herbie avoiding being in chronic, desperate catch-up mode.)

Posted in <5 min read, Interactive calculators, Main, Math is everywhere!, [All posts]

Atheist Grace

4 min read

On Thanksgiving Day, Americans of diverse spiritual affiliations follow the tradition of saying grace: atheists who have never ascribed to supernaturalism, atheists who deconverted from a religion they are now pushing against, post-theists who reject the theist-atheist dichotomy, and theists of Jewish, Muslim, Christian, Hindu, Sikh, Buddhist and other faiths. According to household surveys, theists do not have a monopoly on pre-dinner blessings. Still, non-theists may be uncomfortable with a ritual that seems religious. We shouldn’t be.

According to the Hebrew Bible’s overarching theme, all historical outcomes are controlled by God. In the Book of Deuteronomy,[1] a dying Moses reminds his people to never mistakenly believe that the might of their own hands is responsible for a success. Numerous Biblical legends about improbable, underdog victories reinforce this message: long odds can only be overcome with supernatural intervention. Israelites carried the ark of the covenant into battle, hoping for divine empowerment. But, when opponents were evenly matched, God withheld victory,[2] lest the Israelites conclude that their own hands had won them the day.[3] Iron Age people saw most of life as outside their control. The righteous suffer. The wicked prosper. Hard work and cleverness aren’t enough.

The truth in that 2600-year-old text has been reconfirmed by 21st-century research on class mobility, racial and gender privilege, and drivers of longevity and wealth accumulation. Today, we know that men walk more freely through the world than women do, with risks further away and opportunities in closer reach. We know that, compared to people of color, white people enjoy less fear of unwarranted police brutality and false incarceration, and more access to education, jobs, and promotions. We know from empirical data that childhood adversity explains much of later life outcomes. We understand that mental illness is a genetic sentence, exacerbated by exogenous circumstances. We know that men often earn more than women in the same job, while their lives cost less than women’s do; as a result, their higher net disposable income enables them to invest more, spend more on recreation, and afford to take greater physical and professional risks. And, we now know that the particularities of birthplace, graduation year, and one’s parents’ economic circumstances have more influence on net worth than raw intelligence does.

Compared to our pre-scientific, Iron Age ancestors, we can explain far more of our unmeritocratic world. And, in so doing, we concur with them that hard work and cleverness are not enough.

Thus, we must humbly also recognize that our own hands did not win us this feast here today. For that great portion of life that is indeed outside our control, today we come together to express gratitude. Giving thanks motivates some 90% of Americans to gather for a special meal on the fourth Thursday of November, and 15% of us to travel out of town to do so. Our people are moved by this beloved holiday.

So, who is it that are we thanking?

The religious impulse has been explained as gratitude without a concrete target, directed instead toward an abstract deity.[4] Humans conjure a supernatural controller for that great portion of life that is outside our control and that we cannot otherwise explain. Randomness causes angst, and religion offers comfort.

Thanksgiving began in 17th-century America as a Protestant Christian holy day, to thank God for a good harvest and for other positive events which had no known explanation. Most human societies have had an autumn harvest festival to thank deities for food supplies. Also, most religions feature sacramental meals, making Thanksgiving easily conceptualized as ecumenical and inclusive.

Alternatively, one can instead be grateful for statistical happenstance, without invoking any supernatural target. Non-theists accept that the great portion of life is both uncontrollable and unexplainable. Gratitude need not have an object. We are simply thankful for what is here today, mindful of the capriciousness and preciousness of human existence. Piety has been described as an awareness of absolute dependence on the infinite.[5] In that sense, non-theists and theists around the table today share piety as well as gratitude.

From Thanksgiving’s Christian Vorlage to today’s secular national holiday, we also carry forward the emotional value of ritual. Food serves as an identity marker; sharing food promotes bonding; and, drama intensifies experience. By sanctifying — literally “setting apart” — this dinner from the mundane consumption of everyday food, we enable a more profound sense of gratitude, be it of the object-directed or non-directed variety.

Therefore, whether one wishes to thank good fortune or thank divine providence, we can all agree and pause to reflect:

For the food we are about to receive

and for the fellowship at this table,

may we feel truly and humbly grateful.

 

Happy Thanksgiving!

— Denver 2017

[1] Deuteronomy 8:17

[2] Judges 7:2

[3] Psalms 44:3

[4] Richard Dawkins, contemporary British evolutionary biologist

[5] Friedrich Schleiermacher, German theologian active at the turn of the 19th century

Posted in <5 min read, Religion, [All posts]

Foreskin: A 3000 Year Epic

4 min read

The Bible is an anthology of foreskin stories and penis parables.

The mythical history of Israel begins in Genesis with Abraham, the legendary Iraqi-born patriarch of Hebrew patriarchs. Living in northern Syria, he agrees to practice circumcision in exchange for a divine land grant of several thousand square miles in Palestine [Genesis 17]. Abraham cuts off his 99-year-old foreskin, that of 13-year-old Ishmael, and later the tiny foreskin of 8-day-old Isaac. According to this “Abrahamic Covenant”, the punishment henceforth for a Hebrew man keeping his foreskin will be excommunication.

From the Hebrew Tanakh through the Christian New Testament, the Bible’s narrative arc hinges on foreskin removal policy: If Abraham hadn’t gone under the knife to secure the divinely-promised land, then Moses couldn’t later lead Hebrew-speakers from Egypt back to that land, and thus Judaism wouldn’t have become a thing in Palestine, and therefore a dissident Palestinian Jew wouldn’t one day be posthumously deemed a salvific demi-god, et cetera.

Ritual circumcision was common to many of the Bronze and Iron Age cultures around Biblical Canaan. The practice may have originated as a substitute for child sacrifice. For Hebrew-speaking Canaanites, removing the foreskin symbolized submission to and bond with God. Accordingly, their scriptures exhort men to also remove the “foreskin of their hearts” [Jeremiah 4, Deuteronomy 10].

The Hebrew Bible portrays male genitalia as potent. Damaged testicles disqualify a man from Yahveh worship [Deuteronomy 23]. Divine law says that, if a woman grabs a man’s junk to stop him beating up her husband, her hand can be amputated as punishment [Deuteronomy 25]. Back in the opening story in Genesis, women and men are created concurrently and equally. But, in the second version of creation that follows, God fashions the first woman out of a man’s “lateral limb” (tsela) [Genesis 2]. Hebrew scholars lack certainty about the word’s meaning and have suggested it may be the penis.

Biblical-era men could rape their way into marriage, given a 50-shekel fee to the assault victim’s father [Deuteronomy 22]. In contrast to the Bible’s pervasive penis focus, knowledge of female genitalia is entirely absent. For example, if a bride’s hymen was already broken (as we now know most naturally are) or she didn’t bleed on the wedding night (which we now know is caused by unlubricated intercourse, rather than a physiological condition) she could be executed by stoning [Deuteronomy 22].

Penis entitlement also shows up as Sodomites demanding to gang rape male visitors [Genesis 19] and Benjaminites demanding to gang rape a fellow male Israelite traveler [Judges 19]. In both cases, a lower-value woman is offered to the mob. Logically, the Bible regulates male homosexual sex because it was prevalent. It’s described as an “abomination” (toevah) only as horrifying as the “abominations” of uneven scales [Proverbs 20], incense [Isaiah 1], and Egyptians dining with Hebrews [Genesis 43].

The Hebrew Bible writers apparently considered penis power dynamics too self-evident to necessarily warrant a full explanation. For example, the story of Moses violating the Abrahamic Covenant remains one of the Torah’s most perplexing passages. God is about to kill off Moses for neglecting to circumcise his infant son; but, the resourceful Zipporah snips off the baby’s foreskin just in time [Exodus 4]. That one bloody, little foreskin changes history: Moses survives to later liberate some 1.5 million Israelites out of Egypt and receive divine legislation that shapes the course of Western civilization. Zipporah doesn’t get her hand cut off, thankfully; but she doesn’t get acknowledged as a heroine either.

The ponderous “Law of Moses” emphasizes the importance of foreskin removal. In fact, the “rest-on-the-Sabbath-or-die” commandment [Exodus 31] is interpreted to exempt 8th-day circumcision work, which is theologically more important than Sabbath observance [Mishnah, Gospel of John 7]. Even so, the Israelites fall off the circumcision wagon during their legendary time in Sinai. After wandering in the desert for 40 years, the Israelites launch a genocidal (and now known to be non-historical) invasion of the territory purportedly promised to Abraham six centuries prior. D-Day for Canaan begins with a pit stop at the “Hill of Foreskins” to circumcise 40,000 Israelite troops [Joshua 5]. 

Once living in the Promised Land, the Israelites spend centuries skirmishing with the uncircumcised Philistine “Other”. Archaeology identifies the Philistines as an amalgam of Aegean and European peoples who maintained a culturally-distinct civilization on the Canaanite coast from about 1200 to 604 BCE when they were permanently wiped out by Babylonian conquest. In the Hebrew Bible, when Philistines steal the sacred Torah scroll, Yahveh punishes them with an unspecified genital affliction [1 Samuel 5]. Archaeologists suggest that Philistine gold phallus totems may relate to this tale.

Next, an ambitious young Bethlehemite named David trades 200 dead Philistines’ foreskins for permission to marry princess Michal [1 Samuel 18]. David’s circumcision-murders pave his way to kingship…and to a thousand-year lineage that will be credited with producing a Messiah named Jesus. Today, Jesus is considered a god or important prophet by 55% of the world population – a massive “butterfly effect” dependent on butterflying enemy foreskins.

In the 8thc BCE, the Bible narrative begins to align with history. Assyria subsumes Israel, whose refugees trigger a renaissance in Judah. The Hebrew cultic ritual policy triad (foreskins-bacon-idols) oscillates between liberalism and conservatism for a few generations, until the caesura of Babylonian conquest. Exile of the elites — a trauma so definitive that all subsequent Jewish scripture will be generalized as “post-exilic” — is also explained within the familiar penis-power framework: Donkey-sized penises producing voluminous ejaculate tempted the body of Israel away from her husband-god Yahveh [Ezekiel 23].

When Hellenist darkness descends upon late 4th century BCE Palestine, circumcision suddenly constitutes a capital offense. Foreskin restoration also becomes a thing, so that Jewish men can participate in naked sports without offending the Greeks [1 Maccabees 1]. But, the foreskin policy pendulum soon swings to the opposite extreme, when the Levitical Jewish Hasmoneans assume power in the mid-2nd century BCE. Jews forcibly circumcise lapsed brethren at home and conquered Gentiles in modern-day Lebanon, western Jordan, and southern Israel [1 Maccabees 2, Josephus]. 

The turn of the millennium era in Jerusalem produces the secular erotic poetry book “The Song of Solomon”. In stark contrast to the exploitative, non-consensual sex elsewhere in the Bible, this text celebrates mutual devotion and sexuality between an earnest, tender man and an empowered, eager woman. Penises make an appearance (as sweet fruit), but so do vaginas (as a pomegranate orchard). Foreskins are not mentioned and presumably absent.

Around the same timeframe, a certain Torah-abiding couple in Bethlehem has their infant son Jesus circumcised on the traditional 8th day. Over subsequent centuries, multiple alleged foreskins of Jesus surface in the sacred relics circuit.

The New Testament doesn’t ascribe any penis talk to Jesus. For once in the long history of Canaan, penises and penis sheaths are not the main focus. Nonetheless, a 2nd or 3rd century CE lost scripture describes Jesus presenting a handful of his own semen to Mary Magdalene after pulling out of another woman [The Greater Questions of Mary].  

After Jesus’ untimely death, foreskin policy controversy erupts again. Many of his early followers assert that spiritual salvation requires foreskin removal — even for Greeks eager to join the new movement. Another faction notes that pre-Abrahamic Hebrews like Noah and Enoch had fared just fine with foreskin-covered penises. Some aver that foreskin removal is not just unnecessary, but spiritually harmful. Symptomatic of this rancorous debate, the New Testament mentions circumcision almost as often crucifixion.

The Apostle Paul (himself a circumcised Jew) has the game-changing epiphany that, as a practical matter, adult male circumcision inhibits conversion rates. His advocacy of foreskin preservation enables Christianity’s rapid spread across Europe and the Middle East. “We are the circumcision!” goes one placard-worthy argument [Philippians 3].

Paul treks 400 miles from Antioch south to Jerusalem to consult authority figures regarding his radical policy of foreskin retention [Acts of the Apostles 15]. Because Paul wins his case, Christian men will be identifiable by their intact foreskins for the next couple millennia – a feature exploited to unspeakable horror in the Holocaust. As one exception, circumcision has been common among British Christian aristocrats, because the royal house of Windsor conceptualizes its lineage back to the Jewish House of David. 

Some six hundred years later, the Quran incorporates many Jewish and Christian ideas, but not the theological centrality of foreskins, which are not mentioned. If there were a historical Muhammad, he likely would have had an intact foreskin, living in the Christian-majority 7th century CE Roman province of Arabia Petraea. It was in post-Quranic commentary that Islam adopted the Jewish custom of male circumcision.

In the 20th century, non-religious circumcision suddenly became popular in the West, fueled by pseudo-scientific health claims. By the 1950s, circumcision rates peaked at 90% in the United States (compared to 20% in Britain). Today, the rate has declined to 60% (5% in Britain). Increasingly, even Biblical literalists decide the issue on medical or aesthetic grounds. Abrahamic religion’s 3000-year-long foreskin story is over.

Posted in <5 min read, Religion, Sex and relationships, [All posts]

Feminist kiteboarding

4 min read

A friendly shout out to other weekend’s all-male kiteboarding gang:  Thank you for not assaulting me! 

Seriously.  I am sincerely grateful to have been treated like a mundane, non-gendered human being.  Nobody hurt me, treated me like fresh meat, or shunned me for declining to be cajoled into sex. 

Over the past 6 years, I’ve taken 16 unaccompanied, overnight kiteboarding and snowkiting trips, traveling from 250 to 5000 miles from home to enjoy my all-time favorite sport.  On 10 of those trips (63%), I’ve been hurt, in my capacity as a woman, by a male kiter. 

That’s the world we live in.  Also, that’s how much I love kiting. 

The most painful thing they do is proposition you and ostracize you when you decline.  Maybe that doesn’t sound that bad?  Do you need to hear about the more salacious incidents to feel moved?  Being cast out of the village is an age-old punishment for non-compliance with the social order.  It hurts deeply.  Twice I’ve sat alone on a dark beach on Christmas Eve, sobbing into my dog’s furry shoulder.  For me, the invalidation and invisibility of that type of injury pains me more than physical aggression.

Not only is social isolation existentially painful, but in any case, kiteboarding is by definition a group sport.  One needs other people:  to drive multiple cars for downwinders, to exchange tips about aerial tricks, to take cool photos of one another, to pool emergency spare parts on a beach far from kite shops, and — most importantly — to share apres kiting camaraderie.  Then there is teamwork prudently required to launch and land kites (which is the highest-risk step of kiting).

More often than not, when I have explained this reality, the listener suggests I should stop kiting. In 2017, as has been the case for millennia, the default solution to male aggression is for women to curtail life activities…not for men to refrain from aggression.

The media reports how a girl in an Islamist regime defies death threats to continue playing her beloved sport of basketball. Americans reflexively decry such barbaric curtailment of freedom, liberty, self-expression, and self-determination. But the same people apply a double-standard, admonishing me to abandon my kiteboarding hobby. The Muslim girl’s transgression is less threatening to you, because she’s not here asking you to include her as a peer; instead, she’s far away and unthreateningly playing on a team with her own kind.  It’s easier to have moral courage from afar and to support civil rights in the abstract.  In America, similarly to less wealthy and less democratic nations, women face higher external risks than men do to play the same games and enjoy the same weekend activities. Girls in Somalia should be just as free as boys to play basketball. Girls in America should be just as free as boys to kiteboard. That’s what feminism means. 

Despite being born with different genitals, we women have the same desire as men — to play, to explore the world, to experience intensity and challenge in sports, to be physical in our bodies, to revel in nature.  Are you mentally preparing a few contrarian data points to insist otherwise?  Don’t.  This is something we know – as much as scientific method can enable us to know a thing.  Study after study shows that, to the limited extent that gender-correlated differential desire to play sports exists, it reflects strategic adaptation to social constraints and cultural norms.  Women love sports as much as men do, but we’ve accommodated to constraints from an early age. Essentialism is the lazy, responsibility-abdicating rationalization of the unconsciously-privileged and ethically-compromised. 

Women are asked to change and constrain normal human behavior, to change our walking route, our clothing, our housing situation, our hobbies, and our vacations in order to reduce hate crimes against us.  Society is slowly awakening to the absurdity of demanding that would-be victims self-segregate and self-censor.  Rational people will someday soon agree that it is men who must change their behavior and make different choices.  Rape is only and entirely the fault of the rapist – it is not in any way ever the fault of a woman exercising her equal human right to stay out late at a party, enjoy a starlit walk on a beautiful beach, or wear a tank top in hot weather.  Ostracizing or badmouthing a woman who ignores or deflects advances is only and entirely a moral failing of the man – it is not in any way ever the fault of a woman joyfully pursuing a beloved pastime. 

Consider that I, too, find adrenaline sports intensely erotic; but, I don’t believe I’m automatically entitled to co-opt your body and your experience in service of mine.  Remembering that everyone you encounter is living a life as complex as your own, with their own oceanic stories, fears, and desires – that’s the hard and simple trick to treating women like people.

Did you know?  In this country, women are sometimes killed for trying to leave a relationship and upon filing for divorce.  Every year, there are incidents where a woman is physically attacked or murdered for not saying yes to a date, giving out a phone number, accepting a car ride, or consenting to sexual activity with a stranger on the street.  Our physical integrity is neither universally socially respected nor politically guaranteed.  “Consent. Or I’ll make you consent” is a widespread attitude among men.  Consequently, for women, the better part of valor sometimes is acquiescing to undesired interactions, rather than risk a battle.  (Hence, the gross underestimation of the prevalence of non-consensual sex. And, hence, the welcome new custom of seeking affirmative consent.)  More often, in response to being rebuffed, the man questions your sexual orientation, calls you ugly names, attacks you online, sends you a torrent of hateful threatening messages, tells everyone he slept with you anyway, starts dangerous rumors about you, or socially ostracizes you.  Saying “no” is risky. 

So, I am sincerely grateful for that blissfully drama-free weekend at the beach. 

Though, it didn’t stay that idyllic…

But first, an aside on gratitude:  I am tired of being told to be “grateful” for the “compliment” of not being too old or too ugly to be a sex object.  Indeed, the only other female kiter I’ve since met at that kite spot (whom others reference by shorthand as “the heavyset one”, when actually all they need to say is “the woman”, just like in the snowkiting paradise of Sanpete County, Utah one only needs to ask where “the bar” is because there’s only one…) insisted to me that she has never once been excluded or hurt from male kiting gangs, and I thus have suspiciously bad luck. So, the boys are correct that I could allow myself to feel grateful; when they exclude me it’s only because I’m appealing, whereas they don’t exclude her because she’s not appealing.  In their worldview (lately normalized on a grand stage by President Trump), women’s worth is tied to appearance.  But in my experience of human existence, a superficial compliment doesn’t cancel out the pain of exclusion.  The two categories have no exchange rate.  Apples and ocelots.  One has absolutely nothing to do with the other.

One must also be thoughtful to recognize that “gratitude” is the infamous benediction from rapists; they are known to tell victims to feel grateful for being good enough to be chosen.  On one kiting trip, I passed out from drinking (for the first time in 20 years, which I resentfully feel compelled to stipulate here, because we judge women more harshly for smaller transgressions), and woke up to a man telling me I should be grateful that he didn’t do anything to me. In his estimation, I was good enough to be spared. Either way, they choose for us, and we are deflated by the reminder of our chronic vulnerability.

The fantastical, heart-warming uneventfulness of that particular kiting weekend at the lake has since been superseded by a more familiar dynamic:  About half the kiters (“male kiters” being virtually redundant) shrugged off my offer to trade contact info. The privileged dismissal goes that I should passively await “word of mouth” to find out about kiting. (However, the men coordinate amongst themselves using telecom tools, not via word of mouth. It’s an unveiled blow-off.)  Others pulled a Mike Pence, shunting me off to connect with the non-kiter girlfriend with whom I have little in common. Another weekend, I was left wandering down a beach one moonless midnight looking for someone who invited me to kite, but then inhospitably ghosted me when I arrived, leaving me to pitch a tent in the parking lot until the light of dawn welcomed me instead. He said it had nothing to do with me politely declining to date him, but he never spoke to me again. Another weekend, one guy texted me that he wasn’t going to the lake; but it turned out that, at the same time, he told the rest of the gang that he was going (and did go). Later, I was the only kiter from the group not invited to a snowkiting networking social night in town, despite two kiters asking the host to invite me (and me having more snowkiting experience than most of them). One of the non-kiting girlfriends was the only woman at that gathering; she wondered why I wasn’t there but stayed silent, despite knowing full well how much I longed to be included. (So, not only is it #notallmen, but it is #somewomen who can’t relate their abstract civil rights platforms to concrete situations right in front of them where they could make a difference. Overcoming the bystander impulse is hard…even for those who rail against the people who don’t overcome the bystander impulse.) Now, a few weeks after my ecstatic weekend of feeling like I might finally be accepted as just another human kiteboarder, I’m left with the tired realization that every one of those Denver kiters has blown me off.

Kiteboarding sub-cultures have no reason to be immune to the biases and bad behaviors of wider society. Nonetheless, running into that social imperfection in an otherwise idyllic, recreational context surprises me every time.  For me, kiteboarding is an incomparable peak experience where the boundaries of the Self dissolve, the chaotic brutality of wind and waves elevates nature to the sublime, and through the sacrament of this sport, I glimpse the infinite. I won’t ever quit. It is totally worth it. 

Posted in <5 min read, Kiteboarding, Personal, Social issues, [All posts]

Ezekiel Bread and Reading Comprehension

4 min read

Food for Life Baking Company makes sprouted whole-grain “Ezekiel 4:9 Bread”.  The southern California-based company characterizes its product as “crafted in the likeness of the Holy Scripture verse Ezekiel 4:9 to ensure unrivaled honest nutrition and pure, delicious flavors”.

Who is Ezekiel?

Ezekiel was an ecstatic prophet of doom in 6th-century BCE Judah.  Canonicity of the scripture attributed to him was controversial among Second Temple Jews; but, it was ultimately included in the Hebrew Bible.  Consequently, Ezekiel is considered a major prophet in Judaism, Christianity, and Islam.

The Book of Ezekiel is filled with fanciful, proto-apocalyptic imagery.  In Jewish tradition, the opening chapter’s strange, psychedelic vision is so “dangerous” that it should only be read by (male) adults.  The majority of the famously-bizarre New Testament Book of Revelation is borrowed directly from The Book of Ezekiel (along with borrowings from the Book of Isaiah, Book of Daniel, Book of Psalms, and several extra-canonical Jewish apocalyptic scriptures).   

What does Ezekiel 4:9-13 actually say?

Ezekiel is four chapters into describing an extended religious vision:  The angry Israelite god, Yahveh, tells Ezekiel to make bread from a combination of then-common cereal grains and legumes, to be baked over human feces.  Yahweh explains that Ezekiel’s suffering from eating this offensive bread constitutes a divine sign to all the Israelite people – symbolizing their imminent, horrific divine punishment for incomplete devotion to Yahveh.  Yahveh says his wayward people will experience the demolition of Jerusalem, violent death of 2/3 of the population, foreign exile, and having to eat ritually impure food while in exile.  In subsequent chapters, Ezekiel goes on to preach to the Israelites that they must accept impending annihilation by the Babylonians as just punishment for their transgressions against Yahveh.

The bread recipe in Ezekiel 4:9 is explicitly for punishment, not nourishment.

What should Biblically-accurate “Ezekiel bread” contain?

  1. hitta = durum wheat
    • Food for Life’s Ezekiel 4:9 Bread substitutes modern common wheat
  2. seorah = barley
  3. pol = fava beans (aka broad beans)
    • Food for Life’s Ezekiel 4:9 Bread substitutes soybeans, which weren’t grown in the Near East / Middle East until a few centuries after The Book of Ezekiel was written
  4. adasa = lentils
  5. dohan = probably millet (the Paleo-Hebrew word is a hapax legomenon – i.e., occurs only once in known writings — so its meaning had to be inferred by scholars from an Akkadian homolog)
  6. kussemet = emmer wheat (aka true farro)
    • Food for Life’s Ezekiel 4:9 Bread substitutes spelt, which is a softer-hulled hybrid of emmer

The Hebrew text says nothing about sprouting the grains.  Pre-soaking has always been a technique to make edible food from unmilled grain.  However, the Hebrew text says nothing about whether to keep the grain whole or mill any of the bread ingredients.

The Hebrew text also says nothing about yeast or salt, which are ingredients in Food for Life’s Ezekiel 4:9 Bread. 

Health claims

Ezekiel bread is certainly healthy – but not because it follows a (misinterpreted) 2500-year-old Bible verse. 

First, commercial Ezekiel bread doesn’t contain any additives or preservatives — which is why it’s sold in the refrigerated section of grocery stores.  It is widely understood that is likely healthier — though impractical for mass distribution and feeding the planet.

Second, Ezekiel bread uses some “ancient grains” that were long-ago displaced by much higher-yielding, hybrid grain species that can tolerate a wide range of climates and require less processing effort.  That critical crop innovation (combined with irrigation) made possible the past few thousand years of rapid human civilization growth.  Today, wealthy Westerners are re-discovering these grains because they’re tasty and also have higher protein content relative to gluten.  But, a bread recipe using high carbon-footprint cereal grains and vegetables (and baked using smog-creating excrement fuel) cannot feed the world – and neither reflects lost natural wisdom nor evidences divine knowledge about human nutrition.  (Note that Hebrew biographers of the omniscient Yahveh didn’t record him mentioning the gluten-free “superfood” grains amaranth and teff, which were thriving crops in the Americas and Africa during Biblical times.)

For all of human history, coarse foods were for poor people and refined foods were for rich people.  However, humans have recently realized that whole grains are more nutritious than refined grains (bran and germ removed).  In a relatively abrupt reversal of millennia of food economics, whole-grain bread is now more expensive than refined-flour bread.  So, if Ezekiel 4:9 implies use of whole grain flour (it doesn’t) or unmilled whole grains (it doesn’t), that would have been meant to convey low quality – not purity.  Accuracy in Biblical literalism requires reading the text through the lens of the time period in which it was written.

(Note: Anachronistic reading of the New Testament Gospel of Mark and Gospel of Matthew leads to similar interpretive confusion:  Just before being crucified, Jesus declines wine (oinos) laced with bitters.  That sounds awful…unless you know that, at that time, wine adulterated with bitter substances was used to dull pain.  Is the story writer’s point that Jesus declines the drink because it’s unappetizing (i.e., offered to him out of cruelty), or because its anesthetic properties would fog his experience of suffering (i.e., offered to him out of sympathy)?  We have no way of knowing.  Later, Jesus is offered “sour wine” (oxos) in a contextually-clear gesture of sympathy.  This is also confusing for modern readers unless you know that, at that time, sour wine was a common beverage perceived as refreshing.)

Background on wheat species

  • Einkorn wheat (aka “farro piccolo”)

Triticum monococcum

Hulled diploid wheat

Domesticated ~8000 BCE

  • Emmer wheat (aka “farro medio” or “true farro”)

Triticum turgidum dicoccum

Hulled tetraploid wheat; natural hybrid of two wild grasses

Domesticated ~8000 BCE

  • Durum wheat

Triticum turgidum durum

Naked (no hull) tetraploid wheat

Developed by human artificial selection from emmer wheat ~7000 BCE

5% of today’s global wheat crop

  • Spelt wheat (aka “farro grande”)

Triticum aestivum spelta

Soft-hulled hexaploid wheat; natural hybrid of emmer and another unknown grass

Domesticated ~5000 BCE

  • Common wheat

Triticum aestivum aestivum

Naked hexaploid wheat; hybrid of durum and spelt wheat

Developed ~1000 BCE; cultivars of this species then bred for increasingly higher gluten content

95% of today’s global wheat crop

Posted in <5 min read, Food, Religion, [All posts]

Decision analysis of friending versus dating

3 min read

Online matching platforms are best conceptualized as friend-finders rather than date-finders.  Given how difficult it is to find compatible people in the world, it is irrational to rule out the possibility of friendship (or activity buddy, or professional connection) being the optimal path with a cool person you meet online.  

Let’s say that, for a particular guy I’ve just met and like, my assessments of the future are:

  • Probability of a long-term romantic relationship success = 33%
  • Probability of remaining friends after a short-term dating fling = 25%
  • Value to me of a friendship with him = 80 (on a scale of 1-100)
  • Value to me of short-term dating with no ongoing friendship = 30
  • Value to me of a long-term romantic relationship/partnership with him = 100

Let’s visualize my assumed probabilities and values of potential outcomes as a decision tree [click on image to open larger in a new tab]:

decision tree_outcomes

If I ignore risk (i.e., probability of success), I might jump to the conclusion that dating is a “better” strategy than friendship.  So, let’s calculate the expected values (i.e., probability % * outcome value) of my two strategy alternatives:

decision tree_expvalues

Expected value of dating is calculated as 36.  Expected value of friendship was given by me as 80.

Therefore, if my preference is to maximize expected value, I should befriend this guy instead of dating him.  This would be the risk-neutral, probability-weighted, “rational” strategy.  

However, if I prefer a strategy of maximizing potential value, I should date him.  I have a 50% chance of extracting more value from the connection by dating him than by friending him.  I could end up with a high value of 110 (16.8% chance) or 100 (33% chance)….or a low value of only 30 (50.3% chance).  This is a risk-seeking, high-Beta gamble for me.

Risk tolerance and decision optimization approach (i.e., strategy preference) is specific to each person in each circumstance.  

Now let’s say the guy in question has done his own personal decision analysis as well (by filling out the interactive calculator below!), and he decides his preferred strategy is dating me.  He may value friendship less than I do — perhaps because he has lived in this city for a long time and already has plenty of friends.  Or, he may think I’m the best thing since sliced bread; so, he sees immensely more value in a romantic partnership compared to a friendship, and he’s perhaps also pretty confident that it would work out between us.  

He can’t do much about my decision criterion of preferring to maximize expected value.  But, he could still convince me to date him if he provides information that updates my initial assessments of the future.  For example, as I spend time with him, I might estimate a higher likelihood of long-term romantic relationship success.  As I meet his ex-girlfriends with whom he is still friends, I might perceive an increased probability of maintaining a friendship after dating him (thus limiting the downside risk of getting romantically involved).  As I realize he’s an upstanding feminist gentleman, I might update my valuation of a short-term fling.  [See my essay “Game theory of hookups” for detail on how critical a man’s demonstrated integrity is for increasing his appeal to women for short-term intimacy.]  Most importantly, as I get to know him and experience his appealing qualities, I might see him in a different light and come to value a long-term romantic outcome with him more than I do initially.  In other words, the more he starts to seem worth the risk and/or represents a lower risk, the more likely I’d want to take the risk.  And, of course, this all applies equally to both parties considering the friendship-versus-dating question.  

Input your own assumptions in this interactive calculator!:

[forthcoming]

Input your assumptions:

  • Probability of long-term romantic relationship = __33__ %
  • Probability of friendship after short-term dating = __25___ %
  • Value of friendship = __80___ [1-100 points]
  • Value of short-term dating = __30___ [1-100 points]
  • Value of long-term romantic relationship = __100___ [1-100 points]

Calculated results and recommendations:

  • Incremental value of long-term over short-term romantic connection = __70__
  • Expected value of dating = __36___
  • Maximum potential value among all theoretical outcomes = __110___
  • If you prefer a strategy of maximizing expected value, then choose ___friendship_______
  • If you prefer a strategy of maximizing potential value, then choose ___dating_______
  • Probability that potential-value-maximizing strategy yields better outcome than expected-value-maximizing strategy = __50___ %

 

Posted in <5 min read, Decision quality, Interactive calculators, Sex and relationships, [All posts]

Suffering and its discontents: Reflections on the Bronze Age Collapse

8 min read

Once upon a time, there was a super-regional trade network of specialized economies that supported widespread prosperity and prolific innovation and creative output.  Then the climate shifted abruptly to cause drought and plummeting agricultural yields, stateless marauders suddenly appeared on the scene, and a sequence of earthquakes crumbled cities along a major fault line.  In less than one century, trade routes were severed, wealthy cities burned to the ground and were permanently abandoned, powerful governments dissolved, and refugee populations wandered the region. 

That was the Bronze Age Collapse of ~1225 to~1125 BCE.  Civilization around the Mediterranean and Near East fell abruptly into a Dark Age of scarcity and bloodshed, from which it took 200 to 500 years to recover, depending on the particular region.

Societies are known to have survived acute droughts and famines in the past.  Societies have survived foreign invasions.  Societies have survived devastating earthquakes. 

But not all at once.  

All together you get one of the largest-scale social system collapses in the history of humanity.  Scholars suggest it was at least as dramatic as the fall of the Roman Empire and subsequent European Dark Ages. 

On an individual scale, we can also note that people have been known to survive losing a loved one.  People have survived being financially wiped out.  People have survived losing a home, or being displaced from their homeland. People have survived the cruelty of being excommunicated from their family, or the isolation of being suddenly ostracized from a social circle.  People have survived job loss, being shut out of their career, and thorny legal entanglements.

But not all at once.

All together you get the hyperbolic trials of Job – an evisceration of life so improbably comprehensive that it only exists in Biblical folklore to make a dramatically unsubtle point about the inscrutability of innocent suffering.

A recap of the Hebrew Bible’s Book of Job, written ~400-~350 BCE in Jerusalem:

  1. A man named Job loses everything at once: his children, his livestock and human slaves, his health.
  2. Smug “friends” of Job flail around for an explanation. They themselves have never suffered as comprehensively and profoundly as Job. They self-servingly say it must all be morally deserved suffering.  But the text has gone to great pains to impress upon the reader that poor Job is impeccably righteous and undeserving of any punishment whatsoever.
  3. Job is left bewildered by the false accusations of the self-satisfied bystanders. Since there’s no afterlife (the Hebrews didn’t believe in one), he doesn’t even get some lame platitude about posthumous scale-balancing in which to take meager comfort.  (Hellenistic mystery cults will shortly invent the theological platitude of individual salvation and afterlife.  A sect of apocalyptic Judaism will then re-brand the idea as “Christianity”.)
  4. God never answers the question of why Job suffers. He unsatisfyingly says (in a later interpolation appended to redeem the otherwise atheist-sounding text) to shut up and not complain or wonder about suffering.

What’s obvious to the contemporary reader is that it’s all random.  There is no deity meting out good and bad fortune.  Fortune is fortune.  That’s why we call it fortune.  If life circumstances are coin flips, and the 4th century BCE Judean village is a handful of coins, then some poor villager will end up with a “suspicious” run of 100 tails.  A pre-scientific village practicing immature ancient theism will understandably flail around for supernatural explanations of something that any statistician knows was expected.  The gaps of human understanding in which their god(s) abide will shrink over the next two millennia, until science obviates the psychological drive to posit a creator.  We now have a causal model of the world that accounts for all of the data — including the extreme outliers like Job, whom religion never succeeded in satisfactorily explaining.

Below is a list of 25 types of life tragedy.  How many have hit you… and simultaneously?  Just one can cause depression.  Most any two officially constitute childhood adversity.  Three or four is plenty for a memoir and motivational speaker gig.  

Simultaneous Life Tragedies Checklist

X Death or disappearance of spouse X Unwanted legal divorce proceeding
X Loss of sibling X Legal bankruptcy proceeding
X Loss of parent(s) X Criminal or civil legal proceedings
O Loss of child O Enslavement or false conviction and incarceration
O Loss of pet X Acute cash flow strain
X Homelessness X Major unrecoverable property loss
X Forced geographic displacement X Complete financial wipe-out ($0 savings+$0 retirement+$0 income+$0 assets+$0 credit capacity+ high nondischargeable liabilities)
X Victim of violence or other crime X Loss of personal safety and security
X Emotional trauma / diagnosed PTSD X Loss of social circle
O Serious physical health problem onset or terminal diagnosis X Unexpected job loss
O Mental illness onset X Permanent loss of career
O Catastrophic injury or permanent disability X Loss of professional network to build new career
X Loss of ability to ever have children    

[Notice that “relationship breakup” doesn’t even make the above list.  And “acute cash flow strain” is only on there so that people with mild problems won’t check “complete financial wipe-out”.  Getting dumped by a boy/girlfriend and having trouble paying off your credit card balance are an order of magnitude less traumatic than losing a spouse or being completely wiped out. Failing to see that is a Type I fallacy, see below.]

In the space of 1 ½ years, I experienced nineteen of these twenty-five types of life tragedy.  Most were clustered in a period of four months.

In that period of time, I unexpectedly lost my beloved husband, my parents and only sibling, and my ability to ever fulfill my dream of having children. I was rendered homeless in my hometown and then lost my autonomy as a refugee in someone else’s.  They eviscerated my social network and professional network, ended my career and the business I had been building, and put me in a position where many years later I’ve been as yet unable to resurrect a prior career.  Within that short timeframe, I was the victim of violence, revenge porn, hacking and online impersonation, and SWATing. Overnight, I was left with no income, assets, or savings. For a long time, I was sometimes hungry. I began waking up screaming at night, which abated three years later.

There are certainly worse things that could have happened to me, which aren’t checked off above.  My floor isn’t by any means the bottom of the pit of human suffering.  The onslaught wasn’t accompanied by a catastrophic accident that permanently disabled me.  Unlike my ex-husband, I don’t have a mental illness.  Though my physical health was adversely affected by the turmoil, I wasn’t diagnosed with an incapacitating or terminal illness.  In the aftermath, I lost the power of free choice in unspeakably soul-crushing ways; but, I wasn’t falsely convicted and incarcerated for a crime.  And, I am not dead. 

Most of all, I saved my beloved puppy.  That has been everything — the fulcrum of restoration, my orienting purpose and incentive to persevere.  Six years on, the social isolation continues, due to my far-from-recovered economic circumstances.  And, as my now-aging dog’s health fails, the remnant heartlessly pulls away.

You who callously and inaccurately relativize others’ suffering by saying that “everyone has problems”… you who flippantly dismiss pleas for help and blame victims in order to maintain the fragile plausibility of your personal narrative of meritocracy…you who pay lip service to lofty liberal activism, but refuse help to a friend facing existential risk at home…you whose preoccupation with one or two personal setbacks displaces your capacity for empathy… You are the “friends” of Job.  

Most people — because they have checked “only” two or three boxes at once and were leveled by it — cling to a worldview that life is supposed to be fair and pleasant.  Theistic language or not, that is what they believe and express.  Their words reflect faith in mean reversion and a vague expectation that overwhelming suffering resolves eventually in compensatory hidden benefits and fairness.

People have four options for response to a friend’s suffering.  (See table below.)  It is a rare person who can acknowledge the non-relativism of unjust suffering around them, and can accept that it just is….and then can join me in finding contentment in such a world (Type IV).  I have learned to avoid people who refuse to acknowledge my experience (Type I), who reductively blame the victim (Type II), or whose price of acknowledging my experience is their unwelcome, projected darkness and stultifying pity (Type III).  Die Sonne scheint noch.

Typology of Responses to Suffering

(Click on table image below to open larger in a new tab)

Typology of Responses to Suffering

The Bronze Age Collapse metaphor extends thusly:  In the power vacuum that was created by armageddon in the early 12th century BCE Mediterranean and Near East, new civilizations took root.  People could no longer make bronze, because copper and tin deposits in the Near East are separated from one another by 2,000 miles.  Such a concentrated, vulnerable supply chain broke as soon as anarchy cut off the trade route.  So, people turned to harder-to-smelt but readily-available, single-ingredient iron…and then carbon-tainted iron (a.k.a. steel) — which enabled lighter, sharper, stronger objects that greatly extended human power.  Though causality is speculative, we soon got phonetic alphabets, re-imaginations of a transcendent Ultimate, and democracy.  A long half-millennium after a precipitous one-century collapse, global social development measures recovered.

Of course, we can’t know what would have happened without that tablet-wiping catastrophe.  (This goes also for the late 5thc CE fall of Roman Empire, and the 14th c CE halving of European population due to plague.)  The Bronze Age Collapse made warfare ubiquitous, which advantaged the physiologically stronger gender, which invited loss of women’s social status, which we see in archaeological evidence of societies around this time ceasing to feed women meat and no longer burying them with the nifty artifacts that accompany male corpses.  It turns out that women haven’t always been so maligned and oppressed, and gender equality changes haven’t been monotonically positive over the 100,000 years of homo sapiens existence.  It seems that the Bronze Age Collapse may have set back not only social development several centuries but disproportionately knocked women back in a way that takes much longer to recover.  Social status collapse is stickier than economic collapse.

Nonetheless, today’s liberal historicism makes a temporally- and culturally-biased argument that the miserable and bloody 12thc BCE Bronze Age Collapse was, centuries later, ultimately “beneficial” in the imagined “arc” of deterministic history.  And so the theory is that something “better” can also eventually, after some number of years, come out of the 21stc CE evisceration of my own life.

Posted in 5-10 min read, Personal, Religion, Social issues, [All posts] Tagged , , , , , , , |

Nature, nurture, and disease transmission

4 min read

Bipolar disorder afflicts about 3% of people.  Half of them have a bipolar parent.

Therefore, what is the chance that a bipolar parent will produce a bipolar child?

The pendulum of belief about the causes of mental illness has swung from nurture to nature and back to the middle: a non-dualistic view that it’s both nurture and nature.  People used to describe mental illness as a character weakness or choice.  Then, we realized mental illness is highly heritable.  Now, psychiatric science is focused on how environmental factors activate genes.

In the case of schizophrenia, researchers have determined that childhood environmental stress increases the likelihood of manifesting mental illness more than having the gene does.  Childhood environmental stress includes things like abuse, neglect, brain damage, exposure to toxins, cannabis use, bullying, exposure to violence, or death of a parent.  Even having immigrant parents and living in a city are evidently associated with higher incidence and earlier onset timing (though specific causal relationship is unclear).  Genetics is not simple destiny.  Genes have to get turned on. 

That means that backward-looking data about heritability isn’t perfectly predictive of the future in individual cases.  Knowing about his own diagnosis and the mechanism of triggering gene expression, a bipolar parent can put concerted effort into preventing childhood adversity, and thus lowering the risk of “passing it on”.   Over time, the prevalence of mental illness among children of the mentally ill (C) could trend downward in the direction of overall population prevalence (A). 

Psychiatrists considering a bipolar diagnosis always ask if the patient’s parents (or other relatives) have bipolar disorder.  Inviting a patient to reflect on their childhood through that lens can generate an epiphany; it’s often observed that perhaps half of bipolar patients have a bipolar parent.  But…arithmetically, that does not translate to half of bipolar parents having a bipolar child!  You can calculate the actual arithmetic result and play with input assumptions using this calculator. (Note:  “Bipolar parents” means either 1 or 2 bipolar people. This doesn’t distinguish the evidently higher risk from having 2 bipolar parents.)

Assumptions: 

A __3__ % of individuals are diagnosed with bipolar disorder  (30/1000)

B: __50_ % of diagnosed bipolar individuals have one or two bipolar parents (15/30)

[Interactive calculator forthcoming, where you can change assumptions]

Calculated result:

C: __25_ % probability that a bipolar person’s child will have bipolar disorder (15/59)

D: __2__% probability that a non-bipolar couple’s child will have bipolar disorder (15/941). Note that D < A, because a genetic linkage means that prevalence across all individuals (A) is higher than prevalence only among offspring of non-bipolar parents.

E: _8.5__x higher likelihood that a bipolar person’s child will have bipolar disorder (C/A), compared to if the disease were randomly distributed with no genetic link

  Child  
  Bipolar not Bipolar  
Parents Bipolar 15 44 59
not Bipolar 15  926 941
    30 970 1000

Personal context:  In 2002, I married a man who was subsequently diagnosed with bipolar disorder.  The diagnosis (and second through seventh opinions to confirm it) was a comfort, as he had “always felt like there was screw loose”.  He was, understandably, terrified of passing on the disease.  In 2007, his immigrant parents wrote him off as a “shame to the family” (for the diagnosis) and “addicted to drugs” (for taking prescription anti-psychotics).  My husband’s mother wasn’t diagnosed, but he felt she had some type of mood disorder; also, his maternal grandfather, who died back in the old country, was described suggestively as an eccentric who had gambled the family fortune away.

So, we planned to adopt a child… as soon as we recovered from being financially wiped out from keeping my unraveling husband alive and functioning in a pre-ACA medical system.  Even our 401ks had to be liquidated to pay for treatment.  I quit my career to manage our chaotic life.  Then, the whole economy collapsed, adding insult to injury.  My conservative parents wrote me off for having “chosen” to make the proverbial marriage bed I was laying in.  Environmental stress skyrocketed; his symptoms got worse.

One spring day in 2011, my husband went off his meds and vanished.  In the disorienting aftermath, I tried to have my eggs frozen, so that no matter what happened in the uncertain future, I’d at least retain the choice to have a biological child. 

The hospital ethics board deliberated as to whether I should be allowed to undergo the surgical procedure, given the possibility that my missing bipolar husband might return and I might have him fertilize the eggs.  But, they realized that refusing me the procedure would constitute a violation of my rights – not to mention that it smacked offensively of eugenics.  So I got the go-ahead.  Nonetheless, the procedure never happened.

The missing bipolar husband resurfaced in a violent manner.  A trial was set for the same day as the procedure.  Both proved to be strictly immovable due to judicial machinations and ovulation timing.  (It wasn’t until a year later that a mounting list adverse coincidences made it no longer paranoid to suggest this scheduling overlap hadn’t been a coincidence.)  My main witness was my semi-estranged mother.  She threatened to sabotage the trial if I had her appear without me there in person.  My second witness was an acquaintance who wouldn’t guarantee he’d take time off work to show up.  (Crossing the brown line of interracial marriage meant certain people refused me sympathy or protection, seeing my suffering as a platform to share essentialist beliefs about how “those people”  act; long-suppressed bias surfaced as relatives suggested I leave a man who had in fact already left me.)

Thus, I faced a Sophie’s choice between my near-term safety and a hypothetical future child.  Fear drove me to the courtroom instead of the hospital.  By the time the doctor reset my reproductive cycle to reschedule the procedure 4 months later, my late-30s ovaries had abruptly and irrevocably slipped off the cliff of fertility.  And, as of this writing 6 years later in 2017, the long shadow of my beloved husband’s mysterious disappearance and violent return are such that I still don’t have a stable job or a new husband, and thus am not qualified to adopt, either.

Today, people callously tell me to be “glad” I don’t have a child — glad because the child could have been bipolar if my husband had come back, and glad because when he didn’t come back I might have opted for the tough road of single motherhood if I had had the eggs.  But choice is a transformative thing that shouldn’t be so flippantly discounted.  Choice makes the difference between sex and rape, between car-camping and homelessness, between sharecropping and slavery, between euthanasia and homicide, and between a couple electing to be child-free and…. a tragedy for which our culture doesn’t even have a word:  a woman being denied the possibility of motherhood due to the actions of others.

Posted in <5 min read, Decision quality, Personal, Sex and relationships, Social issues, [All posts]

Legal originalism

19 min read

Canonization of written law creates three problems over time:  social realities and ethical norms naturally evolve to be incongruous with a static legal code, new behaviors and technologies arise that weren’t contemplated by the original authors, and internal inconsistencies in the text come to light.

The Hebrew Torah (“law”) exemplifies these problems: 

  1. Evolution of norms. The Levitical age of 25 years (Numbers 8:24) was later reduced to 20 years (1 Chronicles 23:24), in order to yield more priests within a decimated post-exilic population.
  2. Unanticipated situations. The Torah provides detailed instructions for designing a mobile tabernacle (Exodus 25-27), but not for building and maintaining a fixed central temple.  In the post-exilic period, Jerusalem temple worship and priestly activities became the centerpoint of Judahite identity, requiring updated instructions (1 Chronicles 21-29 and 2 Chronicles 2-5).
  3. Internal inconsistency. Is Passover meat to be eaten boiled (Deuteronomy 16:7), or only roasted and never boiled (Exodus 12:9)?  The proper technique was later harmonized as roasting (2 Chronicles 35:13).

The five long-lost papyrus scrolls eventually canonized as the Torah were written in the 8th to 6th centuries BCE.  First, the core of Deuteronomy was likely written in mid-8thc BCE Samaria (northern Canaan) and revised in late 7thc BCE Jerusalem (southern Canaan).  In the late 8thc refugee-flooded Assyrian vassal state of Judah, the Book of Genesis strategically combined northern and southern oral traditions to ensure broad resonance.  Next, Leviticus, most of Numbers, and the “priestly” half of Exodus were written in 6thc BCE Babylonian exile. 

Reality in the 4th century Persian vassal state of Judah was quite different.  It took Jerusalem four centuries to rebound from its devastation by the Babylonians in 587 BCE, which had reduced the prosperous capital city of some 20,000 inhabitants to perhaps no more than 1,000.  The 6th century territory of Judah contracted from some 3,000 square miles to around 100, and didn’t resume meaningful territorial expansion until the 2nd century BCE Hasmonean period.  Solidifying Judahite ethno-religious identity in this bleak, post-exilic context was the primary purpose of new scripture.

So, rather than suffice with a legal code from two to four centuries past, priests wrote new guidelines.  The 4th-century BCE Book of Chronicles repeatedly justifies re-interpretation, harmonization and amendment of the sacred Torah with the phrase k’mishpat (“according to tradition”).  The semantic range of the Biblical Hebrew word mishpat includes written laws/regulations/ordinances as well as community interpretation of such rules.  Its meaning encompasses both what is statutory and also what is customary. [1]  In contrast to today’s ontology, law and tradition were not dichotomous. 

Legal hermeneutics

Judicial originalists consider the meaning of a legal document’s original words to be fixed.  They aim to interpret law according to how the words were understood at the time they were written, and without regard the law’s current consequences.  Within the originalist camp, people differ as to whether to consider the writers’ intent (intentionalists/interpretivists/constructionists) or not (textualists/formalists/strict constructionists).  Intentionalists may use supplementary texts to attempt to ascertain intent or they may ascribe intent subjectively.  Textualists believe adequate meaning is extractable from the words on the page. 

Judicial pragmatists (non-originalists/purposivists/loose constructionists) consider precedent and consequences.  They understand the legal document’s writers’ intention to have been varied, contextual, and impure.  People can’t anticipate everything in the future, so the legal code should be viewed as “living”.  Interpretive pragmatism recognizes that there was debate before the ink was dry, and so it’s artificial to long afterward retroject stasis onto something that wasn’t static at its origin, to indulge in nostalgia for a certainty that never existed.  Wishing you had certainty doesn’t give you the right to pretend you have it.

Certain types of legal texts naturally demand an interpretive approach weighted more toward either empathic pragmatism (consideration of the “forest”) or disciplined textualism (focus on the “trees”).  A reader may subscribe to elements of all approaches, depending on text and context.  Collective legal text interpretation by a group is best served by a diversity of individual interpretive approaches. 

Pragmatism, intentionalism, and textualism are unequally-spaced points on a spectrum of faith in interpretive objectivity.  Pragmatists accept that objectivity, though desirable, is illusory.  The 21stc revolution in cognitive science and behavioral economics supports this amply.  Originalists still believe that it’s possible (either with consideration of probable authorial intent, or without, in the more extreme case of textualists).  Post-modernism overturned the Enlightenment’s subject-object dichotomy, recognizing that interpretation changes a text.  We now have a broad, cross-disciplinary understanding that “truth” is shifty, the interpretive lens distorts, pure objectivity is unattainable, and reading a text is an act of imposition more than extraction of meaning.

No serious person occupies either extremity of this spectrum.  Nobody claims such nihilistically complete subjectivity that leaves nothing knowable and everyone’s law books tossed out the window.  And, nobody claims such childishly pure objectivity that they believe to have achieved transtemporal telepathy with deceased authors and thus immunity from socio-political-cultural bias.  However, simplistic slander by each side accuses the other of inhabiting the absurd spectral endpoint.

Jewish non-originalism

Judaism is philosophically non-originalist.  Classical four-part Jewish hermeneutics asks readers to consider much more than the historical-grammatical meaning of a scripture text.  Importance is placed on a text’s unintended allegorical possibilities, tangential parallels with other texts, and mystical associations supernaturally revealed to (i.e., invented by) the reader.  Ascertaining the human author’s original intent isn’t the goal.  Sacred time is circular.  Thus, a reader might legitimately use a later text to inform the meaning of an earlier one, despite authorial dependence going the other direction.  Turn-of-the-millennium Jewish scribes imaginatively repurposed old scriptures as midrash, unhistorically embellishing and expanding stories to make a theological point independent of the original text’s plain meaning.  That gave us, for example, The Book of Jubilees, The Book of Enoch, The Gospel of Matthew, and The Book of Revelation.

We are grateful that the writers of the Hebrew Bible chose to include opposing views, preserve some of the authentic diversity of belief that characterized all periods of Canaanite history, and refrain from overwriting past scriptures when they produced new ones.  We inherit an astonishingly rich text, replete with contradictions that evidence accretive composition over time by different schools of thought. The canonical Hebrew Bible contains strata of 2ndc BCE Hasmonean propaganda, 3rdc Hellenism, 4thc and 5thc Persian Zoroastrian influence, 6thc exilic nationalism, 7thc poly- versus heno-theistic tensions, 8thc pan-Israelite ideology, 9th and 10thc Canaanite oral traditions, and distant memories of earlier Bronze Age experience.  Authority in those times didn’t come from a text being static.  The Hebrew Bible’s redactors believed that precisely because it is sacred, religious law must be updated and adapted to current circumstances to remain relevant. [2] 

There are 233,000 Hebrew words in the Tetrateuch (Genesis, Exodus, Leviticus, Numbers) and Deuteronomistic History (Deuteronomy, Joshua, Judges, Samuel, Kings).  The rest of the Nevi’im (“prophets”) plus the Ketuvim (“writings”) total 395,000 words.  Thus, the Hebrew Bible contains 628,000 words.  The Christian New Testament – most of which is midrash on the Hebrew Bible – contributes 138,000 Greek words of content. [3]  Then, the 3rd – 5th century CE Talmud adds 1.8 million more words of interpretation and analysis.  

Strict originalism with the Torah would have left the Jews endlessly carrying the Ark of the Covenant around in a tented tabernacle, instead of building (and re-building) a centralized temple.  The pragmatic re-interpretation of legal text supported the 7th-century BCE monotheizing centralization of the Yahveh cult in Jerusalem and the 5th-4thc BCE institutionalization of priestly temple worship.  Contemporary Western social and political order is dependent on that legacy.  Without legal text revision and re-interpretation, our “10 commandments” would refer to the 8thc BCE “Ritual Decalogue” of Exodus 34 (firstborn animal sacrifice, wheat harvest festival, no covenants with polytheists, etc), rather than the 7th and 6thc “Ethical Decalogue” of Deuteronomy 5 and Exodus 20 (no murder, no adultery, no lying, etc).

Originalism regarding Biblical law today would be absurd.  We don’t kill every child who swears at its parent.  We don’t morally condemn athletes wearing blended-fiber clothing or military veterans with tattoos.  We are right to now punish rapists instead of rape victims, and to abhor the Old- and New Testament-sanctioned institution of human slavery.  We are right to now embrace the innate homosexual orientation of ~6% of our fellow human beings.  Biblical originalism would leave us without the holidays of Hannukah, Easter, and Christmas.  Christians wouldn’t have the Trinity, the Immaculate Conception, or the penal substitution understanding of atonement.  A majority of contemporary Christian popular songs would be recognized as heretical, in that their theology is inconsistent with the Bible text and derives instead from post-Biblical tradition. 

Conservative Protestant Christianity’s struggle with non-originalism

No matter our individual religious affiliations or lack thereof, we Americans all live under a distinctly Christian “sacred canopy” (sociologist Peter Berger’s 1967 term).  Every society’s sacred canopy is both universal and invisible.  Growing up in America means unconsciously absorbing the Christian cultural and epistemological paradigm – to an extent only somewhat mediated by one’s nuclear family of origin.  

“Judeo-Christian tradition” is an ideologically-loaded 20th century American idea — loaded originally with the pretense of non-exclusion of Judaism, and more recently with a (historically inaccurate) nativist othering of Islam.  In practice, our American “tradition” is Judaic only in that 1st century Judaism was the direct source for almost all of what became Christian theology (with the rest sourced from Hellenistic philosophy and mystery cults).  Indeed, the hermeneutic of the 1st-2ndc CE Jews who wrote the Christian scriptures was identical to that of the 2ndc BCE – 1stc CE Jews who contemporaneously wrote the Hebrew Ketuvim and apocrypha.  (Moreover, their theology was also so congruent that the classification of some period scriptures as “Christian” versus “Jewish” is debatable.)  However, the subsequent exegetical traditions diverged.  

Exegetical praxis in Judaism has remained explicitly non-originalist over the millennia.  Somewhat similarly, Catholic and Orthodox Christianity emphasizes post-Biblical convention and patristic mediation of “living” scripture.  Mainstream liberal Protestant Christianity embraces a “higher” critical perspective and thus adaptability to scholarly revelation about its sacred text.  However, conservative Protestant Christianity stands alone in unqualified opposition to pragmatic interpretation of scripture.  

Because of their position against Biblical non-originalism, the Protestant literalists have particular trouble with judicial non-originalism.  Protestant Biblical literalists account for just 15% of global Christians.  But it is their disproportionately loud voices that frame popular American political and philosophical discourse…and influence secular legal hermeneutics. 

In our Christian-influenced culture, authority comes from stasis.  Hence, ex-Christian anti-theists delight in mocking the Bible for how profoundly its message changed over time.  They are correct that the Bible is not inerrant.  It’s full of orthographic errors, copyist omissions and rogue scribal glosses, anachronisms, externally-attested historical factual errors, mis-citations of other scripture, words nobody today knows the meaning of, pseudepigraphs, plot holes, unmarked interpolations, and unresolvably conflicting non-original manuscript fragments…as well as substantive internal contradictions where the same event or topic is addressed more than once.  However, regarding the contradictions, the mockers are anachronistically applying their 21st-century interpretive paradigm to 1st-2ndc CE (Christian New Testament) and 8th-2ndc BCE (Hebrew Bible) texts.  Despite their reasoning being critically unsound, plenty of contemporary Christians have de-converted in response to noticing Biblical contradictions.  For the same reason, conservative Christians are hostile to scholarly high criticism and engage in convoluted harmonizations (“apologetics”) to preserve the illusion of consistency.  And, it’s that same normative reverence for stasis that threatens politicians with ridicule if they modify a policy stance based on new information.  Being labeled inconsistent is a cutting insult in our culture.

Although Christian fundamentalists have difficulty embracing a non-originalist interpretative orientation, they may themselves be theologically non-originalist.  Some avow a docetic Christology: Jesus was a pre-existing divine being who only appeared to be human but wasn’t.  For example, I recently endured a Brazilian Pentecostal’s tirade when, in passing, I referred to Jesus as a “person”.  However, docetists like my Brazilian friend were called “anti-Christs” in the canonical Epistles of John (~100-120 CE), their celebrity proponent Marcion-of-Sinope was excommunicated in 144 CE, and docetism was officially condemned as heretical at the 325 CE Council of Nicaea. [4] 

As a practical matter, Biblical originalism is impeded by the fact that we don’t have the original manuscripts.  Academic specialists use advanced critical methods to hypothetically reconstruct what the original likely said.  But, they don’t all agree which of the extant manuscripts is closest to the lost original for any given corrupted passage.  Moreover, because of the internal contradictions and lack of systematic theology in either the Hebrew Bible or Christian New Testament, there is a wide range of scripturally accurate “original” belief.  Technically, being a Biblical literalist Christian doesn’t require belief that Jesus existed, that he was divine, or that he was a predicted Messiah.

“Original Christianity” faithful to Jesus’ own religion could mean following the 613 stipulations of Mosaic law: male circumcision, eschewing bacon and lobster, interest-free lending, premarital sexual abstinence, no worship during menstruation or a week after ejaculation, no work on Saturdays, no neutering of puppies or cross-breeding of livestock, etc.  Nonetheless, their unwittingly non-originalist reading of the Bible can lead nominally originalist Christians to reject not only antiquated elements of the Levitical holiness code, but the Old Testament in its entirety.  Such Paul-centric, quasi-Gnostic, anti-Semitic “Marcionism” was condemned as heresy in the 2ndc CE when it arose.  Still, modern-day Marcionites exist – and, ironically, are known to selectively quote from the Old Testament to argue from implied authority (with further irony of ascribing certainty and relevance to a contradictory, composite text).

This evidences the core problem with originalism in any domain:  it’s usually about convenience, not consistency.  Originalists are liable to make disingenuous arguments.  For example, the Hebrew Bible is overwhelmingly clear that we are to proactively care for the poor, and the Christian New Testament advocates plainly socialist redistributive policies.  Yet so-called literalists are typically hostile to social welfare programs and critical of, rather than compassionate toward, the poor.  Similarly, the Bible has nothing to say about reproductive rights; yet, so-called literalists invoke the Bible in campaigning to restrict women’s dominion over our own bodies.  Originalists choose which passages to privilege and then must explain away or ignore other inconveniently contradictory passages.  “Anything in the Bible that is not literally true must be an allegory, because the Bible is always literally true” goes the apologist’s unapologetically circular argument. 

Constitutional hermeneutics

Tuesday April 18, 2017, was Neil Gorsuch’s first day on the United States Supreme Court.  The self-described originalist caricatured originalism with his monothematic interjections: “Look at the plain wording.” “Just read the words.”  Other justices, including originalists, countered that it’s not so simple.  Journalists noted Gorsuch’s “shots across the bow” as pre-emptive assertions of rigid originalism to come.  On Day One, his “just the text” catechism is already tiresome. [5] 

Textualism privileges words over their consequences, maintaining that there’s no subjective act of interpretation occurring.  But language is inherently ambiguous and thus interpretation is necessary.  For example:

  • Do “privileges and immunities” refer to citizens’ rights at the local, state, and/or federal level, and/or rights derived from natural law and/or by custom, and are they substantive and/or procedural rights?
  • Are white women and non-white people “persons” deserving “equal treatment”? Apparently not, as it took an amendment two years later to give black men the right to vote, and another amendment fifty years after that to give women the right to vote.  Yet, according to tradition (k’mishpat), we now read “person” as encompassing all human beings.  
  • What did “cruel”, “unreasonable”, “probable”, and “liberty” mean in 18thc colonial American English? The document is full of aspirationally vague words with indeterminate meaning, necessitating a value judgment by later readers.  
  • Do wiretapping and drone surveillance constitute a “search”? Does revenge porn posted online constitute “speech”?  “Dead document” textual originalism can become risible.  And, when uniformly applied, it doesn’t always lead to a politically conservative outcome. 

For these reasons, Constitutional textualists are — like Bible proof-texters – notoriously selective with textual evidence.  Whether its proponents are failing to offset subconscious ideological confirmation bias, or are consciously and disingenuously fitting text to personal ideology, textualism is objectively not the objective approach it purports to be.  Textualist Supreme Court justices’ departures from textualism have been well documented, and we expect the same selective hermeneutical approach from Gorsuch.  After all, the court doesn’t keep a scholar of 18thc American English linguistics on call to inform its deliberations.  [6]

Just as the Bible is not inerrant, so neither is the United States Constitution.  Since the framers knew they couldn’t predict the future, they resisted including specific policies – with the exceptions of income tax policy added via amendment, and alcohol policy added and then subtracted via amendment.  Even so, the document made (and eventually corrected) what we now accept was an egregious moral error regarding the specific policy of slavery.  The Constitution is a short document of 7,591 words (compared to 4,200 in this essay), and in it the framers were smart to mostly remain general and abstract.  However, that well-intentioned textual magnanimity translates into greater subjectivity in interpretation later on.  This is parallels how the Bible’s parabolic abstractions and poetically non-specific language (plus errors, contradictions, non-systematic message, and composite structure) forces subjectivity in its readers.  By their natures, both texts provoke endless interpretive controversy.

Object informs interpretation.  Though most texts don’t announce what hermeneutic to use in interpreting them, both the Bible and the US Constitution helpfully do so.  The 9th amendment is famously troubling to constitutional textualists, in that it recommends non-originalism.  It states that failure to specify a right in the Constitution shouldn’t translate into restricting that right.  The document knows it isn’t comprehensive.  In addition to being unfeasible in practice, strict “originalism” is both non-Biblical and non-Constitutional.

Law has purpose.  Interpretation should further the law’s purpose — as distinct from the law-writers’ intent, and as distinct from imagining how the law-writers would have solved a problem of today that they never could have contemplated.  The purpose of the United States Constitution is stated up front: “in order to form a more perfect union, establish justice, insure domestic tranquility, provide for the common defense, promote the general welfare, and secure the blessings of liberty to ourselves and our posterity.”  Fidelity to purpose requires adaptation to match changing social and ethical norms.

Faithfulness to the overall purpose of a document also requires reading it as a unified whole, rather than as isolated passages.  This is considered best practice for legal documents as well as religious scripture.  For example:  

  • The ~700 BCE text of Isaiah 7:14 depicts a court prophet in 734 BCE explaining to King Ahaz of Judah that a young woman having conceived is a comforting divine sign that the Israel-Aram coalition against Judah will fail. If one digests the Book of Isaiah in its entirety, and together with the Deuteronomistic History from which it copies some content, its multi-century historical context and anthology structure is apparent.  However, reading it in decontextualized excerpts, many have claimed that the words predict Jesus.  The interpretive fallacy is aided by the 3rdc BCE translators’ choice — either careless, ignorant, or ideological — to (a) change a present perfect Hebrew verb to a future tense Greek verb and (b) replace the specific Hebrew word for “young woman” (almah) with an imprecise Greek word that can mean either “young woman” or “virgin” (parthenos).    
  • The 2nd Amendment of 1791 states its purpose as “the security of a free state”, and thus the right to bear arms as conditional. In the overall context of the Constitution, the potential need for a militia to effectuate that security was due to the new republic’s lack of a standing army.  In an era that also lacked local police forces, domestic tranquility got some help from private citizens with muskets that fired 2 times per minute.  Today, domestic tranquility is harmed by easy access to guns firing 600 times per minute.  Weapon technology and social reality have changed in unanticipated ways, demanding updated common sense interpretation of how to achieve the 2nd amendment’s purpose.  Is this an oversimplification…or is it actually the “plain wording”?  [7]  Honest originalism would restrict the right to bear arms today. 

Intentionalism is less extreme than textualism, but leads to its own thorny problems.  In interpreting some legal texts (e.g., contracts and wills), the intent of the writer has significant merit for interpretation.  Less so for the Constitution.  The purpose of our Constitution is not to further the disparate and unstated intentions of 39 upper-class, East coast, Anglo-Saxon, Protestant,  heterosexual, slave-owning, pre-industrial, 18th-century male lawyers. [8]  For example, it is clear from extra-textual evidence that the intention of the 14th Amendment in 1868 was definitely not to prohibit racial segregation; yet, we’ve (thankfully) interpreted it as such ever since 1954.  A Constitution interpreted strictly in light of authorial intent — whether imputed or extra-textually researched — doesn’t necessarily provide the best foundation for a more perfect union, in that it may not remain consistent with the Constitution’s own purpose: the welfare and liberty of contemporary citizens.  The temporal and cultural distance between 1787 and 2017 is vast.  As has often been asserted, the Constitution should be wiser than its writers. 

Humans are human because we can transmit valuable learnings across generations.  This is a feature only known in hominims. [9]  Tool manufacturing started 3.3 million years ago with hominims (interestingly, among primates outside our own genus homo).  Today, intentional tool-making and its corollary of high-fidelity knowledge transmission isn’t seen in any living creatures other than homo sapiens.  We humans use tools to make other tools, rather than just appropriate found objects as tools.  Crucially, we then teach what we learn, so our children don’t have to start over knapping stones into axeheads and re-inventing the wheel every generation.  The fundamental fallacy of strict legal originalism is that it repudiates the wondrous capacity of humanity to learn, accrue wisdom and become more efficient, moral, prosperous, cooperative, and healthy over time.  When it comes to Constitutional law, we can do so much better than to “just read the words”.  

April 2017

 

Notes

[1]  One example of its statutory usage: In the Book of Deuteronomy, Moses recapitulates mishpat, which we now refer to as “the ten commandments”. 

[2] The Christian New Testament continues the Hebrew scripture practice of incorporating opposing belief traditions without harmonization.  It teems with theological contradictions and incomplete articulations of core doctrinal points.  A systematic Christian theology only coalesced two centuries after the last Biblical scriptures were written.  Hence, the need for so much textual harmonization today among people who can’t tolerate that contradictory diversity. 

[3]  For example, 70% of the Book of Revelation’s 404 verses are decontextualized Hebrew scripture references; and much of the remaining 30% is borrowed from extracanonical Jewish apocalyptic writings and Babylonian mythology.  Similarly, 20% of the Gospel of Matthew’s 1,071 verses are Hebrew scripture citations and references; and much of the remaining 80% are scriptural plot parallels, plus ideas from Greek Cynicism philosophy.  More broadly, none of the Old Testament passages that Christians now read as predictions of Jesus actually refer to a future Messiah at all.  Rather than being ignorant or deceptive, the writers of the Gospels and the Book of Revelation were engaged in the then-common practice of pesher – a sub-type of midrash that divorces old scriptures from their original intended meaning and creatively repurposes them to explicate current events.  However, in short order, rapidly-expanding early Christianity forgot its own original hermeneutic.  For the next 17 centuries, the wider Gentile world mistakenly read Christian scripture as historical rather than midrashic.

[4] The Christologies of Jesus’ posthumous followers included docetism, separationism, incarnation, adoptionism, as well as the belief that Jesus wasn’t divine.  All but incarnation were eventually declared heretical.  The canonical synoptic gospels reflect resurrectionist adoptionist and baptismal adoptionist Christology – another example of intentional preservation of contradictory and heterdox belief traditions. 

[5]  Gorsuch was raised Catholic and now attends a liberal Episcopalian church – an example of the imperfect correlation of nominal religious affiliation with stance on judicial hermeneutics.  To the extent that exegetical paradigms influence legal interpretive paradigms, they are defined more by the sacred/cultural canopy than by individual worship community membership.

[6]  The long deliberations often dive into mind-numbing semantic minutiae.  Recall that President Bill Clinton was excoriated by the media in 1998 for caveating acknowledgement of an extramarital affair based on “what the meaning of ‘is’ is”.  This was plausibly earnest logic from a cerebral law school graduate who knows that significant constitutional decisions have hinged on the meaning of a single word.

[7] The accusation of oversimplification is used to silence commentary on certain topics, including constitutional hermeneutics, by laypeople.  However, any topic can be addressed in a paragraph, an essay, a book, or an entire shelf of texts.  It’s a self-serving cognitive bias to believe one’s own professional domain uniquely defies summarization and explication.  Constitutional law is not a mystery beyond the comprehension of those who didn’t attend law school.  Justices:constitution::clergy:bible.  Both law and religion benefit from thoughtful disintermediation.

[8]  Although the signers were nominally Christian Protestants, some held non-Trinitarian and Deist beliefs that most Christians today would label “non-Christian”.

[9]  “Hominim” is a taxonomic sub-tribe of the Great Apes sub-family, in the Anthropoids sub-order of the Primate Order.  It comprises the genus homo with its multiple extinct sub-species and one extant species sapiens, plus other genera of pre-human primates.

Posted in >10 min read, Religion, Social issues, [All posts] Tagged , , , , , , , |