Monday, November 26, 2018

Christianity by State: The Religious Dimension of Federalism

According to the  2010 U.S. Religious Census of Religious Congregations & Memberships Study by the Association of Statisticians of American Religious Bodies, less than 50 percent of the people living in the United States identified themselves as Christian adherents in 2010. There were more than 150.6 million out of 310 million. Even so, candidates for the U.S. presidency still felt the need to vocalize the fact that they are Christian (while the opponent doesn't quite measure up in that respect). President Obama made a point during his first two years in office to stress his Christianity as if it were the membership card to the Oval Office. It would seem that the litmus test was already antiquated and thus needlessly constrictive on potential candidates.
The study also discerned differences between the states, once again giving evidence of the heterogeneous nature of the empire of fifty republics. States where more than 55% of the residents identified themselves as Christian were either in the South or in the Midwest. Mississippi (59%), Utah (57%), Alabama (56%), and Lousiana (54%) top the list. Interestingly, the two clusters of dark red on the map point to two distinct Christian cores in the United States. It would be interesting to study whether or how the two cores differ. One indication is that there were differences is that all of the states in the southern core had the death penalty at the time, whereas two of the three states in the northern core did not. I suspect that the southern core was more ideologically conservative in nature (i.e., social issues)--the Northern core's conservatism being moderated perhaps by the tradition of Hubert Humphrey.
States where less than 36% of the residents identified as Christian were in the West and New England. Vermont (23%), New Hampshire (23%), Maine (25%), and Massachusetts (28%) have the fewest Christians on a percentage basis (the map incorrectly shows MA as bright red). In these states, it would be particularly untenable were public services and offices to close on Good Friday and the following day. Unlike Christmas, Easter does not have the secular holiday component (which is recognized as a national holiday in the U.S.).  In fact, the sheer extent of difference between states like Mississippi (high 50s) and those like Vermont (low 20's) suggests that holidays should be declared on the state rather than the federal level. The question of whether the U.S. Constitution's establishment of religion clause applies at the state level is also relevant.
In short, the multi-colored map of the U.S. in terms of Christianity suggests that a "one size fits all" approach through Congress has the significant downside of ignoring significant cultural and religious differences. In other words, to take Vermont and Mississippi and "split the difference" is not likely to fit with the condition in many states. The fact that those two states are in the same union testifies to the fact that the union itself is on the empire-level, meaning that its member states are themselves commensurate with countries around the world that are not themselves on the empire-level. It should be no surprise, then, that federalism, which originated at the empire level in alliances, is a system of governance particularly well-suited to accommodate the sort of diversity that exists in the U.S. (and E.U.). The key to enabling federalism to accommodate the different religious makeups of the states is to keep the imperial-level government from stifling the state governments. In other words, they need to have enough space to produce legislative action that is tailored to their respective religious cultures. In states that are dark red, it makes sense that Good Friday would be a recognized holiday, whereas such a practice in Vermont or Oregon would impose too much on too many people. 

On Federalism in America and Europe, see Essays on Two Federal Empires: Comparing the E.U. and U.S., Essays on the E.U. Political Economy: Federalism and the Debt Crisis, and British Colonies Forge an American Empire, all available at Amazon.

Source:

The Huffington Post, "Most and Least Christian States in America," May 29, 2012. 




Sunday, November 25, 2018

God's Gold through the Centuries

In the wake of the financial crisis that came to a head in September of 2008, people might have been wondering if sufficient moral constraints on the greed on Wall Street are available, even possible. The ability of traders to create complex derivative securities that are difficult for regulators to regulate, much less understand, may have people looking for ethical or even religious constraints. It would be only natural to ask if such “soft” restraint mechanisms really do have the puissance to do the trick. Here’s the rub: the tricksters are typically the last to avail themselves of ethical or religious systems, and they the wrongdoers are the ones in need of the restraint. Blankfein said of his bank, Goldman Sachs, that it had been doing God’s work. About a week after saying that, he had to walk his statement back and admit that the bankers had does some things that were morally wrong. Although divine omnipotence is by definition not limited by human ethical systems, it is hard to imagine a divine decree telling bankers to tell their clients one thing (buy subprime mortgage derivatives) while taking the opposite position on the bank’s proprietary position (shorting the derivatives, beyond being a counterparty to clients). Divine duplicity seems to represent an oxymoron on a megascale rather than a justification for greed. As the crisis erupted and was subsequently managed by public officials in government and new managers brought in to salvage AIG, I was researching the history of Christian thought on profit-seeking and wealth. I have since published an academic text and a nonfiction book, which develops further on the treatise on the topic. As the book is too recondite for sane people (i.e., outside of academia), I am writing a non-fiction book on the topic for a broader readership. To whet the appetites of those of you who are waiting for something more readable that a recondite thesis, I present a brief account of my original research on the topic here. 


For the full essay, see "God's Gold through the Centuries."
________________________

See related essay: "Religious Sources of Business Ethics"

The academic treatise: Godliness and Greed: Shifting Christian Thought on Profit and Wealth 

Larry Summers Bowed Out of the Race for Fed Chair: “Advise and Consent” Triumphant

On September 15, 2013, the White House announced that Larry Summers, Barak Obama’s prior chief economic advisor and a Secretary of the U.S. Treasury during the Clinton administration, no longer wanted to be considered to fill the upcoming vacancy as chairman of the Federal Reserve. In the announcement, Obama (or an advisor) wrote, “Larry was a critical member of my team as we faced down the worst economic crisis since the Great Depression, and it was in no small part because of his expertise, wisdom and leadership that we wrestled the economy back to growth and made the kind of progress we are seeing today.”[1] Unfortunately, this statement suffers from a sin of omission, which admittedly had been minimized by the media as well. Accordingly, the Democrats in the U.S. Senate who had just come out against a Summers nomination can be regarded as done the nation a vital service. Moreover, the “check” of the “check-and-balance” feature of the U.S. Senate’s confirmation power worked.
With all its open points of access, democracy can have a splintering effect on a point worthy of public debate. The media dutifully plays its “scattering” role before any reasoned train of thought can gain traction in the public domain. For example, at the time of Summers’ retraction, the New York Times reported that a “reputation for being brusque, his past comments about women’s natural aptitude in mathematics and science, and his decisions on financial regulatory matters in the Clinton and Obama administrations had made [Summers] a controversial choice.”[2] A reader could be forgiven for concluding that personal vendettas, public gossip, and single-issue (not even monetary policy!) political activists have points just as important as the matter of Summers’ past decisions on financial regulation.
Any political interest having succeeded in getting its point to a microphone is deemed just as relevant or decisive as any other. Democracy is the great relativiser. The great equalizer. Is rule by "members of the club" the only practical alterative, or is it American democracy merely its front, only superficially relativising and equalizing the relevant and the less-than-relevant?

       Alan Greenspan, Larry Summers, and Robert Rubin. In the late 1990s, they pressed Congress to keep financial derivatives unregulated. A case of government doing Wall Street's bidding?   Image Source: pbs.org


Obviating the scattering effect, we can zero in on Obama’s attribution of “expertise, wisdom and leadership” and ask whether they apply to Summers when he joined Alan Greenspan and Robert Rubin in the late 1990s to lobby Congress to remove the CFTC’s authority to regulate financial derivatives. At the time, Brooksley Born, chairwoman of the CFTC, was recommending to Congress that mortgage-backed derivatives be regulated. That the unholy triumvirate, blessed by Clinton and supported by Wall Street, succeeded with the help of Sen. Phil Gramm in discrediting Born kept the financial system vulnerable to being blind-sided by a collapse in any of the securitized derivatives markets.
It is difficult to fathom how financial regulatory expertise, wisdom, or leadership could possibly pertain to Summers in his lobbying capacity during the Clinton administration, even if he did go on to help Obama mop up the fiscal thaw after Lehman's bankruptcy. Yet, sadly, Summers’ down-right discraceful bullying of Born, and being wrong on top of that on whether financial derivatives should be regulated were barely mentioned in the public discourse leading up to Summers’ decision to withdraw his name from the president's consideration.
Fortunately, the public can rely on the informed advise and consent power of the U.S. Senate. Is it just a coincidence, however, that bullish financial markets answered Summers' decision? Or was it the other way around: Summers' decision being the answer to a message he had received from Wall Street?
To be sure, Summers was "generally considered to be in the pocket of Wall Street."[3] Citibank had been paying him for what the Federal Reserve refers to as "participation" in "Citi events."[4] Additionally, his efforts to keep financial derivatives, including CDOs, unregulated enabled Wall Street banks to make "kazillions of dollars."[5] However, bankers have short memories, given the inertia of the financial interest of the moment. Traders and bankers tended to favor Yellen, according to a poll conducted by CNBC; whereas Yellen got 50 percent, Summers came in at a mere 2.5 percent.[6] Besides sporting an abrasive (Harvard?) manner, Summers had given vocal hints that he might tighten monetary policy, even reduce the Fed's bond-purchase program sooner than Yellen would.
Particularly given Summers' nature, I have trouble believing that he woke up one nice fall morning and decided to cave in to anticipated "political obstacles" to his getting confirmed by the U.S. Senate. Put another way, more was likely behind his loss of support among Democrats on the Senate committee than even concerns the senators might have had about Summers' prior participation in turning Congress against Born's plea to regulate derivatives. Could it be that Wall Street CEOs were pulling the strings, reaching from Boston to Capitol Hill without even leaving finger-prints?    


1. Annie Lowrey and Michael D. Shear, “Summers Pulls Name from Consideration for Fed Chief,” The New York Times, September 15, 2013.
2. Ibid.
3. Mark Gongloff, "Larry Summers' Withdrawal from Fed Race Is Good News for Wall Street and the Economy," The Huffington Post, September 16, 2013.
4. Reuters, "Fed Contender Larry Summers Cancels Citigroup Events," CNBC, September 14, 2013.
5. Gongloff, "Larry Summers."
6. Mark Gongloff, "Wall Street Overwhelmingly Favors Yellen Over Summers for Fed Chair: CNBC Poll," The Huffington Post, July 26, 2013.

The Banks’ Consultants: Guarding the Hen House

Leaving it to consultants hired by mortgage servicers to right the wrongs that the services inflicted on foreclosed homeowners was the unhappy consequence of bank regulators giving ambiguous guidance and failing to install viable oversight mechanisms. According to the Government Accounting Office, “regulators risked not achieving the intended goals of identifying as many harmed borrowers as possible.” Even if the reviews had been completed, there was on guarantee that wronged mortgage borrowers would have received any compensation. On the other side of the ledger, the banks had received billions from the U.S. Treasury with no strings attached. Whether intentional or not, the banking regulators put too much stock in the consultants, who, after all, had been hired by the mortgage servicers."
The report confirms that the Independent Foreclosure Review process was poorly designed and executed," Rep. Maxine Waters (D-Calif.) said. The "report confirms what I had long suspected -– that the OCC’s oversight of the supposedly independent consultants hired by the servicers was severely deficient. The report should serve as a wake-up call.” By referring to the consultants as supposedly independent, Rep. Waters implies in her statement that the flawed review process put the consultants in the position of being able to exploit a conflict of interest.
On the one hand, the flawed oversight means that the consultants were the ones to protect the public interest. The public had to rely on them to correct the wrongs in the interest of the foreclosed. We can label this the consultants’ “public interest” role. The consultants’ other role  was in working for the servicers. As Benjamin Lawsky, the superintendent of New York's Financial Services Department, pointed out, "The monitors are hired by the banks, they're embedded physically at the banks, they are paid by the banks and they depend on the banks for future business." This is the consultants' other role, which can compromise the first role. 
In a conflict of interest, one role can circumvent another, more legitimate role. Even if the conflict is not exploited with the more legitimate role taking the hit, a person or institution in such a position can be reckoned as unethical, according to some scholars. Other scholars argue that only the actual exploitation of the more legitimate role by the other is unethical.
In my view, if exploitation can be seen as a possibility in the relation between two roles, the relation itself is unethical. Put another way, it is unethical to put a person or organization in such a position, even if actual exploitation does not occur. In my view, to create or perpetuate the condition wherein a person or organization can exploit a conflict of interest is unethical.
It follows that the government has a moral responsibility to eliminate the conflicting roles even if they are not being exploited. Furthermore, the person or organization having two such conflicting roles  as can be counted as a conflict of interest is also ethically obliged to pick one role or the other. Often this is not convenient from a short-term financial standpoint, so business practitioners tend to look the other way, rationalized that since no actual exploitation of the conflict of interest has occurred, there are no ethical issues. They are wrong. The mortgage servicers should never have hired consultants to monitor the servicers. This circle itself is problematic, ethically speaking.
The regulators rather than the consultants should have been the key enforcers, and thus protectors of the public interest. The regulators can be faulted not only for their lack of competence, but also ethically in having allowed the conflict of interest to exist.  A business should not be allowed by the regulators to guard the hen house. To permit this to happen is unethical even if no hens are eaten.  


For more on institutional conflicts of interest, see my book, Institutional Conflicts of Interest: Business & Public Policy, available at Amazon.

Sources:

Ben Hallman and Eleazar Melendez, “GAO Foreclosure Report Finds Bank Regulators Failed to Provide ‘Key Oversight’,” The Huffington Post, April 3, 2013.

 Dan Fitzpatrick, "'A Dose of Healthy Competition' For Banking Regulators," The Wall Street Journal, April 18, 2013.

Monday, November 19, 2018

China: Mandating the Virtue of Filial Piety by Law

The founders of the United States, most notably Thomas Jefferson, John Adams, and Ben Franklin, held that for a republic to long endure, its citizenry must be virtuous and of a minimum education. Public education would be established, such that the common man could render a reasoned judgment at the ballot box. The dictum that the popular sovereign (i.e., the electorate) should be broadly educated resulted in law and medical schools in the U.S. requiring entering students to have a bachelor degree in another school before beginning the bachelor’s degree in the professional school. In short, public policy is an effective means of providing a people with the opportunity to gain an education, which at least in theory enhances the wisdom of a self-governing people. 
Virtue is another story. Law seems ill-equipped to form a virtuous people. It is one thing to outlaw vice in its outward conduct; how can legislation instill virtue within a soul?  Mandating virtuous conduct, such as in Massachusetts’ “Good Samaritan” law, may be possible where the conduct is in public and thus readily enforceable. Virtue within the home is far more difficult for the law to reach and thus foster. Even vice behind closed doors, such as incest as well as physical and emotional abuse more generally, is difficult for police to catch. To an extent, property rights enable such vice and allow people the option of not being virtuous in a family context.  Yet in countries in which an authoritarian state trumps even property rights, such as China, the question becomes whether legislation is the sort of thing that can foster or mandate virtuous conduct and even a virtuous character.[1] 
On July 1, 2013, the Government of China passed the Protection of the Rights and Interests of Elderly People law, which lists the required duties of adult children toward their parents. The duties stipulated include visiting one’s parents “often” and occasionally sending them greetings. The intent of the legislation is enforce filial piety, or Xiào 孝, by requiring adult children to meet the “spiritual needs of the elderly.”[2] Man, it would seem, does not live on bread alone.[3] This apparently includes the elderly.  
Although the government can enforce without much trouble the law’s stipulation that companies must give sufficient time off for employees to visit their parents on a regular basis, how are “greetings” to be enforced? The law itself contains no penalties. Moreover, is kinship, or what Cicero calls amicitia—friendship that is closest in the home—the sort of thing, being a sentiment, that can even in theory be nudged by a law? Not according to Guo Cheng, a novelist, who writes, “Kinship is part of human nature; it is ridiculous to make it into a law. It is like requiring couples who have gotten married to have a harmonious sex life.”[4]  Forcing gays not to have sex, mandating or outlawing birth-control, and prohibiting a person of one race to fall in love with someone of another can be added under the heading, “Difficult to Enforce” (or, “Should We Even Be Doing This?”). How do the police or a judge assess whether a son or daughter has given sufficient “mental support” to a parent?  
Should a daughter who has been sexually molested as a girl by her father give him spiritual or mental support? Even worse, should such care be obligatory as enforced by the state?  Is filial piety unconditional? An innocent girl having been sexually molested repeatedly by her father might stand in a shower for hours at night while her parents are out. She might have the mistaken sense that she is the person who must be dirty and thus in need of a cleansing. In actual fact, it is her father who is soiled by sickness from within. Further, the daughter might have the mistaken impression that she is afraid when her dad is away rather than at home. Should the state require her to do certain things for her dad when she is an adult (assuming she survives well into adulthood without self-destructing)?  
What about a grown man who as a boy had been beaten and emotionally abused by his dad? What does that son owe his father in terms of care and contact? Should the state dare even step foot into that tortured decision? What if the boy's mother had tried, albeit unconsciously, over the years to get her son to feel so bad about himself that he might commit suicide to answer for some imagined slight against her?
Any competent psychologist would advise all of those adult children to avoid their respective pathological parents rather than continue to suffer from abuse while offering care and support. Should a law mandate that the victims undergo further injury from parents who clutch out of fear and pride to their old ways or simply don't know any better?

Filial piety, one of the fundamental Confucian virtues
 Image Source: WUJIFA

What exactly is the virtue of filial piety? For six centuries in China, The 24 Paragons of Filial Piety, by Guo Jujing, has taught the importance of respecting and caring for one’s parents. Here the interior is married to exterior conduct. In 2012, the Chinese Government came out with a new edition, brought up to date with suggestions like: teach your parents how to use the internet.[5] It seems rather dogmatic, or arbitrary, to provide such specificity to the virtue. How uncomfortable would it be for the woman who had been repeatedly molested by her unjust father to sit down next to him decades latter to teach him key-strokes at such close range?  The obligatory weight of a six-hundred year-old text might push her over an emotional cliff.  
Generally speaking, law is like a thunderstorm’s wind. It does not discriminate between one light object and another in blowing through. A law is not directed to an individual; nor is law capable of discerning between individual cases. Even in common law, the law coming out of a case is meant to apply beyond the particular plaintiff and defendant. Roe v. Wade, for example, applies to any woman in the U.S. who wants to have an abortion.  
In contrast, virtue interacts with the idiosyncratic nature of the human psyche to form a unique moral character, which the person then applied to particular social situations. Filial piety in practice might mean one thing to you and something else to me even if we agree on the internal essence of the virtue. Furthermore, how we choose to express the virtue externally involves our own particular histories and situations. You might be obligated morally to visit your parents, while I face a psychological and moral obligation to avoid mine. Law is not such a sufficiently fine-tuned instrument to accommodate both of us, even as we share the same virtue.
Given the nature of human nature, law is both necessary and limited in its capacity. Even in the case of an autocratic state such as in China, the law can only do so much in touching a citizen’s interior life—the life of the soul.  Instead of having issued particular requirements to foster the virtue of filial piety in society, the Chinese government could alternatively have put resources into helping the Chinese adults having living parents assess how to apply the virtue to their particular situations rather than determine one size to fit all.


[1] An alternative means, which I do not discuss here, involves the future possibility of scientists being able to “tweak” the human genome to make human beings less inclined to vice and more virtuous. For example, if greed is an instinct or urge to “get still more,” perhaps through genetics that instinct can be expunged from human nature. In terms of virtue, genetics might make it more pleasurable in being generous. Where such genetic treatments are available to everyone, it seems to me that a good ethical argument could be made on their behalf.
[2] Edward Wong, “A Chinese Virtue Is Now the Law,” The New York Times, July 2, 2013.
[3] Bible, Deut. 8:3. See also Matt. 4:4.
[4] Edward Wong, “A Chinese Virtue Is Now the Law,” The New York Times, July 2, 2013.
[5] Andrew Jacobs and Adam Century, “As China Ages, Beijing Turns to Morality Tales to Spur Filial Devotion,” The New York Times, September 5, 2012.

Kant on the NSA Lying to Congress

James Clapper, Director of U.S. National Intelligence, told the U.S. Senate Intelligence Committee in March 2013 that the National Security Agency was not gathering any type of data at all on millions, and even hundreds of millions, of Americans. After leaked documents showed that Clapper had misled the committee in stating, “There are cases where they could inadvertently perhaps collect, but not wittingly,” he issued an apology to the committee for having made the comment that was “clearly erroneous.”[1] U.S. Senator Diane Feinstein, chair of the committee, praised Clapper as an honest and direct man.[2] The discerning reader realizes the full implications of the difference between being in error and lying. To err is human, but to deliberately fabricate for the benefit of oneself or one’s group is a matter on which particular humans can and do differ morally.
Kant is especially well-known for having claimed that it is never ethical to lie. This makes Kant's ethics difficult to accept in terms of "white lies," which are told for another person's benefit rather than for selfish reasons. In terms of universalizing lying as a practice, were everyone to decide to lie on a regular basis, the truth would lose its value because no one would trust it. Lying would no longer have its intended value either, as it would be expected and therefore ignored. In other words, universalizing the maxim of lying would be self-contradictory concerning lying itself; lying universalized would involve a logical contradiction. Put another way, lying as a practice universalized would insult reason itself, and thus be unethical to any rational nature. As human beings, we have such a nature.
It is admittedly strange to think of "unethical" as being in terms of a logical contradiction being contrary to reason. It is easier to think that a logical contradiction is not possible for a being having a rational nature. Kant is saying that as we are rational beings, it is unethical for us to do something if universalizing the practice to it being done by everyone would involve a logical contradiction. Is this criterion simply an expedient method for determining whether a given practice is ethical or unethical, or was Kant really thinking of ethics differently than we think of ought? If the latter, even saying that a logical contradiction is unethical because it insults reason would introduce emotion where there is none; Kant's notion of what it means for some action to be unethical would be that reason is being used against its own rational nature.
Put in Kant's easier formulation that is much easier to understand, lying involves treating other people as one’s means only, rather than also as ends in themselves. You can still treat someone as your means, as long as you also treat him or her as an end in himself or herself too. This formulation has been likened to the Golden Rule: Do unto others as you would have done to you. However, I wouldn't like to be anyone's means and yet Kant permits this as long as I'm also treated as an end in myself (as a rational being). So I don't think Kant's ethic is as idealistic as is the Golden Rule.
Applying Kant to the case at hand, James Clapper either testified in ignorance or to deceive the senators. If the former, the NSA should not have sent him to testify. Either the agency was at fault for sending the wrong guy or Clapper should have known of the program but did not. If during a NSA meeting covering the program he had been daydreaming of spying on the woman living next door, he is culpable. He would have been using his boss as a means rather than also as an end in himself.
Alternatively and more likely, if Clapper knowingly deceived the committee, either on orders from the NSA or from his own will, he used the senators as means rather than also as ends in themselves. Put another way, I doubt that Clapper likes to be lied to; neither does Diane Feinstein or any of the other senators. Nor do the people of the states that those senators represent, or the general public for that matter. An agent knowingly misleading his or her principal is a particularly sordid instance of lying; not only is it selfish and inconsiderate, it is also insubordinate.
Clapper either testified in ignorance or to deceive the senators. If the former, the NSA should not have sent him to testify; the fault is not necessarily his own. However, if he knowingly deceived the committee, either on orders from the NSA or from his own will, he used the senators as means only, rather than also as ends in themselves. Therefore, whether out of ignorance or deceit, Clapper’s error or lie points to insufficient democratic accountability of the NSA to Congress. If NSA chief Gen. Keith Alexander lied to Congress in saying that the NSA could not determine how many U.S. communications were being gathered at the time when in fact the NSA was using its auditing tool Boundless Informant precisely to determine the number of such communications, a disturbing pattern rather than a single incident of faulty testimony would characterize the NSA.[3] In particular, the agency could have developed an organizational culture in which the elected representatives in Congress and even truth-telling itself are insufficiently respected and valued. Such anti-democratic values may be the underlying culprit behind what could be a cavernous hole in democratic accountability—the breach of which would of course maintain the illusion of ongoing accountability.


1. Kimberly Dozier, “James Clapper: Answer on NSA Surveillance to Congress Was ‘Clearly Erroneous’,” The Huffington Post, July 2, 2013.2. Jeremy Peters, “Feinstein’s Support for N.S.A. Defies Liberal Critics and Repute,” The New York Times, July 1, 2013.
3. Kimberly Dozier, “Edward Snowden: NSA Lying, Collecting All Communications Into and Out of U.S.,” The Huffington Post, July 8, 2013.

Friday, November 16, 2018

When Partisanship Takes on Science on Global Warming: The Part before the Whole

Thomas Jefferson and John Adams concurred on the following preference—namely, a natural aristocracy of virtue and talent over the artificial sort of birth and wealth. Talent here is not merely skill, but also knowledge. Hence the two former U.S. presidents agreed that citizens ought to be given a broad basic education in free schools. The corollary is that as a citizenry lapses in virtue and knowledge, decadence will show up in public discourse and consequently public policy. If kept unchecked, the tendency is for the republic to fall.
Therefore, as governor of Virginia, Jefferson proposed a Bill for the More General Diffusion of Knowledge in 1779. His rationale was that because even “those entrusted with power” who seek to protect individual rights can become tyrants, popular education is necessary to render a republic secure. Jefferson’s hope was that by teaching “the people at large” examples of despots in history, the electorate would be more likely to recognize despots in their own time and throw the bastards out on their noses. As for those whom voters put in public offices, Jefferson believed that “laws will be wisely formed, and honestly administered, in proportion as those who form and administer them are wise and honest.” Hence, “those persons, whom nature hath endowed with genius and virtue, should be rendered by liberal education worthy to receive, and able to guard the sacred deposit of the rights of their fellow citizens.” This is why, beginning at around 1900, law schools in the American states began to admit applicants to the undergraduate degree in law (LL.B. or J.D.) who had already earned an undergraduate degree in the liberal arts and sciences. It was not as though the undergraduate degree in law had been promoted to graduate status.
Having had largely self-governing, popularly-elected colonial legislatures for much of the seventeenth century, the nascent American republics would stand on the two pillars of virtue and talent (including knowledge) instilled in the self-governing peoples themselves as well as their elected and appointed public officials. It is said that the only constant is change, as in the extent to which an electorate is virtuous and generally knowledgeable, as well as in the related rise and fall of republics. One notable example is ancient Rome, which went from being a republic to a dictatorship under the purported exigencies of war. Lest the rise and fall of republics seems a bit too dramatic to be considered realistic, I offer the more modest thesis that a decline in virtue and knowledge among an electorate renders the public policy increasingly deficient in dealing with contemporary problems. The matter of climate change is a case in point.
According to a study at Yale in April 2013, Americans’ conviction that global warming was happening had dropped by seven percentage-points over the preceding six months to 63 percent. The unusually cold March—quite a reversal from the previous March—explains the drop, according to the poll’s authors. The cold may actually have resulted from a loosening in the artic jet-stream southward—like a rubber-band whose elasticity has been compromised—due to more open water in the arctic ocean and thus less temperature differential in the air. Even so, only 49% of Americans believed that human activities were contributing to global warming. In fact, only 42% of Americans believed at the time that most scientists had concluded that global warming is really happening. Thirty-three percent of Americans were convinced that “widespread disagreement” exists among scientists.
In actuality, a study showed of more than 4,000 articles touching on human-sourced climate change, 97% of the scientists having written the articles conclude that human-caused change was already happening. Less than 3% either rejected the notion or remained undecided. “There is a gaping chasm between the actual consensus and the public perception,” one of the study’s authors remarked. “It’s staggering given the evidence for consensus that less than half of the general public think scientists agree that humans are causing global warming. This is significant,” the author concludes, “because when people understand that scientists agree on global warming, they’re more likely to support policies that take action on it.” Going back to Jefferson and Adams, ignorance among the electorate in a republic can be sufficient to divert enough political will that legislation needed to fix a societal (or global) problem is sufficiently thwarted.
Perhaps some of the apparent ignorance on global warming in 2013 could actually have been partisan angst. If President Obama favored policies predicated on the assumption that human-sourced global warming was then already underway, just his support alone could have been enough for some Republicans to hold firm in their denial of even other-sourced global warming. In holding knowledge hostage to score cheap partisan points, citizens and their representatives do not demonstrate much respect for knowledge as well as virtue; the vice of partisanship subdues the good of the whole in preference for the good of a part.
If Jefferson and Adams were correct that a virtuous and knowledgeable citizenry is vital to the continuance of a republic, the extent of ignorance and partisan vice related to global warming in spite of the nearly unanimous scientific conclusion and the huge stakes involved may suggest that the American republics and the grand republic of the Union may be on borrowed time (and money). Moreover, that the ignorance and vice pertains to global warming enlarges the implications to include the continuance of the species. That is to say, a virtuous and educated species may be necessary for its very survival.
See this PSA on global warming: http://www.thewordenreport.blogspot.com/2013/05/global-warming-psa.html


Academic Sources:
Philip Costopoulos, “Jefferson, Adams, and the Natural Aristocracy,” First Things, May 1990.
Yale Project on Climate Change Communication, “Americans’ Global Warming Beliefs and Attitudes in April 2013,” Yale School of Forestry and Environomental Studies, 2013.
John Cook, Dana Nuccitelli, Mark Richardson, et al, “Quantifying the Consensus on Ahthropogenic global warming in the scientific literature,” Environmental Research Letters, 8 (2013) (2), pp.
Press Source:
Tom Zeller, “Scientists Agree (Again): Climate Change Is Happening,” The Huffington Post, May 16, 2013.

Wednesday, November 14, 2018

The Gettysburg Address: Shaped by Small Pox?

By the time Lincoln was back on the train returning to Washington, he was down with a high fever from Small Pox. I’m thinking the illness did not grip the president the second he stepped on the train. Already distraught over Mary falling off a horse-carriage, his son Tad taken grievously ill, and the old, tired war, the president was almost certainly already stricken when he delivered the address and perhaps even when he wrote it the day and evening before. I suspect that the Gettysburg Address would not have been only 272 words long had Lincoln been well.
I make it point of getting a flu shot every year now. Contracting the illness was particularly costly academically when I was in graduate school. Typically, I would ration any accumulated energy to going to class. Back in bed, I found writing to be quite arduous, and sustained reading to be almost as exhaustive. In terms of writing, editing particular words or sentences was easiest, for it takes far less energy to think than to write on and on.
I suspect that Lincoln wrote such a short speech because thinking up just the right word or phrase was easier than writing a lot. Small Pox is much more serious than the common cold. Lincoln was likely already exhausted and feeling bad on the train to Gettysburg and in the bedroom that night before the day of the address. Lincoln’s emphasis on diction rather than length was likely a function of the illness rather than political calculus.
Lincoln's address was so short that the photographer only caught the president as he was returning to his seat. In the photo, Lincoln's head (below the leafless tree, just above the crowd-level, and facing the camera) is down, perhaps because he was already not feeling well. Image Source: Wikimedia Commons.
By the end of the twentieth century and into the next decades at least, U.S. presidents typically relied on a speech-writing staff to write many speeches, the vast majority of which being long. One effect of this trend is the shift in presidential leadership from broad principles to incremental legislative reform. In this context of technician presidents, the attendant speech-inflation resists any feasible restraint. Strangely, presidents overlook Lincoln’s short address as a precedent and act more like the famous orator who spoke for two hours just before Lincoln. In spite of the obvious lesson from Gettysburg, the notion that a very short speech can be more powerful than a long one has been lost on the American political elite.
The explanation may lie in Lincoln’s address being a function of him being ill rather than any political calculus. Even so, a discovery is a discovery, even if it comes about by accident. That the subsequent political success of the Gettysburg Address did not give rise to an ongoing practice in political rhetoric suggests that such a short, extremely thought-out speech runs against the current of politics at the moment and even out a year or two. Stature achieved by hard-thought reputational management literally by intensely investing in word choice, or diction, is of value nevertheless even within the space of a four-year term, especially if the incumbent has courageously taken on a few vested interests by moving society off a “sacred cow” or two. Even if neither statesmanship nor politics accounts for the severe brevity of Lincoln’s address, I contend that much political gold is waiting for the leader—whether in the public or private sector—who radically alters his or her rhetorical style and preparation.

On the History of Thanksgiving: Challenging Assumptions

We humans are so used to living in our subjectivity that we hardly notice it or the effect it has on us. In particular, we are hardly able to detect or observe the delimiting consequences of the assumptions we hold on an ongoing basis. That is to say, we have no idea (keine Anung) of the extent to which we take as unalterable matters that are actually quite subject to our whims individually or as a society (i.e., shared assumptions). In this essay, I use the American holiday of Thanksgiving, specifically its set date on the last Thursday of November, to illustrate the following points.
First, our habitual failure to question our own or society’s assumptions (i.e., not thinking critically enough) leaves us vulnerable to assuming that the status quo is binding when in fact it is not. All too often, we adopt a herd-animal mentality that unthinkingly “stays the course” even when doing so is, well, dumb. In being too cognitively lazy to question internally or in discourse basic, operative assumptions that we hold individually and/or collectively, we unnecessarily endure hardships that we could easily undo. Yet we rarely do. This is quite strange.
Second, we tend to take for granted that today’s familial and societal traditions must have been so “from the beginning.” This assumption dutifully serves as the grounding rationale behind our tacit judgment that things are as they are for a reason and, moreover, lie beyond our rightful authority to alter. We are surprised when we hear that some practice we had taken as foundational actually came about by accident or just decades ago.
For example, modern-day Christians might be surprised to learn that one of the Roman emperor Constantine’s scribes (i.e., lawyers) came up with the “fully divine and fully human,” or one ousia, two hypostates, Christological compromise at the Nicene Council in 325 CE. Constantine’s motive was political: cease the divisions between the bishops with the objective being to further imperial unity rather than enhance theological understanding.[1] Although a Christian theologian would point out that the Holy Spirit works through rather than around human nature, lay Christians might find themselves wondering aloud whether the Christological doctrine is really so fixed and thus incapable of being altered or joined by equally legitimate alternative interpretations (e.g., the Ebionist and Gnostic views).
Let’s apply the same reasoning to Thanksgiving Day in the United States. On September 28, 1789, the first Federal Congress passed a resolution asking that the President set a day of thanksgiving. After an improbable win against a mighty empire, the new union had reason to give thanks. A few days later, President George Washington issued a proclamation naming Thursday, November 26, 1789 as a "Day of Publick Thanksgivin."[2] As subsequent presidents issued their own Thanksgiving proclamations, the dates and even months of Thanksgiving varied until President Abraham Lincoln's 1863 Proclamation that Thanksgiving was to be commemorated each year on the last Thursday of November. Here, the attentive reader would be inclined to jettison the “it’s always been this way” assumption and mentality as though opening windows on the first warm day of spring. The fresh air of thawing ground restores smell to the outdoors from the long winter hibernation and ushers in a burst of freedom among nature, including man. Realizing that Thanksgiving does not hinge on its current date unfetters the mind even if just to consider the possibility of alternative dates. Adaptability can obviate hardships discovered to be dogmatic in the sense of being arbitrary.[3]
The arbitrariness in Lincoln’s proclaimed date was not lost on Franklin Roosevelt (FDR). Concerned that the last Thursday in November 1939, which fell on the last day of the month, would weaken the economic recovery on account of the shortened Christmas shopping season, he moved Thanksgiving to the penultimate (second to last) Thursday of November. He defended the change by emphasizing "that the day of Thanksgiving was not a national holiday and that there was nothing sacred about the date, as it was only since the Civil War that the last Thursday of November was chosen for observance.”[4] Transcending the common assumption that the then-current “last Thursday of November” attribute of Thanksgiving was a salient—even sacred, as though solemnly passed down from the Founders by some ceremonial laying on of hands—in the very non-holiday’s very nature, FDR had freed his mind to reason that an economic downside need not be necessary; he could fix a better date without depriving Thanksgiving of being Thanksgiving.
To be sure, coaches and football fans worried that even a week’s difference could interrupt the game’s season. In a column in The Wall Street Journal in 2009, Melanie Kirkpatrick points out that "by 1939 Thanksgiving football had become a national tradition. . . . In Democratic Arkansas, the football coach of Little Ouachita College threatened: 'We'll vote the Republican ticket if he interferes with our football.'"[5] Should Christmas have been moved to April so not to interfere with college basketball? Sadly, the sheer weight being attached to the “it’s always been this way” assumption could give virtually any particular inconvenience an effective veto-power even over a change for the better, generally (i.e., in the public interest).
Unfortunately, most Americans had fallen into the stupor wherein Thanksgiving just had to be on the last Thursday of November. “The American Institute of Public Opinion, led by Dr. George Gallup, released a survey in August showing 62 percent of voters opposed Roosevelt's plan. Political ideology was a determining factor, with 52 percent of Democrats approving of Roosevelt's move and 79 percent of Republicans disapproving.”[6] Even though the significance of the overall percentage dwarfs the partisan numbers in demonstrating how pervasive the false-assumption was at the time among the general population, the political dimension was strong enough to reverberate in unforeseen ways.
With some governors refusing to recognize the earlier date, only 32 states went along with Roosevelt.[7] As a result, for two years Thanksgiving was celebrated on two different days within the United States. In his book, Roger Chapman observes that pundits began dubbing "the competing dates 'Democratic Thanksgiving' and 'Republican Thanksgiving.'"[8] Sen. Styles Bridges (R-N.H) wondered whether Roosevelt would extend his powers to reconfigure the entire calendar, rather than just Thanksgiving. "I wish Mr. Roosevelt would abolish Winter," Bridges lamented.[9] Edward Stout, editor of The Warm Springs Mirror in Georgia -- where the president traveled frequently, including for Thanksgiving -- said that while he was at it, Roosevelt should move his birthday "up a few months until June, maybe" so that he could celebrate it in a warmer month. "I don't believe it would be any more trouble than the Thanksgiving shift."[10] Although both Bridges and Stout were rolling as though drunk in the mud of foolish category mistakes for rhetorical effect, moving up a holiday that has at least some of its roots in the old harvest festivals to actually coincide with harvests rather than winter in many states could itself be harvested once the “it’s always been this way” assumption is discredited. Just as a week’s difference would not dislodge college football from its monetary perch, so too would the third week in November make a dent in easing the hardship even just in travelling and bringing the holiday anywhere close to harvest time in many of the American republics. As one of my theology professor at Yale once said, “Sin boldly!” If you’re going to do it, for God’s sake don’t be a wimp about it. Nietzsche would undoubtedly second that motion.
Why not join with Canada in having Thanksgiving on October 12th? Besides having access to fresh vegetables and even the outdoors for the feast, the problematic weather-related travel would be obviated and Americans would not come to New Year’s Day with holiday fatigue. Of course, we wouldn’t be able to complain about the retailors pushing Christmas over Thanksgiving in line with the almighty dollar, but amid the better feasts and perhaps colorful leaves we might actually allow ourselves to relish (or maybe even give thanks!) amid natures splendors rather than continue striving and complaining.
To be sure, resetting Thanksgiving to autumn in several of the states would translate into summer rather than harvest time in several others. Still other states are warm even in the last week of November, and harvest time might be December or March. Perhaps instead of carving the bird along partisan lines, Thanksgiving might be in October (or even the more temperate September!) in the “Northern” states and later in the “Southern” states, given the huge difference in climates. Remaining impotent in an antiquated assumption that lives only to forestall positive change while retailors continue to enable Christmas to encroach on Thanksgiving reeks of utter weakness.
Giving serious consideration to the notion different states celebrating Thanksgiving at different times might strengthen rather than weaken the American union. Put another way, invigorating the holiday as a day of thanksgiving amid nature’s non-canned bounty might recharge the jaded American spirit enough to mitigate partisan divides because more diversity has been given room to breathe. For the “one size fits all” assumption does not bode well at all in a large empire of diverse climes. Indeed, the American framers crafted an updated version of federalism that could accommodate a national federal government as well as the diverse conditions of the republics constituting the Union. Are the states to be completely deboned as though dead fish on the way to the market at the foot of the Lincoln Memorial? Is it so vitally important that everyone does Thanksgiving on the same day when “by state” enjoys a precedent?
Engulfed in the mythic assumption that the “last Thursday in November” is a necessary and proper fit for everyone and everywhere, Americans silently endure as if out of necessity all the compromises we have been making with respect to the holiday? Perhaps changing the date or returning the decision back to the states would free up enough space for the crowded-in and thus nearly relegated holiday that people might once again feel comfortable enough to say “Happy Thanksgiving” in public, rather than continuing to mouth the utterly vacuous “Happy Holidays” that is so often foisted on a beguiled public. 
Like Christmas and New Year’s Day, Thanksgiving is indeed now an official U.S. holiday. This would also be true were the states to establish the holiday as their respective residents see fit. As push-back against FDR’s misguided attempt to help out the retailors and the economy, Congress finally stepped in almost two months to a day before the Japanese attacked Pearl Harbor in Hawaii (whose harvest time escapes me). The U.S. House passed a resolution declaring the last Thursday in November to be a legal holiday known as Thanksgiving Day. The U.S. Senate modified the resolution to the fourth Thursday so the holiday would not fall on a fifth Thursday in November lest the Christmas shopping season be unduly hampered as it rides roughshod over Thanksgiving. Roosevelt signed the resolution on December 26, 1941, the day after Christmas, finally making Thanksgiving a legal holiday alongside Christmas and New Year’s Day.[11] Interestingly, the U.S. Commerce department had found that moving Thanksgiving back a week had had no impact on Christmas sales.[12] In fact, small retailors actually lamented the change because they had flourished under the “last Thursday” Thanksgiving rubric; customers fed up with the big-named department stores like Macy’s being so overcrowded during a truncated “Christmas season” would frequent the lesser-known stores in relative peace and quiet. Charles Arnold, proprietor of a menswear shop, expressed his disappointment in an August letter to the president. "The small storekeeper would prefer leaving Thanksgiving Day where it belongs," Arnold wrote. "If the large department stores are over-crowded during the shorter shopping period before Christmas, the overflow will come, naturally, to the neighborhood store."[13] This raise the question of whether a major legal holiday is best treated as whatever results from the tussle of business forces oriented to comparative strategic advantage as well as overall sales revenue.
Lest the vast, silent majority of Americans continue to stand idly by, beguiled by the tyranny of the status quo as if it were based in the permafrost of “first things,” things are not always as they appear or have been assumed to be. We are not so frozen as we tend to suppose with respect to being able to obviate problems or downsides that are in truth dispensable rather than ingrained in the social reality.


1. Jarslav Pelikan, Imperial Unity and Christian Division, Seminar, Yale University.
2.  The Center for Legislative Archives, “Congress Establishes Thanksgiving,” The National Archives, USA. (accessed 11.26.13).
3. The other meaning of dogmatic is “partial” in the sense of partisan or ideological more generally. Given the extent to which a person can shift ideologically through decades of living, might it be that partisan positions are not only partial, but also arbitrary?
4. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
5. Melanie Kirkpatrick, “Happy Franksgiving: How FDR tried, and failed, to change a national holiday,” The Wall Street Journal, November 24, 2009.
6. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
7. Ibid.
8. Roger Chapman, Culture Wars: An Encyclopedia of Issues, Viewpoints, and Voices (Armonk, NY: M.E. Sharpe, 2010).
9. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
10. Ibid.
11. The solely religious holidays in November and December are private rather than legal holidays. As Congress cannot establish a religion on constitutional grounds, Christmas is a legal holiday in its secular sense only. Therefore, treating Christmas as a legal holiday as akin to the private religious holidays (including Christmas as celebrated in churches!) is a logical and legal error, or category mistake. Ironically, Thanksgiving, in having been proclaimed by Lincoln as a day to give thanks (implying “to God”), is the most explicitly religious of all the legal holidays in the United States.
12. Ibld.
13Ibid.