Tuesday, September 30, 2014

The New York Fed: A Case of Regulatory Capture

According to The Wall Street Journal, a study sponsored by the Federal Reserve Bank of New York in 2009 uncovered “a culture of suppression that discouraged regulatory staffers from voicing worries about the banks they supervised.”[1] Whereas the report points to excessive risk aversion and group-think as the underlying problems, a fuller explanation is possible—one with clear implications for public policy.

The New York Fed is the primary regulator of many large financial institutions; this regulatory power is increased under the Dodd-Frank financial reform law of 2010. Put another way, the American public relies heavily on that Fed in keeping the sort of massive credit-freeze that happened in September of 2008 from happening again. Therefore, something like regulatory capture, wherein regulators of an agency have been “captured” by the regulated institutions, would heap mounds of systemic risk on a given financial system.

Unfortunately, Carmen Segarra, a former examiner at the New York Fed, released tapes in 2014 catching regulators going soft on the banks presumably being regulated. In fact, she claims she was fired in 2012 for “refusing to overlook Goldman’s lack of a conflict of interest policy and other questionable practices that should have brought tougher regulatory scrutiny.”[2] Generally speaking, her tapes make the central bank look “deferential and ineffectual, and apparently concerned above all with accommodating the banks it [is] supposed to regulate.”[3] This has regulatory capture written all over it.

As easy as it is to point to Fed staffers seeking to cash in by being soft on a regulated bank now in order to get a high-paying job at it later, that Segarra’s superiors had her change meeting minutes she had taken to eliminate a Goldman Sachs executive’s statement that “once clients become wealthy enough, certain consumer laws [don’t] apply to them.[4] To be sure, the executive was correct; wealthy individuals can qualify with the S.E.C. as accredited investors. Even so, that the New York Fed went to the trouble of eliminating the executive’s comment from the record—as well as dissuading Segarra from pressing the banker on his statement—may suggest that the light treatment comes from the top.

New York Fed President Stephen Friedman sat on Goldman’s board and had a large holding in the bank’s stock even though such a financial relationship violates Federal Reserve policy. William Dudley, the next president, had been a partner at Goldman Sachs. Moreover, the bank had given the Obama ’08 campaign $1 million—the largest single donation received. All of this points to why institutional (or structural) conflicts of interest are inherently unethical, as well as dangerous to a republic.

During a press conference in 2014, President Obama notably included himself in saying that no one in Washington is clean; the amounts needed to mount a successful electoral campaign mean that every elected officeholder is beholden to powerful interests that are regulated. This may explain why the president caved on a public option in his health-insurance reform that would be known as the Affordable Care Act. Similarly, he acted contrary to former Fed chair Paul Volker's recommendation that the Dodd-Frank legislation include a provision that would break up the largest banks on account of their systemic risk to the entire financial system and economy. Being beholden to wealthy interests in the status quo makes a mockery of anything resembling "real change."

So the White House may have been putting pressure on the New York Fed presidents who would have been inclined anyway to go easy on the banks as regulators. If Barak Obama is correct, then at least part of a solution is either mandatory public funding of campaigns or mandating that use of the public air-waves obliges television and radio companies to have plenty of free ad time for candidates before an election. The problem of the revolving door would still need to be dealt with, but at least the officials with tremendous regulatory discretion would not feel obliged to pay back the regulated instead of demanding more vigorous scrutiny.

[1] Pedro N. Da Costa, “N.Y. Fed Staff Afraid to Speak Up, Secret Review Found,” The Wall Street Journal, September 28, 2014.
[2] Sam Levine, “Elizabeth Warren Calls for Investigation of NY Fed Over Secret Tapes,” The Huffington Post, September 28, 2014.
[3] Ben Walsh, “The Fed Is Even Afraid to Ask Goldman Sachs the Easy Questions,” The Huffington Post, September 30, 2014.
[4] Levine, “Elizabeth Warren.”

Wednesday, September 24, 2014

The Last Emperor: A Curious Case of Limited Absolute Power

People either obey a powerful government official or rebel. A rebellion does not typically include continued loyalty to the sovereign. The French Revolution demonstrates this point. Yet in China in the 1910s as the Qing dynasty lost power, the authority of the emperor became more complex—or maybe it had been so throughout the dynasty.

The full essay is at “The Last Emperor” 

CEO/worker Pay: Perceptual Shortcomings

According to one study of people around the world, people of different cultures, incomes, religions, and other differences show “a universal desire for smaller gaps in pay between the rich and poor” than was the actual case at the time of the survey in 2014.[1] Interestingly, the respondents didn’t have a clue how much of a gap actually existed in their respective economies. The difficulty in estimation means that the public discourse on economic inequality has been rife with erroneous assumptions. Where the error lies in the direction of minimizing the gap, we can postulate that public policy allows for greater economic inequality than would otherwise be the case.

The United States, for example, surged past Peter Drucker’s wall of 20 to 1 (CEO compensation to average worker pay), hitting 40 to 1 in 1994 and then 400 to 1 in 2005. Why would America’s silent majority put up with such economic inequality? The short answer might lie with the power of corporations in using media corporations to lull television viewers into supposing that the difference in compensation is not very significant—significance involving not only perception, but judgment as well. That is to say, whether the gap is perceived to be significant is a value judgment that can be subtly manipulated.

In spite of an actual gap of 350 to 1 (CEO compensation to unskilled worker pay) in 2014, the Americans surveyed estimated the ratio to be 30 to 1.[2] Such a perceptual judgment could have been influenced by the lack of attention on the topic in the media. The ideologicalization of American broadcast journalism—the blurring of the lines between reporting and advocating—points to just how much estimates of significance can be subject to external influence.

Considering the relatively wide actual gap being allowed to exist in the American States as of 2014, what would the public policy have looked like had the perceptions of the American public been adjusted up to 350 to 1? Would the decentralized individual voters forming majoritarian blocks demanding limits put enough pressure on their elected representatives to mitigate the power of wealth in the halls of legislatures as elections loom?  

[1] Gretchen Gavett, “CEOs Get Paid Too Much, According to Pretty Much Everyone in the World,” The Huffington Post, September 24, 2014.
[2] Ibid.

Monday, September 15, 2014

The Scottish Referendum: A Political Analysis

Any political analysis of the Scottish referendum on secession from Britain should include not only the Scottish National Party (SNP) and Westminster, but also other large E.U. states and even the E.U. powers at the federal level. Such an analysis may leave the cynic wondering whether the question could even conceivably be decided by the Scots themselves—so much being on the line for state and federal officials and their respective institutions.

In May 2011, the Scottish National Party (SNP) won 69 out of 129 seats in the Scottish Parliament, with about 45 percent of the vote, up by more than 12 percentage points. Their three main rival parties — Labour, the Conservatives and Liberal Democrats all lost ground.[1] John Curtice, a professor at Strathclyde University, opined the SNP victory appeared to be partly because of dissatisfaction with the other political parties, particularly the Labour Party, rather than from an overwhelming desire to secede from the state. Surveys had consistently showed support for secession at between a quarter and a third of voters, he said.[2] According to a survey by the Scottish Sunday Mail newspaper published the previous month, only 33 percent of the adult residents in the region would back secession from Britain while 43 percent preferred the status quo, even though a bare majority favored a referendum on the question. Even so, one news outlet reported that “the unionist campaign is in disarray and the nationalists boast a leader who even his opponents admit is a highly skilled political operator.”[3]

How much say do the voters really have? Are they actually pawns being moved without their knowledge? Perhaps large vested interests are the real deciders. David Cheskin (AP)

In June 2011, MSNBC anticipated nonetheless that opposition from the state government in London to the northern region’s secession would be “fierce, not least because of its economic consequences.”[4] Most notably, in the three years between 2008 and 2011, the British government took in 28.74 billion pounds ($47.10 billion) in revenues from oil and gas production, the bulk of which is based in the North Sea.[5] Were the Scots to secede, they would presumably take their $38 billion oil industry, including the oil-rich off-shore areas. Additionally, Westminster would lose about 10% of its tax revenue.[6] To be sure, a cross-subsidy went in the other direction, thanks largely to London’s wealthy taxpayers. Even so, Westminster had a strong economic incentive to put money behind its efforts to skew the vote. It is no surprised, therefore, that Mark Carney, the Bank of England Governor, said that keeping the British pound would be incompatible with “sovereignty.”[7] I would not be surprised to learn that Westminster was behind this timely warning to the Scots.  

Politically, Britain less Scotland would have less influence in the European Council. As the most stridently Euro-Skeptic—the South Carolina as it were—state in the E.U., Westminster had a huge incentive in maintaining its influence in the European Council by retaining the Scottish population (8% of the UK’s total). To the Scots, a majority of those polled wanting to remain part of the E.U., secession from the Euro-Skeptic state could mean that Scotland would become a state even if Britain will have seceded. That the president of the E.U.’s executive branch, the European Commission, said that it would be “extremely difficult” for Scotland to gain statehood even as a former president of the European Parliament, the E.U.’s lower legislative chamber, confirmed the Scottish National Party’s claim that Scotland could indeed accede may point to Westminster’s influence over at least one very powerful federal official, José Barroso.

It is indeed strange that Barroso would seek to douse the Yes vote in the referendum. For E.U. federalists, anything that weakens a resolute anti-federalist state such as Britain should be encouraged. With a majority of Scots wanting to stay in the E.U., the E.U. itself—as well as its federalist-oriented states such as Germany, Belgium, and the Netherlands—had a political interest in strategically working to encourage the Yes vote in the hope that Scotland come in and the rest of the UK leave. Even apart from the federalist E.U. states, the large states would benefit should Britain move down a tier, given that the states compete for influence in the Union. Barroso’s words of discouragement may thus confirm the lengths to which the British prime minister would go and the extent of his political reach given Westminster’s stake in the game being played out in Scotland.

In America, U.S. President Obama came out against Scotland gaining independence from Britain[8]. Doubtless then he would also have opposed the eastern region of Colorado's secession and that of southern Illinois and northern California, even though those regions would doubtlessly become new states, just as Scotland would in the E.U. The oddness of Obama's position stems from American colonial history, for even though Scotland is a region of the state of Britain in the E.U. akin to northern California of the state of California in the U.S., the theme of freedom from Britain ought to resonate with Americans in a way that Europeans cannot feel. Accordingly, I suspect that Barak Obama's public position is part of a larger deal or quid pro quo with David Cameron. Obama would have gotten something important to the U.S. in exchange for doing something else on a matter bearing much less on American interests. 

Amidst all the influence of the political heavy-weights, a cynic could easily look in vain for a democratic basis for a determination on the question. It might even be said that “determination by the people” is a quaint cover for a power-struggle between major players in the E.U., with Westminster having so much on the line even bribery could not be ruled out.

Even within Scotland, the political parties had a huge stake in the game. In the 2010 regional election, Labour won 41 seats in the local assembly, while the Conservative Party won just one—the Liberal Democrats and the SNP taking the rest.[9] Because the conservatives do better at the state level, the Conservative Party in the Scottish region had a large stake in defeating the referendum, while the Labour Party had a likewise state in seeing the measure pass.

What say is left for the Scots themselves among all the political heavy-hitters? Even though it may well be idyllic to want the general will of the Scots themselves to decide the question, I cannot give up the ghost—hoping even naïvely that the outcome will be that preferred by a majority in the region clear of manipulation and outside money. While the political parties in Scotland have a legitimate stake in the game, for Westminster to agree to give the Scots a chance to decide the question only to attempt to undercut their say can only count as manipulative deceit. At the very least, the state government faced an inherently unethical institutional conflict-of-interest in using threats and other unsavory devices to manipulate the vote in the state’s own interest.

1. Ian MacKenzie, “Scottish Pro-Independence Party Wins Majority,” msnbc.com, May 6, 2011.
2. Ibid.
3. Ibid.
4. Ian Johnston, “Scotland to Split from UK and ‘Be a Nation Again’?” msnbc.com, June 7, 2011.
5. Ibid.
6. Kim Hjelmgaard, “Scotland Vote Comes Down to the Wire,” USA Today, September 17, 2014.
7. Ibid.
8. Jill Lawless, "Scotland Independence Vote Commences," Associated Press, September 18, 2014.
9. Paul Vale, “5 Reasons Why Many Scots Will Vote For Independence,” The Huffington Post, September 16, 2014.

Saturday, September 13, 2014

Beyond Breaking California Up into Six States: A Federalist Alternative

In any epoch and in any culture, the human mind displays a marked tendency to accept the status quo as the default—being so ensconced in fact that efforts at real change almost inevitably face formidable road-blocks. In this essay, I analyze the 2014 failed ballot-petition that would have put the proposal of breaking California into six separate states to Californians. I contend that the proponents could alternatively have taken up a more optimal alternative—one much easier to put into effect. Interestingly, that idea comes from the E.U. rather than the U.S.

When the U.S. constitutional convention hammered out a constitution in 1787, the thirteen states together had about 7 million people. In 2014, at the time of this writing, California alone sports a population of roughly 38 million. In terms of population alone, California would be five early United States with seven states each. George Washington had held his tongue in the convention so he could preside neutrally until the last point, when he asked the delegates to change the minimum representation in the proposed U.S. House of Representatives from 30,000 residents to 20,000, the House being the repository of democracy in the federal government. Washington would likely shake his head in utter disbelief at such populous House districts as well as those at the state level. Additionally, the direct election of U.S. Senators is popularly taken to translate into California’s two senators representing 38 million people (technically, the senators represent California as a polity rather than its citizens directly). In contrast, the two U.S. senators from Wyoming “represent” only roughly 600,000, which, while much better than 38 million, is significantly more than 20,000.

In terms of territorial-related diversity, California’s rivals that which existed between New England and the southern states in 1787. Redding is not exactly San Francisco or L.A., and the central valley is like a world away from Santa Cruz. In terms of public policy, the practical necessity of compromise can easily result in a lowest common denominator very distinct from any of the regional preferences. In other words, no one is happy with the result that gets through the intractable blockages.

So it is perhaps no surprise that a proposal sought the light of day in 2014 to split California into six states, and neither is it any surprise that the ballot-petition fell short. As much as it might make sense, the institutional conflict-of-interest that would face U.S. senators from other states in deciding whether to approve of the ten additional senators in what would have been California. According to three Californians who work on such issues, “The other states, through their congressional representatives, have no incentive to approve a subdivided California; indeed, they have a huge disincentive to do so.”[1] This is so not only in terms of the U.S. Senate, but also the Electoral College as having more U.S. senators would mean more electoral votes in presidential elections. Furthermore, the movement for smaller states would need to deal with James Madison’s point that a political minority is more likely to be oppressed in a smaller republic—although the six states would on average each have significantly more people than did the thirteen states in 1787. Even so, the obstacles to a member-state in the U.S. splitting into two or more states are certainly more than in the E.U. The state of Britain does not need federal approval to split into four states, for example, or to have the Scottish region secede from the state.

Interestingly, Tim Draper, the venture-capitalist Californian who was behind the failed Six-State ballot-petitions, suggested in a talk at The Commonwealth Club that the counties could alternatively be given more of California’s retained and even residual sovereignty (the federal government’s being limited, or enumerated). The counties could then build support for eventual regional states from the ground up. Counties in Northern California would doubtless legalize marijuana in a split second and Napa would pass legislation enacting a grape-studded flag, while L.A. county would legalize drunk celebrities in public places and San Diego county would ok bad weather on a probationary trial-basis as if it were akin to stem-cell research.

While from a democratic standpoint, this strategy is preferable to Draper’s own, the “home-grown” proposals would still face the inexorable political obstacles because both in process and outcome, the proposal would be a closer fit to the people. 

As still another doable alternative, Californians could look at the E.U. states that are themselves federal systems, such as Belgium, the Netherlands, and Germany. The independent state of Switzerland could also serve as a model, should Californians want most of their state’s governmental sovereignty at the county or regional level. That is to say, California could have a confederal or (modern) federal system of governance. California would still be a member-state in the U.S., so U.S. approval would not be necessary, and Californians would benefit from the check-and-balance that federalism within their state could give them as the counties or regions could check excesses in the California Federal Government (and vice versa). Sadly, such a mechanism had succumbed to a lop-sided federal system at the state-U.S. Government level. In short, federalism can be applied not just as it was long ago at the empire or alliance level, but also at the modern-day member-state level—a large and diverse state being a federal system unto itself. I suspect that George Washington would agree.

[1] David Carrilo, Jack Citrin, and Ethan Rarick, “Carving Up California: Been There, Tried That,” Daily Journal, July 22, 2014.

Friday, September 12, 2014

Ebola in Liberia: The Government’s Fault?

With the Ebola virus “spreading like wildfire” in Liberia, “devouring everything in its path,” Brownie Samukai, the state’s defense minister, went on to tell the U.N. Security Council on September 9, 2014 that “Liberia is facing a serious threat to its national existence.”[1] With more than half of the epidemic’s deaths in that state—1,224 out of at least 2,2296 in West Africa as of September 6, 2014—and new cases “increasing exponentially,” the World Health Organization (WHO) declared that “the demands of the Ebola outbreak have completely outstripped the government’s and partners’ capacity  to respond.”[2] Meanwhile, the International Monetary Fund (IMF) reported that the illness had severely handicapped the mining, agriculture, and service sectors of the state’s economy.[3] Quite understandably, pleas for the government to do more peeled like frightened bells across the state. “The patients are hungry, they are starving. No food, no water,” a terrified woman told journalists. “The government needs to do more. Let Ellen Johnson Sirleaf do more!”[4] Even if valid, such blame is hypocritical to the extent that the people themselves had been refusing to do what is necessary to stop such a virus from spreading.

Concerning the validity of woman’s charge that the government was not doing nearly enough, Samukai pointed out that the “already weak health infrastructure” was overwhelmed.[5] That is to say, the government had to deal with an already-insufficient healthcare system. Why insufficient? Two theories of development give different answers. According to dependencia, or dependency theory, the infrastructure of a colony is oriented to getting commodities out to the colonizer rather than to developing an internal, web-like system. Roads of a coastal colony, for example, are prioritized that go from the interior to the coast, where ships can pick up the goods and transport them to the core economy (e.g., Europe). The colonists and even their successors cannot be blamed for the lack of internally-oriented infrastructure, yet at some point after a sufficient amount of time as a sovereign state the lack of any progress is surely blameworthy.

The modernization theory says that what holds a developing country back is not its colonial infrastructure, but, rather, things like tradition and ignorance that a people stubbornly cling to even when offered a better way. Superstition, for example, may keep people from working on certain days while tradition has it that a person should stop working as soon as he or she has enough for subsistence living. This inverse of the Protestant work ethic can keep capital from accumulating to the point that reinvestment can broaden an agrarian economy to include manufacturing industries. Rigidly sticking with the custom that puts child labor above education, a people can keep its young from becoming professionals, business entrepreneurs, and managers. Quite understandably, executives of foreign corporations are hesitant to start operations where such a base labor pool exists and reinforces itself.

Taken together dependencia and modernization theory can account for the weak health infrastructure in Liberia and other former colonies in Africa. Health-care of the natives had not been a priority of the colonizers. Additionally, education and investment, as well as even foreign direct investment, may be lacking even though they would contribute much to building a sound healthcare system.

Applied to the Ebola outbreak, we can look beyond the government and healthcare infrastructure to apply modernization theory to the people themselves. The funeral custom, for example, of touching the body the deceased friend or relative is great for the virus, which spreads by touch rather than air. Even so, the African who have this tradition stubbornly and/or ignorantly held to it even as the epidemic was spreading. Additionally, villagers took to hiding sick residents rather than allowing visiting healthcare workers to take the infected people to makeshift treating facilities out of fear that people go to die at such places; meanwhile, the villagers themselves could become infected. In some cases, villagers even attacked the visitors, stubbornly ignoring their pleas.

Scared villagers in Liberia stand far away from the healthcare worker, even as they risk getting the virus by rubbing up against each other--ignoring the worker's pleas. (Image Source: The Washington Post)

Simply maintaining a distance from other people, rather than continuing to touch them, would have done a lot to smite the Ebola. Especially sordid is the assumption that the healthcare workers and government officials don’t know what they are talking about, especially if the person also assumes that he or she cannot be wrong—such as in knowing that touching a dead body brings with it benefits that can keep the person healthy or safe. Ignorance that cannot be wrong, backed up by tradition, can indeed be a silent killer, the odor of which can only be pleasing to the Ebola virus. Blaming the government rings hollow from such a putrid drum, even if officials could be doing a better job in mopping up the mess.

[1] Abby Ohlheiser, “Ebola Is ‘Devouring Everything in Its Path.’ Could It Lead to Liberia’s Collapse?The Washington Post, September 11, 2014.
[2] WTO, “Ebola Situation in Liberia: Non-Conventional Interventions Needed,” September 8, 2014; Elahe Izadi, “Ebola Death Toll Rises to 2,296 as Liberia Struggles to Keep Up,” The Washington Post, September 9, 2014.
[3] Anna Yukhananov, “IMF Says Ebola Hits Economic Growth in West Africa,” Reuters, September 11, 2014.
[4] Abby Ohlheiser, “Ebola.”
[5] Ibid.

Wednesday, September 10, 2014

Letter to the Scots: Read between the Lines

The answer may be staring you in the face. Such might be the best feedback the rest of the world could give the Scots as they discern whether their region should break off from the state of Britain. How do the English feel about the Scots? The answer is presumably relevant, as who wants to remain where they are not liked? On this matter, the Scots could do worse than read between the lines of a poll done roughly a month before the referendum on what the English think should be Scotland’s relation to Britain if the region leaves and if it stays.[1]

"It is striking how tough people in England are on Scotland whatever the referendum outcome," Jeffery said. The message appears to be, 'Vote yes, by all means, but if you do, you're on your own.'" In the poll, two in three respondents in England said they would not want Scotland to use the British pound even though the Queen would continue as the head of state (i.e., Scotland would be in the British Commonwealth of nations--a partial residual of the British Empire). Only 1 in 4 were in favor of Britain helping an independent Scotland negotiate its accession as a state alongside Britain in the European Union and membership in NATO.[2]

If the residents in the Scottish region vote against breaking off from the state, English voters would overwhelmingly be in favor of giving the region more autonomy from the state government. Lest this seem too good to be true, those voters "also want to cut funding to Scotland and prevent Scottish members of the British Parliament from voting on issues concerning only England." The message here, according to Jeffery, one of the study's authors, is: "By all means have more devolution, but you can't then have a role at Westminster you do now, and don't expect any funding to flow northwards from England."[3]

Either way, the not so subtle message for the Scots is that they are hardly welcome. Such tension between two groups that both self-identify as a people in one state is doubtlessly counter-protective from the perspective of the state itself; two separate states in the E.U. would be more optimal, for the E.U. federal system permits both homogeneous political subunits, or states, and a diverse empire-scale polity—hence the advantages of both. A state of two contending peoples, proverbially at each other’s throats, is thus far from optimal for the federal system, not to mention the state itself. Put another way, arguing that the UK is just such a political arrangement that works best with such a basic contentious difference in terms of group-identification treats the E.U. state as if it were like the E.U. (or U.S.) itself, rather than a state thereof. A state in the E.U. cannot logically be equivalent to the E.U., or then a subunit would be commensurate to that to which it is a subunit. 

For the Scots, the simple message is that it is not good to remain in close quarters with a people who want the worst rather than the best for you. Reading between the lines, the English want you out. I submit that this factor ought not be a trivial one as the Scots deliberate on whether their region should break off from the E.U. state to become a new, relatively homogeneous one, and thus more conducive to both Britain and Scotland as states, and to the E.U. as well.

1. YouGov conducted the survey of 3,695 adults living in England via the internet on April 11-12, 2014
2. Katrin Bennhold, "How Scottish Independence Relates to Larger Tax Fights," The New York Times, August 21, 2014.
3. Ibid.

Tuesday, September 9, 2014

Oceans Arising on Edifices of Arrogance

A study published in late November 2012 in the journal Science estimates that the melting of ice sheets in Antarctica and Greenland had raised global sea levels by 11.1 millimeters (0.43 inch) since 1992. That represents one-fifth of the total sea-level rise increase in that period. Other contributors include the expansion of the sea water from warming, and the melting of glaciers, as for instance on mountains. In the 1990s, melting of the polar ice sheets in the Antarctica and Greenland was responsible for about 10 percent of the global sea-level rise, but by 2012 the effect had risen to 30 percent.[1] The study does not, however, uncover the underlying cause, or association, lying in a complexity in human nature itself. Our species has vaunted to the top of the food chain and leveraged a brain capable of engineering technological advances that would have seemed magical even just in the nineteenth century, and yet we seem hard-wired to accelerate our course to a self-destructive extinction. This lack of balance is reflected in the increasing extremes in the global climate. In this essay, I begin with the study and steadily work toward uncovering the underlying, subterranean culprit.

In Greenland, melted ice, or water, headed to the Atlantic Ocean. NYT

The study can be interpreted as essentially “firming up” what had been left to guesswork hitherto. “It allows us to make some firm conclusions,” Andrew Shepherd of the University of Leeds said. “It wasn’t clear if Antarctica was gaining or losing ice. Now we can say with confidence it is losing ice.”[2] This is significant because there are hundreds of feet of sea-level rise in the combined ice of Greenland and the Antarctica, and even that sort of rise could occur in even just two centuries. Unlike ice in the sea melting, water from land-ice is added to the sea and thus is particularly salient in the rise in sea-level.

Although correlation is not necessary causation, global emissions of carbon dioxide were at a record high in 2011, having jumped 3% from the previous year. The international goal of limiting the ultimate warming of the planet to 3.6 degrees (F) was all but disregarded at the time as unrealistic, according to researchers at the Global Carbon Project. Slowly falling emissions in some of the developed economies, including the U.S., were more than matched by continued growth in developing countries like China and India. Coal was growing fastest, with related emissions jumping more than 5 percent in 2011 from the previous year.[3]

Moreover, the level of carbon dioxide, the most important heat-trapping gas in the atmosphere, had increased 41% since the beginning of the Industrial Revolution. Meanwhile, the temperature of the planet increased about 1.5 degrees (F) since 1850. The New York Times reports that scientists expected at the time of the release of the 2011 figures that further “increases in carbon dioxide” would “likely . . . have a profound effect on climate, . . . leading to higher seas and greater coastal flooding, more intense weather disasters like droughts and heat waves, and an extreme acidification of the ocean.”[4] The volume of carbon dioxide in the atmosphere in 2013 was 396 parts per million, 2.9 ppm higher than in 2012; this represents the largest year-to-year increase since 1984, when reliable global records began.[5] As a result, "(c)oncentrations of nearly all the major greenhouse gases reached historic highs in 2013, reflecting ever-rising emissions from automobiles and smokestacks but also, scientists believe, a diminishing ability of the world's oceans and plant life to soak up the excess carbon put into the atmosphere by humans."[6] Accordingly, climatologists were predicting more accelerated ice-melt.

To be sure, distinguishing a causal connection from the sort of cycle that was responsible for Greenland being green in Mediaeval times has been the fulcrum of much controversy and debate; it is not as though scientists can treat one earth by increasing the carbon dioxide in the atmosphere while leaving another earth as a control-group and measure the differing climatic consequences. As the philosopher David Hume argued in the late eighteenth century, we actually know much less about cause and effect than we think we do. The human brain naturally that’s a strong positive correlation accompanied by a logical rationale as good enough to pronounce a causal relationship.

Regardless of whether our use of fossil fuels is a contributing factor in the melting of the ice-sheets, that the sea-level was rising even in 2011 and so much of humanity lives within fifty miles of a sea-coast suggests that major dislocations will undoubtedly be necessary within one or two hundred years, and perhaps even sooner given the record-high level of carbon dioxide in the planet’s atmosphere in 2013. As much as a third of Florida could be underwater again—whether it eventuates as a result of a natural climatic cycle or carbon dioxide emissions, or both. That the acceleration in the ice-sheet melting reported in 2012 was five times that which scientists had supposed earlier suggests that the data in from the following year may result in even more dramatic headlines. That the C02 and methane (from leaks in wells and distribution as well as from permafrost melt) levels were not only increasing, but doing so at unprecedented rates, suggests that we humans have literally outdone ourselves. It seems a fantasy to expect prudent measures that would obviate beforehand even just some of the anticipated damage. Even in the wake of Hurricane Katrina, the U.S. Government could have given Louisiana a financial incentive to rebuild the city further inland, above sea-level. That would have been an easy decision compared with what to do about Manhattan, and yet the unquestioned, knee-jerk response was to rebuild on site.

Accordingly, even the study on ice-melt reported in late 2012 and the report in 2014 of the record increase in C02 in the atmosphere were unlikely to affect public policy significantly, at least in the United States. Enabling such negligence, Andrew Shepherd of Leeds said in 2012: “The signals suggest there is no immediate threat.”[7] Meanwhile, the carbon emissions were actually increasing from the previous year. This represents two degrees of separation from a reduction. The emissions targets in the UNFCCC agreement signed in 1992 had included stabilisations at 1990 levels for some countries and reductions for others by the year 2000.   In other words, we as a species seem pretty clueless, even as we promote ourselves being of the highest and most developed species.

We are, it can be said, a species of today. The stock markets demonstrate this innate propensity clearly enough. For a complex organism not known for quick evolutionary adaption to a changing environment, it is dangerous to be so “hard-wired” for today when our artifacts collectively can shift a planetary equilibrium beyond its natural cycle. It is not a given that we will be able to rely on our prowess at technological development to make up for our convenient habit of looking on and even making a situation worse. I have personally felt this in momentary lapses of my new-found better diet when I eat one after another chocolate cookie after having fallen with one. My mind succumbs to the fallacy that in having lapsed, might as well open the flood-gates. The next morning, I make myself go running, as if the presumed cause-effect relation might prevent any future lapses. Nietzsche may have been right in suggesting that thoughts are really instinctual urges and reasoning is their tussling for dominance.

As Mark Twain observed, speaking through an angel in The Mysterious Stranger, “Man’s mind clumsily and tediously and laboriously patches little trivialities together and gets a result—such as it is.”[8] Yet so conceited is that mind, and in other matters too. In spite of having a moral sense, and perhaps because of our knowing right from wrong, our “paltry race [is] always lying, always claiming virtues which it hasn’t got.”[8] From an angel’s point of view, homo sapiens—arrogantly self-named here as the “wise man” species—can only be “dull and ignorant and trivial and conceited, and so diseased and rickety, and such a shabby, poor, worthless lot all around”[10] as to be met with utter indifference from the angelic perches so different—the pathos of distance being hollow rather than filled with empathy or even sympathy.[10] Twain’s angel does not mince words. Humans “have nothing in common with me—there is no point of contact; they have foolish little feelings and foolish little vanities and impertinences and ambitions; their foolish little life is but a laugh, a sign, and extinction.”[12] It is not merely the staying power of the moneyed commercial caste that moves us as a species to our own extinction; all of us are complicit.

We have built our mammoth edifices and modern conveniences on such a scale, and we use them as if we were junkies on a drug-fix that we have outstripped our own capacity as a species even to mop up after ourselves. This vulnerability becomes truly dangerous now that we are capable of having a significant impact on the planetary ecosystem, including its atmosphere. Even so, we continue to single our species out as “Made in the Image of God,” and as we preach our moral sense, ignorant of the probability that a more intentionally cruel and self-destructive race has never roamed on the land or swam in the sea. Our reckless conceit, it would seem to all outward appearances, is in such denial of its own existence that we naturally assume we cannot be wrong—that we affirm with such factuality, “I know what I know.” If only the ice on this towering edifice would melt from global warming; if only we could be so lucky.

1. Gautam Naik, “Polar Ice Melt Is Accelerating,” The Wall Street Journal, November 30, 2012.
2. Ibid.
3. Justin Gillis and John Broder, “With Carbon Dioxide Emissions at Record High, Worries on How to Slow Warming,” The New York Times, December 3, 2012.
4. Ibid.
6. Joby Warrick, "CO2 Levels in Atmosphere Rising at Dramatically Faster Rate, U.N. Report Warns," The Washington Post, September 9, 2014.
7. Gillis and Broder, “Carbon Emissions at Record High.”
8. Mark Twain, The Mysterious Stranger in The Mysterious Stranger and Other Stories (New American Library: New York, 1962), p. 212.
9. Ibid., p. 192.
10. Ibid., p. 172.
11. Ibid., p. 176.
12. Ibid., p. 211.

Sunday, September 7, 2014

Natural Rights in Europe and America: Shoring-Up Each Other’s Weak Spots

The Declaration of Independence made by the thirteen newly sovereign American states in 1776 recognizes “that all men are created equal, that they are endowed by their Creator with certain inalienable rights, that among these are Life, Liberty and the pursuit of Happiness.” These rights are not dependent on any government, and thus exist equally so in the state of nature. The Declaration of the Rights of Man and the Citizen, made in Europe thirteen years later, omits any mention of a creator-deity. “Men are born and remain free and equal in rights.” The equality here is more limited, being solely in terms of rights, “man’s natural and imprescriptible rights” in particular. These “are liberty, property, security, and resistance to oppression.” We can thus compare and contrast the two sets of rights, which important implications for public policy for both America and Europe.

The entire essay is at "Natural Rights in Europe and America."

Tuesday, September 2, 2014

The Scots Weigh Independence from Britain as the British Consider Leaving the E.U.

The debate over whether the Scottish region of Great Britain should secede from the UK extends beyond whatever provincial interests unite and divide the state’s regions; it "is also part of a larger question that extends well beyond Britain, to Texas and Colorado, for example, and elsewhere: What are the benefits and drawbacks of larger, politically diverse countries, compared with smaller, more homogeneous ones?"[1] Yet is Britain a large, heterogeneous country even as it is a state in the European Union? Texas is much larger, and yet  it too is a state in a union of relatively homogeneous states. 

Moreover, rather than treating large, diverse countries on one side of a spectrum and small, relatively homogeneous countries on the other, the two political types can be viewed as qualitatively different—meaning that they are on two different political levels. Generally speaking, an empire is a very large, diverse country or union that consists of relatively homogeneous “kingdom-level” (i.e., smaller) countries or states. Indeed, modern federalism allows for both the empires and their respective states to have elements of “countryhood” even as two distinct levels are involved: the federal and that of the consisting states. 

Because an empire consists of a large number of such subunits, neither the UK nor Canada can be said to be empires, whereas the E.U. and U.S. can. So comparing Britain to Texas and Colorado in terms of secession should be done carefully; done wrongly, a category mistake may be involved. Such a mistake treats a state in one empire as being on the same level as an entire empire elsewhere, or one empire like a state in one’s own empire.  

According to The New York Times, UK spending in the Scotland region of the state benefits from wealthy London taxpayers. The article goes on to the troubled waters of comparisons. "There is an echo of this debate in the United States, even if the political sides tend to be switched. In Colorado, some conservative rural residents have raised the prospect of breaking off from the rest of [Colorado], which is both more liberal and more affluent. More broadly, many conservative states in the United States, like South Carolina and North Dakota, receive many more tax dollars from Washington than they send there."[2] It is the “more broadly” pivot that is problematic, for the move from intra-Colorado, or intra-Britain dynamics to interstate dynamics in either the E.U. or U.S. involves taking on additional (interstate-only) dynamics.

Specifically, it is one thing for a region of a state to split off, and quite another for a semi-sovereign member-state of a union to secede. For one thing, international principles apply only to the union, and they complicate any intended comparison with a state. Furthermore, it is one thing for a region in a state to justify leaving a state with a low or moderate amount of regional diversity relative to that of a union of many formerly-independent states, and quite another for one of those states to justify leaving an inherently diverse (i.e., by territory) union.

Scotland breaking off from Britain, I contend, is rightly commensurate to Eastern Colorado breaking off form Colorado and Southern Illinois, which is called Egypt, breaking off from the Chicago-dominated Illinois. Scotland is not like South Carolina seceding from the U.S. in 1861; rather, the UK is like South Carolina in deciding whether or not to secede from the European Union.  Just as the Federalists and Anti-federalists debated in the early U.S. whether the states or the federal level should have the balance of power, so too Euro-federalists and Euro-Skeptics have debated the same question in the first fifty years of the EC/EEC/E.U.  Attempting to liken the arguments on the Scottish question to those being made in Britain on whether the state should secede from the E.U. is likely to be an exercise in futility; the vast qualitative difference between the two political levels cannot but elude such an exercise.

[1] Katrin Bennhold, "How Scottish Independence Relates to Larger Tax Fights," The New York Times, August 21, 2014.
[2] Ibid.