Saturday, February 9, 2019

Behind Cameron's Referendum on Britain's Secession from the E.U.

Governors of other E.U. states reacted quickly to David Cameron’s announcement that if his party would be re-elected to lead the House of Commons, he would give his state’s residents a chance to vote yes or no on seceding from the European Union. The result would be decisive, rather than readily replaced by a later referendum. Cameron said the referendum would also be contingent on him not being able to renegotiate his state’s place in the Union. This renegotiation in particular prompted some particularly acute reactions from the governments of other “big states.” Behind these reactions was a sense that the British government was being too selfish. This was not fair, I submit, because the ground of the dispute was on the nature of the E.U. itself as a federal system. 
David Cameron, PM of Britain
With the basic or underlying difference still intact, it should be no surprise that the renegotiation did not go well. German Foreign Minister Guido Westerwelle said at the time that Britain should not be allowed to “cherry pick” from among the E.U. competencies only those that the state likes. What then should we make of the opt-outs at the time—provisions in which states other than Britain benefitted? Surely one size does not fit all in such a diverse federal union (that goes for the U.S. as well). Westerwelle was saying that Cameron had abused the practice that was meant as an exception rather than the rule. Britain was exploiting this means of flexibility in the Union because that people in that state tended to view the E.U. as a confederation or, worse, a trade "bloc" even though the E.U. and its states each had some governmental sovereignty. 
The president of the European Parliament, Martin Schulz, said the approach of the British government would lead to the detriment of the Union. Specifically, he warned of “piecemeal legislation, disintegration and potentially the breakup of the union” if Britain was allowed to be bound only to the E.U. competencies that the party in power in the House of Commons liked. A player joining a baseball team undermine the game even in demanding that he will only bat because that’s the only part that is fun. In higher education, the education itself could only be incomplete if students could limit their classes to what interests them. Such a player or student would essentially have a different view of the sport and education, respectively. The view itself of the nature of the thing was so at odds with the fundamentals of the thing that it would be undercut severely. This is what had been going on in the case of Britain navigating in the E.U. 
Carl Bildt, the Swedish foreign minister, also touched on the detriment to the whole from what he erroneously took to be the selfishness of a part. He said that Cameron’s notion of a flexible arrangement for his own state would lead to there being “no Europe at all. Just a mess.” French foreign minister Laurent Fabius said that “Europe a la carte” would introduce dangerous risks for Britain itself. So if the British government was being selfish, it could have been at the state's detriment, though of course I contend that selfishness does not go far enough. 
In short, the visceral reactions in other states to Cameron’s announcement manifested recognition of selfishness of one part at the expense of the whole. Those reactions were rash and, even more importantly, lacking in recognition of the underlying fault-line in the Union erupting between Britain and the Union out somewhere in the Channel. Cameron and plenty of other Brits viewed the E.U. simply a series of multilateral treaties in which sovereign states could pursue their respective interests. “What he wants, above all,” according to Deutsche Welle, “is a single market.” Therefore, he “wants to take powers back from Brussels” to return the E.U. to a network of sovereign states. It followed according to this view that each state, being fundamentally sovereign, “should be able to negotiate its level of integration in the EU.” Such would indeed be the case were the E.U. merely a bundle of multilateral international treaties, or a network to which Britain was a party, rather than a federal union of semi-sovereign states and a semi-sovereign federal level. Herein lies the real conflict of ideas within the E.U. Cameron’s strategy is selfish only from the assumption that the E.U. is something more than a network to which Britain happens to belong.
Ultimately the problem was the uneasy co-existence of the two contending conceptions of what the union was in its very essence. The real question was whether the E.U. could long exist with both conceptions being represented by different states. The negative reaction from state officials of other states who held the “modern federal” conception (i.e., dual sovereignty) of the E.U. suggests that ultimately Cameron’s conception of the E.U. was utterly incompatible with the union’s continued viability, given what it actually was at the time

Sources:
EU Leaders Hit Out Over Cameron Referendum Pledge,” Deutche Welle, 23 January 2013.
Cameron Wants Another EU,” Deutsche Welle, 24 January 2013.

Essays on Two Federal Empires, available at Amazon.

Essays on the E.U. Political Economy, available at Amazon.

Greek Austerity: Pressure on the Environment

“While patrolling on a recent cold night, environmentalist Grigoris Gourdomichalis caught a young man illegally chopping down a tree on public land in the mountains above Athens. When confronted, the man broke down in tears, saying he was unemployed and needed the wood to warm the home he shares with his wife and four small children, because he could no longer afford heating oil. ‘It was a tough choice, but I decided just to let him go’ with the wood, said Mr. Gourdomichalis, head of the locally financed Environmental Association of Municipalities of Athens, which works to protect forests around Egaleo, a western suburb of the capital.”[1] Tens of thousands of trees had disappeared from parks and forests in Greece during the first half of the winter of 2013 alone as unemployed Greeks had to contend with the loss of the home heating-oil subsidy as part of the austerity program demanded by the state’s creditors. As impoverished residents too broke to pay for electricity or fuel turned to fireplaces and wood stoves for heat, smog was just one of the manifestations—the potential loss of forests being another. On Christmas Day, for example, pollution over Maroussi was more than two times the E.U.’s standard. Furthermore, many schools, especially in the north part of Greece, had to face hard choices for lack of money to heat classrooms.
Greek forests were succumbing  in 2012 to the Greeks' need to heat their homes as austerity hit.   source: Getty Images
Essentially, austerity was bringing many people back to pre-modern living, perhaps including a resurgence in vegetable gardens during the preceding summer. At least in respect to the wood, the problem was that the population was too big—and too concentrated in Athens—for the primitive ways to return, given the environment's capacity. 
To be sure, even in the Middle Ages, England had lost forests as the population (and royal plans) grew. In December 1953, many Londoners decided to use their fireplaces to burn wood, resulting in pollution blanketing the city. As a result, thousands died and the city outlawed the use of fireplaces. No one probably thought to ask whether the city had gotten too big—and too dense. No policy was enacted that would result in a shift in population out of the region.
Generally speaking, human population levels made possible by modern technology and medical advances have become too large for a return to pre-modern ways of life. Because of the extraordinarily large sizes of the modern city, including Athens, suddenly removing modern technology, which includes government subsidies, it is especially problematic when many people are forced to fend for themselves to meet basic needs. The efficiency of modern technology, including in regard to utilities and food distribution, is often taken for granted, even by governments, so the impacts on the environment when masses of people “return to nature” can be surprising. Nature has become "used to" seven billion humans on the planet in large part because we have economized via technology so the full brunt of the population-size is not felt. Particularly in industrial countries, societies are reliant on modern technology because without it the bulging population is unsustainable. 
Put another way, we have distanced ourselves from nature, and our growth in numbers in the meantime has made it impossible for us to “get back to nature” in a jolt, especially by many people. It is in this sense that governmental austerity programs that cut back on sustenance are dangerous not only for society, but also the ecosystems in which humans live. Accordingly, by mid-January, 2013, the Greek government was considering proposals to restore heating-oil subsidies. It is incredible that the financial interests of institutional creditors, including other governments, were even allowed to put the subsidies at risk.
In ethical terms, the basic sustenance of a people takes priority ethically over a creditor’s “need” for interest. The sin of usury is sourced back to the origins of lending as an instance of charity rather than money-making either from the plight of the poor or profit-uses.[2] When a person in antiquity was in trouble financially, someone with a bit of cash would lend some with the expectation that only that sum would be returned. The demand for interest on top was viewed by the historical Church as adding insult to injury (i.e., the bastardization of charity into a money-making ruse). Then exceptions were made for commercial lending, wherein a creditor could legitimately demand a share of the profit made from the borrowed money in addition to the return of the principal. As commercial lending came increasingly to characterize lending, the demand for interest became the norm, even on consumption loans when no profit would ensue to pay off the loan with interest. The notion that interest is conditional on a borrower having enough funds was lost, causing much pain to many in the name of fidelity of contract, as if it or the creditor’s financial interest were an absolute. Put another way, the default has swung over from the borrowers to the lenders to such an extent that society may look the other way as people literally have to cut down trees to heat their homes because creditors have demanded and won austerity touching on sustenance programs.
Therefore, especially in Christian Europe, putting people out by pressure being applied to state governments in the E.U. to make payments even in the context of a financial crisis can be considered to be untenable, ethically speaking. I am not suggesting that states should be profligate with borrowed funds. Rather, just as Adam Smith’s Wealth of Nations is bracketed by his Theory of Moral Sentiments, so too an economy (and financial system) functions best within moral constraints. 

1. Nektaria Stamouli and Stelios Bouras, “Greeks Raid Forests in Search of Wood to Heat Homes,” The New York Times, January 11, 2013.
2. Skip Worden, God's Gold, available at Amazon. 

Friday, February 8, 2019

Second-Term Inaugural Addresses of American Presidents: Of Transformational or Static Leadership?

According to a piece in the National Review, “George Washington might have had the right idea. Second inaugural addresses should be short and to the point. Of course, speaking only 135 words as Washington did in 1793 might be a little severe.”[1] Consider how short, and (yet?) so momentous Lincoln's Gettysburg Address was. The challenge for second-term-presidents, whether Barack Obama or the sixteen two-term presidents before him, is “how to make a second inaugural address sound fresh, meaningful and forward-looking." Almost all of Obama’s predecessors failed at this. Only Abraham Lincoln and Franklin D. Roosevelt made history with their addresses. One stirred a nation riven by civil war; the other inspired a country roiled by a deep depression. All but forgotten are the 14 other addresses, their words having been unable to survive the test of time. Even those presidents famed for their past oratory fell short.”[2] This is a particularly interesting observation: surviving the test of time being the decisive criterion. Even a president whose silver tongue mesmerizes a people of his or her time may not deliver ideas that survive beyond being a cultural artifact of the president’s own time. What of an address that is quite meaningful in its immediate time yet does not pass the test of time so as to be recognized as a classic? 

The full essay is at "Inaugural Addresses: Of Leaders?"

1. George E. Condon, Jr., “The Second-Term Inaugural Jinx,” National Journal, January 20, 2013.
2. Ibid.

Increasing Income Inequality in the U.S.: Deregulation to Blame?

Most Americans have no idea how unequal wealth as well as income is in the United States. This is the thesis of Les Leopold, who wrote How to Make a Million Dollars an Hour. In an essay, he points out that the economic inequality increased through the twentieth century. His explanation hinges on financial deregulation. I submit that reducing the answer to deregulation does not work, for it does not go far enough.
In 1928, the top one percent of Americans earned more than 23% of all income. By the 1970’s the share had fallen to less than 9 percent. Leopold attributes this enabling of a middle class to the financial regulation erected as part of the New Deal in the context of the Great Depression. In 1970 the top 100 CEOs made $40 for every dollar earned by the average worker. By 2006, the CEOs were receiving $1,723 for every worker dollar. In the meantime was a period of deregulation beginning with Carter’s deregulation of the airline industry in the late 1970s and Reagan’s more widespread deregulation. Even Clinton got into the act, agreeing to shelve the Glass-Steagall Act, which since 1933 had kept commercial banking from the excesses of investment banking. The upshot of Leopold’s argument is that financial regulation strengthens the middle class and reduces inequality by tempering the wealth and income of those “on the top.” Deregulation has the reverse effect.
The increasing role of the financial sector in the second half of the 1900s means that finance itself could claim an increasing share of compensation.  
Leopold misses the increasing proportion of the financial sector in GDP from the end of World War II to 2002. The ending of the Glass-Steagall act in 1998 does not translate into more output on Wall Street relative to other sectors. Indeed, the trajectory of the increasing role of finance in the U.S. economy is independent of even the deregulatory period. Leopold’s explanation can be turned aside, moreover, by merely recognizing that the “young Turks” on Wall Street have generally been able to walk circles around the products of their regulators. Even though financial deregulation can open the floodgates to excessive risk-taking, such as in selling and trading sub-prime-mortgage-based derivatives and the related insurance swaps, I suspect that the rising compensation on Wall Street has had more to do with the increasing role of the financial sector in the American economy.
The larger question, which Leopold misses in his essay, is whether the “output” of Wall Street is as “real” as that of the manufacturing and retail sectors, for example. Is there any added value to brokering financial transactions, which in turn are means to investments in such things as plants and equipment used to “make real things”? Surely there is value to the function of intermediaries, but as that function takes on an increasing share of GDP, it is fair to ask whether the overall value of “production” is inferior.
Given the steady increase of the financial sector as a percent of GDP, one would expect a more steady divergence of these two lines. Reagan's deregulation fits the divergence pictured, though one would expect a further increase in divergence after the repeal of the Glass-Steagall Act in 1998.  Source: Les Leopold

As for the rising income and wealth of Wall Streeters, increasing risk, which is admittedly encouraged by deregulation, is likely only part of the story. If the financial products are premium goods as distinct from the goods sold at Walmart, for instance, then as the instruments are increasingly complex one would expect the compensation to increase as well.
Leopold is on firmest ground in his observation that Americans are largely oblivious to the extent of economic inequality in the United States. Few Americans have a sense of how much more economic inequality there is in the U.S. than in the E.U., where the ratio of CEO to average worker compensation is much lower. One question worth asking centers on what in American society, such as in what is valued in it, allows or even perpetuates such inequality, both in absolute and relative terms. The relative terms suggest that part of the explanation lies in cultural values having relative salience in American society. Possible candidates include property rights and the related notion of economic liberty, the value placed on wealth itself as a good thing, and the illusion of upward mobility that allows for sympathy for the rich from those “below.”
In short, beyond actual regulations, particular values esteemed in American society and the increasing role of the financial sector in the American GDP may provide us with a fuller explanation of why economic inequality increased so during the last quarter of the twentieth century and showed no signs of stopping during the first decade of the next century. Americans by in large were wholly unaware of the role of their values in facilitating the growing inequality, and even of the sheer extent of the inequality itself. In a culture where political equality has been so mythologized, the acceptance of so much economic inequality is perplexing. At the very least, the co-existence of the two seems like a highly unstable mixture from the standpoint of the viability of the American republics “for which we stand.” Yet absent a re-calibration of societal values, the mixture may be an enduring paradox of American society even if the democratic element succumbs.

Source:
Les Leopold, “Inequality Is Much Worse Than You Think,” The Huffington Post, February 7, 2013.

Thursday, February 7, 2019

A U.S. Senator Aiding a Contributor While Averting a "Fiscal Cliff": Turning a Crisis into an Opportunity

The law passed by Congress on January 3, 2013 to avert the across-the-board tax increases and “sequester” (i.e., across-the-board budget cuts) was “stuffed with special provisions helping specific companies and industries.” While many of the provisions would increase the U.S. Government’s debt, at least one would decrease it. Is the latter any more ethical because it is in line with the more general interest in reducing the federal debt? Put another way, does the end justify the means?  Do good consequences justify bad motives?  These are extremely difficult questions. The best I can do here is suggest how they can be approached by analysis of a particular case study.
In the legislation, a provision reduced the Medicare reimbursement rate for a radiosurgery device manufactured by the E.U. company Elekta AB. The cut was pushed by a competitor, Varian Medical Systems. Senate Majority Leader Harry Reid asked Sen. Max Baucus, chair of the Senate Finance Committee, to write the cut into the legislation. While both senators could point to the public interest in the debt-reduction result of the cut, their relationship with Varian makes their motives suspect. Specifically, they may have exploited personal conflicts of interest that eclipsed a more expansive duty to the wider (i.e., not private, or personal) public interest. 
While it is perhaps simplistic to relate campaign contributions to a senator’s subsequent action, it is significant that Varian spent  $570,000 in 2012 on lobbying. The company added Capitol Counsel, which had contacts to Sen. Baucus. Vivian already had connections to Reid through Cornerstone Government Affairs lobbyist Paul Denino, a former Reid deputy chief of staff. Additionally, the leading beneficiary of the contributions of Varian executives and the company’s PAC over the previous four years was Sen. Reid, whose committees received $21,200. Varian’s lobbyists added $42,700 more to Reid’s campaign.[1] While Sen. Reid’s subsequent urging of the reimbursement rate cut could have been unrelated to these contributions and contacts, the senator’s involvement compromises him ethically. Put another way, it is at the very least bad form, or unseemly. It implies that companies making political contributions and hiring lobbyists connected to public officials do so (or worse, should do so) to have special access to those particular officials to turn upcoming legislation to the companies’ financial advantage. Even if the public also benefits, it can be asked whether the companies deserve their particular benefits. In the case of Varian, it may be asked whether the company deserved the cut in the reimbursement rate going to Elekta.
As could be expected, spokespersons at both companies sought to argue the merits of their respective cases in the court of public opinion.  It is more useful to look at the regulators’ rationale for increasing the reimbursement rate for Elekta’s  “Gamma Knife” in the first place. Originally, the knife and Varian’s linac machines were lumped together by the Centers for Medicare and Medicaid Services (CMS) under the same CMS code. In 2001, the Centers separated the devices in terms of data collection so an analysis could be conducted on whether the devices should receive different reimbursement rates. The Huffington Post reports that the reimbursement rate for the Gamma Knife was increased because “it typically requires only one treatment, while the linacs often require multiple treatments.” Also, “Gamma Knives machines are more expensive to obtain and maintain due to the storage of radioactive cobalt and regulation by both the Nuclear Regulatory Commission and the Department of Homeland Security. Linacs don’t use nuclear material and are regulated by the Food and Drug Administration.”[2] So, due to the cost and use differential, CMS  increased the Gamma Knife reimbursement in 2006 to $7000. From the standpoint of the criteria of regulators, the data-collection and analysis method and the rational rationale are legitimate. In contrast, because neither the use or cost differential had changed by January 2013, the cut in the reimbursement rate cannot enjoy such legitimacy. Hence it is possible that exogenous factors, such as the political influence of Varian’s lobbyists and campaign contributions, were behind the change. From the standpoint of the previous rate differential, the change cannot be justified. Neither Sen. Reid nor Sen. Baucus could justify their actions (and motives) by the substance of the case. However, they could still appeal to the salubrious budget-cutting effect as justifying their involvement.
The question here is whether the favorable consequences of the cut on the government’s subsequent deficits mitigates or reduces the shady scenario of a senator acting on behalf of a company that had contributed to his or her campaign. I would advise a member of Congress to avoid even the appearance of a conflict of interest. If the result in this particular case is in the public interest (i.e., reducing the deficit), does this positive consequence justify the senators’ actions and even the questionable appearance?  It’s a no-brainer that the senators would immediately point to the public interest in the consequence, but does it effectively remove the taint of immoral political conduct (and perhaps motive)?
The link between the company-senator relation, the senators’ action in which the company stands to benefit financially in a material way, and the financial benefit to the company can be distinguished ethically from a good consequence to the public. A bystander would naturally view the consequence to the public as salubrious even while having a sentiment of disapprobation toward the company’s own benefit as well as the senators’ action and relation to the company. In other words, the favorable impact on the public does not remove the stain on the company and the senators. To be sure, that stain would be greater were the public harmed rather than helped, but even with the positive general consequence the senators may have acted for the private benefit. Also, their action could have come from other senators, hence obviating the ethical problem. In short, the public interest does not remove either senator from the ethically problematic situation in which they decided to occupy.  Even if their motive had been solely for the public interest, they violated the appearance of unethical motive and conduct.
“The end justifies the means” is a slippery slope in terms of what the human mind can rationalize as legitimate. Great harm has been seemingly justified by great ideals. Even in the face of the ideals, the harms provoke a sentiment of disapprobation by the observer (excepting sociopaths). This suggests that the ideals cannot completely justify unethical means.  It may indeed be that unethical means are necessary in some particular cases, but this does not render the devices ethically pure. Ethical principles do not know practical compromise. Rather, people do.


1. Paul Blumenthal, “Varian Medical Systems Used Fiscal Cliff Deal to Hurt Competitor,” The Huffington Post, February 8, 2013.
2. Ibid.

On the Impact of Political Rhetoric: From “Global Warming” to “Climate Change”

Words matter in politics. The side that can frame a question by definitively naming it in the public mind enjoys a subtle though often decisive advantage in the debate and thus in any resulting public policy as well. For example, “pro-choice”privileges the pregnant woman, while “pro-life” defines the abortion debate around the fetus. Similarly, “global warming” implies a human impact, whereas“climate change” defines the issue around nature. Even though the shift from“global warming” to “climate change” is more in keeping with the evolving science and won’t be bumped off by a cold winter, political players have been the driving force—language hardly being immune to ideological pressure.
Regarding the weather shifting popular perception on the issue, research published in Public Opinion Quarterly in 2011 claimed that a bad winter can indeed discredit the “global warming” label.[1] The Washington Policy Center claimed two years later that the heavy snowfall during the latest winter had led to “climate change” replacing “global warming.”[2] The cold refusing to relent in March of 2013 and hitting North America hard in January of 2019 seemed to undercut or repudiate the scientific “global warming” hypothesis even though meteorology, a empirical science,  always demands long-term data.
However, in looking back at the name-change, we must consider the influence of political actors, who are prone to manipulate the public's perception in part by using words to frame the debate. In 2002, for example, Frank Luntz wrote a confidential memo to the Republican Party suggesting that because the Bush administration was vulnerable on the climate issue. The White House should abandon the phrase “global warming,” he wrote, in favor of “climate change.”[3] As if by magic, although “global warming” appeared frequently in President Bush’s speeches in 2001, “climate change” populated the president’s speeches on the topic by 2002.[4] In other words, the president’s political vulnerability on the issue was answered by changing the label to reframe the debate. Not missing a beat, critics charged that the motive was political in downplaying the possibility that carbon emissions were a contributing factor.[5] Both Bush and Cheney had ties to the oil and gas industry. In fact, Cheney's through Halliburton may have played a role in the administration's advocacy in favor of invading Iraq under the subterfuge that it had been involved in the attack on the Pentagon and the World Trade Center in 2001. 
The Obama administration likely went with “climate change” rather than "global warming" because the former was less controversial. The corporate Democrat tended to hold to the center politically; after all, Goldman Sachs had contributed a million dollars to his first presidential campaign in 2008. In September 2011, the White House decided to replace the term “global warming” with “global climate disruption.”[6] The administration subsequently annulled its own decision. 
So much attention on the matter of a mere label indicates that just how important what you call something is to its outcome. Labels are not always neutral. For instance, the term "African American," was making inroads whereas "Black American" was hardly ever heard. "African" slips in ethnicity whereas "Black," or negroid, refers to race. Changing the axis on which the controversy had hinged was in favor of the race-now-ethnicity. Meanwhile, the American public didn't notice the artful conflation of ethnicity (i.e., culture) and race. Obama used the ethnic term and applied it to himself even though his mother was Caucasian. He also claimed Illinois as his home state even though he moved to Chicago after college. He could benefit politically from the support of Black Americans and Illinoisans. 
Similarly, Obama could benefit politically from adopting "climage change." As the academic journal Public Opinion Quarterly reported in 2011, “Republicans are far more skeptical of ‘global warming’ than of ‘climate change.’” Whereas the vast majority of Democrats were indifferent to the label being used.[7] With “global warming” carrying “a stronger connotation of human causation, which has long been questioned by conservatives,” Obama stood to gain some republican support simply by changing how he refers to the issue.[8] That support was part of the president's ability to straddle the center in American politics. 
Given the effort that has gone into labels, it is amazing that more time in the Congress has not gone into debating labels. I am also curious why the American people did not realize that they were being manipulated by the choice of label. If "climate change" allows for the contention that human-sourced carbon emissions into that atmosphere have not been a cause of the warming of the oceans and air, then it is possible that the very survival of the species could be in jeopardy because of  the choice of a label for short-term economic and political reasons.

1. Tom Jacobs, “Wording Change Softens Global Warming Skeptics,” Pacific Standard, March 2, 2011. 
2. Washington Policy Center, “Climate Change: Where the Rhetoric Defines the Science,” March 8, 2011.
3. Oliver Burkeman, “Memo Exposes Bush’s New Green Strategy,” The Guardian, March 3, 2003.
4. Ibid.
5. Washington Policy Center, “Climate Change: Where the Rhetoric Defines the Science,” March 8, 2011.
6. Erik Hayden, “Republicans Believe in ‘Climate Change,’ Not ‘Global Warming,” The Atlantic Wire, March 3, 2011.
7. Tom Jacobs, “Wording Change Softens Global Warming Skeptics,” Pacific Standard, March 2, 2011.
8. Ibid.

Tuesday, February 5, 2019

An Empire's Economic Scale Demands a Market System: The Case of China

A trend of increased-scale economies can be observed through history as city-states have given way to the increased military power of centralized Medieval kingdoms. Many of those expanded into Early Modern kingdoms as advances in military technology make it possible for kings to extend the territory under their control. Even empires have gotten bigger. Modern-day Germany was once considered an empire, as were Switzerland and the Netherlands. Today these polities are states in a modern form of empire, the EU. Similarly, the emergent United Colonies of America was considered to be an empire within the British Empire, with the individual colonies being viewed on both sides of the Atlantic as Early Modern kingdom-level polities on par with the states of the E.U. in the twentieth century. Similarly in China, as kingdoms were added, an old form of empire took shape. Because these enlargements came about gradually over centuries, it has been difficult for the human mind to recalibrate how the modern large empire-scale economies should be designed to take into effect the distinct challenges of the scale. We can see such an adjustment in the case of China as economic centralization came to be replaced by regulated markets, albeit with a sizeable involvement still of the government in the economy. 
Communism, for lack of a better word, has somehow morphed into Capitalism in China, as if a genetic mutation had taken hold through mitosis. This reflects an important trend that can be traced back to Deng Xiaoping (1904-1997), who “abandoned many orthodox communist doctrines and attempted to incorporate elements of the free-enterprise system into the Chinese economy” beginning in the late 1970's, according to the Encyclopedia Britannica. Decades later, upon becoming prime minister, Li Keqiang announced in 2013 that the central government would reduce the state’s role in the economy. The Chinese government issued a set of policy proposals to reduce “government intervention in the marketplace” and give “competition among private businesses a bigger role in investment decisions and setting prices.”[1] According to the proposals, a tax on natural resources would be expanded, market forces would play a larger role in determining bank interest rates, and, according to the government, policies would be enacted to “promote the effective entry of private capital into finance, energy, railways, telecommunications and other spheres.”[2] Foreign investors would be given more opportunities to invest in finance, including banking, logistics and healthcare. Foreign exchange controls would also be loosened further.
The proposals were enough for Stephen Green, an economist with Standard Chartered, to remark, “This is radical stuff, really.”[3] Huang Yiping, chief economist at Barclays, pointed to lower growth projections and massive amounts of debt as giving the Chinese government a rather practical motive in continuing the trend of refurbishing communism. Many experts doubted, however, whether the Communist Party would “abandon the state capitalist model, break up huge, state-run oligopolies or privatize major sectors of the economy that the party considers strategic, like banking, energy and telecommunications.”[4] Additionally, corrupt government officials would doubtlessly resist losing what the New York Times called their “secret stakes in companies,” not to mention all the bribes.[5]
Even so, it is astounding that the prime minister, a communist, would say: “If we place excessive reliance on government steering and policy leverage to stimulate growth, that will be difficult to sustain and could even produce new problems and risks. The market is the creator of social wealth and the wellspring of self-sustaining economic development.”[6] Marx and Lenin would hardly recognize the Chinese Communist Party. Because China has over a billion people, the old “command-and-control” economic model based on centralized directives on production quotas and prices had become increasingly difficult to coordinate. Bottlenecks in supply causing shortages on the shelves could eventually occur, with political instability increasingly likely.  The sheer scale of China, an empire of former kingdoms, has rendered centralized control highly inefficient.


The Emperor Kangxi of the Qing Dynasty. He ruled for 60 years, greatly expanding the size of the empire.      Source: Chinahighlights.com


Interestingly, even as Emperor Kangxi (1654-1722), the second emperor of the Qing Dynasty (1644-1911), expanded the empire by taking over central Asian Muslim kingdoms, he resisted the preceding Ming Dynasty’s laissez-faire policy on internal trade and industry by turning some crucial industries into monopolies. Interestingly, John D. Rockefeller would probably have concurred, based on his own theory that the coordination in a monopoly in a vital industry such as oil could put an end to destructive competition. In any case, Kangxi apparently saw no contradiction between expanding the empire and centralizing some important sectors of the economy. Similarly, Mao saw no internal tension in collectivized consolidation on a large scale. As tempting centralization has been for Chinese dictators seeking increased control and thus power, government regulation of competitive markets is eminently better in empire-scale economies, not only of China, but the E.U., U.S., and Russia as well. 


1. David Barboza, “China Plans to Reduce the State’s Role in the Economy,” The New York Times, May 24, 2013.
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. Ibid.