Tuesday, January 31, 2012

State Fiscal Governance in the E.U.: Toward Balanced Union

At the end of January 2012, 25 of the 27 E.U. states (all but Britain and the Czech Republic) indicated at a meeting of the European Council that they would accept in principle (i.e., subject to ratification) a new limit of 0.5% of gross economic output over a complete economic cycle for the states’ “structural budget deficits” and strengthen the E.U.’s enforcement mechanism on states that breach that limit on deficits and 60% of gross output for accumulated state debt. The Wall Street Journal reported at the time that the “0.5% deficit limit would, if obeyed, mark a revolution in [the states’] fiscal policies, ending more than 30 years of steadily rising [state] debt.” This statement is immediately undercut, however, as the Journal describes the extent of the loopholes in the amendment.  The reader is left wondering whether the E.U. will use the proposed enforcement mechanisms, such as the ECJ imposing fines on states already hard up for cash, as well as whether fining a state government would make any difference.

Sarkozy, Merkel and Monti at the European Council Meeting          
Philippe Wojazer/Reuters


The full essay is at "Essays on the E.U. Political Economy," available at Amazon.


1. Marcus Walker, “BudgetTreaty: Neither Panacea Nor Poison,” The Wall Street Journal, January 31, 2012. 

Saturday, January 28, 2012

A Future of Regulators at Fault?

The typical case in the U.S. is that the industry being regulated resists being regulated, while the regulators insist on enforcing the regulations. To be sure, particularly strong firms in an industry may propose incremental regulations for strategic advantage—knowing that smaller or less profitable firms in the industry would have more trouble complying financially. The strategic use of regulation is an under-appreciated phenomenon in the de-regulation movement. Perhaps even more bizarre is the case of an industry complaining about lax enforcement of existing regulations and demanding even more. What industry might fit this bill? As a hint, look for a major scandal that did reputational harm to an industry.

In the industry, the firms’ calls for a crackdown must contend with a legacy of a light regulatory touch.” People in the industry question whether regulators have been “too gentle” on the firms. If a firm “runs afoul of the rules, regulators largely rely on the firms to report their own wrongdoing.”[1] It sounds like Andy Taylor’s jail in Mayberry—the keys are hanging on the wall, help yourself Otis (he was the drunk). From 1996-2011, regulators penalized only ten firms, letting scores of others off the hook “because [the] regulators deemed their violations accidental.”[2] The sounds like the work of Andy’s deputy, Barney—find the loot only to lose the criminal.

I am describing the futures industry, whose firms, “ordinarily loath to accept regulation,” decided in the wake of MF Global’s collapse and loss of $1.2 billion of customer money to spearhead efforts for “new oversight as they try to heal the black eye.”[3] The existing regulations are nearly non-existent. For example, “firms need not inform customers of the whereabouts of their money.”[4] It is no wonder that prospective customers would be hesitant to enter that arena—hence the firms’ financial interest in additional regulations. The question is perhaps why the regulators had not recommended any to Congress.

Without the usual vested interests thwarting prospective regulations bearing on themselves while Congress stands by and takes the contributions, any additional regulations “spearheaded” by the industry can be expected to be enacted. The question, I suppose, is whether the regulators will feel like enforcing them. With no headwind from the industry, lax enforcement would be an interesting nut to crack. Could it be that regulators exist who don’t believe regulations are very important? Such a mentality would be the opposite of public service—something like agnosticism in religion, only as held by clerics. Such a thing simply is not expected, hence it is worth investigating.

The most likely explanation is that the regulatory agency was “captured” by its industry, and ultimately it was in the interest of the industry itself to come up with something stronger. If so, might it be that once the new regulations are up and running, particular firms or the industry as a whole might want to capture the agency again? Even in following the industry’s “spearheading,” the agency is essentially “captured.” Is it the case in the American system more generally, that agencies are captured by their regulatees whether in increasing or decreasing enforcement? In other words, might the business and financial sectors be too powerful over government for their own good—like children telling their parents when to discipline them and when to let them get away with something?  The sectors—and in particular their largest firms—may be too powerful for a viable republic: TBTG, meaning too big to govern. TBTF might not be the biggest elephant in the living room after all.


1. Ben Protess and Azam Ahmed, “Insiders Call For Oversight of Futures,” The New York Times, January 26, 2012.
2. Ibid.
3. Ibid.
4. Ibid. 

Wednesday, January 25, 2012

Reining in Corporate Pay: Europe as a Model of Fairness for America

Corporate compensation—executive pay in particular—represents a “clear market failure,” so said Vince Cable, the business secretary in the E.U. state of Britain.[1] While suspected, the sheer explicitness, or blatant manner, of this verdict is itself noteworthy. Moreover, it stands as an opportunity for the E.U. to surpass the U.S. on economic fairness, which is a type of justice (see John Rawls). That is to say, Europe had an opportunity at the time of Cable’s statement to set the E.U. on a trajectory that would make the unfairness in the American system more transparent.

The business secretary, a Liberal Democrat in a coalition government with the Conservative Party, told the British House of Commons in January 2012 that business and investors “recognize that there is a disconnect between top pay and company performance and that something must be done.”[2] In New York at the time, the disconnect was generally taken as a fact of life, given the power of the managerial elite in corporate capitalism (as well as American legislatures). As if attempting to pop this stygian balloon filled with the noxious air of denial, Vince Cable continued, “We cannot continue to see chief executives’ pay rising at 13 percent a year while the performance of companies on the stock exchange languishes well behind . . . (a)nd we can’t accept top pay rising at five times the rate of average workers’ pay as it did [in 2010].”[3] The unfairness, in other words, is at the expense of not just the corporation’s owners, but also the (other) employees—executives being employees too.

It could be argued that the 13% annual increases in executive pay are a function of an increasing proportion of company stock options in the compensation. If the profits are languishing, the theory goes, the value of the options should be zero (i.e., unexecuted). However, what if the next group of executives, or even the economy over all, “performs” such that the options held by the previous executives then become valuable? Moreover, what do options cost non-management stockholders in terms of dilution? I suspect that options are “an easy way out” relative to cash compensation. The question is thus whether the practice can be reined in.

Under Vincent Cable’s proposals, shareholder votes on executive pay would be binding. Seventy-five percent, rather than a mere majority, would be needed for approval. I have never understood why the American states limit such votes to “non-binding,” as if the business judgment rule trumps property rights on compensation. Given that CEOs typically control their boards, whose job it is to oversee the executives for the stockholders, treating the property owners of the corporate wealth as if they were a focus group or a meaningless straw poll in Iowa seems misguided at best—and supportive of an institutional conflict of interest centered on the executives. To be sure, the suggestion made by Chuka Umunna, Vincent Cable’s counterpart in the Labour Party, to have employees as part of executive compensation committees also incurs a conflict of interest—one centered on the employees who have an interest in wanting “payback” for the decades of unfair compensation.[4] Conflicts of interest can work both ways—inflating and deflating deserved compensation.

My main point is the following: Were Europe to strengthen investors’ property rights—including disallowing proxies held by managements for such votes on account of the conflict of interest—the fault running through American political economies and civil societies would become more transparent (i.e., more obvious). “We have  been clear that executive pay must always be fair and transparent, and that high pay must be for outstanding, not mediocre, performance,” John Criby of the Confederation of British Industry—a business lobby group—said. “Millions [of pounds] for mediocrity does a disservice to the reputations of hard-working businesses.”[5] Indeed, it does a disservice to the society as a whole—particularly in terms of what it stands for. Can you imagine the U.S. Chamber of Commerce coming up with such a statement? It is a pity, particularly in terms of systemic risk, that a lack of enlightened self-interest in the vested interests on such a “clear market failure” exists in America. For this reason, “Europe as a Model” is not such a bad thing, certain rhetoric (of the usual suspects) to the contrary.

In Vermont and Wisconsin at the time of Vincent Cable’s speech in Britain, movements were underway to amend the respective constitutions (as well as that of the U.S.) to make it clear that corporations are not “legal persons.” I believe it would follow that money is not speech, though the amendments ought to make this explicit, given the tendency of justices to invent legal doctrines and the sway of money in legislative halls. Polls at the time showed 71% of the people across the United States were opposed to the Citizens United (unlimited corporate political contributions) decision of the U.S. Supreme Court two years before (almost to the day). Lest those movements get cocky, their leaders should be aware that huge corporate “war chests” used to buy politicians, commentators, and air time may mean that even such a supermajority’s popular will may not be sufficient. If so, it is unlikely that anything like Vincent Cable’s proposals would see the light of day in America. Accordingly, I have pointed out here the value in merely having an alternative displayed in Europe, even if the benefit is limited to wakening Americans up to the grip of corporate capitalism in American societies.


1, Julia Werdigier, “British Government Works to Rein in Corporate Pay,” The New York Times, January 23, 2012. 
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.

Sunday, January 22, 2012

dit Downgrades in the E.U.: Blaming the Messenger?

On January 13, 2012, “S&P stripped France and Austria of their prized triple-A credit ratings and reduced the ratings of seven other [states in the euro-zone], including Italy, Spain and Portugal. Germany, Finland, the Netherlands and Luxembourg were spared, along with Belgium, Estonia and Ireland.”[1] Italy was downgraded from single-A to triple-B-plus. "We think there are elements missing in their analysis … when it comes to the growth strategy … there is no space for maneuver for fiscal impetus but we believe that a growth strategy will have to rely mainly on structural reforms," Olivier Bailly, an E.U. Government spokesman, told reporters.[2] “Bailly also called the timing of the S&P decisions ‘very odd’ citing fiscal policies adopted to weather the crisis in the downgraded countries as well as the two successful debt auctions in Spain and Italy last week. ‘We think that there is a strange timing in this announcement considering the signals from the markets,’ Mr. Bailly said.”[3] The “very odd” and “strange timing” reference a tacit political motive behind S&P, which the European officials point out is an American company.

The full essay is at "Essays on the E.U. Political Economy," available at Amazon.


1. Christopher Emsden, Matina Stevis, and Bernd Radowitz, “E.U. Leaders Focuson ‘Progress’,” The Wall Street Journal, January 16, 2012.
2. Ibid.
3. Ibid.

Friday, January 13, 2012

Britain and Its Scottish Region: Should a State Split?

A region of one of the large E.U. states may split off to become a new state. For a U.S. state to split into two would require the approval of the Congress and presumably the U.S. President. I also assume the E.U.’s legislative and executive branches would have to sign off on the addition of a new state. I am not referring to Bavaria, or even northern Italy. The region to which I refer is known as Scotland, in the state of Britain. An independent Scotland would presumably have to apply to become a state of the E.U. 


The full essay is at Essays on Two Federal Empires.

Thursday, January 12, 2012

Assessing a “Funded Right” to Education as Constitutional in the U.S.

According to the Texas constitution, the government must provide funds for a “general diffusion of knowledge.” This is a worthy purpose in a representative democracy, as an educated electorate is generally presumed better able to self-govern by voting for candidates and even on policy-oriented referendums. Thomas Jefferson and John Adams had their differences to be sure, but they both believed that an educated and virtuous citizenry is vital to a republic. Accordingly, the “Texas constitution imposes an affirmative obligation to provide adequate financial resources for education, whatever the economic cycle,” according to Mark Trachtenberg, an attorney who represents more than seventy school districts that sued the government of Texas.[1] Altogether, four funding suits were pending in Texas as of January 2012. Five hundred districts, which together educate more than half of all public school students in Texas, were involved in those suits at the time. In 2010, the Texas legislature had cut more than $5 billion from school district budgets. In the wake of the cuts, the districts claimed that they lacked the resources to provide the level of education required by the constitution. One major question is whether the courts are the proper venue for this matter.

Critics of the lawsuits say it is the prerogative of legislatures to make the call on school funding. From a democratic standpoint, the representatives of the people should decide, rather than a few unelected judges. “There are more-appropriate venues for a vigorous and informed public debate about the state’s spending priorities,” according to Colorado’s head of state, John Hickenlooper.[2] Meanwhile, Washington’s Supreme Court ruled that the Washington legislature must come up with a plan for additional educational spending. The ruling can be interpreted as an indictment ultimately against Washington citizens, as they had elected the legislators who had insufficiently (in the justices’ view) funded general education. It is ironic that unelected justices would find the people as having been insufficient in seeing to it that the republic of Washington would remain viable with respect to “government of the people” and “government by the people.”

Constitutionally speaking, whether basic law (i.e., a constitution) should contain substantive funding requirements is an interesting question. If so, then the courts have every right to intervene, as part of their role is to interpret constitutions. The underlying question may be whether substantive rights, such as the right of free speech, should be expanded to what we might call “funded rights,” such as the right a funded education. In a “funded right,” the funding itself is moved up from being a matter of policy to being a function of government. Other “funded rights” could be “access to funded health-care,” food-stamps and guaranteed housing. Such “funded rights” can be justified as basic in terms of human rights. Furthermore, the “funded right” to a job, which may be implied in the Full Employment Congressional Act of 1946, could be written into a constitution.

In short, “funding rights” can be oriented to a “floor” of sorts below which neither a republic nor a human being can survive. Taking such rights out of the policy arena by elevating them to basic law constitutionally would protect the vulnerable from the momentary selfishness of the “haves”—particularly, the “one percent.” That is to say, a social contract that is not a mere reflection of the wealthy can perhaps exist and be protected even when under moneyed political pressure. After all, the courts are supposed to protect the rights of the individual (and minority) against the tyranny of the majority (Madison). Whereas it is clear that humans need medical care, food and shelter to survive, perhaps part of the dispute in Texas is whether a republic really needs an educated citizenry to be viable—or is it just a cherry on the sundae? Relative to medicine, food and shelter, a general education is certainly not necessary, even if it is important. The question is perhaps whether constitutional protections extend to the latter—or even whether they include that which is necessary but not necessarily in the foreseeable interests of the “haves.”

1. Nathan Koppel, “Schools Sue States For More Money,” The Wall Street Journal, January 7-8, 2012. 
2. Ibid.

Tuesday, January 3, 2012

On the U.S. Presidency's Campaign “Season”

The overextension, or hypertrophy, of one part of a governmental system—whether a level, branch or even a particular office—can be seen in the overemphasis alone of the process by which it is filled. Whether obsessed over or merely elongated, the selection process can come to take on a life of its own. Indeed, it could even eclipse governing. If, in referring to a particular office that has a four year term, one expects a window of only a year or perhaps two for governing before the selection process revs up again, then there is reason to suspect that the office has too much power in the system of government. Of course, it could also be that the selection process is simply flawed, but why then would so many people either tacitly approve it or even maintain that it is necessary.

In the context of the Iowa caucuses, which formally kick off the nomination process for the U.S. presidency—after at least six months of media-complicit campaigning and debates during which candidates rise and fall without a single vote being cast—the vested interests in Iowa defend the value, even necessity, of what is essentially, Me first! Listen to me, watch me! What is mentioned in the media as an aside, if at all, is the small fact that Iowa’s delegates to the parties’ nominating conventions are selected months later at a state convention. The selection of delegates is not a reflection of the popular vote. One would not suspect this from all the attention given to the “results” on caucus night. In actuality, the value of the results is basically in giving the first snapshot of what some voters in one small, unrepresentative state feel about the various candidates. While preferable to having pundits misinterpret polls in order to grab headlines by elevating some candidates while effectively marginalizing or even pushing others out well before any votes are cast, the Iowa caucuses are a straw poll of sorts, whose value is largely perceptual. Considering that the Iowa straw poll, which occurs about six months before the caucuses, can be far off the mark as a predictor of their results—much less of who is actually nominated by the two major parties (e.g., 2011-2012)—we might want to reassess the value of the caucuses even as little polls.

I contend that the overemphasis of the “first in the nation” caucuses, far beyond their significance even to the parties’ eventual nominating conventions in Iowa, stems from the overemphasis of the office of the U.S. presidency itself. Whether the power of that office had grown too much through roughly the last three-quarters of the twentieth century or the media and the people have built up too much “personality hype” surrounding the office, to obsess over its selection process from even midway through the four year term is at the very least excessive. Most significantly, the sheer length of the campaign “season” can compromise or even thwart the governing. In other words, we are treating a means with at least one end. We do not select a president so he (or she) can turn around and start campaigning for re-election. Perhaps we should not even allow presidential second-terms, though we would still have the multi-year campaign “season.”

In short, I contend that the way in which the two major American political parties nominate their respective candidates for the office of the U.S. president is deeply flawed, if not broken. Moreover, the entire campaign “season” is entirely too long, and growing longer—at least as of 2012. Similarly, the commercial Christmas, or “Happy Holiday” season was also being pushed further back into November and even October with our tacit approval (or at least not confronting offending retailers). Another national holiday, Thanksgiving, which is on the third Thursday in November, has come to be eclipsed or slighted in a way that is similar to how governance comes to be relegated or even ignored as the cameras turn to the “upcoming” campaign. After months of daily coverage even before the Iowa caucuses, the media proclaim that the campaign season is about to begin. How, one might wonder, can a journalist make such a statement with a straight face? You have been covering the race every day for at least six months, maybe longer! About to begin?  Haven’t you been paying attention to even what you have been doing? Nobody can be that stupid. One can reasonably wonder, therefore, whether it is in the interest of some power behind the scenes that the presidential campaign be turned into a business in its own right, or for some other purpose to be served that would be put at risk were the fiction uncovered and the system fixed.

Of course, as I allude to above, all of the attention paid to the presidential campaigning could be due to the increased power of the office itself. In his book on the presidency, Arthur Schlesinger refers the inordinate power of the office as “the imperial presidency.” The presidency, which includes the role of commander in chief, gained power from World War II to the Vietnam War in part due to what can be termed the “cycle of influence” involving the increasingly powerful military-industrial complex. In general terms, the increasing amount of money being “donated” to successive presidential campaigns suggests that at least some wealthy donors view the office as being sufficiently powerful to warrant the higher investment in influence.

One should not expect a candidate for president to campaign on reducing the powers of the presidency. At the very least, such a platform would suffer from a rather obvious conflict of interest. According to the International Herald Tribune’s 2012 New Year’s edition, even the “limited government” Republican candidates for president held “expansive views about the scope of the executive powers they would wield if elected—including the ability to authorize the targeted killing of U.S. citizens they deem threats and to launch military attacks without congressional permission.” Most of the candidates saw “the commander in chief as having the authority to lawfully take extraordinary actions if he decides doing so is necessary to protect national security.”[1] This view would turn out to be convenient should any of those candidates become president.

The military rise of the U.S. around the world since World War II—even fighting two wars at once in the first decade of the twenty-first century!—has made the American presidency more powerful in absolute terms as well as relative to the other offices, branches and even levels of government in the United States. This trend is particularly dangerous because it is difficult to hold a president accountable. Making reference to the preceding “decade of disputes over the scope and limits of presidential authority,” which itself could have been a reaction to its increase, the Tribune points out that “executive branch actions are often secret and courts rarely have jurisdiction to review them.”[2] We are lucky there have not been more “slicky Dick” Nixons occupying the White House. So the conventional wisdom is that the candidates must be thoroughly vetted, even if by the pundits and press instead of being willowed down by the voters closer to the election (i.e., within the same year!). Fear can be a powerful motivator in “staying the course” in the status quo.

The sheer duration of the presidential election cycle, plus the seemingly consequential Iowa caucuses being its first snapshot by some of the electorate, is said to be valuable, even necessary. This claim is a galaxy away from my perspective that the selection process is fundamentally flawed as well as broken. How is it that European campaign season—admittedly on the state level but the states play a larger role at the E.U. level than their American counterparts do at the U.S. level—is only a few months? Do the Americans make better decisions on account of the longer duration and related additional attention? Typically, any possible “constitutional moment” wherein the general public focuses on a major matter of governance or policy during the campaign is quickly eclipsed by the report that someone called someone else a bad name. As if by instinct, we feel we need to hear what the other guy said in reply, so we are hooked, like addicts, totally unaware of the opportunity costs both in terms of substantive public debate and in our own lives, which I submit should not be lived vicariously through a soap opera of bad actors willing to go on and on if it will get them the power they crave. So, finally, it can be asked whether they are worth all of the time we give them. Maybe if the office were not so powerful, or if structural changes were made to how a person is selected as president, we might get a break from the seemingly unending series of reports on the campaigns.

We do not even know that the U.S. Constitution mandates the Electoral College as a way to check excess democracy ironically by making use of elected representatives. The state legislatures were to appoint the electors who would then vote by state to elect the president. Even though the U.S. House was to be the repository of popular election in the U.S. Government and the Electoral College was meant to check excess democracy (e.g., the impact of momentary passions) for our own good, American citizens living long before the twenty-first century made the electors in each state subject to whichever candidate wins the popular vote in the state. In other words, the system is not only out of control; it was never designed to function as a multi-year electoral “season.” Even so, it goes on as if nothing were wrong—even as if the status quo were being worshipped, in effect. At the very least, to be comfortable with a broken status quo indicates that something is wrong societally—with us.  To admit that it is broken but be resigned to it is almost to deserve it. It is as though the United States is so big, as an empire of fifty republics comparable to European states, that only the sheer mass of its inertia can fuel its momentum. No other force, even from within, dares challenge the powerful vested interests who insist not only that the system is working, but that it must! We in turn do not question, really. We are so accustomed to going along with the status quo, as if its mere operation meant it is viable, that we hardly even notice.

On a clear, sunny day in early 2012, a century after the Titanic sank, few if any Americans were viewing the presidential contest then going on as fundamentally broken; the attention was on the personalities running. It is as if American society itself were unconsciously flirting with an iceberg ahead, sipping tea oblivious to the danger—or stranger still, looking straight at the mass of ice ahead while pausing to rearrange the deck chairs as if to get the best view. No need to report the sight or figure out a change of course. As the Iowa caucuses demonstrate, it is important to be first—sitting in front. This is ironic in lands so filled with Churches (even in the campaigns). Being able to secure one’s place means the system is working—serving the good of the whole, the public good.

See related essay: “Picking a President by Polls


1. Charlie Savage, “Limited Government But Far-Ranging Presidency,” International Herald Tribune, December 31, 2011-January 1, 2012.  
2. Ibid.

Sunday, January 1, 2012

On the Plight of the Euro

Relative to the U.S. dollar, the euro of the E.U. was not in as dire shape in 2011 as was typically presumed. As the euro celebrated its ten-year anniversary on January 1, 2012 at $1.29, a ten-year perspective could assuage the irrational exaggeration of fear over the currency’s impending demise. Besides the human propensity to develop tunnel vision—looking only right straight ahead—we tend to over-dramatize some things (while ignoring other things).


The full essay is at "Essays on the E.U. Political Economy," available at Amazon.