Monday, June 30, 2014

An Ethical Meltdown in Japan: On the Toxicity of Tepco's Nuclear Power

Without doubt, Japan’s largest power provider, Tokyo Electric Power Co. (Tepco), faced the biggest challenge of its 50-year-history in "recovering from the damage done to its nuclear facilities and power systems by a devastating earthquake and tsunami" in 2011.[1] The New York Times reported that "foreign nuclear experts, the Japanese press and an increasingly angry and rattled Japanese public are frustrated by government and power company officials’ failure to communicate clearly and promptly about the nuclear crisis. Pointing to conflicting reports, ambiguous language and a constant refusal to confirm the most basic facts, they suspect officials of withholding or fudging crucial information about the risks posed by the ravaged Daiichi plant."[2]

When a spokesperson for Tepco said early in the morning on March 16th "that a fire had broken out at the Daiichi plant’s No. 4 reactor, a reporter naturally asked how the fire had begun, given that just the day before the company had reported putting out a fire at that same reactor. The executive’s answer: ‘We’ll check. . . . We don’t have information here,’ he explained. After about two hours, the Tepco representative had the information: Turned out the smoke was coming not from reactor No. 4, but from reactor No. 3. If Tepco’s information had been delayed and vague, the reporters’ response was quick and direct. ‘You guys have been saying something different each time!’ one shouted. ‘Don’t tell us things from your impression or thoughts, just tell us what’s going on. Your unclear answers are really confusing!’"[3]

Tepco executives leave one of the many press conferences held during the disaster in 2011 (Image Source: The Wall Street Journal)

Furthermore, "the fire confusion followed Tepco’s failure to confirm that the water level in at least one of its fuel-rod storage pools had plummeted, which the media had started reporting citing government sources. Only after several hours, by which point it had started pumping in new water, did the company finally confirm that the level was low. . . . (W)hen the company changed its explanation of conditions at the reactor, one frustrated reporter said, ‘You guys think we’re ignorant [about nuclear operations] so you can make your explanation very vague, but we are not!’ The government may not be any more satisfied than the press is with Tepco’s disclosure practices. Local media reports say the prime minister scolded the company’s executives for not calling him after an explosion at the plant. He had to learn about it from the TV.”[4] On March 20th, The New York Times reported that questions had arisen on whether Tepco executives had "waited too long before pumping seawater into the plant, a measure that would ruin a valuable investment."[5]


Tepco evinces an ethical meltdown, which is to say, a toxic lack of credibility caused by a series of unethical actions long enough to be viewed as a pattern indicative of a sordid psychology. Secondarily, the company illustrates the dangers to Japan in the incestuous nature of Japanese business and government relations, otherwise known as amakudari, wherein regulators retire to better-paid jobs in the very industries they once policed. This system operates in private advantage at the expense of the Japanese people, whose fortitude and self-restraint in the wake of the earthquake and tsunami provide the world with an enduring model. Any residual resentment among descendants of the allies in World War II against the Japanese people must surely have melted away in the early spring of 2011 along with the last remaining dirty snow from the arduous albeit non-nuclear winter. In other words, the Japanese have the respect and admiration of the world, even if we are critical of the Japanese officials in business and government who have repeatedly forsaken the public good for their own private advantage. According to what Susum Hirakawa, a professor of psychology at Taisho University, the Japanese people were just as skeptical. “The mistrust of the government and Tepco was already there before the crisis, and people are even angrier now because of the inaccurate information they’re getting.”[6] In other words, an ethical meltdown had occurred--its toxic radiation infecting the polite, patient people just when the situation at the Daiichi plant was most dire.

According to The New York Times at the time, “the confusion is emblematic of days of often contradictory reports about what is happening at the plant.” Tepco “cannot know for sure what is happening in many cases because it is too dangerous for workers to get close to some reactors.” With 750 workers evacuated, a skeleton crew of a mere 50 workers were struggling “to keep hundreds of gallons of seawater a minute flowing through temporary fire pumps into the three stricken reactors, Nos. 1, 2 and 3, where overheated fuel rods continued to boil away the water at a brisk pace.” As the small crew of technicians braved radiation and fire, they “became the only people remaining at the Fukushima Daiichi Nuclear Power Station on [March 14th] — “and perhaps Japan’s last chance of preventing a broader nuclear catastrophe.” They could hardly be blamed for not being at the world’s beck and call for information; they were literally putting their lives at risk “to prevent full meltdowns that could throw thousands of tons of radioactive dust high into the air and imperil millions of their compatriots.”[7] That is, they were tasked with diverting a catastrophe and thus saving Japan (and perhaps even the American republics downwind). At the same time, were their bosses at a safe distance intentionally manipulating the data and delaying the use of seawater to save money and minimize blame, the verdict would be different in spite of their workload and stress at the time, especially given Tepco’s mixed track record when it comes to self-aggrandizing behavior (e.g., lying).

Contributing to the frustration were undoubtedly memories of Tepco’s checkered past with regard to being truthful with the public regarding safety precautions and even when the company had been culpable. For example, back in the summer of 2003, The New York Times had reported that Tepco “was forced to close all 17 of its nuclear plants temporarily after admitting that it had faked safety reports for more than a decade.”[8] The year before, The Japan Times had noted that MITI had “found evidence of falsified records from the late 1980s to early 1990s regarding cracks at Tepco's Kashiwazaki-Kariwa nuclear plant in Niigata Prefecture, and the No. 1 and No. 2 Fukushima nuclear plants in Fukushima Prefecture.” The Economy, Trade and Industry minister Takeo Hiranuma reacted to the news by telling reporters that “Tepco should take seriously the fact that it betrayed the people's confidence in nuclear power. . . . It is absolutely abominable that this incident caused the people's confidence to be largely lost in nuclear energy, which is a pillar of the nation's energy policy."[9] More than being a pillar, nuclear energy is inherently so dangerous that that industry ought to be the last to tolerate fabrication—particularly on safety! In any industry, nothing undercuts credibility more than a series of lies, for the latter points to the involvement of sordid personalities that are tenaciously and notoriously intractable. For such a personality to be invested with power in the nuclear power industry is something the human race can hardly afford. To the extent that the Japanese government has not pressured Tepco's board to replace upper management, and has even enabled Tepco by keeping accidents from the public, the government officials (and parties) should be held accountable. In short, the rest of the world was justified in holding the Japanese government, and ultimately the people, responsible for Tepco being allowed to continue in its furtive ways.

"Everything is a secret," said Kei Sugaoka, a former Tepco nuclear power plant engineer in Japan who has since moved to California; “There's not enough transparency in the industry."[10] In 1989 Sugaoka had "received an order that horrified him: edit out footage showing cracks in plant steam pipes in video being submitted to regulators. Sugaoka alerted his superiors in the Tokyo Electric Power Co., but nothing happened -- for years. He decided to go public in 2000. Three Tepco executives lost their jobs."[11] Even in spite of this belated (and all too rare) societally-induced accountability, company executives refused to allow the International Atomic Energy Association (IAEA) to conduct inspections after a 6.8 earthquake hit the nuclear plant at Nigata in July of 2007. Such a defensive stance could be expected from persons who lie to cut corners. It took the prefecture, or county, to insist that the inspection be done despite Tepco’s objection.

Actually, the “government was initially reluctant to let the IAEA inspect the plant but changed its stance after receiving petitions from local officials eager for a third-party assessment to ease public concern over the safety of Japan's nuclear plants.”[12] Moreover, the nuclear power industry in Japan had been "in a comfy relationship with government regulators often willing to overlook safety lapses."[13] This is why the firing of the three top executives had undoubtedly been societally rather than governmentally induced. Had Tepco not gotten away with lying about its safety reports for years, the local officials urging the IAEA inspection (probably themselves pressured by worried citizens) might not have been so adamant that the prefecture intervene even if officials at that level were too cozy with Tepco.

Therefore, the Japanese media and people had more than sufficient reason in the wake of the tsunami in 2011 to suspect that the dearth or confused nature of information from the plant nearing meltdown might have been more than confusion or unobtainability. According to The New York Times, government officials were "almost completely reliant" on Tepco for information on the Daiichi plant.[14] If the government officials need not have been reliant, they may have been guilty of mistaken, and perhaps even negligent complicity, or at least naiveté, given Tepco's track-record in distorting and falsifying information submitted to the government. Tepco’s reputational capital had suffered such a meltdown by 2011 that even the mere possibility of subterfuge naturally claimed the high ground in the public’s eye; the record of lies had deprived the company of the benefit of the doubt, even as its employers were risking their very lives heroically to save Japan. Such is the severity of the toxicity of an ethical meltdown—even diverting a natural catastrophe and saving millions of people is not enough to undo it. Once credibility has been lost, it is extremely difficult to build it back up. Even if expedient strategic choices seem convenient in the short term, they can be very costly in the long term.

Lest business practitioners around the world looking back at Tepco’s trajectory feel secure in complacency, knowing that their respective companies could not suffer a similar ethical meltdown because they have instituted codes of ethical conduct and ethical procedures, it should be pointed out that Tepco had instituted a rather sophisticated system in 2002. One might remember, moreover, the delegates’ discussion in the U.S. Constitutional Convention regarding the feebleness of mere parchment in holding power back when it is not checked against itself in a separation of power as interest pitted against interest. The mere existence of a corporate code of ethics and an “ethics line” in a company with a squalid corporate culture is no check on unethical conduct. In fact, the PR use of such an apparatus can actually enable sordid, narcissistic managers to be even more unethical because the window-dressing can absorb the slack. For a time, the public's perception of a company's commitment to "corporate citizenship" can act as a default having its own momentum in blocking recognition of the onslaught of unethical conduct. Unfortunately, unsavory executives know all too well how to take advantage of this sociological phenomenon of group-think. In the case of Tepco, lies over decades had depleted any such PR from the company's organizational ethical-infrastructure. Accordingly, it made no difference to the frustrated people in Japan (and around the world) who instinctively doubted the executives’ willingness to deliver information rather than self-serving propaganda even in the face of a catastrophic nuclear meltdown. What kind of a person is that self-absorbed in such a context? Can a corporate code of ethics stand up to such a psychology. 

Even if not intended as mere window-dressing, corporate ethical statements, procedures and organizational design are enervated or even impotent relative to a corporate culture formed by people all too comfortable taking the road easiest travelled when the travelling gets bumpy. According to TEPCO’s own web-site, “In September 2002, TEPCO implemented countermeasures to guard against a reoccurrence of incidents with regard to inspection and maintenance operations at our nuclear power stations. At the same time, the Company announced four commitments in the interest of creating a ‘Corporate system and climate of individual responsibility and initiative.’ The actualization of the four commitments has been adopted as our social mission, and the entire Company is deeply involved in the effort.” This includes the following imperative: “Disclose information on the management and operation of our nuclear power stations, so the public is able to confirm that our plants are being operated safely” and “Creating systems to ensure the observance of ethics.”[15]

From Tepco’s web-site, announcing the company's ethical system in 2002.

Tellingly, Tepco’s corporate ethical system, although organizational in design and formal extent, was to be geared to individual responsibility—meaning that individual employees should take responsibility for their actions; nothing is said about corporate responsibility—executives and the company spokespersons taking responsibility for corporate mistakes. Moreover, as the company’s record attests, simply having a formal ethical code and a “social mission,” and even a formal intent to disclose even inconvenient information, does not necessarily have any actual bearing or impact in flesh and blood terms where motives at the moment are in line with power. That is to say, the tendency to hide bad information from the public out of fear is real because it is felt, whereas the existence of something written down on plaque or in an organizational structure chart is mere parchment. The challenge is to deal with the way top executives individually and as a group deal with fear and discomfort when the company itself screws up or performs badly, financially or otherwise, because they typically have the power to act in moments of crisis as they will. In the end, it may come down to the type of people that are hired (ultimately by the board of directors). It is unlikely that a company with a bad habit of ethical slights can change without a wholesale change in management, at least at the top and middle levels, and in the people who have done the hiring for those levels.

Punctum saliens, it should not be presumed that the systemic risk of an ethical meltdown is only catastrophic in the case of nuclear energy. The additional examples of BP executives lying about safety and Lehman managers using Repo 105 to understate the bank’s debt and cost-based real estate valuations to essentially overstate the value of the bank's real estate-based assets even after the real estate market had tanked strongly suggests that mankind entered a new era in the twenty-first century. Specifically, the wherewithal or puissance of big business to cause large-scale or systemic devastation from ethical meltdowns had arrived. Ultimately, beyond even the question of whether regulatory agencies have been captured by industries too big to fail, the human race is perhaps ready to confront the possibility that we have allowed private capital to reach such immense concentrations that its organizations can sport such inherently large and systemically-dangerous tasks as holding highly radioactive bars on the shore, drilling deep water wells going far beyond human reach, and inventing sophisticated toxic derivatives of unknown depth--the collapse of which possibly giving rise to the end of the global financial system “by Monday.” Has the human mind yet adjusted to, let alone comprehend, what catastrophic damage its elongated artificial arms can produce even without being fueled by the hydraulic fluid of ambition and greed? The sheer scale of mankind's modern ventures warrants much greater trepidation and humility than is the case, especially given the lessons that humanity is capable of learning from looking systemically at what occurred during September of 2008, April of 2010, and March of 2011. Lest we have faith in our written parchments to prevent ethical meltdowns as in such cases, we have only to look at the presumptuousness inherent in human nature to motivate us as a species to redouble our efforts to protect ourselves from ourselves by restraining our appetite for morebigger, and largerPlus haut, plus loin, plus fort! Sans fin? Vraiment? Si oui, quel dommage pour nous . . . notre petite humanité. Parfois, moins est plus.

1. Atsuko Fukase, “Tepco Versus the Media,” The Wall Street Journal, March 16, 2011.
2. Hiroko Tabuchi, Ken Belson, and Norimitsu Onishi, “Dearth of Candor from Japan’s Leadership,” The New York Times, March 16, 2011.
3. Fukase, “Tepco Versus the Media.”
4. Ibid.
5. Keith Bradsher and Matthew Wald, “Executives May Have Lost Valuable Time at Damaged  Nuclear Plant,” March 19, 2011.
7. Hiroko Tabuchi and Keith Bradsher, “Japan Says 2nd Reactor May Have Ruptured with Radioactive Release,” The New York Times, March 15, 2011.
8. James Brooke, “Four Workers Killed in Nuclear Plant Accident in Japan,” The New York Times, August 10, 2004.
9.Koizumi, Hiranuma Blast Tepco over Alleged Nuclear-hazard Coverup,” The Japan Times, August 31, 2002.
10. Associated Press, “Scandal-Ridden Energy Company BehindJapan’s Nuke Crisis,” CBS News, March 17, 2011.
11. Ibid.
12. Kyodo News, “IAEA Begins Followup Examination of Quake-hit Niigata Nuclear Plant,” The Japan Times, January 29, 2008.
13. Ibid.
14. Tabuchi et al, “Dearth ofCandor from Japan’s Leadership.” 

Tuesday, June 17, 2014

Legislating Politeness: The Chinese Government Takes on a Dysfunctional Culture

In June, 2014, the “Capital Civilization Office” in the Chinese Government began a half-year campaign to “encourage Beijing’s 20 million residents to behave better.”[1] Targets include “people who are noisy, smoke in public, curse at sports events, fail to line up for buses, run red lights, drink while they drive, and drive aggressively.” It seems to me this list could equally apply to Miami, and, at least in terms of driving, to the entire Northeast coastline of the United States. Perhaps urban modernity is to blame, or maybe it is simply the old truism pertaining to the rise and fall of great empires, and thus to cities as well. Chinese history is no stranger to this cycle in the form of a succession of dynasties. Perhaps we would be wise to view the modern city in such terms too.

The Chinese civilization office informed Beijing residents that they are to “dress properly, show grace in speech and manner and say ‘hello,’ ‘thank you’ and ‘sorry’ more often.” Lest this list conger up images of George Orwell’s “Big Brother” figure in the book, 1984, the gargantuan task can be regarded as futile at best. Although Chinese officials were set to utilize “guidance, education and relevant laws and regulations,” including fines, some of the unseemly conduct was at the time not against the law. The underlying problem may be whether arresting cultural decadence is even possible when it has come to characterize the culture in a particular geographical area.

Visiting Beijing a number of years ago, I was immediately struck by the lack of lines, or queues, in public places. It was as if the phenomenon of waiting one’s turn had been lost in a society whose culture stretched back thousands of years. Once the Confucian ethic has been lost, how does a society regain it? A government cannot very easily legislate “building civilization,” though Jan Longbin, deputy director of the Capital Civilization Office cites “immense results” from the Olympic manners campaign six years earlier. “Building civilization is not something that can be done in a single day,” he admits. This may be a tremendous understatement, owing to the self-reinforcing mechanism that is a part of human culture.

Decadence in a society, which is to say, among people who live in the same geographical area, may have a downward, intensifying dynamic that makes any reversal especially arduous. Once a particular assumption, such as that it is ok to ignore the people in line and go directly to the front, reaches a critical mass in terms of the proportion of people having adopted it, they proceed with added confidence in doing so. This second assumptions operates as a kind of protective bubble by giving the individuals the mistaken sense that the primary assumption cannot be wrong. Efforts to force those people to wait in line must contend with this “gravity,” such that when the enforcement ends the original rudeness is likely to re-emerge virtually unscathed.

I have witnessed this descending cycle of cultural decadence through visits to my hometown in Illinois. In spite of the city having been hit hard by the loss of machine-tool plants in the early 1980s and the subsequent increase in crime as well as unemployment, most new comers and visitors such as myself cite the predominate attitude of a high proportion of the local inhabitants are particularly problematic. The sordid demeanor, based on a lack of education—the city being among the least educated in the U.S.—can be characterized as ignorance that cannot be wrong, backed up by whatever authority it thinks it has. 

Ironically, the decadence manifests particularly in the low-level office workers and in the service sectors. Bus drivers, for example, are said to be particularly boorish and even aggressive, and office workers tend to be rigidly obsessed with their tiny policies at the utter detriment of common sense, not to mention plain decency and workability. Neither unemployment nor crime can be blamed for the pretention; rather, the arrogance of ignorance, operating as a sort of self-entitlement as if on stilts during a flood rather than rightfully submerged, seems to be in play as the particularly intractable pathology plaguing the rust-belt city. 

Given the rigid defense mechanisms protecting the dysfunctional mentality that is shared by many there, any civic effort to render the city more livable would almost certainly be fraught with difficulty, if not founder at the outset. As it is, the healthy people—those who recognize the banality so widespread—tend to leave town, if they can, and this self-selection out intensifies the proportion of sickness in the town, which of course pushes any fix even further away. The decadence, or garden-variety crassness, is as though predestined to pursue its own path until the black hole can be distended, or bloated, any further. 

In short, once a squalid mentality, or set of assumptions regarding interpersonal and organizational behavior, sets in in a particular geographical area, the decadence is likely to continue downward until it hits rock-bottom rather than be staved off by government intervention. The tools available to government are no match for the self-reinforcing defense mechanisms of a dysfunctional culture.

For example, even though my banal hometown consistently ranks near the bottom of livable cities in the U.S, local residents in the thick of the pathology (i.e., in utter denial) try to claim that every city has the mentality, which in terms of the rankings is a mathematical impossibility. The denial feeds the arrogance, and the pathogens go on in their ways with a misplaced confidence rather like bloated fish walking on stilts during a flood. When rudeness, or even meanness, as new-comers tend to characterize it, is backed up by the sheer presumptuousness of ignorance that cannot be wrong, a hard shell is formed that may succumb only when the rotten core as reached it, rather than from external efforts at reform. 

However, it does seem theoretically possible that an influx of enough healthy souls can put the indigenous weakness on the defensive, essentially knocking it off its perch in the work-places and in public spaces. In a city of 20 million, such as Beijing, such an intervention would be too huge to be at all realistic; and so it is that as is the case for any great civilization, so too modern cities are subject to the rise and fall of being human, all too human after all. 

[1] For this quote and all others in this essay: Calum MacLeod, “Be More Polite, Beijing Residents Told,” USA Today, June 11, 2014. 

Thursday, June 12, 2014

U.S. House Majority Leader Cantor’s Election Loss: On the Corporate Connection

It is not every day that the majority leader in the U.S. House of Representatives loses—and badly at that—to a primary challenger in an intra-party contest. In the wake of Jim Cantor’s defeat in June, 2014, journalists wasted no time in reducing the defeat to one issue: immigration. Such a reductionist ex-post facto divination of voter intent—as if an electorate were one monolithic mind writ large—is fraught with difficulties. Beyond the sheer artifice, such an interpretation offers an easy cover for less convenient, subterranean political shifts underway and expressed in the vote.

Leaping to the immigration rationale whereby arch-conservative, or “Tea-Party” Republican voters punished Cantor for having been willing to work with Democrats on a compromise involving amnesty in some form, Ali Noorani of an immigration lobbyist group opined that the primary’s result would make it tougher to get a bill through the U.S. House in the current session.[1] Frank Sharry of America’s Voice, “said Cantor’s loss blows up a last-minute attempt by Republicans to organize support for an immigration bill.”[2] No doubt both Noorani and Sharry were going off media reports as early as election night that immigration reform had brought the majority leader down.

To be sure, more elaborate analyses, such as one by The Washington Post, broadened the explanation to include Cantor’s votes in general and his lack of attention to his district.[3] Preoccupied with gaining power over his colleagues by raising money, Cantor missed the warning signs, and this alone could have annoyed Republican voters back home. Additionally, as put by Jamie Radtke, co-founder of the Virginia Tea Party, Cantor had “made an enemy of his friends.”[4] Put another way, pundits wanting to extrapolate the primary results onto the national stage may be overlooking the idiosyncratic elements that do not generalize or go forward.

For all the points needlessly cut off by the reductionism to immigration or even Cantor’s willingness to compromise with Democrats, the most important received scarcely any media coverage. Specifically, interviews with some voters suggest that Cantor lost in part because he had done the bidding of big business at the expense of the small businesses in Richmond. Stung by bailouts without strings to the large Wall Street banks that had dangerously over-leveraged themselves on risky mortgage-backed securities and a dearth of prosecutions on fraud, Republican voters could not have missed the apparent collusion between corporate and campaign coffers; Cantor had raised nearly $5.5 million to have a 27-to-1 financial advantage over his challenger.[5] As astonishing as Dave Brat’s win is, given this financial disparity, the real lesson from the primary may be that elected officials willing to cut the financial strings to corporate America may actually do better at home because the positions and votes would be more closely tailored to the constituents.

Put another way, voters may be hungry for political courage that shakes the conventional “wisdom.” Teddy Roosevelt must have discovered as much when he went after the mighty trusts like Standard Oil in the early twentieth century, and Andrew Jackson nearly a century earlier when he refused to fund the Second Bank of the United States for fear of encroaching federal power. This is a message that the powers that be in 2014 doubtless would not want getting out, and the media dutifully complied with the subterfuge of immigration reform being in all probability dead for the time being.

[1] Alan Gomez, “Immigration Bill Looks Doubtful,” USA Today, June 12, 2014.
[2] Ibid.
[3] David Fahrenthold, Rosalind Helderman, and Jenna Portnoy, “What Went Wrong for Eric Cantor?The Washington Post, June 12, 2014.
[4] Ibid.
[5] Susan Davis And Catalina Camia, “Contests Loom for Top GOP Posts,” USA Today, June 12, 2014.

Friday, June 6, 2014

GDP and Poverty: Is Economic Growth the Answer?

From 1959 to 1973, the American economy grew 82 percent, per person. It is easy to assume this is why the poverty rate decreased from 22% to 11 percent.[1] From roughly 1985 to 1990 and then again from 1995 to 2000, heady growth rates are also correlated positively with declining poverty rates. But correlation is not causation. Indeed, had the correlation in the 1959-1973 period continued, the subsequent per capita growth would have ended poverty in 1986. What then are we to make of the relationship between GDP and poverty?

According to Heidi Shierholz, an economist at E.P.I., the “very tight relationship between overall growth and fewer and fewer Americans living in poverty” broke apart in the 1970s.[2] In spite of the OPEC oil cartel’s inflationary shocks in 1973 and 1979, the poverty rate remained relatively constant through the decade of “stagflation,” spiking only once Reagan took office—perhaps on account of David Stockman’s domestic budget cuts that hit the poor especially hard and Paul Volcker’s high interest rates at the Fed (to decrease the inflation rate) that increased the cost of borrowing money. To be sure, the recessions in the early 1980s and the early 1990s are associated with increases in the poverty rate, which even lags the subsequent recoveries, and the rate fell as the economy was humming along in the late 1980s and 1990s. Even so, eleven percent seems to be the rate’s floor. Perhaps this is why the relationship broke apart in the 1970s?

According to Thomas Piketty, the period from World War I to the 1970s is unusual economically because the shocks reduced returns on capital relative to the growth rates of income and GDP. The economist suggests that inequalities in income and even wealth narrow under this rather artificial arrangement; typically, returns on capital have outsized increases in income, thus increasing the inequalities. Perhaps the inverse relationship we are looking at holds only when the rate of return on capital is low relative to the GDP rate. However, as I indicate above, heady growth periods after the 1970s can be found in which the poverty rate is decreasing, and recessionary periods in which the rate is increasing.

So we are back to the 11 percent floor. When the relationship broke apart in the early 1970s, the poverty rate was indeed at 11 percent. Instead of continuing downward, the rate hovered, when up a bit, then slightly downward until just above 11 percent before heading starkly upward in 1981.[3] What might be behind the floor? It seems doubtful to me that 11 percent of the adults in America are simply unable to work for non-economic reasons. It seems more likely that the market mechanism, which can support wages at the minimum wage without any upward pressure (e.g., from labor shortages), functions at an equilibrium short not only of full employment, but rising wages (relative to returns on profits and stock appreciation) as well. In other words, we cannot look to the market to “grow” us out of poverty. While undoubtedly a help, the market is only part of the solution.

The question is thus how we can fill in the rest of the solution. What, outside of the economy, can make up the difference? As the source and enforcer of societal rules, government is not intrinsically the answer, for to be the “man in charge” does not in itself connote supplying materials (including jobs). Yet the government could direct that materials and or jobs be provided outside of private industry, such as by non-profit organizations receiving funds from tax revenues and thus directly under government oversight. Even now, the Full Employment Act of 1946 directs that everyone who wants a job should be able to have one—that this is a task of government to oversee. Lest employment be viewed as an end in itself rather than a means, we could stipulate that everyone has at least a minimum amount of money and wealth. Hence the unemployable veteran, for example, would not have to suffer in poverty. Lest breaking through the 11 percent floor make it more difficult for Walmart or McDonalds to pay cashiers the minimum wage of just over $7 an hour, we might remember that a part of the solution is not in itself more important than the solution itself.

That is to say, hits taken at the margins to part of the solution as the rest of the solution is added should not outweigh the solution itself, for the end is more important than any one of the means. Too often, I fear, the American public discourse obsesses on the downsides on the margins to a means, being all too willing to preempt a full solution just so the particular means is not slighted in any way. I suppose this is a sort of tunnel vision, with greed playing a supporting role. Death and taxes may be inevitable, but surely poverty is not. Indeed, it may be viewed as an artificial byproduct of human society and thus as fully within our powers and responsibility to eliminate rather than take as a given.

[1] Neil Irwin, “Growth Has Been Good for Decades. So Why Hasn’t Poverty Declined?The New York Times, June 4, 2014.
[2] Ibid.
[3] Ibid.

Tuesday, June 3, 2014

What We Know about Coal and Natural Gas: The EPA’s Coal Emissions Targets

Coal is the bad guy. At least it is the antagonist in the U.S. Environmental Protection Agency’s 645-page carbon-emissions plan unveiled in early June, 2014. In spite of the fact that the 30% reduction in CO2 emissions from the level in 2005 being set for 2030, critics showed their oligarchic focus on today by pointing to what the current likely costs would be. Electric bills increasing $4 or so a month in West Virginia. Lost jobs—as if the criteria of capital were also those of labor. In short, short-term inconveniences without a hint of the other side of the ledger. I submit that this is precisely the element in human nature that can be likened to the proverbial “seed of its own destruction” in terms of the future of our species. As menacing as such “reductionism to today” is, the assumption such as underlies the EPA’s proposal that coal is the definitive obstacle—and, furthermore—that we are not missing any other huge but invisible danger—is just as problematic from the standpoint of the species’s survival.

I have in mind the EPA’s projections by fuel type going out to 2030 from 2012. From 37% of the electricity generated in 2012, the comparable projected figure is 32% for 2013.[1] While at least the direction is downward in relation to the other fuel types, climatologists would doubtless say more of a drop is necessary to stave off more than a 2 degree C global increase.  As damning as this “ok but not good enough” scenario is pertaining to coal, the more damning feature of the report is that it may be very wrong about something it takes to be an improvement.

For example, the natural gas category is projected to go from 30% in 2012 to 35% in 2030[2]. That’s good, right, because it’s the clean gas. Not so fast. Independent empirical studies of leaks in Utah, L.A., and Washington, D.C. have shown much higher levels of methane escaping into the atmosphere than the 1% touted by the producers and adopted (without independent confirmation) by the EPA. In the observations, the actual percentages of leakage were double-digits. The problem is that the break-even point with coal in impacting global warming is 3 percent. Methane, which natural gas gives off before being burnt, turns out to have ten times the impact as coal.

My point is that even though by now we are used to the contingents that put today’s convenience above the risk to the future of the species, we don’t know what we don’t know, and this can be even more dangerous. In other words, what we assume to be a good thing may in actual fact be doing a lot of damage under our very eyes. We may not have a clue as to how what we are doing today is impacting the planet’s atmosphere. This may be one reason why scientists have repeatedly had to accelerate their projections of when the ice sheets would melt at the poles.

Human nature may be much more problematic from the standpoint of the species’s own survival than we know. Not only have 1.8 million years of natural selection engrained in us a focus on today (e.g., fight or flight) at the expense of tomorrow; we may be very wrong about stuff we assume we got right and yet be totally unaware of it. It is as though our species were a person with long hair who never bothers to use a mirror to make sure that the hair in back is brushed. The laugh is on that person, and yet she (or he) has no idea. The industrialists who are instinctively wetted to the status quo out of a desire for financial gain may just be the tip of the iceberg; we had better look underneath before it has totally melted.

[1] Wendy Koch, “EPA Carbon-Cutting Plan Could See Power Shift,” USA Today, June 3, 2014.
[2] Ibid.

President Obama as Chief Executive: Missing the Fraud at the Veterans Administration

Buffeted with a whirlwind of criticism in the wake of revelations of widespread fraud in VA Hospital and outpatient clinic wait times, President Obama somewhat sheepishly admitted during a news conference on the matter that he had heard nothing of the practice on his travels around the country. With at least one instance of false scheduling at 65% of the facilities between September 30, 2013 and March 31, 2014 and 13% of schedulers being instructed in how to falsify wait-times,[1] it is odd that word had not reached the president’s ear. Maybe this is not so odd after all, for the president’s domestic trips tended to be oriented to campaign fundraisers and speeches oriented to proposed legislation. In other words, the president—and Barak Obama is hardly alone here—put his legislative role above that of his office as chief executive.

It is worth noting that the legislative role of the American federal president is negative in that the power is exercised by vetoing legislation. To be sure, the president is constitutionally encouraged to make recommendations through the State of the Union report made to Congress. Even so, the extent of time and attention that presidents have directed to pushing favored legislative bills go beyond making recommendations, and thus the opportunity cost (i.e., foregone attention to other matters, such as managing the executive branch of the U.S. Government) is not justified. Put another way, having two branches focused at the top on legislating is not only redundant, or overkill; the joint focus leaves the executive branch without a chief except in parchment.

This is not to say that proactive rather than veto-based presidential involvement in the legislative process cannot bear fruit. Franklin D. Roosevelt, the president for much of the Great Depression in the 1930s, expended tremendous effort in seeing to it that his New Deal programs were legislated into actuality. In an “exit-interview” at the conclusion of decades in the U.S. House of Representatives, Rep. John Dingell (D-Michigan) calls FDR, “The Giant, one of probably the three greatest” presidents in American history.[2] In dull contrast, Dwight Eisenhower was a “fine chairman of the board, . . . but didn’t do much.”[3] This stinging critique implies that the managerial imprint translates into lethargy or at least a lack of accomplishment.

Relatedly, Dingell criticizes Jimmy Carter for not being able to see the forest even as he could see every tree in the woods.[4] While a president as presider should have his or her eyes on the big picture, protecting society and the systems of business and government as wholes from actualizing systemic risk, the president as chief executive should focus on trees relative to society as a whole—that is, relative to the orientation in presiding. To be sure, the focus of the particular agencies is considerably narrower, and no CEO rightfully gets hung up at that level—but neither does a CEO focus on society at the expense of the business itself. Carter took micromanaging to the extreme, personally approving even the White House Christmas cards. As dysfunctional as this is for a chief executive, equally problematic is a president who acts as if he or she were a Congressional leader, or else privileges his own presiding over managing. Yet legislating and presiding have come to swallow up the very notion of the American presidency—Rep. Dingell’s comments illustrating this default.

It hardly bears mentioning that for a politically-oriented person, running around the member states making speeches oriented to a vision of society is unquestionably more exciting than exercising executive responsibilities. As a result, it has been all too easy for the campaign-oriented people who have occupied the Oval Office to effectively leave the mammoth executive branch without a CEO or managerial chairperson—a decision that tacitly enables the sort of widespread fraud as was found in the Veterans Administration in 2014. It is fanciful to suppose that word of even such a widespread managerial practice would somehow show up on a rope-line as a celebrity president is passing by. Yet in his news conference on the fraud at the VA, President Obama saw no such disjunction. Instead, he sought to appear as managerially on top of the intricacies of the VA scheduling process.

I suspect that the encroachment of campaigning over governing has a correlate in the White House, wherein legislating has come to crowd out the executive functions. Perhaps the Electoral College was established in part so a good executive rather than a good campaigner would have a chance at the office; the increasing salience of the popular vote being like a storm’s wave washing over everything else and thus effectively establishing the sort of person who would get the prize. Relatedly, the underlying problem doubtlessly includes the character flaw that too easily ignores some of a job’s responsibilities in selfishly favoring others. In other words, we can indeed blame Barak Obama and many of his predecessors for slighting their managerial responsibilities across the executive branch in order to have more influence (i.e., legislatively). Ironically, fewer speeches on pending legislation would have much more currency and free up the president to manage the executive branch. As for the quite legitimate presiding role that is literal to the presidency itself, catastrophic threats to the systems of business, government, and society do not arise every day; the role does not “eat” a lot of time on a daily basis if understood correctly instead of applied to every symptom that pops up on an oversensitive radar-screen. Leaving legislating largely to Congress, a presiding president would likely find that he or she has enough time to manage the executive branch effectively, assuming an optimal mix of direct supervision and delegation is applied. Generally speaking, balance and proper boundaries would do a lot of good, yet unfortunately human nature may be more schizogenic than homeostatic—more maximizing (e.g., desire) than oriented to equilibrium.

[1] Meghan Hoyer and Gregg Zoroya, “Fraud Masks VA Wait Times,” USA Today, June 3, 2014.
[2] “John Dingell Rates the Presidents,” USA Today, May 2, 2014.
[3] Ibid.
[4] Ibid.

Dismembering Time Inc: A Critique of the Conglomerate Strategy

Typically, management is assumed to be a skill or practice that enables a person so trained to work in virtually any company, regardless of what the sector happens to be. A manager is presumed fully able to go from managing a bank to managing a restaurant. Organizing is the common thread; passion for the particular output is not. Yet surely product-specific knowledge and indeed fascination must count for something. This point struck me as I read the ideas of one journalist regarding what should come of Time Inc. once separated from the mother-ship of Time Warner. If the parts of Time Inc. are indeed worth more as parts of different companies than as remaining as a whole, then it is worth asking whether it makes sense to assign each part to a company oriented to the same theme or domain. If so, then the particular theme of a publication, or business moreover, should have some bearing on a manager.

Playing with how a freed-up Time Inc. might be internally dismembered, Michael Wolff suggests that Time’s InStyle magazine might go to Hearst or Condé Nast; Entertainment Weekly to The Hollywood ReporterSports Illustrated to ESPN; and Fortune to The Wall Street Journal.[1] Although management dogma has it that the practice can enable a non-publishing company to effectively take in any of Time’s magazines, synergies in terms of content would doubtless have value if the purchasing company not only publishes already, but also in the same subject matter. Adding to the latter, I submit that collecting together managers who have a particular interest in a certain subject-matter, such as sports or finance, and having those managers fully occupied only in it, has both financial and psychological value. The balance could in fact be tipped away from the conglomerate enterprise should the concept of a shared, intense passion take hold in a managerial setting.

Accordingly, the salience of the content of what is being managed—the particular subject-matter of a product or service—may render managerial skill not as much of a passport as currently supposed or taught in business schools. Just because a person has a MBA does not mean that he or she could manage a magazine for a few years then move on to manage a restaurant; the delimiting question would be: what interests you other than management? Perhaps then the world would see more managers of flesh and blood—fully alive even at work—rather than mere stand-ins composed of sanitized skeletal bone. To be viable, management may have to transcend itself into specific, non-transferrable content.

[1] Michael Wolff, “The Once and Future Time Inc. Is in Flux,” USA Today, May 27, 2014.