Monday, June 30, 2014

An Ethical Meltdown in Japan: On the Toxicity of Tepco's Nuclear Power

Without doubt, Japan’s largest power provider, Tokyo Electric Power Co. (Tepco), faced the biggest challenge of its 50-year-history in "recovering from the damage done to its nuclear facilities and power systems by a devastating earthquake and tsunami" in 2011.[1] The New York Times reported that "foreign nuclear experts, the Japanese press and an increasingly angry and rattled Japanese public are frustrated by government and power company officials’ failure to communicate clearly and promptly about the nuclear crisis. Pointing to conflicting reports, ambiguous language and a constant refusal to confirm the most basic facts, they suspect officials of withholding or fudging crucial information about the risks posed by the ravaged Daiichi plant."[2]

When a spokesperson for Tepco said early in the morning on March 16th "that a fire had broken out at the Daiichi plant’s No. 4 reactor, a reporter naturally asked how the fire had begun, given that just the day before the company had reported putting out a fire at that same reactor. The executive’s answer: ‘We’ll check. . . . We don’t have information here,’ he explained. After about two hours, the Tepco representative had the information: Turned out the smoke was coming not from reactor No. 4, but from reactor No. 3. If Tepco’s information had been delayed and vague, the reporters’ response was quick and direct. ‘You guys have been saying something different each time!’ one shouted. ‘Don’t tell us things from your impression or thoughts, just tell us what’s going on. Your unclear answers are really confusing!’"[3]


The full essay is in Cases of Unethical Business, available in print and as an ebook at Amazon.com.  


Tuesday, June 17, 2014

Legislating Politeness: The Chinese Government Takes on a Dysfunctional Culture

In June, 2014, the “Capital Civilization Office” in the Chinese Government began a half-year campaign to “encourage Beijing’s 20 million residents to behave better.”[1] Targets include “people who are noisy, smoke in public, curse at sports events, fail to line up for buses, run red lights, drink while they drive, and drive aggressively.” It seems to me this list could equally apply to Miami, and, at least in terms of driving, to the entire Northeast coastline of the United States. Perhaps urban modernity is to blame, or maybe it is simply the old truism pertaining to the rise and fall of great empires, and thus to cities as well. Chinese history is no stranger to this cycle in the form of a succession of dynasties. Perhaps we would be wise to view the modern city in such terms too.

The Chinese civilization office informed Beijing residents that they are to “dress properly, show grace in speech and manner and say ‘hello,’ ‘thank you’ and ‘sorry’ more often.” Lest this list conger up images of George Orwell’s “Big Brother” figure in the book, 1984, the gargantuan task can be regarded as futile at best. Although Chinese officials were set to utilize “guidance, education and relevant laws and regulations,” including fines, some of the unseemly conduct was at the time not against the law. The underlying problem may be whether arresting cultural decadence is even possible when it has come to characterize the culture in a particular geographical area.

Visiting Beijing a number of years ago, I was immediately struck by the lack of lines, or queues, in public places. It was as if the phenomenon of waiting one’s turn had been lost in a society whose culture stretched back thousands of years. Once the Confucian ethic has been lost, how does a society regain it? A government cannot very easily legislate “building civilization,” though Jan Longbin, deputy director of the Capital Civilization Office cites “immense results” from the Olympic manners campaign six years earlier. “Building civilization is not something that can be done in a single day,” he admits. This may be a tremendous understatement, owing to the self-reinforcing mechanism that is a part of human culture.

Decadence in a society, which is to say, among people who live in the same geographical area, may have a downward, intensifying dynamic that makes any reversal especially arduous. Once a particular assumption, such as that it is ok to ignore the people in line and go directly to the front, reaches a critical mass in terms of the proportion of people having adopted it, they proceed with added confidence in doing so. This second assumptions operates as a kind of protective bubble by giving the individuals the mistaken sense that the primary assumption cannot be wrong. Efforts to force those people to wait in line must contend with this “gravity,” such that when the enforcement ends the original rudeness is likely to re-emerge virtually unscathed.

I have witnessed this descending cycle of cultural decadence through visits to my hometown in Illinois. In spite of the city having been hit hard by the loss of machine-tool plants in the early 1980s and the subsequent increase in crime as well as unemployment, most new comers and visitors such as myself cite the predominate attitude of a high proportion of the local inhabitants are particularly problematic. The sordid demeanor, based on a lack of education—the city being among the least educated in the U.S.—can be characterized as ignorance that cannot be wrong, backed up by whatever authority it thinks it has. 

Ironically, the decadence manifests particularly in the low-level office workers and in the service sectors. Bus drivers, for example, are said to be particularly boorish and even aggressive, and office workers tend to be rigidly obsessed with their tiny policies at the utter detriment of common sense, not to mention plain decency and workability. Neither unemployment nor crime can be blamed for the pretention; rather, the arrogance of ignorance, operating as a sort of self-entitlement as if on stilts during a flood rather than rightfully submerged, seems to be in play as the particularly intractable pathology plaguing the rust-belt city. 

Given the rigid defense mechanisms protecting the dysfunctional mentality that is shared by many there, any civic effort to render the city more livable would almost certainly be fraught with difficulty, if not founder at the outset. As it is, the healthy people—those who recognize the banality so widespread—tend to leave town, if they can, and this self-selection out intensifies the proportion of sickness in the town, which of course pushes any fix even further away. The decadence, or garden-variety crassness, is as though predestined to pursue its own path until the black hole can be distended, or bloated, any further. 

In short, once a squalid mentality, or set of assumptions regarding interpersonal and organizational behavior, sets in in a particular geographical area, the decadence is likely to continue downward until it hits rock-bottom rather than be staved off by government intervention. The tools available to government are no match for the self-reinforcing defense mechanisms of a dysfunctional culture.

For example, even though my banal hometown consistently ranks near the bottom of livable cities in the U.S, local residents in the thick of the pathology (i.e., in utter denial) try to claim that every city has the mentality, which in terms of the rankings is a mathematical impossibility. The denial feeds the arrogance, and the pathogens go on in their ways with a misplaced confidence rather like bloated fish walking on stilts during a flood. When rudeness, or even meanness, as new-comers tend to characterize it, is backed up by the sheer presumptuousness of ignorance that cannot be wrong, a hard shell is formed that may succumb only when the rotten core as reached it, rather than from external efforts at reform. 

However, it does seem theoretically possible that an influx of enough healthy souls can put the indigenous weakness on the defensive, essentially knocking it off its perch in the work-places and in public spaces. In a city of 20 million, such as Beijing, such an intervention would be too huge to be at all realistic; and so it is that as is the case for any great civilization, so too modern cities are subject to the rise and fall of being human, all too human after all. 




[1] For this quote and all others in this essay: Calum MacLeod, “Be More Polite, Beijing Residents Told,” USA Today, June 11, 2014. 

Thursday, June 12, 2014

U.S. House Majority Leader Cantor’s Election Loss: On the Corporate Connection

It is not every day that the majority leader in the U.S. House of Representatives loses—and badly at that—to a primary challenger in an intra-party contest. In the wake of Jim Cantor’s defeat in June, 2014, journalists wasted no time in reducing the defeat to one issue: immigration. Such a reductionist ex-post facto divination of voter intent—as if an electorate were one monolithic mind writ large—is fraught with difficulties. Beyond the sheer artifice, such an interpretation offers an easy cover for less convenient, subterranean political shifts underway and expressed in the vote.

Leaping to the immigration rationale whereby arch-conservative, or “Tea-Party” Republican voters punished Cantor for having been willing to work with Democrats on a compromise involving amnesty in some form, Ali Noorani of an immigration lobbyist group opined that the primary’s result would make it tougher to get a bill through the U.S. House in the current session.[1] Frank Sharry of America’s Voice, “said Cantor’s loss blows up a last-minute attempt by Republicans to organize support for an immigration bill.”[2] No doubt both Noorani and Sharry were going off media reports as early as election night that immigration reform had brought the majority leader down.

To be sure, more elaborate analyses, such as one by The Washington Post, broadened the explanation to include Cantor’s votes in general and his lack of attention to his district.[3] Preoccupied with gaining power over his colleagues by raising money, Cantor missed the warning signs, and this alone could have annoyed Republican voters back home. Additionally, as put by Jamie Radtke, co-founder of the Virginia Tea Party, Cantor had “made an enemy of his friends.”[4] Put another way, pundits wanting to extrapolate the primary results onto the national stage may be overlooking the idiosyncratic elements that do not generalize or go forward.

For all the points needlessly cut off by the reductionism to immigration or even Cantor’s willingness to compromise with Democrats, the most important received scarcely any media coverage. Specifically, interviews with some voters suggest that Cantor lost in part because he had done the bidding of big business at the expense of the small businesses in Richmond. Stung by bailouts without strings to the large Wall Street banks that had dangerously over-leveraged themselves on risky mortgage-backed securities and a dearth of prosecutions on fraud, Republican voters could not have missed the apparent collusion between corporate and campaign coffers; Cantor had raised nearly $5.5 million to have a 27-to-1 financial advantage over his challenger.[5] As astonishing as Dave Brat’s win is, given this financial disparity, the real lesson from the primary may be that elected officials willing to cut the financial strings to corporate America may actually do better at home because the positions and votes would be more closely tailored to the constituents.

Put another way, voters may be hungry for political courage that shakes the conventional “wisdom.” Teddy Roosevelt must have discovered as much when he went after the mighty trusts like Standard Oil in the early twentieth century, and Andrew Jackson nearly a century earlier when he refused to fund the Second Bank of the United States for fear of encroaching federal power. This is a message that the powers that be in 2014 doubtless would not want getting out, and the media dutifully complied with the subterfuge of immigration reform being in all probability dead for the time being.



[1] Alan Gomez, “Immigration Bill Looks Doubtful,” USA Today, June 12, 2014.
[2] Ibid.
[3] David Fahrenthold, Rosalind Helderman, and Jenna Portnoy, “What Went Wrong for Eric Cantor?The Washington Post, June 12, 2014.
[4] Ibid.
[5] Susan Davis And Catalina Camia, “Contests Loom for Top GOP Posts,” USA Today, June 12, 2014.

Friday, June 6, 2014

GDP and Poverty: Is Economic Growth the Answer?

From 1959 to 1973, the American economy grew 82 percent, per person. It is easy to assume this is why the poverty rate decreased from 22% to 11 percent.[1] From roughly 1985 to 1990 and then again from 1995 to 2000, heady growth rates are also correlated positively with declining poverty rates. But correlation is not causation. Indeed, had the correlation in the 1959-1973 period continued, the subsequent per capita growth would have ended poverty in 1986. What then are we to make of the relationship between GDP and poverty?

According to Heidi Shierholz, an economist at E.P.I., the “very tight relationship between overall growth and fewer and fewer Americans living in poverty” broke apart in the 1970s.[2] In spite of the OPEC oil cartel’s inflationary shocks in 1973 and 1979, the poverty rate remained relatively constant through the decade of “stagflation,” spiking only once Reagan took office—perhaps on account of David Stockman’s domestic budget cuts that hit the poor especially hard and Paul Volcker’s high interest rates at the Fed (to decrease the inflation rate) that increased the cost of borrowing money. To be sure, the recessions in the early 1980s and the early 1990s are associated with increases in the poverty rate, which even lags the subsequent recoveries, and the rate fell as the economy was humming along in the late 1980s and 1990s. Even so, eleven percent seems to be the rate’s floor. Perhaps this is why the relationship broke apart in the 1970s?

According to Thomas Piketty, the period from World War I to the 1970s is unusual economically because the shocks reduced returns on capital relative to the growth rates of income and GDP. The economist suggests that inequalities in income and even wealth narrow under this rather artificial arrangement; typically, returns on capital have outsized increases in income, thus increasing the inequalities. Perhaps the inverse relationship we are looking at holds only when the rate of return on capital is low relative to the GDP rate. However, as I indicate above, heady growth periods after the 1970s can be found in which the poverty rate is decreasing, and recessionary periods in which the rate is increasing.

So we are back to the 11 percent floor. When the relationship broke apart in the early 1970s, the poverty rate was indeed at 11 percent. Instead of continuing downward, the rate hovered, when up a bit, then slightly downward until just above 11 percent before heading starkly upward in 1981.[3] What might be behind the floor? It seems doubtful to me that 11 percent of the adults in America are simply unable to work for non-economic reasons. It seems more likely that the market mechanism, which can support wages at the minimum wage without any upward pressure (e.g., from labor shortages), functions at an equilibrium short not only of full employment, but rising wages (relative to returns on profits and stock appreciation) as well. In other words, we cannot look to the market to “grow” us out of poverty. While undoubtedly a help, the market is only part of the solution.

The question is thus how we can fill in the rest of the solution. What, outside of the economy, can make up the difference? As the source and enforcer of societal rules, government is not intrinsically the answer, for to be the “man in charge” does not in itself connote supplying materials (including jobs). Yet the government could direct that materials and or jobs be provided outside of private industry, such as by non-profit organizations receiving funds from tax revenues and thus directly under government oversight. Even now, the Full Employment Act of 1946 directs that everyone who wants a job should be able to have one—that this is a task of government to oversee. Lest employment be viewed as an end in itself rather than a means, we could stipulate that everyone has at least a minimum amount of money and wealth. Hence the unemployable veteran, for example, would not have to suffer in poverty. Lest breaking through the 11 percent floor make it more difficult for Walmart or McDonalds to pay cashiers the minimum wage of just over $7 an hour, we might remember that a part of the solution is not in itself more important than the solution itself.

That is to say, hits taken at the margins to part of the solution as the rest of the solution is added should not outweigh the solution itself, for the end is more important than any one of the means. Too often, I fear, the American public discourse obsesses on the downsides on the margins to a means, being all too willing to preempt a full solution just so the particular means is not slighted in any way. I suppose this is a sort of tunnel vision, with greed playing a supporting role. Death and taxes may be inevitable, but surely poverty is not. Indeed, it may be viewed as an artificial byproduct of human society and thus as fully within our powers and responsibility to eliminate rather than take as a given.




[1] Neil Irwin, “Growth Has Been Good for Decades. So Why Hasn’t Poverty Declined?The New York Times, June 4, 2014.
[2] Ibid.
[3] Ibid.

Tuesday, June 3, 2014

What We Know about Coal and Natural Gas: The EPA’s Coal Emissions Targets

Coal is the bad guy. At least it is the antagonist in the U.S. Environmental Protection Agency’s 645-page carbon-emissions plan unveiled in early June, 2014. In spite of the fact that the 30% reduction in CO2 emissions from the level in 2005 being set for 2030, critics showed their oligarchic focus on today by pointing to what the current likely costs would be. Electric bills increasing $4 or so a month in West Virginia. Lost jobs—as if the criteria of capital were also those of labor. In short, short-term inconveniences without a hint of the other side of the ledger. I submit that this is precisely the element in human nature that can be likened to the proverbial “seed of its own destruction” in terms of the future of our species. As menacing as such “reductionism to today” is, the assumption such as underlies the EPA’s proposal that coal is the definitive obstacle—and, furthermore—that we are not missing any other huge but invisible danger—is just as problematic from the standpoint of the species’s survival.

I have in mind the EPA’s projections by fuel type going out to 2030 from 2012. From 37% of the electricity generated in 2012, the comparable projected figure is 32% for 2013.[1] While at least the direction is downward in relation to the other fuel types, climatologists would doubtless say more of a drop is necessary to stave off more than a 2 degree C global increase.  As damning as this “ok but not good enough” scenario is pertaining to coal, the more damning feature of the report is that it may be very wrong about something it takes to be an improvement.

For example, the natural gas category is projected to go from 30% in 2012 to 35% in 2030[2]. That’s good, right, because it’s the clean gas. Not so fast. Independent empirical studies of leaks in Utah, L.A., and Washington, D.C. have shown much higher levels of methane escaping into the atmosphere than the 1% touted by the producers and adopted (without independent confirmation) by the EPA. In the observations, the actual percentages of leakage were double-digits. The problem is that the break-even point with coal in impacting global warming is 3 percent. Methane, which natural gas gives off before being burnt, turns out to have ten times the impact as coal.

My point is that even though by now we are used to the contingents that put today’s convenience above the risk to the future of the species, we don’t know what we don’t know, and this can be even more dangerous. In other words, what we assume to be a good thing may in actual fact be doing a lot of damage under our very eyes. We may not have a clue as to how what we are doing today is impacting the planet’s atmosphere. This may be one reason why scientists have repeatedly had to accelerate their projections of when the ice sheets would melt at the poles.

Human nature may be much more problematic from the standpoint of the species’s own survival than we know. Not only have 1.8 million years of natural selection engrained in us a focus on today (e.g., fight or flight) at the expense of tomorrow; we may be very wrong about stuff we assume we got right and yet be totally unaware of it. It is as though our species were a person with long hair who never bothers to use a mirror to make sure that the hair in back is brushed. The laugh is on that person, and yet she (or he) has no idea. The industrialists who are instinctively wetted to the status quo out of a desire for financial gain may just be the tip of the iceberg; we had better look underneath before it has totally melted.



[1] Wendy Koch, “EPA Carbon-Cutting Plan Could See Power Shift,” USA Today, June 3, 2014.
[2] Ibid.

President Obama as Chief Executive: Missing the Fraud at the Veterans Administration

Buffeted with a whirlwind of criticism in the wake of revelations of widespread fraud in VA Hospital and outpatient clinic wait times, President Obama somewhat sheepishly admitted during a news conference on the matter that he had heard nothing of the practice on his travels around the country. With at least one instance of false scheduling at 65% of the facilities between September 30, 2013 and March 31, 2014 and 13% of schedulers being instructed in how to falsify wait-times,[1] it is odd that word had not reached the president’s ear. Maybe this is not so odd after all, for the president’s domestic trips tended to be oriented to campaign fundraisers and speeches oriented to proposed legislation. In other words, the president—and Barak Obama is hardly alone here—put his legislative role above that of his office as chief executive.

It is worth noting that the legislative role of the American federal president is negative in that the power is exercised by vetoing legislation. To be sure, the president is constitutionally encouraged to make recommendations through the State of the Union report made to Congress. Even so, the extent of time and attention that presidents have directed to pushing favored legislative bills go beyond making recommendations, and thus the opportunity cost (i.e., foregone attention to other matters, such as managing the executive branch of the U.S. Government) is not justified. Put another way, having two branches focused at the top on legislating is not only redundant, or overkill; the joint focus leaves the executive branch without a chief except in parchment.

This is not to say that proactive rather than veto-based presidential involvement in the legislative process cannot bear fruit. Franklin D. Roosevelt, the president for much of the Great Depression in the 1930s, expended tremendous effort in seeing to it that his New Deal programs were legislated into actuality. In an “exit-interview” at the conclusion of decades in the U.S. House of Representatives, Rep. John Dingell (D-Michigan) calls FDR, “The Giant, one of probably the three greatest” presidents in American history.[2] In dull contrast, Dwight Eisenhower was a “fine chairman of the board, . . . but didn’t do much.”[3] This stinging critique implies that the managerial imprint translates into lethargy or at least a lack of accomplishment.

Relatedly, Dingell criticizes Jimmy Carter for not being able to see the forest even as he could see every tree in the woods.[4] While a president as presider should have his or her eyes on the big picture, protecting society and the systems of business and government as wholes from actualizing systemic risk, the president as chief executive should focus on trees relative to society as a whole—that is, relative to the orientation in presiding. To be sure, the focus of the particular agencies is considerably narrower, and no CEO rightfully gets hung up at that level—but neither does a CEO focus on society at the expense of the business itself. Carter took micromanaging to the extreme, personally approving even the White House Christmas cards. As dysfunctional as this is for a chief executive, equally problematic is a president who acts as if he or she were a Congressional leader, or else privileges his own presiding over managing. Yet legislating and presiding have come to swallow up the very notion of the American presidency—Rep. Dingell’s comments illustrating this default.

It hardly bears mentioning that for a politically-oriented person, running around the member states making speeches oriented to a vision of society is unquestionably more exciting than exercising executive responsibilities. As a result, it has been all too easy for the campaign-oriented people who have occupied the Oval Office to effectively leave the mammoth executive branch without a CEO or managerial chairperson—a decision that tacitly enables the sort of widespread fraud as was found in the Veterans Administration in 2014. It is fanciful to suppose that word of even such a widespread managerial practice would somehow show up on a rope-line as a celebrity president is passing by. Yet in his news conference on the fraud at the VA, President Obama saw no such disjunction. Instead, he sought to appear as managerially on top of the intricacies of the VA scheduling process.

I suspect that the encroachment of campaigning over governing has a correlate in the White House, wherein legislating has come to crowd out the executive functions. Perhaps the Electoral College was established in part so a good executive rather than a good campaigner would have a chance at the office; the increasing salience of the popular vote being like a storm’s wave washing over everything else and thus effectively establishing the sort of person who would get the prize. Relatedly, the underlying problem doubtlessly includes the character flaw that too easily ignores some of a job’s responsibilities in selfishly favoring others. In other words, we can indeed blame Barak Obama and many of his predecessors for slighting their managerial responsibilities across the executive branch in order to have more influence (i.e., legislatively). Ironically, fewer speeches on pending legislation would have much more currency and free up the president to manage the executive branch. As for the quite legitimate presiding role that is literal to the presidency itself, catastrophic threats to the systems of business, government, and society do not arise every day; the role does not “eat” a lot of time on a daily basis if understood correctly instead of applied to every symptom that pops up on an oversensitive radar-screen. Leaving legislating largely to Congress, a presiding president would likely find that he or she has enough time to manage the executive branch effectively, assuming an optimal mix of direct supervision and delegation is applied. Generally speaking, balance and proper boundaries would do a lot of good, yet unfortunately human nature may be more schizogenic than homeostatic—more maximizing (e.g., desire) than oriented to equilibrium.



[1] Meghan Hoyer and Gregg Zoroya, “Fraud Masks VA Wait Times,” USA Today, June 3, 2014.
[2] “John Dingell Rates the Presidents,” USA Today, May 2, 2014.
[3] Ibid.
[4] Ibid.

Dismembering Time Inc: A Critique of the Conglomerate Strategy

Typically, management is assumed to be a skill or practice that enables a person so trained to work in virtually any company, regardless of what the sector happens to be. A manager is presumed fully able to go from managing a bank to managing a restaurant. Organizing is the common thread; passion for the particular output is not. Yet surely product-specific knowledge and indeed fascination must count for something. This point struck me as I read the ideas of one journalist regarding what should come of Time Inc. once separated from the mother-ship of Time Warner. If the parts of Time Inc. are indeed worth more as parts of different companies than as remaining as a whole, then it is worth asking whether it makes sense to assign each part to a company oriented to the same theme or domain. If so, then the particular theme of a publication, or business moreover, should have some bearing on a manager.

Playing with how a freed-up Time Inc. might be internally dismembered, Michael Wolff suggests that Time’s InStyle magazine might go to Hearst or Condé Nast; Entertainment Weekly to The Hollywood ReporterSports Illustrated to ESPN; and Fortune to The Wall Street Journal.[1] Although management dogma has it that the practice can enable a non-publishing company to effectively take in any of Time’s magazines, synergies in terms of content would doubtless have value if the purchasing company not only publishes already, but also in the same subject matter. Adding to the latter, I submit that collecting together managers who have a particular interest in a certain subject-matter, such as sports or finance, and having those managers fully occupied only in it, has both financial and psychological value. The balance could in fact be tipped away from the conglomerate enterprise should the concept of a shared, intense passion take hold in a managerial setting.

Accordingly, the salience of the content of what is being managed—the particular subject-matter of a product or service—may render managerial skill not as much of a passport as currently supposed or taught in business schools. Just because a person has a MBA does not mean that he or she could manage a magazine for a few years then move on to manage a restaurant; the delimiting question would be: what interests you other than management? Perhaps then the world would see more managers of flesh and blood—fully alive even at work—rather than mere stand-ins composed of sanitized skeletal bone. To be viable, management may have to transcend itself into specific, non-transferrable content.




[1] Michael Wolff, “The Once and Future Time Inc. Is in Flux,” USA Today, May 27, 2014.