I
Heroic icons produced by the story of American history have often been put to use in the service of the office of the presidency, typically to harmonize as well as possible the President’s preexisting personality traits or image with a ready-made figure evocative of “the real America” or “what America is all about.“
Upon his inauguration, Barack Obama ensconced himself in the Oval Office as a newer sort of hero, one who had been floating about on the American scene for years, but had not had his iconic moment until the presidential election of 2008. On January 20th, 2009, Barack Obama took control of the executive branch as America’s first consultant president.
II
A brief, but by no means comprehensive, catalog of heroic American icons—all of whom have been made to slab mortar between the discursive bricks of America’s essence—serves to give one some idea of the numerous poses available to a political communications team at any particular moment. Over the course of the nation’s history, and in no particular order, the pilgrim, pioneer, captain of industry, man of the people, cowboy, war hero, self-made man, businessman, man of God, general, yeoman farmer, and so on have served as instances of the heroic, trustworthy man who points back to the archetypical American: hard-working, honest, and self-reliant.
While intellectuals in the traditional sense have generally not been celebrated as heroic figures in American lore, the socially conservative—but for all that, anarchically self-centered—culture of American society leaves plenty of room for smarts of a certain kind: savoir faire. No nonsense know-how of the common sense kind is greatly revered—if not in intellectuals or technocrats, then in men who appear out of nowhere with answers garnered through hard-won experience, or, even better, something less tangible and more mystical than that.
Consider the case of the figure of the corporate CEO who, while presently out of favor, traced a remarkable arc between, say, 1980 and 2008. In that time, once faceless organization men became stars in their own right. Lee Iacocca’s “remarkable” turnaround of Chrysler was but a John the Baptist moment to Jack Welch’s miracle work at GE. But even these men were mere spectacular heads of organizations—their successors made organizations secondary to their spectacular heads. In an odd evolution, corporate organizations, which, whatever their hierarchies and asymmetrical power structures, perforce relied upon some form of collective enterprise among participants, and operated with numerous redundancies and contingency plans, became the home of the indispensable man.
This was echoed in politics by the appearance of figures like John Corzine, Mitt Romney, and Michael Bloomberg, whose success as businessmen and CEOs made them uniquely qualified to govern.
The persistence of types like the cowboy—remember 2008 presidential candidate John McCain’s self-identification as a “Maverick,” which evoked both an old-west gunslinger and the sort of fighter pilot made famous by Tom Cruise in “Top Gun”—speak to a rather complex complex in American culture’s manufacture of heroes: what is most desired is the anti- or asocial, laconic loner instantly recognizable by his self-sufficiency, his know-how, and his willingness to buck the system. But—and herein lies the complexity—the desire to buck the system is not to be confused with a desire to alter the fundamental characteristics of the system, which are ipso facto just and fair. Instead, the uniquely American hero seeks to buck the system in order to make it work as it ought to work. There is no need for a new ideal—it’s just a matter of making the already existing ideal more real. In politics this often takes the form of “Washington is broken, and I’m going to fix it.” During the summer just passed, for example, Ben Quayle, son of the former Vice President Dan Quayle, sought to avail himself of this pose with a 30-second television advertisement for his House campaign that equated drug cartels in Mexico with “tax cartels in Washington, D.C.,” and spoke to what is always the most pressing need in the nation’s capital: “Someone,” said Quayle, “has to go to Washington and knock the hell out of the place.”
Of course, what is ignored in these formulations is the structural complex of interests and power that either create systematic distortions or simply allow already inequitable systems to operate more efficiently. For the political man, this can be beside the point, for actual public welfare need not be his primary end.[1] For the political man, when the chips are down, it’s best to stand before the cameras and plead one’s fundamental suitability for the job. And it’s best to have a well-tailored costume from wardrobe—preferably one that doesn’t take too much trouble to put on.
Consider the postwar presidents. Faced with likely electoral defeat in 1948, Harry Truman, the plain-spoken haberdashery owner, and product of the Tom Pendergast machine, railed against the “do-nothing” Congress and “Gave ‘em Hell.” It worked, and he won. Chalk one up for the perennially loved little guy, against whom the deck is endlessly stacked by the very system Truman was running to continue to preside over.
Eisenhower was a bona fide war hero; the general who oversaw D-Day and then ran the United States for eight years likely would have been able to run it longer had the recently ratified Twenty-second Amendment not gotten in the way. Kennedy was a war hero on a smaller scale, but also a matinee idol. Smoke, mirrors, electoral shenanigans, and a lot of cash got him elected, but a bizarre combination of media savvy, Olympian eloquence, actual intellect, and crafty patriotic evangelism kept him popular until his untimely death. The role he had actually played was elucidated posthumously, when his widow pointed to the Broadway musical “Camelot” as a way of understanding his administration. The Kennedys became “American Royalty.” For obvious reasons, kings and royalty typically don’t play well as American icons, but an exception was made for Kennedy and his family, perhaps because their pop kingdom took as its template not actual monarchism, but Arthurian legend as interpreted by the American entertainment industry.[2]
Ironically, there was nothing in wardrobe for Kennedy’s successor Johnson, the Texan for whom the cowboy hat would have seemed most fitting. His formidable legislative abilities were eventually overwhelmed by a war spiraling out of control and instead of going before the cameras in costume, he appeared as himself one last time to announce that he would not seek another term.
Nixon relied more on the surreptitious winks and nods of those who knew better. Earlier in his life, he had pled his case as a put-upon family man in the “Checkers” speech that saved his career. In office, though, Nixon played the role of the club president. Precisely what club he was presiding over (country club, Elks Club, Knights of Columbus, White Citizens’ Council) was beside the point—the point was, most belonged, but many were excluded. The Quaker who’d come from nothing came to stand for the traditional values held by the majority, but held silently—as the ugliness inhering within them was better not spoken. Ford had too little time to don any costume, and was quickly followed by Carter, a peanut farmer who emerged from nowhere (a plus), to ineffectually reap the unforgiving harvest of the policies sown by Kennedy, Johnson, and Nixon.
Reagan, the former actor, assumed the presidency in the role of a character that was one part Capra-esque little guy and ten parts cowboy. Fully prepared to alchemically bring morning to America by going to war on the social safety net, labor, and communism—which were all of a piece anyway—Reagan’s legacy is for many a matter of orthodox reverence to this day.
Reagan was followed by Bush, his former number two, whose goofy patrician manner was tolerated but not celebrated, and whose dearth of charm and second-rate technocratic arrogance made it impossible to recover when the economy went south, even after he’d enjoyed impossibly high poll numbers following the successful prosecution of an overseas war. Bush was followed by Clinton, who, like Carter, emerged from nowhere (“the man from Hope”), but who, unlike Carter, was a crafty and untrustworthy shapeshifter and carnival barker who, with his opportunism, narcissism, quick wits, and uncanny ability to convincingly hit the right emotional note, found grace as America’s first afternoon talk-show host President, which may explain his enduring popularity and the ease with which he is laughed at by the same populace for whom he remains so popular.
Young George Bush followed and, in circumstances highly conducive to any sort of heroic masquerade, got enough people to buy his combination regular guy/cowboy act for long enough to allow him to serve two terms characterized by negligence, recklessness, incompetence, and innumerable allegations of illegality. His act had worn thin by the time he left office with staggeringly low poll numbers, but the damage had been done.
The events that took place during Young George Bush’s administration set the stage for—indeed, almost necessitated that—someone come forth as a transformational leader. Not since 1980 (or even the 1930s) had the country been in such poor shape. Obama himself evoked Reagan during the democratic primaries—a fact with which his opponent, Senator Hillary Clinton, tried to make hay. But there was no hay to be made. Obama’s point was taken—the opportunity of the moment called for a drastic change of course. Called, indeed, for Change, his campaign’s one-word mantra.
III
The special characteristics of a consultant are objective disinterest combined with expertise. The former is festooned with imputed benevolence by the client, and the latter is the sine qua non of the profession. The consultant’s visage bears a smile reminiscent of the sphinx’—but unlike the sphinx, the consultant is the enemy of ambiguity. Deftly scratching you behind the ears as you solve the riddle he’s crafted, he guides you towards the answer you (and he) knew all along. No Diogenes provocatively bearing a lantern in daylight, the consultant carries with him a genie’s lamp whose inhabitant, released after proper handling, is you, the client equipped with the tools necessary to find the answer on your own.
Every person a stakeholder, all desiring the same outcome: the identification, pursuit, and realization of the right solution. Employing every tool in his arsenal—bringing everyone to the table, creating buy-in, forging consensus, managing change, crafting recommendations, serving as a facilitator—the consultant works tirelessly, trading in the most banal and necessary of revelations: common sense upon which we can all agree, issued in three-ring binders or spiral-bound reports.
A contributor to this magazine had the privilege a number of years ago of taking part in a three-day seminar introducing participants to the concept of “creative leadership.” The lead-up to the seminar was a luminous carnival of self-discovery that included Myers-Briggs testing, as well as detailed feedback from peers, managers, and direct reports. The anticipation of what was meant by creative leadership was exhilarating—the idea that there are aspects of one’s job that one finds frustrating seemed perfectly normal, but was there a key to professional life and conduct that would provide one with a definitive answer as to why this was the case and what could be done to change this?
Sadly the answer was no. The underlying assumption of the seminar was that the status quo both justified itself and required no justification: management would remain in place, roles and responsibilities would not fundamentally change, you would still be working with the same people, no one had to worry about being assessed on their level of competence and cold, hard quantitative data regarding employee turnover; the history of genuine career progression; consequences of hiring or promotion decisions; financial results; or company performance were not in the scope of analysis. The paradigm was to be discovered, accepted, possibly explored, rarely questioned, and never changed.
One’s dissatisfaction or aspirations for a more rewarding experience at work were the unfortunate consequence of having too many of one’s own ideas. This was illustrated to the contributor in a one-on-one session with a seminar facilitator. Using a pen-to-paper diagram of a simple journey between points A and B, the contributor learned that what appeared to be a straightforward endeavor could be fraught by many variables, including the speed at which the journey would take place, opinions that remaining at point A was perfectly acceptable, the introduction of an unforeseen point C into the mix, or the view that point B had already been reached. The contributor pushed the facilitator on, for example, the impossibility of a single entity being in three places at once, as well as the notion that the amount of variables could be drastically reduced by a well-communicated directive from management containing a definition of what point B was, what the risks of the journey were and how they would be mitigated, and a fair, sensible, detailed argument as to why completing the journey within a finite time period was necessary, even in the presence of dissent.
This was met with the stare of bemusement, concern, and mild pity. Whether or not the contributor understood it, everyone was excellent in their own special way and the journey could only be undertaken when all the variables were spoken to and accounted for. “Creative leadership” indeed sprung from one’s ability to suss out and perpetuate what already existed. The contentiousness of what lay outside the paradigm was a tell-tale indicator of its utter lack of merit and likely damnation to failure.
In a sense, the consultant is always begging the question because, in a sense, he is always operating on the assumption that there is one end towards which everyone in an organization ought to strive—and, in a sense, he is usually right.
But how could he not be? A for-profit company exists to maximize profit and market share. A nonprofit or non-governmental organization exists to fulfill its mission; in general, given the means available at any historical moment, the codification of organizational and procedural best practices is not only a lazy reflection of the comfort of herd mentality. “Thinking outside the box” is celebrated for the successes it spawns, but more often than not these attempts fall short and result in failures which remain cloaked in anonymity. New conditions and practices may well serve to establish new rubrics and constraints, but consultants are rarely actual organizational management revolutionaries—more often they exist to guide your hand towards the light switch which activates the fluorescent bulbs you’ve already installed.
IV
But a nation is not an organization whose ends are easily defined. The activities of even the most repressive nation are the result of pronounced and protracted contestations between disparate blocs and coalitions of power. Barack Obama’s promise to “change the culture of Washington” was (one could hope) based either on cynical rhetorical posturing or (one feared) a naive refutation of the structural principles governing the operation of power (Ben Quayle’s commitment to “knock the hell out of the place” was merely a less polite, and just as empty, version of this promise uttered two years later).
As the Obama Administration’s approach to governing began to take shape, it became apparent that the latter was the case—it was time for the consultant to roll up his sleeves and help the enterprise known as America find the answers to its operational woes.
First up was the financial crisis, the province of the specialists Tim Geithner and Larry Summers, who, with others, may have been complicit in the collapse of 2008, but who were also, as representatives of a certain technocratic element within the financial system, stakeholders worthy of a seat at the table—alongside the financial institutions themselves. Together, with taxpayer money, they’d save the system and get banks lending again.
Sadly, their plans were flawed. As James Kenneth Galbraith pointed out in a March 14, 2010 interview with this magazine:
Putting a charitable construction on things, and taking people at their word, I would say that Larry Summers, Tim Geithner, and the President took the view that you could not only stabilize but restart the economy through the banking system in a relatively short period of time. And I think judging from recent articles and profiles I have seen of Geithner in the press, that is still his view. The President was saying early on that the need was to get the banks lending again. Well, in 2009 lending went down by an order of 600 billion dollars from the previous year. I thought at the time that was a futile quest, that what the banks would do would be to go into a kind of hibernation where they would be taking all of the funds they could get from the Fed at zero percent—collect the funds for basically nothing—and they would be parking them as investments in government debt at three percent. And the difference would be sufficient to provide earnings for the banks, but it was completely unrealistic to expect that the banking system would be providing commercial and industrial loans to commercial real estate, which is in vast excess supply, residential housing, which is likewise in terrible shape—completely collapsed in values—and business enterprises, which don’t have any interest or ability to collateralize a loan. Those are problems that couldn’t be addressed by making exhortations to the banks. [3]
But it could have been worse, couldn’t it? There were no runs on banks, and high unemployment and a sluggish economy are far better than panic in the streets, higher unemployment, and a depression.
In any case, it was on to Afghanistan, and the protracted decision over whether or not to increase troop numbers. As described to The Straddler by a government consultant, the administration went about its lengthy decision-making process on whether or not to increase troop levels in Afghanistan pursuant to principles not unfamiliar in the field. The administration convened a focus group of key stakeholders (among whom there were no Afghanis) to get buy in on the process that they (the stakeholders) would feel empowered to effect as actors responsible for executing the decision that was ultimately made (by them).
As the New York Times reported on December 9, 2009:
“The president welcomed a full range of opinions and invited contrary points of view,” Secretary of State Hillary Rodham Clinton said in an interview last month. “And I thought it was a very healthy experience because people took him up on it. And one thing we didn’t want—to have a decision made and then have somebody say, ‘Oh, by the way.’ No, come forward now or forever hold your peace.”[4]
In the end, Obama decided to send 30,000 troops, and the decision looked like it may not have been his. In any case, no one could say, “Oh, by the way,” even though they did. He also decided on a timeframe for withdrawal beginning 18 months hence if certain less than clear conditions were met.
The Obama Consultancy’s vision of convening stakeholders was also evident during the multi-month oil leak in the Gulf of Mexico brought about by the explosion of one of BP’s rigs. The Consultancy’s initial response to the spill was noteworthy for its familiar lack of action. This was followed by an odd public debate over whether or not the president should express outrage, with some arguing that the president should, and others saying that the president’s doing so would not stop the leak.
On the question of what actual path to pursue to staunch the environmental catastrophe, a tenuous consensus developed that the equipment and technical expertise necessary to stop the leak were the unique property of BP. This, however, did not obviate a question, hardly entertained, about an obvious conflict. While one could reasonably assume that it was in BP’s interest to stop the leak as quickly as possible, one would also be correct to believe that it was also in BP’s interest—indeed, it was its legal responsibility to its shareholders—to continue to seek ways to increase profits, expand market share, and minimize exposure to liability. The latter was quite simply a matter of fiduciary duty.
As a matter of logic and administration, it is easy to see how these two interests had the potential to come into the conflict. Wesley Warren, director of programs for the National Resources Defense Council, told CBS News’ “Sunday Morning” in early June: “The administration needs to make sure…that when BP cleans up its mess, that that’s not the same thing as cleaning up the scene of the crime.”[5]
Former Labor Secretary Robert Reich spoke lucidly to this point even earlier. In late May, he wrote:
It’s time for the federal government to put BP under temporary receivership, which gives the government authority to take over BP’s operations in the Gulf of Mexico until the gusher is stopped.
Why?
This is the only way the public know what’s going on, be confident enough resources are being put to stopping the gusher, ensure BP’s strategy is correct, know the government has enough clout to force BP to use a different one if necessary, and be sure the President is ultimately in charge.
If the government can take over giant global insurer AIG and the auto giant General Motors and replace their CEOs, in order to keep them financially solvent, it should be able to put BP’s north American [sic] operations into temporary receivership in order to stop one of the worst environmental disasters in U.S. history.
The Obama administration keeps saying BP is in charge because BP has the equipment and expertise necessary to do what’s necessary. But under temporary receivership, BP would continue to have the equipment and expertise. The only difference: the firm would unambiguously be working in the public’s interest. As it is now, BP continues to be responsible primarily to its shareholders, not to the American public. As a result, the public continues to worry that a private for-profit corporation is responsible for stopping a public tragedy. [6]
But rather than even consider this bold, but perhaps necessary, route—or any other route that might have yielded better results than lackadaisical collaboration with the spill’s perpetrator—Obama chose a tepid path that earned him grudging praise from the National Review’s Lou Doliner,[7] who declared the crisis “over” on August 6th, and had the Administration’s man on the scene, former Coast Guard Admiral, Thad Allen (to whom credit belonged on the government side, according to Doliner), blaming the public on the 123rd day of the disaster for any difficulties the Administration encountered during the process:
“The current response model assumes the responsible party [i.e., BP] will work with the federal on-scene coordinator and local state entities to achieve unity of effort and effective spill response. It’s been challenging at times to create that unity of effort given sometimes what appears to be the rejection of the notion [by] the general public.”[8]
Odd that the general public should reject the “current response model.” Perhaps this was a failure of the Obama Consultancy’s current crisis communications management model?
It is also worth noting, in the context of Admiral Allen’s remarks, that, more than six weeks into the crisis, seemingly in response to calls for a demonstration of emotion over the events in the Gulf, President Obama said he was trying to determine “whose ass to kick.” Though one might have thought the answer to that query—again, more than six weeks into the leak—would have been rather clear, it appears as though the Administration’s answer, more than 17 weeks into the disaster, was rather traditional: that of the general public. (Pursuant to American tradition, the general public never stops having its ass handed to it.)[9]
V
Obama’s chosen methodology of governance lacks both the stupidity and the effectiveness of that of his immediate predecessor, George W. Bush, who made sure we knew that he was “the decider.” But Bush’s cynical claim to be a “uniter, not a divider,” a so-so bit of political sloganeering which spoke to precisely what he was not, seems to have crystallized into one of Consultant Obama’s operating tenets.
Even if one accepts and laments the premise that Barack Obama is a consultant masquerading as a president, another question remains: is Barack Obama, whatever his failings as a president, a good consultant?
A brief examination of the Obama Consultancy’s problem-solving methodology used to achieve tangible health care reform can be applied. There are more than enough assumptions that deserve to be questioned as part of the current health care paradigm, yet our consultant finds himself in an unfortunate pickle—hamstrung by the obligation to represent and mirror the thoughts of those at the table whilst making considerations for those who are not at the table: the public. These silent bystanders are audacious enough to expect a president to represent their interests, an impossibility when the president is a full-time consultant. There is also no questioning of the assumed right of those at the table indeed to be at the table. The lack of accountability to the public enjoyed by the private organizations represented in the reform process has never precluded them from participation in formulating public policy and would certainly not preclude them from it now. Voters may want to “throw the bums out” every other November, but the CEOs of big pharma or tier-1 HMOs could not be realistically subjected to such brutal a culling as a regular, popular election.
Indeed, the precondition to participation was that everyone left the table satisfied. One can only imagine the scene when the appointed stakeholders arrived at the White House to peruse the agenda and achieve their objectives. They were welcomed and reminded of the momentousness of the occasion. The fragmented lobbies and back-handed, piecemeal, “I got mine” nonsense were things of the past. The Consultant-in-Chief sternly reminded the stakeholders, wayward and selfish though some of them may have been, that the inherent right to veto reform still belonged to all of them, but now came in conjunction with a new responsibility to participate in hammering out a solution.
What emerged was a suitably “tweaked” version of what already existed: stakeholders couldn’t be required to make due with a smaller piece of the considerable pie that provided them with taxpayer absorption of their business risk by way of R&D subsidies, tax holidays, barriers to entry and lack of any invasive government limitation of the liabilities for healthcare-related goods and services still to be borne by the public. The solution was to increase the magnitude of said pie with additional insurance premiums and superficially unambiguous discontinuation of many of the uglier characteristics of the system up to that point. The consultant could go to his project sponsors with a list of results. Radical change? No. Change we can believe in? Absolutely.
Stakeholders walked away from the table having achieved the objective of satisfaction, knowing full well that their continuing status as stakeholders—indeed the only stakeholders—was not in question. Stakeholders got theirs and the public’s great prize, an unequivocal public mandate for care, could never get on the agenda in the first place for fear that it was simply asking too much. You could even rely on more sympathetic members of the press to politely chide Obama’s crankier critics:
The 111th Congress returned to Washington this week with a record of legislative achievement that rivals President Lyndon Johnson’s “Great Society.” Voters may show their thanks by throwing lawmakers out of office.
Encouraged by Barack Obama, a new president from their own party, the Democrat-led House of Representatives and Senate provided health-care coverage for another 32 million Americans, offering coverage for 95 percent of U.S. residents.”[10]
Magnanimous though this “offering” is, uncomfortable questions about how these 32 million Americans would fund this, given that their exclusion from the current system most likely resulted from their inability to do so, or what was to be made of the unfortunate and extreme case of the remaining and obviously most vulnerable five percent, needed to remain off the table as well. Now was not the time for a fundamental overhaul of a system that was no doubt in place for a very good reason.
But even more illustrative, in its distilled essence, of Barack Obama’s competence as a consultant was his approach to the controversy surrounding the building of a mosque in downtown Manhattan within blocks of the site of 2001 terrorist attacks on New York City.
Proponents of the mosque’s construction, including Mayor Bloomberg, contended that allowing it to be built was in harmony with American principles of tolerance. Opponents contended it was disrespectful to the memory of the people who had died in the terrorist attacks; heated claims were made that it glorified the terrorists who had taken so many lives.
On Friday, August 13th, President Obama weighed into the controversy:
As a citizen and as president, I believe that Muslims have the same right to practice their religion as anyone else in this country. That includes the right to build a place of worship and a community center on private property in lower Manhattan, in accordance with local laws and ordinances.[11]
On Saturday, August 14th, President Obama, expanded on this statement after an event touting Gulf Coast recovery in Panama City, Florida:
I was not commenting and I will not comment on the wisdom of making the decision to put a mosque there. I was commenting very specifically on the right people have that dates back to our founding. That’s what our country is about. And I think it’s very important as difficult as some of these issues are that we stay focused on who we are as a people and what our values are all about.[12]
Fair enough, but what profits a man to invoke the principles of the world but lose his point, if he ever had one? What kind of consultant is this?
As a bit of rhetoric ripe for analysis, its elegance cannot be denied. As logic, simply notated, it resembles a C major scale. It is the Pied Piper leading those who follow from where they began to where they began.
But as a matter of consultant intervention, it lacks wisdom or effectiveness. By invoking and elucidating a national operating principle protecting certain activities, but then explicitly refusing to offer an opinion on the wisdom of engaging in those activities, Obama leaves out the consultant’s tell, the answer towards which he’s guiding you, even though it’s supposed to be the answer you knew all along.
VI
It is perhaps unfair to suggest that Mr. Obama’s temperament and administrative style would have been better suited to an institution that valued his arrant combination of methodical intelligence, deference to pre-existing formations of power, and caution. But it certainly is not unreasonable to suggest that Mr. Obama’s unique talents could have been more effectively utilized in a more forgiving field—say, as a professor of anything at almost any university, or at one of the investment banks. Or perhaps as a member of a consultancy specializing in change management you can believe in: all on a normalized timetable; all with everyone on the same page; all towards an end upon which everyone has agreed; unburdened of the necessity of administering contingency; freed of the need to advance an agenda with specific winners and losers, or to make decisions in the face of, at the very best, partially mitigated certainty.
But, as it is, the nation, which needed so much more, has naught but the Obama Consultancy. It could have been worse. But considered in the context of the dynamic constraints on action that exist at any historical moment, and considered too in light of the fact that dramatic events often rapidly alter the character of these constraints, one could be forgiven for being disappointed. It is a truism that politics, especially reform politics, is the art of the possible; but what is possible does not change according to a predictable timeframe. Financial reform was totally impossible between 1980 and 2008, unless it meant, for instance, repealing the Glass-Steagall act in 1999 and allowing the largest financial institutions once again to become the sclerotic, unmanageable high-rollers they were prior to 1929. In 2008 it became possible, and Obama chose a timid route of gathering stakeholders (narrowly defined) and seeking a “not fish, not meat” policy that would get everyone working towards a common goal they never shared in the first place. (That the solution could never be “meat” is a perfectly logical assumption when the consultant is tasked with forging agreement on the Thanksgiving dinner menu with a table full of gobbling turkeys.) When Wall Street was bailed out and didn’t show its gratitude by lending to consumers or ceasing operations in dubious activities, Obama was disappointed. He was angry. He had not run for office to “bail out a bunch of fat cat bankers.”[13] But a paragraph from a New York Magazine article on Obama and Wall Street makes the power calculation very clear:
[O]ne of the city’s most successful hedge-fund hotshots offers a different surmise: “The majority of Wall Street thinks, ‘Hey, you lent us money. We did a trade. We paid you back. When you had me down, you could have crushed me, you could have done whatever you wanted. You didn’t do it! So stop your bitching and stop telling me I owe you, because I already paid you everything! The fact that I’m making money now is because I’m smarter than you!’”[14]
Indeed. Because that’s how power works, Mr. Obama’s models aside.
Whither Obama next? Well, that’s up to the stakeholders, however identified, who are brought to the table to help suss out that decision. In the interests of achieving better outcomes, is it perhaps time for the President, should he decide to persist in his role as consultant, to start defining stakeholder down so that it begins to allow at least for inclusion of the citizenry and its allies—a humble place at the table for those with the most at stake?[15]