Sunday, August 31, 2014

Wall Street has the most to lose from real democracy

The 1 percent’s long con: Jim Cramer, the Tea Party’s roots, and Wall Street’s demented, decades-long scheme

Wall Street has the most to lose from real democracy. Which is why they posture as rebels and not The Man

The 1 percent's long con: Jim Cramer, the Tea Party's roots, and Wall Street's demented, decades-long scheme
 Jim Cramer (Credit: AP/Mark Lennihan) 
Happy Labor Day. A few years ago, Eric Cantor used this holiday as one more occasion to celebrate business owners. To a lot of people, that sounded crazy. But in truth, it came straight out of the bull market ideology of the 1990s, a time when the nation came to believe that trading stocks was something that people in small towns did better than slicksters in New York, and when Wired magazine declared, in one of its many frenzied manifestoes, that “The rich, the former leisure class, are becoming the new overworked” and that “those who used to be considered the working class are becoming the new leisure class.” We were living in a “New Economy,” Americans said back then, and the most fundamental novelty of the age was an idea: that markets were the truest expression of the will of the people. Of course the Beardstown Ladies were better at investing than the Wall Street pros; they were closer to the humble populist essence of markets. Of course the Millionaire Next Door was an average Joe who never showed off; that’s the kind of person on whom markets smile. And of course bosses were the new labor movement, leading us in the march toward a luminous economic democracy. Ugh. I got so sick of the stuff that I wrote a whole book on it: "One Market Under God," which was published by Doubleday just as the whole thing came to a crashing end. Here is an excerpt. 
The Dow Jones Industrial Average crossed the 10,000 mark in March of 1999, a figure so incomprehensibly great that it was anyone’s guess what it signified. The leaders of American opinion reacted as though we had achieved some heroic national goal, as though, through some colossal feat of collective optimism, we had entered at long last into the promised land of riches for all. On television, the rounds of triumphal self-congratulation paused for a nasty rebuke to the very idea of financial authority brought to you by the online brokerage E*Trade, a company that had prospered as magnificently as anyone from the record-breaking run-up: “Your investments helped pay for this dream house,” declared a snide voice-over. “Unfortunately, it belongs to your broker.” And behold: There was the scoundrel himself, dressed in a fine suit and climbing out of a Rolls Royce with a haughty-looking woman on his arm. Go ahead and believe it, this sponsor cajoled: Wall Street is just as corrupt, as elitist, and as contemptuous toward its clients as you’ve always suspected. There should be no intermediaries between you and the national ATM machine in downtown Manhattan. You needed to plug yourself in directly to the source of the millions, invert the hierarchy of financial authority once and for all. “Now the power is in your hands.”
In the rival series of investment fairy tales broadcast by the Discover online brokerage (a curious corporate hybrid of Sears and J. P. Morgan) a cast of rude, dismissive executives, yawning and scowling, were getting well-deserved payback at the hands of an array of humble everymen. Again the tables of traditional workplace authority were rudely overturned by the miracle of online investing: The tow-truck drivers, hippies, grandmas, and bartenders to whom the hateful company men had condescended were revealed to be Midases in disguise who, with a little help from the Discover system, now owned their own countries, sailed yachts, hobnobbed with royalty, and performed corporate buyouts—all while clinging to their humble, unpretentious ways and appearances just for fun. And oh, how the man in the suit would squirm as his social order was turned upside down!

In the commercials for his online brokerage, Charles Schwab appeared in honest black and white, informing viewers in his down-home way how his online brokerage service worked, how it cut through the usual Wall Street song and dance, how you could now look up information from your own home. “It’s the final step in demystification,” he said. “This internet stuff is about freedom. You’re in control.” To illustrate the point other Schwab commercials paraded before viewers a cast of regular people (their names were given as “Howard,” “Rick,” and “Marion”) who shared, in what looked like documentary footage, their matter-of-fact relationship with the market—the ways they used Schwab-dot-com to follow prices, how they bought on the dips, how they now performed all sorts of once-arcane financial operations completely on their own. The stock market was about Rick and Marion, not Charlie Schwab.
In another of the great stock market parables of that golden year, the Ricks and Marions of the world were imagined in a far more insurgent light. Now the common people were shown smashing their way into the stock exchange, breaking down its pretentious doors, pouring through its marble corridors, smashing the glass in the visitors’ gallery windows and sending a rain of shards down on the money changers in the pit—all to an insurgent Worldbeat tune. As it turned out, this glimpse of the People Triumphant in revolution—surely one of the only times, in that century of red-hunting and labor-warring, that Americans had ever been asked by a broadcasting network to understand such imagery as a positive thing—was brought to you by Datek, still another online trading house. What the people were overthrowing was not capitalism itself but merely the senseless “wall” that the voice-over claimed always “stood between you and serious trading.”
Exactly! As the century spun to an end, more and more of the market’s biggest thinkers agreed that “revolution” was precisely what was going on here. Thus it occurred to the owners of Individual Investor magazine to send gangs of costumed guerrillas, dressed in berets and armbands, around Manhattan to pass out copies of an “Investment Manifesto” hailing the “inalienable right” of “every man and woman . . . to make money—and lots of it.”
Meanwhile, the National Association of Real Estate Investment Trusts ran ads in print and on TV in which a casually dressed father and his young son capered around the towering office blocks of a big city downtown. “Do we own all this, Dad?” queried the tot. “In a way we do,” answered his father. This land is their land—not because they have bought it outright, like Al, the country-owning tow-truck driver in the Discover spots, but in a more intangible, populist, Woody Guthrie sort of way: Because they have invested in REITs.
Not to be outdone by such heavy-handed deployment of 1930s-style imagery, J. P. Morgan, the very personification of Wall Street’s former power and arrogance, filled its ads with hyper-realistic black and white close-ups of its employees, many of them visibly non-white or non-male. Literally putting a face on the secretive WASP redoubt of financial legend, the ads reached to establish that Morgan brokers, like Schwab brokers, were people of profound humility. “I will take my clients seriously,” read one. “And myself, less so.” The ads even gave the names and e-mail addresses of the Morgan employees in question, a remarkable move for a firm whose principal had once been so uninterested in serving members of the general public that he boasted to Congress that he didn’t even put the company’s name on its outside door.
Faced with this surprisingly universal embrace of its original populist campaign against Wall Street, E*trade tried to push it even farther: The changes in American investing habits that had brought it such success were in fact nothing less than a social “revolution,” an uprising comparable to the civil rights and feminist movements. In its 1999 annual report, entitled “From One Revolution To the Next,” E*trade used photos of black passengers sitting in the back of a bus (“1964: They Said Equality Was Only For Some of Us”) and pre-emancipated white women sitting in the hilarious hairdryers of the 1960s (“1973: They Said Women Would Never Break Through the Glass Ceiling”) to establish E*Trade itself as the rightful inheritor of the spirit of “revolution.” The brokerage firm made it clear that the enemy to be overthrown on its sector of the front was social class: Next to a photo of a man in a suit and a row of columns, a page of text proclaimed, “They said there are ‘the haves’ and the ‘have-nots.’” But E*trade, that socialist of the stock exchange, was changing all that: “In the 21st century it’s about leveling the playing field and democratizing individual personal financial services.” The company’s CEO concluded this exercise in radicalism with this funky rallying cry: “Bodacious! The revolution continues.”
Whatever mysterious forces were propelling the market in that witheringly hot summer of 1999, the crafters of its public facade seemed to agree that what was really happening was the arrival, at long last, of economic democracy. While the world of finance had once been a stronghold of WASP privilege, an engine of elite enrichment, journalist and PR-man alike agreed that it had been transformed utterly, been opened to all. This bull market was the götterdammerung of the ruling class, the final victory of the common people over their former overlords. Sometimes this “democratization” was spoken of as a sort of social uprising, a final victory of the common people over the snobbish, old-guard culture of Wall Street. Sometimes it was said to be the market itself that had worked these great changes, that had humiliated the suits, that handed out whole islands to mechanics, that had permitted little old ladies to cavort with kings. And sometimes “democratization” was described as a demographic phenomenon, a reflection of the large percentage of the nation’s population that was now entrusting their savings to the market.
However they framed the idea, Wall Street had good reason to understand public participation as a form of democracy. As the symbol and the actual center of American capitalism, the financial industry has both the most to lose from a resurgence of anti-business sentiment and the most to gain from the ideological victory of market populism. For a hundred years the financial industry had been the chief villain in the imagination of populist reformers of all kinds; for sixty years now banks, brokers, and exchanges have labored at least partially under the regulations those earlier populists proposed. And Wall Street has never forgotten the melodrama of crash, arrogance, and New Deal anger that gave birth to those regulations. To this day Wall Street leaders see the possibility of a revived New Deal spirit around every corner; they fight not merely to keep the interfering liberals out of power, but to keep order in their own house, to ensure that the public relations cataclysm of 1929-32 is never repeated. This is why so much of the bull market culture of the Nineties reads like a long gloss on the experience of the 1930s, like a running battle with the memory of the Depression.
Take the stagnant-to-declining real wages of American workers, for example. A central principle of “New Economy” thought is that growth and productivity gains have been severed from wage increases and handed over instead to top management and shareholders. Since the redistributionist policies of “big government” are now as impermissible as union organizing, stocks of necessity have become the sole legitimate avenue for the redistribution of wealth. In other eras such an arrangement would have seemed an obvious earmark of a badly malfunctioning economic system, a system designed to funnel everything into the pockets of the already wealthy, since that’s who owns most of the stock. After all, workers can hardly be expected to buy shares if they can’t afford them.
But toss the idea of an ongoing financial “democratization” into the mix, and presto: Now the lopsided transformation of productivity gains into shareholder value is an earmark of fairness—because those shareholders are us! Sure, workers here and there are going down, but others, through the miracle of stocks, are on their way up.
Furthermore, ownership of stock among workers themselves, an ideologue might assert, more than made up for the decade’s stagnant wages. What capital took away with one hand, it was reasoned, it gave back with the other—and with interest.
This idea of stock prices compensating for lost or stagnant wages had long been a favorite ideological hobbyhorse of the corporate right, implying as it did that wealth was created not on the factory floor but on Wall Street and that workers only shared in it by the grace of their options-granting CEO. What was different in the 1990s was that, as the Nasdaq proceeded from triumph to triumph, economists and politicians of both parties came around to this curious notion, imagining that we had somehow wandered into a sort of free-market magic kingdom, where the ever-ascending Dow could be relied upon to solve just about any social problem. Now we could have it all: We could slash away at the welfare state, hobble the unions, downsize the workforce, and send the factories overseas—and no one got hurt!
Naturally the idea was first rolled out for public viewing in the aftermath of a serious public relations crisis for Wall Street. One fine day in January, 1996, AT&T announced it was cutting 40,000 white-collar jobs from its workforce; in response Wall Street turned cartwheels of joy, sending the company’s price north and personally enriching the company’s CEO by some $5 million. The connection of the two events was impossible to overlook, as was its meaning: What’s bad for workers is good for Wall Street. Within days the company was up to its neck in Old Economy-style vituperation from press and politicians alike. Then a golden voice rang through the din, promoting a simple and “purely capitalist” solution to “this heartless cycle”: “Let Them Eat Stocks,” proclaimed one James Cramer from the cover of The New Republic. “Just give the laid-off employees stock options,” advised Cramer, a hedge fund manager by trade who in his spare time dispensed investment advice on TV and in magazines, and “let them participate in the stock appreciation that their firings caused.” There was, of course, no question as to whether AT&T was in the right in what it had done: “the need to be competitive” justified all. It’s just that such brusque doings opened the door to cranks and naysayers who could potentially make things hot for Wall Street. Buttressing his argument with some neat numbers proving that, given enough options, the downsized could soon be—yes—millionaires, Cramer foresaw huge benefits to all in the form of bitterness abatement and government intervention avoidance. He also noted that no company then offered such a “stock option severance plan.” But the principle was the thing, and in principle one could not hold the stock market responsible; in principle the interests of all parties concerned could be fairly met without recourse to such market-hostile tools as government or unions.
And in ideology all one requires is principle. Thus it turned out to be a short walk indeed from Cramer’s modest proposal to a generalized belief in the possibility of real social redress through stocks. After all, since anyone can buy stocks, we had only ourselves to blame if we didn’t share in the joy. The argument was an extremely flexible one, capable of materializing in nearly any circumstance. In a November, 1999 think-piece addressing the problem of union workers angered by international trade agreements, a  New York Times writer found that they suffered from “confusion” since even as they protested, their 401(k)s were “spiking upward” due to “ever-freer trade.” To Lester Thurow, the answer to massive and growing inequality was not to do some kind of redistribution or reorganization but to “widen the skill base” so that anyone could “work for entrepreneurial companies” and thus have access to stock options. For lesser bull market rhapsodists the difference between “could” and “is” simply disappeared in the blissful haze. Egalitarian options were peeking out of every pocket. The cover of the July, 1999 issue of Money carried a photo of a long line of diverse, smiling workers—a familiar populist archetype—under the caption, “The employees of Actuate all get valuable stock options.” Inside, the magazine enthused about how options “are winding up in the shirt pockets of employees with blue collars, plaid collars and, increasingly, no collars at all.”
By decade’s end the myth of the wage/stock tradeoff was so widely accepted that its truest believers were able to present it as a historical principle, as our final pay-off for enduring all those years of deindustrialization and downsizing. In a January, 2000 Wall Street Journal feature story on how the good times were filtering down to the heartland folks of Akron, Ohio—a rust belt town that had been hit hard by the capital flight of the Seventies and Eighties—the soaring stock market was asserted to have gone “a long way in supplanting the insecurity of the 1980s, when the whole notion of employment for life was shattered, with something else: a sense of well-being.” Yes, their factories had closed—but just look at them now! The Journal found a blue-collar Akron resident who played golf! And an entrepreneur who drove a Mercedes! Who needed government when they options?
The actual effect of widespread use of stock options in lieu of wages, of course, was the opposite. Options did not bring about some sort of New Economy egalitarianism; they were in fact one of the greatest causes of the ever widening income gap. It was options that inflated the take home pay of CEOs to a staggering 419 times what their average line-worker made; it was options that unleashed the torrent of downsizing, outsourcing, and union busting. When options were given out to employees—a common enough practice in Silicon Valley by decade’s end—they often came in lieu of wages, thus permitting firms to conceal their payroll expenses. In any case, the growth of 401(k)s, even in the best of markets, could hardly be enough to compensate for declining wages, and it was small comfort indeed for those whose downsizing-induced problems came at age 25, or 35, or 45.
Options were a tool of wealth concentration, a bridge straight to the Nineteenth century.
And yet the fans of the bull market found it next to impossible to talk about options in this way. Only one interpretation, one explanatory framework seemed to be permissible when speaking of investing or finance—the onward march of democracy. Anything could be made to fit: The popularity of day trading, the growth of the mutual fund industry, the demise of Barings bank, the destruction of the Thai currency. The bubble being blown on Wall Street was an ideological one as much as it was anything else, with succeeding interpretations constantly heightening the rhetoric of populist glory. It was an “Investing Revolution!” It was all about “empowerment”!
And there were incredible prizes to be won as long as the bubble continued to swell, as long as the fiction of Wall Street as an alternative to democratic government became more and more plausible. Maybe the Glass-Steagall act could finally be repealed; maybe the SEC could finally be grounded; maybe antitrust could finally be halted. And, most enticingly of all, maybe Social Security could finally be “privatized” in accordance with the right-wing fantasy of long standing. True, it would be a staggering historical reversal for Democrats to consider such a scheme, but actually seeing it through would require an even more substantial change of image on Wall Street’s part. The financiers would have to convince the nation that they were worthy of the charge, that they were as public-minded and as considerate of the little fellow as Franklin Roosevelt himself had been. Although one mutual fund company actually attempted this directly—showing footage of FDR signing the Social Security Act in 1935 and proclaiming, “Today, we’re picking up where he left off”—most chose a warmer, vaguer route, showing us heroic tableaux of hardy midwesterners buying and holding amidst the Nebraska corn, of World War II vets day-trading from their suburban rec-rooms, of athletes talking like insiders, of church ladies phoning in their questions for the commentator on CNBC; of mom and pop posting their very own fire-breathing defenses of Microsoft on the boards at Raging Bull. This was a boom driven by democracy itself, a boom of infinite possibilities, a boom that could never end.
Excerpted with permission from “One Market Under God” (Doubleday Books).
Thomas Frank Thomas Frank is a Salon politics and culture columnist. His many books include "What's The Matter With Kansas," "Pity the Billionaire" and "One Market Under God." He is the founding editor of The Baffler magazine.

Friday, August 29, 2014

Naked Capitalism on the Fraud of Bank Regulation


Gillian Tett’s Astonishing Defense of Bank Misconduct

Posted on August 29, 2014 by

I don’t know what became of the Gillian Tett who provided prescient coverage of the financial markets, and in particular the importance and danger of CDOs, from 2005 through 2008. But since she was promoted to assistant editor, the present incarnation of Gillian Tett bears perilous little resemblance to her pre-crisis version. Tett has increasingly used her hard-won brand equity to defend noxious causes, like austerity and special pleadings of the banking elite.
Today’s column, “Regulatory revenge risks scaring investors away,” is a vivid example of Tett’s professional devolution.
The twofer in the headline represents the article fairly well. First, it take the position that chronically captured bank regulators, when they show an uncharacteristic bit of spine, are motivated by emotion, namely spite, and thus are being unduly punitive. Second, those meanie regulators are scaring off investors. It goes without saying that that is a bad outcome, since we need to keep our bloated, predatory banking system just the way it is. More costly capital would interfere with its institutionalized looting.
In other words, the construction of the article is to depict banks as victims and the punishments as excessive. Huh? The banks engaged in repeated, institutionalized, large scale frauds. If they had complied with regulations and their own contracts, they would not be in trouble. But Tett would have us believe the regulators are behaving vindictively. In fact, the banks engaged in bad conduct. To the extent that the regulators are at fault, it is for imposing way too little in the way of punishment, way too late.
As anyone who has been following this beat, including Tett, surely knows is that adequate penalties for large bank misdeeds would wipe them out. For instance, as many, including your humble blogger, pointed out in 2010 and 2011 that bank liability for the failure to transfer mortgages in the contractually-stipulated manner to securitazation trusts alone was a huge multiple of bank equity. So not surprisingly, as it became clear that mortgage securitization agreements were rigid (meaning the usual legal remedy of writing waivers wouldn’t fix these problems) and more and more cases were grinding their way through court, the Administration woke up and pushed through the second bank bailout otherwise known as the National Mortgage Settlement (which included 49 state attorney general settlements) of 2012.
Similarly, Andrew Haldane, then the executive director of financial stability for the Bank of England, pointed out that banks couldn’t begin to pay for the damage they did. In a widely-cited 2010 paper, Haldane compared the banking industry to the auto industry, in that they both produced pollutants: for cars, exhaust fumes; for bank, systemic risk. Remember that economic theory treats cost like pollution that are imposed on innocent bystanders to commercial activity as an “externality”. The remedy is to find a way to make the polluter and his customer bear the true costs of their transactions. From Haldane’s quick and dirty calculation of the real cost of the crisis (emphasis ours):
….these losses are multiples of the static costs, lying anywhere between one and five times annual GDP. Put in money terms, that is an output loss equivalent to between $60 trillion and $200 trillion for the world economy and between £1.8 trillion and £7.4 trillion for the UK. As Nobel-prize winning physicist Richard Feynman observed, to call these numbers “astronomical” would be to do astronomy a disservice: there are only hundreds of billions of stars in the galaxy. “Economical” might be a better description.
It is clear that banks would not have deep enough pockets to foot this bill. Assuming that a crisis occurs every 20 years, the systemic levy needed to recoup these crisis costs would be in excess of $1.5 trillion per year. The total market capitalisation of the largest global banks is currently only around $1.2 trillion. Fully internalising the output costs of financial crises would risk putting banks on the same trajectory as the dinosaurs, with the levy playing the role of the meteorite.
Contrast Haldane’s estimate of what an adequate levy would amount to with how Tett’s article depicts vastly smaller amounts as an outrage:
A couple of years ago Roger McCormick, a law professor at London School of Economics and Political Science, assembled a team of researchers to track the penalties being imposed on the 10 largest western banks, to see how finance was evolving after the 2008 crisis.
He initially thought this might be a minor, one-off project. He was wrong. Last month his project team published its second report on post-crisis penalties, which showed that by late 2013 the top 10 banks had paid an astonishing £100bn in fines since 2008, for misbehaviour such as money laundering, rate-rigging, sanctions-busting and mis-selling subprime mortgages and bonds during the credit bubble. Bank of America headed this league of shame: it had paid £39bn by the end of 2013 for its transgressions.
When the 2014 data are compiled, the total penalties will probably have risen towards £200bn. Just last week Bank of America announced yet another settlement with regulators over the subprime scandals, worth $16.9bn. JPMorgan and Citi respectively have recently settled with different US government bodies for mortgage transgressions to the tune of $13bn and $7bn.
Yves here. Keep in mind that these settlement figures are inflated, since they use the headline value, and fail to back out the non-cash portions (which are generally worth little, or in some cases are rewarding banks for costs imposed on third parties) as well as tax breaks.
In an amusing bit of synchronicity, earlier this week Georgetown Law professor Adam Levitin also looked at mortgage settlements alone and came up with figures similar to the ones that have McCormick running to the banks’ defense. But Levitin deems the totals to be paltry:
There’s actually been quite a lot of settlements covering a fair amount of money. (Not all of it is real money, of course, but the notionals add up).
By my counting, there have been some $94.6 billion in settlements announced or proposed to date dealing with mortgages and MBS….In other words, what I’m trying to cover are settlements for fraud and breach of contract against investors/insurers of MBS and buyers of mortgages.
Settlements aren’t the same as litigation wins, and I don’t know the strength of the parties’ positions in detail in many of these cases, but $94.6 billion strikes me as rather low for a total settlement figure.
And that is the issue that Tett tries to finesse. The comparison that she and McCormick make on behalf of the banks is presumably relative to their ability to pay, when the proper benchmark is whether the punishment is adequate given the harm done.
In fact, despite McCormick’s and the banks’ cavilling, investors understand fully that these supposedly tough settlements continue to be screaming bargains. When virtually every recent settlement has been announced, the bank in question’s stock price has risen, including the supposedly big and nasty $16.6 billion latest Bank of America settlement (which par for the course was only $9 billion in real money). The Charlotte bank’s stock traded up 4% after that deal was made public. So if investors are pleased with these pacts, what’s the beef?
The complaint, in so many words, is that these sanctions are capricious. Tett again:

The numbers are getting bigger and bigger,” observes Prof McCormick, who has been so startled by this trend that last month he decided to turn his penalty-tracking pilot project into a full-blown, independent centre. A former leading European regulator says: “What is happening now is astonishing. If you had asked regulators a few years ago to predict how big the post-crisis penalties might be, our predictions would have been wrong – by digits.”
Now the article does list some abuses, such as the Libor scandal, that were exposed after the crisis. That goes double for chain of title abuses, which suddenly exploded into media and therefore regulators’ attention in the fall of 2010. That means the reason that the penalties have kept clocking up is that, in the absence of having performed large scale systematic investigations in the wake of the crisis, regulators are dealing with abuses that came to their attention after the “rescue the banks at all costs” phase. Those violations are just too visible for the officialdom to give the banks a free pass, particularly since the public is correctly resentful that no one suffered much if at all for crisis-related abuses.
So the banks’ unhappiness seems to result from the fact that having been bailed twice by the authorities (once in the crisis proper, a second time via the “get of out jail almost free” of the Federal/state mortgage settlements of 2012), the financiers thought they were home free. They are now offended that they are being made to ante up for some crisis misconduct as well as additional misdeeds. Yet Tett tries to depict the regulators as still dealing with rabbit of 2008 bad deeds that are still moving through the banking anaconda, when a look at JP Morgan’s rap sheet shows a panoply of violations, only some of which relate to the crisis (as in resulting from pre-crisis mortgage lending or related mortgage backed securities and CDOs):
Bank Secrecy Act violations;
Money laundering for drug cartels;
Violations of sanction orders against Cuba, Iran, Sudan, and former Liberian strongman Charles Taylor;
Violations related to the Vatican Bank scandal (get on this, Pope Francis!);
Violations of the Commodities Exchange Act;
Failure to segregate customer funds (including one CFTC case where the bank failed to segregate $725 million of its own money from a $9.6 billion account) in the US and UK;
Knowingly executing fictitious trades where the customer, with full knowledge of the bank, was on both sides of the deal;
Various SEC enforcement actions for misrepresentations of CDOs and mortgage-backed securities;
The AG settlement on foreclosure fraud;
The OCC settlement on foreclosure fraud;
Violations of the Servicemembers Civil Relief Act;
Illegal flood insurance commissions;
Fraudulent sale of unregistered securities;
Auto-finance ripoffs;
Illegal increases of overdraft penalties;
Violations of federal ERISA laws as well as those of the state of New York;
Municipal bond market manipulations and acts of bid-rigging, including violations of the Sherman Anti-Trust Act;
Filing of unverified affidavits for credit card debt collections (“as a result of internal control failures that sound eerily similar to the industry’s mortgage servicing failures and foreclosure abuses”);
Energy market manipulation that triggered FERC lawsuits;
“Artificial market making” at Japanese affiliates;
Shifting trading losses on a currency trade to a customer account;
Fraudulent sales of derivatives to the city of Milan, Italy;
Obstruction of justice (including refusing the release of documents in the Bernie Madoff case as well as the case of Peregrine Financial).
Finally, let’s dispatch the worry about those poor banks having to pay more to get capital from investors. If this actually happened to be true, it would be an extremely desirable outcome, for it would help shrink an oversize, overpaid sector.
However, the Fed and FDIC earlier this month, in an embarrassing about face, admitted that the “living wills” that banks submitted were a joke, meaning that the major banks can’t be resolved if they start to founder. We’ve said for years that the orderly liquidation authority envisioned by Dodd Frank is unworkable. And we weren’t alone in saying that; the Bank of International Settlements and the Institute for International Finance agreed.
The implication, which investors understand full well, is that “too big to fail” is far from solved, and taxpayers are still on the hook for any megabank blowups. As Boston College professor Ed Kane pointed out in Congressional testimony last month, and Simon Johnson wrote in Project Syndicate earlier this week, that means that systemically important banks continue to receive substantial subsidies.
Yet Tett would have you believe that banks are suffering because investors see them as bearing too much litigation/regulatory risk. If that were true, Bank of America, the most exposed bank, would have cleaned up its servicing years ago.
It’s clear that banks and investors regard the risk of getting caught as not that great, and correctly recognize the damage even when they are fined as a mere cost of doing business. It is a no brainer that their TBTF status assures that no punishment will ever be allowed to rise to the level that would seriously threaten theses institutions. Everyone, including Tett, understands that this is all kabuki, even if the process is a bit untidy. So all of this investor complaining is merely an effort to get regulators to fatten their returns a bit.
Bank defenders like Tett would have you believe that the regulators have been inconsistent and unfair. In fact, if they have been unfair to anyone, it is to the silent equity partners of banks, meaning taxpayers. Banks are so heavily subsidized that they cannot properly be regarded as private firms and should be regulated as utilities. Fines for serious abuses that leave banks able to continue operating in their current form are simply another gesture to appease the public. Yet Tett would have you believe that a manageable problem for banks is a bigger cause for concern than the festering problem of too big to fail banks and only intermittently serious regulators.

Crime Scene – New Orleans: An Act of Big Oil.

By Greg Palast
Tuesday, 26 August 2014

[Lower Ninth Ward, New Orleans]  Nine years ago this week, New Orleans drowned.  Don’t you dare blame Mother Nature.  Miss Katrina killed no one in this town.  But it was a homicide, with nearly 2,000 dead victims.  If not Katrina, who done it?  Read on.

The Palast Investigative Fund is making our half-hour investigative report available as a free download – Big Easy to Big Empty: The Untold Story of the Drowning of New Orleans, produced for Democracy Now.  In the course of the filming, Palast was charged with violation of anti-terror laws on a complaint from Exxon Corporation. Charges were dropped, and our digging continued.

Who is to blame for the crushing avalanche of water that buried this city?

It wasn’t an Act of God.  It was an Act of Chevron.  An Act of Exxon. An Act of Big Oil.

Take a look at these numbers dug out of Louisiana state records:
Conoco 3.3 million acres
Exxon Mobil 2.1 million acres
Chevron 2.7 million acres
Shell 1.3 million acres
These are the total acres of wetlands removed by just four oil companies over the past couple decadesIf you’re not a farmer, I’ll translate this into urban-speak:  that’s 14,688 square miles drowned into the Gulf of Mexico.

Here’s what happened.  New Orleans used be to a long, swampy way from the Gulf of Mexico.  Hurricanes and storm surges had to to cross a protective mangrove forest nearly a hundred miles thick.

But then, a century ago, Standard Oil, Exxon’s prior alias, began dragging drilling rigs, channeling pipelines, barge paths and tanker routes through what was once soft delta prairie grass.  Most of those beautiful bayous you see on postcards are just scars, the cuts and wounds of drilling the prairie, once America’s cattle-raising center.  The bayous, filling with ‘gators and shrimp, widened out and sank the coastline.  Each year, oil operations drag the Gulf four miles closer to New Orleans.

Just one channel dug for Exxon’s pleasure, the Mississippi River-Gulf Outlet ("MR-GO") was dubbed the Hurricane Highway by experts—long before Katrina—that invited the storm right up to—and over—the city’s gates, the levees.

Without Big Oil's tree and prairie holocaust, "Katrina would have been a storm of no note," Professor Ivor van Heerden told me.  Van Heerden, once Deputy Director of the Hurricane Center at Louisiana State University, is one of the planet’s the leading experts on storm dynamics.

If they’d only left just 10% of the protective collar. They didn’t.

Van Heerden was giving me a tour of the battle zone in the oil war.  It was New Orleans’ Lower Ninth Ward, which once held the largest concentration of African-American owned homes in America.  Now it holds the largest contrition of African-American owned rubble.

We stood in front of a house, now years after Katrina, with an "X" spray-painted on the outside and "1 DEAD DOG," "1 CAT," the number 2 and "9/6" partly covered by a foreclosure notice.

The professor translated:  "9/6" meant rescuers couldn’t get to the house for eight days, so the "2"—the couple that lived there––must have paddled around with their pets until the rising waters pushed them against the ceiling and they suffocated, their gas-bloated corpses floating for a week.

In July 2005, Van Heerden told Channel 4 television of Britain that, "In a month, this city could be underwater." In one month, it was.  Van Heerden had sounded the alarm for at least two years, even speaking to George Bush’s White House about an emergency condition:  with the Gulf closing in, the levees were 18 inches short.  But the Army Corps of Engineers was busy with other rivers, the Tigris and Euphrates.

So, when those levees began to fail, the White House, hoping to avoid Federal responsibility, did not tell Louisiana's Governor Kathleen Blanco that the levees were breaking up.  That Monday night, August 29, with the storm by-passing New Orleans, the Governor had stopped the city’s evacuation.  Van Heerden was with the governor at the State Emergency Center.  He said, "By midnight on Monday the White House knew. But none of us knew."

So, the drownings began in earnest.

Van Heerden was supposed to keep that secret.  He didn't.  He told me, on camera––knowing the floodwater of official slime would break over him. He was told to stay silent, to bury the truth. But he told me more.  A lot more.  
"I wasn't going to listen to those sort of threats, to let them shut me down."
Well, they did shut him down. After he went public about the unending life-and-death threat of continued oil drilling and channelling, LSU closed down its entire Hurricane Center (can you imagine?) and fired Professor van Heerden and fellow experts. This was just after the University received a $300,000 check from Chevron.  The check was passed by a front group called "America’s Wetlands"—which lobbies for more drilling in the wetlands.

In place of Van Heerden and independent experts, LSU’s new "Wetlands Center" has professors picked by a board of petroleum industry hacks.

In 2003, Americans protested, "No Blood for Oil" in Iraq.  It’s about time we said, "No Blood for Oil"—in Louisiana.

The Real Reason Sugar Has No Place in Cornbread



And it should always be made in a cast iron skillet. [Photographs: Vicky Wasik]
I'm about to touch the third rail of Southern food. Well, actually, one of the third rails of Southern food, for when it comes to defining how certain beloved dishes should or should not be made, Southerners can get downright touchy. But, sometimes a truth is so self-evident that you can't present an impartial case for both sides. So I'm just going to say it: sugar has no business in cornbread.
Neither, for that matter, does wheat flour. One might make something quite tasty with well-sweetened wheat flour mixed with cornmeal, but be honest with yourself and call it a dessert. Cornbread is something else.
Now for a less personal perspective.
Much of the sugar/no sugar debate comes down to how one's grandmother made cornbread (and my grandmother didn't let a speck of sugar enter her batter). There are plenty of otherwise perfectly normal Southerners (my wife, for instance) whose grandmothers put sugar in cornbread. And there's a good explanation for why they did it. It all comes down to the nature of modern cornmeal.

Daily Bread

20140820-cornbread-vicky-wasik-3.jpgBut, first, a word on cornbread and Southerness. A lot of corn is grown in places like Iowa and Illinois, and Americans in all parts of the country have long made breads, cakes, and muffins from cornmeal. But for some reason, cornbread itself is still associated primarily with the South.
"The North thinks it knows how to make cornbread, but this is gross superstition," Mark Twain wrote in his autobiography. When the Southern Foodways Alliance needed a title for their series of books collecting the best Southern food writing, they chose Cornbread Nation.
Cornbread's enduring role in Southern cookery comes from its ubiquity—it was the primary bread eaten in the region from the colonial days until well into the 20th century. Though farmers in the Northeast and Midwest cultivated thriving crops of wheat and rye, corn remained the staple grain of the south, as European wheat withered and died of rust in the region's heat and humidity.
For all but the wealthiest Southerners, the daily bread was cornbread.
For all but the wealthiest Southerners, the daily bread was cornbread. "In the interior of the country," a New York Times correspondent observed in an 1853 article about Texas, "cornbread forms the staple article of diet—anything composed of wheat flour being about as scarce as ice-cream in Sahara." Biscuits made from wheat flour are very closely associated with the South, but for most Southerners they were rare treats reserved for special occasions like Sunday dinner.

Early Cornbread

The simplest type of cornbread was corn pone, which was made from a basic batter of cornmeal stirred with water and a little salt. It was typically cooked in a greased iron skillet or Dutch oven placed directly on hot coals. An iron lid was put on top and covered with a layer of embers, too, so the bread was heated from both bottom and top and baked within the pan.
Over time, the basic pone recipe was enhanced to become cornbread. Cooks first added buttermilk and a little baking soda to help it rise. Later, eggs and baking powder made their way into many recipes. But there are two ingredients you almost never see in any recipes before the 20th century: wheat flour and sugar.
In 1892, a Times correspondent, after enumerating the many types of corn-based breads eaten in Virginia, noted, "It will be observed that in none of them is sugar used. There are cornmeal puddings served with sweet sauces, but no Southern cook would risk the spoiling of her cornbreads by sweetening them."
In 1937, the Times reported that "cornbread in Kentucky is made with white, coarsely ground cornmeal. Never, never are sugar and wheat flour used in cornbread. Water-ground cornmeal and water-ground whole wheat flour have still a market in Kentucky and are still used with delight."

Changing the Recipe

So why were cooks so unanimous on the subject of sugar and wheat flour up through the 1930s and so divided on it today? That mention of Kentucky's lingering market for "water-ground" meal provides an important clue, for a huge shift occurred in the cornmeal market in the early part of the 20th century, one that changed the very nature of cornmeal and forced cooks to alter their cornbread recipes.
There's no better source to turn to to understand these changes than Glenn Roberts of Anson Mills in Columbia, South Carolina. In the 1990s, Roberts embarked on a single-minded mission to help rediscover and revive the rich variety of grains that were all but lost amid the industrialization of agriculture and food production. He's cultivated a network of farmers to grow heirloom corn, rice, and other grains, and he launched Anson Mills to mill them in traditional ways and distribute them to restaurant chefs and home cooks.
During the 19th century, Roberts says, toll milling was the way most farm families got the meal for their cornbread. Farmers took their own corn to the local mill and had it ground into enough cornmeal for their families, leaving behind some behind as a toll to pay the miller. "With toll milling, it was three bags in, three bags out," Roberts explains. "A person could walk or mule in with three bags, take three bags home, and still get chores done."
The mills were typically water-powered and used large millstones to grind the corn. Starting around 1900, however, new "roller mills" using cylindrical steel rollers began to be introduced in the South. Large milling companies set up roller operations in the towns and cities and began taking business away from the smaller toll mills out in the countryside. "The bottom line is they went off stone milling because the economies didn't make sense," Roberts says, "which is why stone milling collapsed after the Depression."
Unlike stone mills, steel roller mills eliminate much of the corn kernel, including the germ; doing so makes the corn shelf stable but also robs it of much flavor and nutrition. The friction of steel rolling generates a lot of heat, too, which further erodes corn's natural flavor. Perhaps the most significant difference, though, is the size of the resulting meal.
"If you're toll milling," Roberts says, "you're using one screen. It's just like a backdoor screen. If you put the grits onto that screen and shake it, coarse cornmeal is going to fall through. The diverse particle size in that cornmeal is stunning when compared to a [steel] roller mill."
When cornmeal's texture changed, cooks had to adjust their recipes. "There's a certain minimum particle size required to react with chemical leaven," Roberts says. "If you are using [meal from a roller mill] you're not going to get nearly the lift. You get a crumbly texture, and you need to augment the bread with wheat flour, or you're getting cake."
The change from stone to steel milling is likely what prompted cooks to start putting sugar in their cornbread, too. In the old days, Southerners typically ground their meal from varieties known as dent corn, so called because there's a dent in the top of each kernel. The corn was hard and dry when it was milled, since it had been "field ripened" by being left in the field and allowed to dry completely.
High-volume steel millers started using corn harvested unripe and dried with forced air, which had less sweetness and corny flavor than its field-ripened counterpart. "You put sugar in the cornmeal because you are not working with brix corn," Roberts says, using the trade term for sugar content. "There's no reason to add sugar if you have good corn."

Today's Cornmeal

By the end of the Depression, old fashioned stone-ground cornmeal and grits had all but disappeared from the South, replaced by paper bags of finely-ground corn powder. The new cornmeal tended to be yellow, while the meal used for cornbread in much of the coastal South traditionally had been white. (There is a whole complex set of issues associated with the color of cornmeal that will have to wait for a later time.)
Cooks who paid attention knew there was a difference. "A very different product from the yellow cornmeal of the North is this white water-ground meal of the South," wrote Dorothy Robinson in the Richmond Times Dispatch in 1952. "The two are not interchangeable in recipes. Most standard cookbooks, with the exception of a comparative few devoted to Southern cooking, have concerned themselves with yellow cornmeal recipes as if they did not know any other kind! They do not even distinguish between the two. They simply say, naively, 'a cup of cornmeal' when listing ingredients in a recipe."
But even those who knew the difference had trouble finding the old stone-ground stuff. In 1950, a desperate Mrs. Francine J. Parr of Houma, Louisiana, posted a notice in the Times-Picayune with the headline "Who's Got Coarse Grits?" and explained, "the only grits we can get is very fine and no better than mush. In short, I'm advertising for some grocer or other individual selling coarse grits to drop me a line."

Making Proper Cornbread

Cornbread is just one of many traditional Southern foods that are difficult to experience today in their original form for the simple reason that today's ingredients just aren't the same. Buttermilk, rice, benne seeds, watermelons, and even the whole hogs put on barbecue pits: each has changed in fundamental ways over the course of the 20th century.
But, thanks to historically-minded millers like Glenn Roberts and others, it's getting a little bit easier to find real stone-ground cornmeal again. Some are even using heirloom varieties of dent corn to return the old flavor and sweetness to cornmeal and to grits, too.
The key to making good, authentic Southern cornbread is to use the right tools and ingredients. That means cooking it in a black cast iron skillet preheated in the oven so it's smoking hot when the batter hits the pan, causing the edges of the bread to brown. That batter should be made with the best buttermilk possible (real buttermilk if you can find it, which isn't easy).
And you shouldn't use a grain of wheat flour or sugar. If you start with an old fashioned stone-ground meal like the Anson Mills' Antebellum Coarse White Cornmeal, you'll have no need for such adulterations.

this gentleman got it just right...follow this recipe!!
I very much enjoyed this article and I am so glad to see someone tell the real story about cornbread and get it right. Like most southerners I agree whole heatedly that you should never put sugar in corn bread.
My dad and granddad have raised white dent corn for as long as I can remember, (i'm 51 my dad is 76). I use to go with them when they would take a sack of shelled corn to the miller to have it ground into corn meal. After the last old miller died in the area where I grew up, (southwest Virginia), my dad located and purchased a small stone mill and with the help of my granddad they brought it home and set it up. My dad is still raising his own white dent corn and grinding his own corn meal which he generously shares with me and many other folks in the community. I can tell you, it is nothing like the so called corn meal that you buy from the store. for the last 2 years he has raised a variety called Boone County White. Dad remembers raising it when he was growing up. The corn grows 15 feet tall and the ears of corn are can be as larges as 14 inches long and grow about 8 feet high on the stalk. I had never seen anything like it until 2 years ago when he raised the first patch of it. Truly "Corn as high as an elephants eye ...". I have seen a number of people requesting a recipe, so here it is and like most good things it is very simple.
If you like the bread about an inch to inch and a half thick us a #5 cast iron skillet, if you like it thinner then use a #7 or #8 cast iron skillet.
(The cast iron skillet needs to be well seasoned or the bread will stick)
Preheat you oven to 475. when the oven is hot put about a teaspoon or two of bacon grease in the skillet and slide it in the over on the top rack to heat up. While the skillet is heating, take a medium size mixing bowl and blend together:
1 1/2 cups of corn meal
1 tsp of salt
1 to 1 1/2 tsp of baking powder
1/4 tsp of baking soda
Add enough whole buttermilk to the dry ingredients to make a thick paste. If you like the cornbread kind of wet and heavy add a little extra buttermilk if you like it dry then use a little less.
When the skillet is good and hot, (the bacon grease will be starting to smoke), remove the skillet from the oven and pour in the batter and return it to the top rack of the oven. When the top is golden brown it is ready, about 15 to 20 minutes.

Wednesday, August 27, 2014

Maryland Crab-Cake Sandwich

Home / Blogs / Eat Like a Man
Eat Like a Man

Melissa Golden; illustrations by Mikey Burton
A Top Chef Masters contestant attempts to teach a writer who can barely make a bowl of cereal how to create food that expresses love and tastes delicious. In four hours. No problem.
Published in the September 2013 issue
The best way to cook is at someone's knee. You stand, you watch, you get out of the way when they need to get something off the stove. Stand there long enough and you start to pick up not only little tips, like how to chop a pepper, but also larger truths, like why we cook at all. Here, four writers with varying levels of experience shadow four great chefs, each at the top of his game. Feel free to stand and watch. Check back here to read more this week!

Lesson 1: Cook Like You Mean It

It took a bit for Bryan Voltaggio, the famous young chef with a pig tattooed on his arm, to decide I really was the tragic miracle I'd said I was. We were in the kitchen of his fourth and newest restaurant, Range, in Washington, D.C., pasta and cherry tomatoes and garlic simmering on the stove. A few minutes before, when I was cutting those same cherry tomatoes in half, I told him he was witnessing my first time putting a knife to a vegetable. Not long after, he wondered aloud whether he was being set up as part of some elaborate prank. That's when I mentioned I'd never cracked an egg. "How is that possible?" Voltaggio said. "How are you alive?"
I agreed that it was ridiculous for a thirty-nine-year-old man never to have cracked an egg, that it says something terrible about me as well as modern society that I can survive and in fact grow quite fat without acquiring even the most basic cooking skills, but nevertheless, I had never cracked an egg. Before entering Voltaggio's kitchen, I had possibly prepared the least food of any fully functioning North American adult: one plate of pasta — dried noodles and jarred sauce — just after I'd graduated from Meal Plan University and one serving of Hamburger Helper, with which I'd attempted to court the very good cook who somehow still became my wife. Other than those two barely digestible meals, whenever I have eaten, someone else has made my food for me, either because they love me or because I paid them. Only after Voltaggio watched me nervously crack that first egg did he finally believe me. "Nobody's that good an actor," he said.
Voltaggio comes from a family of cooks and chefs — he finished second to his brother, Michael, on the sixth season of Top Chef — and to watch him work in a kitchen is to watch witchcraft, years of experience and observation and fever poured into a cauldron. In some ways, that afternoon at Range confirmed my guiding philosophy: We should do only those things at which we are good. Why would I cook when Bryan Voltaggio cooks? If cooking makes him happy, and eating his food makes me happy, why would I upset that happy order of things? It had never made sense to me, and today it would remain nonsensical but for the fact that after we finished making our pasta, Voltaggio and I made the crab-cake sandwich that changed my life. We didn't just make that sandwich. We made every last component of that sandwich from its most basic ingredients. We made the soft, hot rolls, washing them with egg and sprinkling them with salt; we made the crab cakes, giant lumps of fresh crab combined with not much else and carefully levered into a pan of clarified butter; we even made the tartar sauce, from Voltaggio's original recipe, that went on top of the crab cakes like a blanket.

Now, here I must confess: While making that tartar sauce, I was consumed by the cynicism of my former self. It took me maybe an hour of work, not including the time I would need at home to find each of its fourteen ingredients. It required making grape-seed oil shimmer in the pan but not smoke — canola oil would smell like rotting fish, Voltaggio said, the sort of wisdom that seems impossible for me to own — and sweating diced celery, fennel, and onions, but not browning them. Alternatively, I could go out and buy a jar of tartar sauce in about six seconds. But then I finished Voltaggio's recipe, and I tasted it, and I understood. It wasn't some small fraction better than factory-born tartar sauce. It was better by orders of magnitude, turning something incidental into something essential. I can't recall eating any single tartar sauce in my life except for that one. Then we put it on the sandwich, and then we ate the sandwich, and holy sweet Mary mother of baby Jesus, it was the best sandwich I have ever eaten. It was the sandwich I had been dreaming about my whole life put suddenly where it belonged, in my open, groaning mouth.
What Voltaggio taught me, more than anything else, is that there is no particular magic in that trick. He refuses to call food art, or cooking artistry. That makes it sound more precious and inaccessible than it is. All good cooking requires, at its foundation, is generosity. Every decent meal I have eaten I have enjoyed because someone else had a big enough heart to make it.
I always thought of my refusal to cook as a selfless act: I was sparing the world my barbarism. In reality, learning how to make delicious whole food requires a capacity for goodness that I wish I didn't have to work so hard to possess. Yes, at some level, that crab-cake sandwich was just a sandwich, just caloric energy presented in a photogenic shape. But it was also this beautiful expression of care, this tender, charitable agreement that Bryan Voltaggio had made to teach me how to do some tiny fraction of what he does and to help me feel as though I could do more of it. I will make those crab-cake sandwiches again and again, partly because I couldn't live with the idea of never eating another one, but mostly because it will allow me to give something meaningful, my time and my effort, my attention and my education, to the people who remind me not only how I am alive but also why.


Bryan Voltaggio, Range, Washington D.C.
—As told to Francine Maroukian
Serves 6 to 8
  • 7 Tbsp mayonnaise, preferably Duke's
  • 1 Tbsp Old Bay
  • 2 ½ tsp Worcestershire sauce
  • 2 ½ tsp Dijon mustard
  • 3 ¾ tsp lemon juice
  • 2 eggs
  • 4 scallions, minced
  • 6 drops Tabasco sauce
  • ½ tsp fine sea salt
  • 2 lbs jumbo lump crabmeat, picked of shell fragments
  • 1 cup cracker meal for breading
  • 1 cup clarified butter*
  • 8 buns, toasted and buttered
In a medium bowl , combine the mayonnaise, Old Bay, Worcestershire, mustard, lemon juice, eggs, scallions, Tabasco, and sea salt. Using a wire whisk, mix the ingredients together to incorporate evenly. Add the crabmeat by thirds and fold gently with a spatula to be sure the crab does not get broken up.
Evenly coat the bottom of a baking dish with a generous dusting of the cracker meal, about ½ cup. Use an ice-cream scoop or a similar tool to divide crabmeat mixture into six or eight individual cakes. Place each crab cake in the cracker meal and dust with the remaining cracker meal, coating all sides. In a large frying pan, slowly heat the clarified butter. Use a candy thermometer to get it to 325 degrees, or stick the end of a chopstick into the butter — when it gives off a steady stream of bubbles, you're at 325.
Using a slotted metal or other high-heat-resistant spatula and working one at a time, place each cake into the butter, leaving a half inch between them so the crab cakes brown evenly. Cook crab cakes on both sides in the clarified butter, about 6 full minutes per side, until golden brown. (If you need to cook in multiple batches, set your oven at the lowest temperature and insert a cooling rack over a baking sheet, to rest the crab cakes on.) Let cakes sit for a minute, and then transfer them to the buns. Top with tartar sauce.
*Slowly melt three sticks of butter in a pan. When it starts bubbling, remove from heat. Using a spoon, remove white milk solids from the surface and discard. Pour the golden yellow layer of clarified butter into a container — this is what you will cook with. Discard the solids remaining on the bottom.

Tuesday, August 26, 2014

The Kennewick Man: the most important human skeleton ever found in North America

The Kennewick Man Finally Freed to Share His Secrets

He’s the most important human skeleton ever found in North America—and here, for the first time, is his story

Smithsonian Magazine | Subscribe

n the summer of 1996, two college students in Kennewick, Washington, stumbled on a human skull while wading in the shallows along the Columbia River. They called the police. The police brought in the Benton County coroner, Floyd Johnson, who was puzzled by the skull, and he in turn contacted James Chatters, a local archaeologist. Chatters and the coroner returned to the site and, in the dying light of evening, plucked almost an entire skeleton from the mud and sand. They carried the bones back to Chatters’ lab and spread them out on a table.
The skull, while clearly old, did not look Native American. At first glance, Chatters thought it might belong to an early pioneer or trapper. But the teeth were cavity-free (signaling a diet low in sugar and starch) and worn down to the roots—a combination characteristic of prehistoric teeth. Chatters then noted something embedded in the hipbone. It proved to be a stone spearpoint, which seemed to clinch that the remains were prehistoric. He sent a bone sample off for carbon dating. The results: It was more than 9,000 years old.
Thus began the saga of Kennewick Man, one of the oldest skeletons ever found in the Americas and an object of deep fascination from the moment it was discovered. It is among the most contested set of remains on the continents as well. Now, though, after two decades, the dappled, pale brown bones are at last about to come into sharp focus, thanks to a long-awaited, monumental scientific publication next month co-edited by the physical anthropologist Douglas Owsley, of the Smithsonian Institution. No fewer than 48 authors and another 17 researchers, photographers and editors contributed to the 680-page Kennewick Man: The Scientific Investigation of an Ancient American Skeleton (Texas A&M University Press), the most complete analysis of a Paleo-American skeleton ever done.
The book recounts the history of discovery, presents a complete inventory of the bones and explores every angle of what they may reveal. Three chapters are devoted to the teeth alone, and another to green stains thought to be left by algae. Together, the findings illuminate this mysterious man’s life and support an astounding new theory of the peopling of the Americas. If it weren’t for a harrowing round of panicky last-minute maneuvering worthy of a legal thriller, the remains might have been buried and lost to science forever.

The storm of controversy erupted when the Army Corps of Engineers, which managed the land where the bones had been found, learned of the radiocarbon date. The corps immediately claimed authority—officials there would make all decisions related to handling and access—and demanded that all scientific study cease. Floyd Johnson protested, saying that as county coroner he believed he had legal jurisdiction. The dispute escalated, and the bones were sealed in an evidence locker at the sheriff’s office pending a resolution.
“At that point,” Chatters recalled to me in a recent interview, “I knew trouble was coming.” It was then that he called Owsley, a curator at the National Museum of Natural History and a legend in the community of physical anthropologists. He has examined well over 10,000 sets of human remains during his long career. He had helped identify human remains for the CIA, the FBI, the State Department and various police departments, and he had worked on mass graves in Croatia and elsewhere. He helped reassemble and identify the dismembered and burned bodies from the Branch Davidian compound in Waco, Texas. Later, he did the same with the Pentagon victims of the 9/11 terrorist attack. Owsley is also a specialist in ancient American remains.
“You can count on your fingers the number of ancient, well-preserved skeletons there are” in North America, he told me, remembering his excitement at first hearing from Chatters. Owsley and Dennis Stanford, at that time chairman of the Smithsonian’s anthropology department, decided to pull together a team to study the bones. But corps attorneys showed that federal law did, in fact, give them jurisdiction over the remains. So the corps seized the bones and locked them up at the Department of Energy’s Pacific Northwest National Laboratory, often called Battelle for the organization that operates the lab.

nullAt the same time, a coalition of Columbia River Basin Indian tribes and bands claimed the skeleton under a 1990 law known as the Native American Graves Protection and Repatriation Act, or NAGPRA. The tribes demanded the bones for reburial. “Scientists have dug up and studied Native Americans for decades,” a spokesman for the Umatilla tribe, Armand Minthorn, wrote in 1996. “We view this practice as desecration of the body and a violation of our most deeply-held religious beliefs.” The remains, the tribe said, were those of a direct tribal ancestor. “From our oral histories, we know that our people have been part of this land since the beginning of time. We do not believe that our people migrated here from another continent, as the scientists do.” The coalition announced that as soon as the corps turned the skeleton over to them, they would bury it in a secret location where it would never be available to science. The corps made it clear that, after a monthlong public comment period, the tribal coalition would receive the bones.
The tribes had good reason to be sensitive. The early history of museum collecting of Native American remains is replete with horror stories. In the 19th century, anthropologists and collectors looted fresh Native American graves and burial platforms, dug up corpses and even decapitated dead Indians lying on the field of battle and shipped the heads to Washington for study. Until NAGPRA, museums were filled with American Indian remains acquired without regard for the feelings and religious beliefs of native people. NAGPRA was passed to redress this history and allow tribes to reclaim their ancestors’ remains and some artifacts. The Smithsonian, under the National Museum of the American Indian Act, and other museums under NAGPRA, have returned (and continue to return) many thousands of remains to tribes. This is being done with the crucial help of anthropologists and archaeologists—including Owsley, who has been instrumental in repatriating remains from the Smithsonian’s collection. But in the case of Kennewick, Owsley argued, there was no evidence of a relationship with any existing tribes. The skeleton lacked physical features characteristic of Native Americans.
In the weeks after the Army engineers announced they would return Kennewick Man to the tribes, Owsley went to work. “I called and others called the corps. They would never return a phone call. I kept expressing an interest in the skeleton to study it—at our expense. All we needed was an afternoon.” Others contacted the corps, including members of Congress, saying the remains should be studied, if only briefly, before reburial. This was what NAGPRA in fact required: The remains had to be studied to determine affiliation. If the bones showed no affiliation with a present-day tribe, NAGPRA didn’t apply.
But the corps indicated it had made up its mind. Owsley began telephoning his colleagues. “I think they’re going to rebury this,” he said, “and if that happens, there’s no going back. It’s gone."

Photos of the Ainu people of Japan, thought to be among his closest living relatives, were inspiration for Kennewick Man’s reconstruction. (National Anthropological Archives )

So Owsley and several of his colleagues found an attorney, Alan Schneider. Schneider contacted the corps and was also rebuffed. Owsley suggested they file a lawsuit and get an injunction. Schneider warned him: “If you’re going to sue the government, you better be in it for the long haul.”
Owsley assembled a group of eight plaintiffs, prominent physical anthropologists and archaeologists connected to leading universities and museums. But no institution wanted anything to do with the lawsuit, which promised to attract negative attention and be hugely expensive. They would have to litigate as private citizens. “These were people,” Schneider said to me later, “who had to be strong enough to stand the heat, knowing that efforts might be made to destroy their careers. And efforts were made.”
When Owsley told his wife, Susan, that he was going to sue the government of the United States, her first response was: “Are we going to lose our home?” He said he didn’t know. “I just felt,” Owsley told me in a recent interview, “this was one of those extremely rare and important discoveries that come once in a lifetime. If we lost it”—he paused. “Unthinkable.”
Working like mad, Schneider and litigating partner Paula Barran filed a lawsuit. With literally hours to go, a judge ordered the corps to hold the bones until the case was resolved.
When word got out that the eight scientists had sued the government, criticism poured in, even from colleagues. The head of the Society for American Archaeology tried to get them to drop the lawsuit. Some felt it would interfere with the relationships they had built with Native American tribes. But the biggest threat came from the Justice Department itself. Its lawyers contacted the Smithsonian Institution warning that Owsley and Stanford might be violating “criminal conflict of interest statutes which prohibit employees of the United States” from making claims against the government.
“I operate on a philosophy,” Owsley told me, “that if they don’t like it, I’m sorry: I’m going to do what I believe in.” He had wrestled in high school and, even though he often lost, he earned the nickname “Scrapper” because he never quit. Stanford, a husky man with a full beard and suspenders, had roped in rodeos in New Mexico and put himself through graduate school by farming alfalfa. They were no pushovers. “The Justice Department squeezed us really, really hard,” Owsley recalled. But both anthropologists refused to withdraw, and the director of the National Museum of Natural History at the time, Robert W. Fri, strongly supported them even over the objections of the Smithsonian’s general counsel. The Justice Department backed off.
Owsley and his group were eventually forced to litigate not just against the corps, but also the Department of the Army, the Department of the Interior and a number of individual government officials. As scientists on modest salaries, they could not begin to afford the astronomical legal bills. Schneider and Barran agreed to work for free, with the faint hope that they might, someday, recover their fees. In order to do that they would have to win the case and prove the government had acted in “bad faith”—a nearly impossible hurdle. The lawsuit dragged on for years. “We never expected them to fight so hard,” Owsley says. Schneider says he once counted 93 government attorneys directly involved in the case or cc’ed on documents.
Meanwhile, the skeleton, which was being held in trust by the corps, first at Battelle and later at the Burke Museum of Natural History and Culture at the University of Washington in Seattle, was badly mishandled and stored in “substandard, unsafe conditions,” according to the scientists. In the storage area where the bones were (and are) being kept at the Burke Museum, records show there have been wide swings in temperature and humidity that, the scientists say, have damaged the specimen. When Smithsonian asked about the scientists’ concerns, the corps disputed that the environment is unstable, pointing out that expert conservators and museum personnel say that “gradual changes are to be expected through the seasons and do not adversely affect the collection.”
Somewhere in the move to Battelle, large portions of both femurs disappeared. The FBI launched an investigation, focusing on James Chatters and Floyd Johnson. It even went so far as to give Johnson a lie detector test; after several hours of accusatory questioning, Johnson, disgusted, pulled off the wires and walked out. Years later, the femur bones were found in the county coroner’s office. The mystery of how they got there has never been solved.
The scientists asked the corps for permission to examine the stratigraphy of the site where the skeleton had been found and to look for grave goods. Even as Congress was readying a bill to require the corps to preserve the site, the corps dumped a million pounds of rock and fill over the area for erosion control, ending any chance of research.
I asked Schneider why the corps so adamantly resisted the scientists. He speculated that the corps was involved in tense negotiations with the tribes over a number of thorny issues, including salmon fishing rights along the Columbia River, the tribes’ demand that the corps remove dams and the ongoing, hundred-billion-dollar cleanup of the vastly polluted Hanford nuclear site. Schneider says that a corps archaeologist told him “they weren’t going to let a bag of old bones get in the way of resolving other issues with the tribes.”
Asked about its actions in the Kennewick Man case, the corps told Smithsonian: “The United States acted in accordance with its interpretation of NAGPRA and its concerns about the safety and security of the fragile, ancient human remains.”
Ultimately, the scientists won the lawsuit. The court ruled in 2002 that the bones were not related to any living tribe: thus NAGPRA did not apply. The judge ordered the corps to make the specimen available to the plaintiffs for study. The government appealed to the Court of Appeals for the Ninth Circuit, which in 2004 again ruled resoundingly in favor of the scientists, writing:

because Kennewick Man’s remains are so old and the information about his era is so limited, the record does not permit the Secretary [of the Interior] to conclude reasonably that Kennewick Man shares special and significant genetic or cultural features with presently existing indigenous tribes, people, or cultures.
During the trial, the presiding magistrate judge, John Jelderks, had noted for the record that the corps on multiple occasions misled or deceived the court. He found that the government had indeed acted in “bad faith” and awarded attorney’s fees of $2,379,000 to Schneider and his team.
“At the bare minimum,” Schneider told me, “this lawsuit cost the taxpayers $5 million.”
Owsley and the collaborating scientists presented a plan of study to the corps, which was approved after several years. And so, almost ten years after the skeleton was found, the scientists were given 16 days to examine it. They did so in July of 2005 and February of 2006.
From these studies, presented in superabundant detail in the new book, we now have an idea who Kennewick Man was, how he lived, what he did and where he traveled. We know how he was buried and then came to light. Kennewick Man, Owsley believes, belongs to an ancient population of seafarers who were America’s original settlers. They did not look like Native Americans. The few remains we have of these early people show they had longer, narrower skulls with smaller faces. These mysterious people have long since disappeared.
To get to Owsley’s office at the National Museum of Natural History, you must negotiate a warren of narrow corridors illuminated by fluorescent strip lighting and lined with specimen cases. When his door opens, you are greeted by Kennewick Man. The reconstruction of his head is striking—rugged, handsome and weather-beaten, with long hair and a thick beard. A small scar puckers his left forehead. His determined gaze is powerful enough to stop you as you enter. This is a man with a history.
Kennewick Man is surrounded on all sides by tables laid out with human skeletons. Some are articulated on padded counters, while others rest in metal trays, the bones arranged as precisely as surgeon’s tools before an operation. These bones represent the forensic cases Owsley is currently working on.
“This is a woman,” he said, pointing to the skeleton to the left of Kennewick Man. “She’s young. She was a suicide, not found for a long time.” He gestured to the right. “And this is a homicide. I know there was physical violence. She has a fractured nose, indicating a blow to the face. The detective working the case thinks that if we can get a positive ID, the guy they have will talk. And we have a positive ID.” A third skeleton belonged to a man killed while riding an ATV, his body not found for six months. Owsley was able to assure the man’s relatives that he died instantly and didn’t suffer. “In doing this work,” he said, “I hope to speak for the person who can no longer speak.”
Owsley is a robust man, of medium height, 63 years old, graying hair, glasses; curiously, he has the same purposeful look in his eyes as Kennewick Man. He is not into chitchat. He grew up in Lusk, Wyoming, and he still radiates a frontier sense of determination; he is the kind of person who will not respond well to being told what he can’t do. He met Susan on the playground when he was 7 years old and remains happily married. He lives in the country, on a farm where he grows berries, has an orchard and raises bees. He freely admits he is “obsessive” and “will work like a dog” until he finishes a project. “I thought this was normal,” he said, “until it was pointed out to me it wasn’t.” I asked if he was stubborn, as evidenced by the lawsuit, but he countered: “I would say I’m driven—by curiosity.” He added, “Sometimes you come to a skeleton that wants to talk to you, that whispers to you, I want to tell my story. And that was Kennewick Man.”
A vast amount of data was collected in the 16 days Owsley and colleagues spent with the bones. Twenty-two scientists scrutinized the almost 300 bones and fragments. Led by Kari Bruwelheide, a forensic anthropologist at the Smithsonian, they first reassembled the fragile skeleton so they could see it as a whole. They built a shallow box, added a layer of fine sand, and covered that with black velvet; then Bruwelheide laid out the skeleton, bone by bone, shaping the sand underneath to cradle each piece. Now the researchers could address such questions as Kennewick Man’s age, height, weight, body build, general health and fitness, and injuries. They could also tell whether he was deliberately buried, and if so, the position of his body in the grave.
Next the skeleton was taken apart, and certain key bones studied intensively. The limb bones and ribs were CT-scanned at the University of Washington Medical Center. These scans used far more radiation than would be safe for living tissue, and as a result they produced detailed, three-dimensional images that allowed the bones to be digitally sliced up any which way. With additional CT scans, the team members built resin models of the skull and other important bones. They made a replica from a scan of the spearpoint in the hip.
As work progressed, a portrait of Kennewick Man emerged. He does not belong to any living human population. Who, then, are his closest living relatives? Judging from the shape of his skull and bones, his closest living relatives appear to be the Moriori people of the Chatham Islands, a remote archipelago 420 miles southeast of New Zealand, as well as the mysterious Ainu people of Japan.
“Just think of Polynesians,” said Owsley.
Rib fragments showing details of the ends. (Chip Clark / NMNH, SI)
Not that Kennewick Man himself was Polynesian. This is not Kon-Tiki in reverse; humans had not reached the Pacific Islands in his time period. Rather, he was descended from the same group of people who would later spread out over the Pacific and give rise to modern-day Polynesians. These people were maritime hunter-gatherers of the north Pacific coast; among them were the ancient Jōmon, the original inhabitants of the Japanese Islands. The present-day Ainu people of Japan are thought to be descendants of the Jōmon. Nineteenth-century photographs of the Ainu show individuals with light skin, heavy beards and sometimes light-colored eyes.
Jōmon culture first arose in Japan at least 12,000 years ago and perhaps as early as 16,000 years ago, when the landmasses were still connected to the mainland. These seafarers built boats out of sewn planks of wood. Outstanding mariners and deep-water fishermen, they were among the first people to make fired pottery.
The discovery of Kennewick Man adds a major piece of evidence to an alternative view of the peopling of North America. It, along with other evidence, suggests that the Jōmon or related peoples were the original settlers of the New World. If correct, the conclusion upends the traditional view that the first Americans came through central Asia and walked across the Bering Land Bridge and down through an ice-free corridor into North America.
Sometime around 15,000 years ago, the new theory goes, coastal Asian groups began working their way along the shoreline of ancient Beringia—the sea was much lower then—from Japan and Kamchatka Peninsula to Alaska and beyond. This is not as crazy a journey as it sounds. As long as the voyagers were hugging the coast, they would have plenty of fresh water and food. Cold-climate coasts furnish a variety of animals, from seals and birds to fish and shellfish, as well as driftwood, to make fires. The thousands of islands and their inlets would have provided security and shelter. To show that such a sea journey was possible, in 1999 and 2000 an American named Jon Turk paddled a kayak from Japan to Alaska following the route of the presumed Jōmon migration. Anthropologists have nicknamed this route the “Kelp Highway.”
“I believe these Asian coastal migrations were the first,” said Owsley. “Then you’ve got a later wave of the people who give rise to Indians as we know them today.”
What became of those pioneers, Kennewick Man’s ancestors and companions? They were genetically swamped by much larger—and later—waves of travelers from Asia and disappeared as a physically distinct people, Owsley says. These later waves may have interbred with the first settlers, diluting their genetic legacy. A trace of their DNA still can be detected in some Native American groups, though the signal is too weak to label the Native Americans “descendants.”
Whether this new account of the peopling of North America will stand up as more evidence comes in is not yet known. The bones of a 13,000-year-old teenage girl recently discovered in an underwater cave in Mexico, for example, are adding to the discussion. James Chatters, the first archaeologist to study Kennewick and a participant in the full analysis, reported earlier this year, along with colleagues, that the girl’s skull appears to have features in common with that of Kennewick Man and other Paleo-Americans, but she also possesses specific DNA signatures suggesting she shares female ancestry with Native Americans.
Kennewick Man may still hold a key. The first effort to extract DNA from fragments of his bone failed, and the corps so far hasn’t allowed a better sample to be taken. A second effort to plumb the old fragments is underway at a laboratory in Denmark.
There’s a wonderful term used by anthropologists: “osteobiography,” the “biography of the bones.” Kennewick Man’s osteobiography tells a tale of an eventful life, which a newer radiocarbon analysis puts at having taken place 8,900 to 9,000 years ago. He was a stocky, muscular man about 5 feet 7 inches tall, weighing about 160 pounds.               
                                                                                        He was right-handed. His age at death was
                                                                                          around 40.
Kennewick Man pelvis. (Chip Clark / NMNH, SI)

Anthropologists can tell from looking at bones what muscles a person used most, because muscle attachments leave marks in the bones: The more stressed the muscle, the more pronounced the mark. For example, Kennewick Man’s right arm and shoulder look a lot like a baseball pitcher’s. He spent a lot of time throwing something with his right hand, elbow bent—no doubt a spear. Kennewick Man once threw so hard, Owsley says, he fractured his glenoid rim—the socket of his shoulder joint. This is the kind of injury that puts a baseball pitcher out of action, and it would have made throwing painful. His left leg was stronger than his right, also a characteristic of right-handed pitchers, who arrest their forward momentum with their left leg. His hands and forearms indicate he often pinched his fingers and thumb together while tightly gripping a small object; presumably, then, he knapped his own spearpoints.
Kennewick Man spent a lot of time holding something in front of him while forcibly raising and lowering it; the researchers theorize he was hurling a spear downward into the water, as seal hunters do. His leg bones suggest he often waded in shallow rapids, and he had bone growths consistent with “surfer’s ear,” caused by frequent immersion in cold water. His knee joints suggest he often squatted on his heels. I like to think he might have been a storyteller, enthralling his audience with tales of far-flung travels.
Many years before Kennewick Man’s death, a heavy blow to his chest broke six ribs. Because he used his right hand to throw spears, five broken ribs on his right side never knitted together. This man was one tough dude.
The scientists also found two small depression fractures on his cranium, one on his forehead and the other farther back. These dents occur on about half of all ancient American skulls; what caused them is a mystery. They may have come from fights involving rock throwing, or possibly accidents involving the whirling of a bola. This ancient weapon consisted of two or more stones connected by a cord, which were whirled above the head and thrown at birds to entangle them. If you don’t swing a bola just right, the stones can whip around and smack you. Perhaps a youthful Kennewick Man learned how to toss a bola the hard way.
Spear point from Kennewick Man's hip
The most intriguing injury is the spearpoint buried in his hip. He was lucky: The spear, apparently thrown from a distance, barely missed the abdominal cavity, which would have caused a fatal wound. It struck him at a downward arc of 29 degrees. Given the bone growth around the embedded point, the injury occurred when he was between 15 and 20 years old, and he probably would not have survived if he had been left alone; the researchers conclude that Kennewick Man must have been with people who cared about him enough to feed and nurse him back to health. The injury healed well and any limp disappeared over time, as evidenced by the symmetry of his gluteal muscle attachments. There’s undoubtedly a rich story behind that injury. It might have been a hunting accident or a teenage game of chicken gone awry. It might have happened in a fight, attack or murder attempt.
Much to the scientists’ dismay, the corps would not allow the stone to be analyzed, which might reveal where it was quarried. “If we knew where that stone came from,” said Stanford, the Smithsonian anthropologist, “we’d have a pretty good idea of where that guy was when he was a young man.” A CT scan revealed that the point was about two inches long, three-quarters of an inch wide and about a quarter-inch thick, with serrated edges. In his analysis, Stanford wrote that while he thought Kennewick Man had probably received the injury in America, “an Asian origin of the stone is possible.”

The food we eat and the water we drink leave a chemical signature locked into our bones, in the form of different atomic variations of carbon, nitrogen and oxygen. By identifying them, scientists can tell what a person was eating and drinking while the bone was forming. Kennewick Man’s bones were perplexing. Even though his grave lies 300 miles inland from the sea, he ate none of the animals that abounded in the area. On the contrary, for the last 20 or so years of his life he seems to have lived almost exclusively on a diet of marine animals, such as seals, sea lions and fish. Equally baffling was the water he drank: It was cold, glacial meltwater from a high altitude. Nine thousand years ago, the closest marine coastal environment where one could find glacial meltwater of this type was Alaska. The conclusion: Kennewick Man was a traveler from the far north. Perhaps he traded fine knapping stones over hundreds of miles.
The food we eat and the water we drink leave a chemical signature locked into our bones, in the form of different atomic variations of carbon, nitrogen and oxygen. By identifying them, scientists can tell what a person was eating and drinking while the bone was forming. Kennewick Man’s bones were perplexing.
Although he came from distant lands, he was not an unwelcome visitor. He appears to have died among people who treated his remains with care and respect. While the researchers say they don’t know how he died—yet—Owsley did determine that he was deliberately buried in an extended, prone position, faceup, the head slightly higher than the feet, with the chin pressed on the chest, in a grave that was about two and a half feet deep. Owsley deduced this information partly by mapping the distribution of carbonate crust on the bones, using a magnifying lens. Such a crust is heavier on the underside of buried bones, betraying which surfaces were down and which up. The bones showed no sign of scavenging or gnawing and were deliberately buried beneath the topsoil zone. From analyzing algae deposits and water-wear marks, the team determined which bones were washed out of the embankment first and which fell out last. Kennewick Man’s body had been buried with his left side toward the river and his head upstream.
The most poignant outcome? The researchers brought Kennewick Man’s features back to life. This process is nothing like the computerized restoration seen in the television show Bones. To turn a skull into a face is a time-consuming, handcrafted procedure, a marriage of science and art. Skeletal anatomists, modelmakers, forensic and figurative sculptors, a photographic researcher and a painter toiled many months to do it.
The first stage involved plotting dozens of points on a cast of the skull and marking the depth of tissue at those points. (Forensic anatomists had collected tissue-depth data over the years, first by pushing pins into the faces of cadavers, and later by using ultrasound and CT scans.) With the points gridded out, a forensic sculptor layered clay on the skull to the proper depths.
The naked clay head was then taken to StudioEIS in Brooklyn, which specializes in reconstructions for museums. There, sculptors aged his face, adding wrinkles and a touch of weathering, and put in the scar from the forehead injury. Using historic photographs of Ainu and Polynesians as a reference, they sculpted the fine, soft-tissue details of the lips, nose and eyes, and gave him a facial expression—a resolute, purposeful gaze consistent with his osteobiography as a hunter, fisherman and long-distance traveler. They added a beard like those commonly found among the Ainu. As for skin tone, a warm brown was chosen, to account for his natural color deepened by the harsh effects of a life lived outdoors. To prevent too much artistic license from creeping into the reconstruction, every stage of the work was reviewed and critiqued by physical anthropologists.
“I look at him every day,” Owsley said to me. “I’ve spent ten years with this man trying to better understand him. He’s an ambassador from that ancient time period. And man, did he have a story.”
Today, the bones remain in storage at the Burke Museum, and the tribes continue to believe that Kennewick Man is their ancestor. They want the remains back for reburial. The corps, which still controls the skeleton, denied Owsley’s request to conduct numerous tests, including a histological examination of thin, stained sections of bone to help fix Kennewick Man’s age. Chemical analyses on a lone tooth would enable the scientists to narrow the search for his homeland by identifying what he ate and drank as a child. A tooth would also be a good source of DNA. Biomolecular science is advancing so rapidly that within five to ten years it may be possible to know what diseases Kennewick Man suffered from and what caused his death.
Today’s scientists still have questions for this skeleton, and future scientists will no doubt have new ones. Kennewick Man has more to tell.

Read more:
Give the gift of Smithsonian magazine for only $12!
Follow us: @SmithsonianMag on Twitter