New Republic
Trump’s Military Family Scandal Reveals the Profound Ugliness of His Character
October 19th, 2017, 07:40 PM

Michelle Obama’s speech at the 2012 Democratic National Convention was a love letter to her husband, for his accomplishments both as a president and a family man. “Today, after so many struggles and triumphs and moments that have tested my husband in ways I never could have imagined, I have seen firsthand that being president doesn’t change who you are—it reveals who you are,” she said.

Four years later, as Hillary Clinton and Donald Trump battled for the presidency, the first lady reprised those lines in a speech in Northern Virginia—but this was no love letter. “At the end of the day, as I’ve said before, the presidency doesn’t change who you are, it reveals who you are,” she said. “And the same thing is true of a presidential campaign.” One did not have to squint to read between those lines, especially after she put a finer point on it:

If a candidate is erratic and threatening, if a candidate traffics in prejudice, fears, and lies on the trail, if a candidate has no clear plans to implement their goals, if they disrespect their fellow citizens, including folks who make extraordinary sacrifices for our country—let me tell you, that is who they are. That is the kind of president they will be. Trust me.

Trump right now is under the harshest, white-hot light of his presidency, and we’re seeing more clearly than ever who he truly is. His handling of the deaths of four Green Berets is a case study in everything that is wrong with the president—all of his pathologies and deficiencies. The scandal has revealed his character in full, and it’s even uglier than Michelle Obama imagined.


This was a slow-moving scandal, until it wasn’t. The U.S. Special Forces soldiers were killed in an ambush in Niger on October 4. Twelve days passed without a single public comment from the president. As we now know, from a Politico report on Wednesday, National Security Council staffers on October 5 drafted a statement for Trump to read, but for unknown reasons he never released it. Instead, he remained silent about the soldiers’ deaths—while continuing to tweet about issues far removed from his executive duties, complaining about “fake news” and criticizing anti-racist athletes who kneel in protest during the national anthem.

Finally, at an impromptu press conference on Monday, a reporter asked Trump, “Why haven’t we heard anything from you so far about the soldiers that were killed in Niger? And what do you have to say about that?” Trump didn’t answer the question, which specifically asked about why he’s been publicly silent. “I’ve written them personal letters,” he responded. “They’ve been sent, or they’re going out tonight, but they were written during the weekend. I will, at some point during the period of time, call the parents and the families—because I have done that, traditionally.... So, the traditional way—if you look at President Obama and other presidents, most of them didn’t make calls, a lot of them didn’t make calls.” This was immediately proven false.

The following day, Trump called Myeshia Johnson, the widow of Sergeant La David T. Johnson, one of the slain soldiers. According to Democratic Congresswoman Frederica S. Wilson, who was with Myeshia Johnson when the call was made, Trump said, “He knew what he was signing up for, but I guess it hurts anyway.” The call brought Johnson to tears, according to Wilson, whose account has been confirmed by Cowanda Jones-Johnson, mother of Sgt. La David T. Johnson and also verified by Myeshia Johnson herself. “President Trump did disrespect my son and my daughter and also me and my husband,” Jones-Johnson told the Post.

As is his wont, Trump responded by lashing out at Wilson:

The White House has been trying to walk back this tweet: Aides don’t deny Trump said those words to Johnson, but insist he was “misunderstood.” If one wanted to be charitable, one could find extenuating excuses for Trump’s language on the phone call. It’s always difficult to console a stranger who has lost a loved one, even more so when your decisions, as president, indirectly caused that death. Trump doesn’t have much experience yet with calling bereaved military families; perhaps he was flustered, and thus wasn’t as careful with his words as he should have been. By saying that Johnson “knew what he was signing up for,” Trump might have been trying, in a clumsy way, to say that the fallen soldier bravely gave his life to defend American interests.

But Trump, who has long demonstrated a lack of empathy for suffering Americans, hasn’t earned such a charitable interpretation. He could have told the press that he erred in not delivering that October 5 statement, and apologized to Myeshia Johnson for his careless words. Instead, in an attempt to deflect blame, he told bald-faced lies about Obama and Wilson—both of whom, one can’t ignore given Trump’s racial demagoguery, are black. In reacting like a cornered rat, he only made matters worse for himself. While his lies about Obama and Wilson were immediately exposed, the political press rightly didn’t stop there; Trump’s claim to have called all grieving military families was also debunked. “You know when you hear people lying, and you want to fight? That’s the way I feel last night,” said a grieving military son, who told the Post that Trump never called his family. “He’s a damn liar.” Even worse, as that article revealed, Trump promised to give a grieving father $25,000, but never delivered. A White House spokesperson on Wednesday insisted, “The check has been sent,” then attacked “the media’s biased agenda.” In truth, the check was made out the same day as the Post’s report—almost certainly in response to it.

Some observers have tried to argue that Trump is being ill-served by those around him. “This is a failure of the president’s staff,” Sam Nunberg, former political advisor, told The Los Angeles Times. While it’s true that Trump’s White House staff is a mess, these underlings are being scapegoated for Trump’s own faults; they’re incompetent largely because their boss is incompetent. As Michelle Obama said in her 2012 DNC speech, “I’ve seen how the issues that come across a president’s desk are always the hard ones—the problems where no amount of data or numbers will get you to the right answer, the judgment calls where the stakes are so high, and there is no margin for error. And as president, you can get all kinds of advice from all kinds of people. But at the end of the day, when it comes time to make that decision, as president, all you have to guide you are your values, and your vision, and the life experiences that make you who you are.”

As a useful microcosm of Trump’s presidency and his character, the bereavement scandal could not be a more damning indictment of his value, his vision, and the life experiences that led him to the White House. And this is an equally damning indictment of the millions of Americans who put him there, after convincing themselves that Trump that he was not the sum of his numerous, manifest flaws—that was not who he seemed to be.


Last year, some of Trump’s most ardent advocates suggested that he was a different man in private—not the raging, petulant bigot witnessed at rallies and on Twitter. This calmer, more reasonable Trump would emerge soon, we were told. “If the public sees the Donald Trump I’ve gotten to know in private, he will not be stopped,” RNC chair Reince Priebus said in July 2016.It’s just taken longer to pivot and I think he’s pivoting.” Trump never pivoted—not after he won the Republican nomination, not after winning the election, and not after his inauguration.

Michelle Obama had predicted as much in her speech in Virginia. “A candidate is not gonna suddenly change once they get into office,” she said. “Just the opposite, in fact, because the minute that individual takes that oath, they are under the hottest, harshest light there is. And there is no way to hide who they really are. And at that point it’s too late.” It is too late indeed. Obama’s warning ultimately did not persuade enough swing state voters to pick Clinton over Trump, and now we’re living with the consequences of a president whose character is even more corrupt his administration is.

Lest we forget, amid this diagnosis of Trump’s many personal deficiencies, that this all started because Trump failed in his professional duty. He was asked a simple question: Why hadn’t he said anything about the four soldiers killed in Niger? As commander-in-chief, his job is not only to console grieving families, but to explain to the American public what the military is doing—especially when his administration puts soldiers in harm’s way, and something goes wrong. Why, for instance, is the American military in Niger? What were the circumstances that led to these death? How will this impact the American military’s mission against Al Qaeda in Africa? These questions remain unanswered amid a controversy of Trump’s own making. He changed the conversation, and now we must change it back. The four grieving families deserve nothing less.

Both Sides of Joni Mitchell
October 19th, 2017, 07:40 PM

Writing about Joni Mitchell is difficult. The first and clearest fact about her is that she recorded eight genius records in a row in the late 1960s and 1970s: Song to a Seagull (1968), Clouds (1969), Ladies of the Canyon (1970), Blue (1971), For the Roses, (1972), Court and Spark (1974), The Hissing of Summer Lawns (1975), and Hejira (1976). This is the Joni Mitchell whom David Crosby saw play in the Gaslight South in 1967: “She was singing ‘Michael from Mountains’ or ‘Both Sides, Now’ or some other fucking wonderful song, and she just knocked me on my ass. I did not know there was anybody out there doing that.”

RECKLESS DAUGHTER: A PORTRAIT OF JONI MITCHELL By David YaffeSarah Crichton Books, 448 pp., $28

Those albums were all recorded in quick succession, and Mitchell kept working for decades afterward, but it is this era of her career that defines her legacy. Very few critics are interested in Mitchell’s career through the 1980s and 1990s, as attested by almost every article collected in the new book Joni: An Anthology. Critics expressed ambivalent delight at her return to music with 2007’s Shine, albeit alongside disclaimers that “Joni Mitchell created such a prescient roadmap for life and love in the 1960s and early ’70s that everything since has felt cryptic by comparison.”

Reckless Daughter: A Portrait of Joni Mitchell, a new biography by David Yaffe, tries to be more fulsome in its treatment of her legacy. It’s a vivid and dramatic book, focusing more on the music than on the musician’s life. Yaffe tries hard to reckon with Mitchell’s genius in prose and to keep up his interest throughout the ‘80s and ‘90s, after her career slumped. Reckless Daughter is an exercise in describing one art form with another, and Yaffe strains to match Mitchell’s musical talent with his own writerly flourishes. He calls the song “Woodstock” “a purgation.” The book is deeply researched and heavily invested in tracing the process of a day-to-day creative life. If you have no interest in the mechanics of studio recording or the career of Mitchell’s bassist Jaco Pastorius, you may find it a little tiresome. But Yaffe tries to put his arms around the whole Mitchell phenomenon.


Yaffe dedicates substantial space to what we could call the second Joni Mitchell. This artist is the one who comes after the waifish early years of her creative apex, an ex-folksinger trying to make her way through the ‘80s, getting narcissistic and angry and broke. She makes some bad albums: Don Juan’s Reckless Daughter (1977), Mingus (1979), Wild Things Run Fast (1982), and the synth-soaked Dog Eat Dog (1985). This Mitchell is on a lot of cocaine. Her voice is losing its top end, and she’s bored of playing the guitar in her old style. In 1986, Mitchell played an obscure song at the Amnesty International benefit concert. A dissatisfied audience member threw ice at her face. There follow some better albums in the 1990s, but by this stage, her career is shot and the money is just not coming anymore.

Don Juan’s Reckless Daughter is a specifically upsetting point in this low period because Joni Mitchell appears in blackface on the cover. She’s dressed as a “black pimp” character which she claims was modeled after a man she met on the street (“this black guy with a beautiful spirit walking with a bop”). If you weren’t looking for it, you might not recognize her. Mitchell says that she met the man while on the hunt for a Halloween costume, and she duly went as him to a party, something she repeated on a few occasions. This character seems to have been a long-running fixation of Mitchell’s. As recently as 2015 she told New York magazine that she has “experienced being a black guy on several occasions.” In that interview she seems to say that she feels she has an inner black male inside of her. “I nod like I’m a brother,” she said.

Yaffe also records an incident in the ’70s in which Mitchell deliberately shocked Joan Baez by using the n word in reference to Muhammad Ali and Rubin “Hurricane” Carter. These actions on Mitchell’s part are worth noting, I think, chiefly because they flag what isn’t often said about her.

It isn’t easy to connect this version of Joni Mitchell with the young Mitchell most critics focus on. In her twenties, Joni Mitchell wrote songs that have become girders in the architecture of culture—there are so many covers of “Big Yellow Taxi” that it has hit the U.S. top 50 five times—and she wrote them while being beautiful, white, and female, and playing the guitar in a way that she had personally invented. These facts make her an outlier—contemporaries like Joan Baez and Judy Collins made no equivalent lasting impact—and also transformed the Joni Mitchell of those early albums into an almost mythic figure. Young Mitchell, the one who spat out “The Circle Game” at 23, is such an icon to music fans that she seems to lie beyond the grasp of acceptable criticism. This Joni Mitchell, not the person but the icon, is a fantasy of the bohemian woman, a goddess, a person no more human or connected to the rest of culture than the Virgin Mary.

This is the Joni Mitchell who embodies a kind of bohemian feminine that no longer has room to exist. AIDS and war and the raging capitalist fetish of America showed all the things that Joni Mitchell symbolizes to have been an illusion; the daydream had dissolved by the 1980s. But she was a genius of bohemian fantasy, and so we get to keep her. We’ve given up on that dream, her fans say, but don’t make me give up Joni Mitchell.

By the same token that fans and critics alike idealize young Mitchell into a bohemian fantasy dream girl, they do not scrutinize the complexity, difficulty, and strangeness of her words and behavior in the late ‘70s and onward. It isn’t just that Mitchell fans refuse to let her cocaine-fuelled grandiosity, use of racist language and imagery, or bad ‘80s albums damage her artistic legacy. The problem is that her fans, including journalists, simply do not engage with the whole, real, flawed Joni Mitchell. Our culture’s subtle, insidious misogyny lets us pretend that fantasy women are more important than real ones; the real Joni Mitchell has never existed, according to this logic, and nor does any fan need to reckon with her racist actions in order to obsess over her dainty early music. This dynamic lets us continue loving Joni Mitchell for the fantasy we have of her, a fantasy that was maybe never real—and certainly not since the day she first looked sideways at a synth, and her wrinkles started to show.


I am a biased reader of this biography, because Joni Mitchell’s earliest albums make my skin crawl. Yes, even Blue. I hate the way she yodels, I hate her soprano, I hate the way she pronounces the word “road.” Once all those cigarettes ground her voice down for The Hissing of Summer Lawns, for me, the sound grew into something like music. But her acoustic music relies too much on suspended chords. These are the chords that give songs like “Both Sides, Now” or “My Old Man” a strong aftertaste— like dishsoap not fully rinsed from a glass—of musical theater.

Once Mitchell began playing with accomplished jazz musicians (Yaffe rightfully goes on a long digression on the career of Jaco Pastorius, the genius bass player who transformed her sound) in her mid-career, all those unresolved chords and swooping vocal lines began to make sense. Court and Spark is an album filled with true musicianship, and enough funk sensibility to knock Mitchell’s whistling lamentations into songs.

Perhaps I’m guilty of the same idealization that any Joni Mitchell mega-fan indulges in, except that my “good Joni” exists later, in the part of her career that most critics ignore. Being a fan means being specific about the things you love, and why. A fan gets credit for the music that they choose to define their tastes with. A fan and their chosen star reflect something, a sense of themselves, between them. For that reason, however, we need to trouble our vision of Joni Mitchell as the boho-genius dream girl. Idealizing women into smaller versions of themselves is destructive, even, or especially, when the full picture contains details we’d rather ignore. We deserve both sides, now.

Louis Guilloux’s Great, Forgotten War Novel
October 19th, 2017, 07:40 PM

There are no trenches, no German submarines, no gas attacks in Blood Dark, yet Louis Guilloux’s epic novel ranks among the most powerful French depictions of the First World War. By 1935, when it was published, suffering on the front line had already produced a series of classics: Henri Barbusse’s Under Fire and Maurice Genevoix’s ’Neath Verdun (1916); Blaise Cendrars’s I’ve Killed (1918); Roland Dorgelès’s Wooden Crosses (1919). Guilloux’s contribution was different. As an adolescent in provincial Brittany, he had seen war reach behind the front and penetrate civilian populations and institutions; in Blood Dark he set out to create a war literature of the home front, a toxic zone where rumors do battle with the truth and witch hunts are carried out in the name of patriotism.

Blood Dark takes place on a single day in 1917, in a town recognizable as Guilloux’s native Saint-Brieuc, population 24,000, perched on the north coast of Brittany. The war has reached a low point after the debacle at the Chemin des Dames, and the American doughboys are still nowhere in sight. Patriotism has grown hollow; for some young people, revolutionary Russia is becoming a source of hope. Into a classical frame—unity of time and of place—Guilloux sets a riotous cast of some twenty main characters whose destinies combine and reverberate in a series of short episodes. He finds a way, through this form, to explore the effects of the war on an entire community and to delve deeply into the consciousness of one awed individual who is both a spiritual guide and a living symptom of the society in disarray.

This guiding light or rather guiding shadow of the novel is an unhinged teacher of high-school philosophy named Charles Merlin, nicknamed “Cripure” by his students—a play on Kant’s CRItique of PURE Reason. He is in charge of teaching ethics to the draft-aged boys, young men condemned to spin the wheel of fortune on the front. Guilloux’s portrait of Cripure was inspired by a teacher and mentor of his own, the eccentric philosopher Georges Palante (1862–1925), though Guilloux once said that Cripure was “derived from Palante”—a starting point for his action, rather than a model. Like Palante, Cripure is a renegade from the Sorbonne, a man of broken friendships and a failed marriage, sharing his bed with an uneducated housekeeper, the affectionate, saintly Maïa, who dispenses level-headed wisdom inflected with Gallo, the local dialect of eastern Brittany. And like Palante, Cripure is disabled in the cruelest way, with huge, deformed feet that make it difficult for him to walk. At one point, the town boot-maker shows off Cripure’s shoes to a visiting circus director, who wants to hire him: “But when the circus manager had learned that the owner of those astonishing boots was a professor, and of philosophy! He’d simply shrugged and changed the subject.”

The action of the novel revolves around a few signal events: a schoolboys’ plot to unbolt the front wheel of Cripure’s bicycle; a Legion of Honor ceremony at the local school, now partly transformed into a military hospital; rioting soldiers at the train station who don’t want to return to the front; Cripure’s aborted duel with his colleague Nabucet; and the adventures of an even larger cast of characters that includes draftees, antiwar students, amorous spinsters, hypocritical school officials, and pedophiles; slick politicians and young men on the make; a revolutionary leaving for Russia, an amputee, and a couple learning of their son’s execution for mutiny at the front.

Guilloux’s fiction touches on issues that are still matters of great contention among French historians of the Great War: To what extent was there a consensus about the fighting? What was the nature of the mutinies that broke out as the war dragged on? Did they occur at random or were they part of a deep current of antiwar sentiment? Soldiers in transit demonstrating at train stations, individual deserters, fomenters of revolt on the front were all in some sense “mutineers,” and their numbers add up to a few thousand or to tens of thousands—depending on your definition of “mutiny.” What is clear is that antiwar sentiment, moral exhaustion, and episodes of disobedience flourished in the summer of 1917, the summer of Blood Dark.

The best-known scene in the novel is certainly the riot at the Saint-Brieuc train station. Guilloux has a genius for portraying chaos and for letting us see the drama of the individuals inside a crowd. He doesn’t spare his readers a close-up of one of the men disfigured by trench warfare—a gueule cassée, or “broken face.” For his novel to begin in a carnival of cruelty and end in tenderness is one of its great achievements. Cripure, impossible to categorize by any of our literary labels—hero, victim, genius, idiot, madman, muse—is another.


Albert Camus considered Blood Dark one of the few French novels to rival the great Russian epics. “I know of no one today who can make characters come alive the way you do,” he wrote to Guilloux in 1946. Guilloux, Camus said later, was uniquely attuned to the sorrow of others, but he was never a novelist of despair.

Camus was only one of many French writers at the forefront of literary life in the 1930s and ’40s who considered Blood Dark a masterpiece. Louis Aragon said that Cripure was the Don Quixote of bourgeois ruin; André Gide said that the novel had made him lose his footing. On the left, Guilloux’s contemporaries understood Blood Dark as an important political response to Louis-Ferdinand Céline’s nihilistic Journey to the End of the Night, published three years earlier. “The truth of this life is death,” Céline wrote in Journey, and Guilloux responded: “It’s not that we die, it’s that we die cheated.” The publisher used that line on a paper band around the book cover. For French intellectuals in the 1930s, there was a crucial difference between Céline and Guilloux: Both writers denounced the patriotic lies that lead men to their deaths, but for Céline the violence of man to man was inevitable, biological. Guilloux, by contrast, held out hope for fraternity and for collective struggle. In his world, and in his fiction, there were always causes worth fighting for, always zones of tenderness.

When Blood Dark missed winning France’s biggest literary prize, the Goncourt (just as Journey to the End of the Night had missed it in 1932), Guilloux’s fellow writers, among them Gide, Dorgelès, and Aragon, as well as Paul Nizan and André Malraux, protested by organizing a public meeting to laud his vision and underline his blazing critique of war and human hypocrisy.

Literary historians of existentialism have argued that Blood Dark launched the notion of the absurd well in advance of Jean-Paul Sartre’s Nausea, Samuel Beckett’s Molloy, and Camus’s The Stranger. Yet Guilloux is often dismissed as a regionalist. In fact he was a transnational writer at a time when many of his contemporaries were taken up with ingrown literary rivalries. Just before beginning to work in earnest on Blood Dark, he translated Claude McKay’s Home to Harlem, rendering black American English in vibrant Caribbean slang—a Creole of his own making, adapted for French readers. Guilloux’s notebooks make clear that more than a realist, he was a voice writer, testing dialogues and send-ups of bourgeois language, recording conversations, and compiling lists of idioms and ridiculous expressions. The black English in Home to Harlem surely inspired Maïa’s Gallo-speak. Guilloux read well beyond the French canon, translating Steinbeck and McKay, on the one hand, and drawing inspiration from Dostoyevsky and Tolstoy on the other.

Blood Dark is still considered a masterpiece in France, but in English the book remains little known. Part of the problem is its first translation. Samuel Putnam’s version, titled Bitter Victory, appeared in simultaneous American and British editions shortly after the original French publication. A former expatriate, a columnist for The Daily Worker, and a translator of Rabelais and Cervantes, Putnam saw in Guilloux’s novel a condemnation of “the bourgeois culture that had made the war.” Was it he or his editors who chose Bitter Victory, a misleading title for a book set a good year before the war’s end, when no victory was in sight? Putnam translated in the “mid-Atlantic style” then in vogue, neither American nor English, supposedly pleasing to readers in both countries but actually quite lost at sea. As a result of this linguistic compromise, Guilloux’s most remarkable quality as a writer, his sense of each character’s unique voice, is muffled.

Part of what makes this new translation so riveting is the attention that Laura Marris has given to the novel’s distinct voices and places. As a poet, and the translator of the contemporary Breton poet Paol Keineg, Marris has immersed herself in local Saint-Brieuc culture and has studied Guilloux’s papers, attending to the voices and sense of place he captured. From its new haunting title on, she has brought Blood Dark to life for the American reader. In this centenary of the darkest year of the Great War, what truer novel to read? In one respect, Guilloux’s story could not be more contemporary: As violence and terror seep into every aspect of his characters’ lives, they try to hold the chaos of the world at bay.

This article is appears as the introduction to Blood Dark by Louis Guilloux, published this month by NYRB Classics.

Ralph Northam Is Taking on Betsy DeVos—by Breaking With Barack Obama
October 19th, 2017, 07:40 PM

On Thursday night in Richmond, Virginia, his first day back on the campaign trail since Donald Trump’s inauguration, Barack Obama will give a stump speech for the state’s Democratic nominee for governor, Ralph Northam. The former president will inject himself into a race The Washington Post called “the country’s marquee statewide election this year,” a critical swing-state test for his party and the Trump resistance. Polls are tightening; Democrats are nervous. But with three weeks until the election, Northam, Virginia’s current lieutenant governor, is clinging to a slim lead. His campaign said last week that “Ralph and President Obama will discuss the need for the next governor to create economic opportunity for all Virginians—no matter who you are or where you’re from.”

One subject the pair likely won’t discuss, however, is their differences on K-12 education, the issue one recent poll found is most important to voters in the race. Northam represents a distinct departure from Obama’s emphasis on charter schools, support for high-stakes standardized tests, and tense relations with teachers unions. In fact, the lieutenant governor has explicitly deemphasized charters and critiqued the testing regime, while unions have sung his praises. His campaign is at once the first big battle against the privatization agenda of Education Secretary Betsy DeVos—whose family gave more than $100,000 to his Republican opponent, Ed Gillespie—and a kind of prototype for left-wing critics of Obama’s education agenda who hope Democrats will chart a new course on public schools.

“Northam is a breath of fresh air,” Diane Ravitch, the education historian and activist, told me this week, lauding him as “what every Democrat should be” on school reform. Julian Vasquez Heilig, an education professor at California State University, Sacramento, said Northam’s campaign is “a bellwether of what you’re going to see in other governor’s races,” including Lieutenant Governor Gavin Newsom’s candidacy in the Golden State next year. “There really is, for the first time in a statewide race, this opportunity to make a decision between the Betsy DeVos vision of privatizing and cutting funding for public education as public good versus someone who has more interest in a local community approach and listening to some of the critiques of the past 15 years of failed top-down education policy,” he said. Northam spokesman David Turner told me, “For obvious reasons, I’m not going to be discussing differences with President Obama right now.” But in Vasquez Heilig’s view, Northam represents “where the new wave of Democratic leaders are going.... He is breaking from Barack Obama and he is charting a different course from Trump and DeVos.”

Talk of a seismic shift in the Democratic Party’s education policy is premature. But with DeVos making “school choice” like charters and private school vouchers increasingly toxic for Democrats, there’s certainly room for a stronger defense of public education on the left. Yet as Northam is learning late in this campaign, pushing back on decades of accepted wisdom can be politically perilous.


Almost exactly a year ago, as the Obama presidency wound down, Washington Post education blogger Valerie Strauss wrote that “the growth of charter schools was a key priority in his administration’s overall school reform program.” The president had incentivized the expansion of these publicly funded but independently operated schools, which proponents say give students an alternative to failing traditional schools. But earlier this year, neither Democrat running in Virginia’s gubernatorial primary campaigned on expanding charters in the state.

In a June “email debate,” Post editorial board member Lee Hockstader asked both Northam and former Congressman Tom Perriello “why you want to continue to keep [charters] out of Virginia when there are schools in many communities that have so consistently failed their students—many of them in predominantly black and low-income areas—and when there is no hope of change or improvement.” Northam replied that “we need to make sure that we fund K-12 first before we move on to other things like charter schools.” He stressed that any charter authorization decisions should be “left to our local leaders and those closest to the communities,” and that “the charter proposals seen in Virginia would ultimately divert much-needed funding from school divisions, often those that are in the most need.”

Perriello’s rhetoric was even more cautious. “The performance of charter schools has simply not exceeded performance within the system, despite years of investments,” he told the Post. “The evidence does, however, show one clear trend, which is that schools in areas of concentrated poverty are far more likely to be underperforming.” He then went on to say, “Instead of blaming the teachers and principals, we should ask why we have not done more to reduce poverty.... Some of the solutions to our education performance must be found outside the classroom, in restoring the broken promise of social mobility and economic security for all Virginians.”

This kind of rhetoric might have endeared Perriello to the Virginia Education Association, which represents 50,000 educators in the state. But in April, the group swung its weight behind Northam, saying, “He’s the best candidate for our students, schools and educators, and he has an excellent track record of working to meet their needs.” The reference to his track record was telling. Though Perriello ran on skepticism of charters, even his past ties to “school choice” advocates like the Democrats for Education Reform was enough to turn the union off. “There was some extreme concern with regard to that issue,” the union’s president, Jim Livingston, told me. That issue did play a significant role in our decision to embrace Ralph Northam.”

Livingston says Northam will be a better partner for teachers than Obama’s education secretary, Arne Duncan, was. “Northam understands that in order for us to move the needle on improving public education we have to include the practitioners,” he said, “and that’s something we did not see under the Arne Duncan years, and it’s something we certainly will not see under the Betsy DeVos administration.” Livingston added that Northam “provides us with the opportunity to turn away from that failed experience and really move in a new direction.”

Northam also provides a stark contrast with Ed Gillespie, who has fully embraced DeVos’s privatization agenda. “Gillespie wants to expand the state’s charter schools beyond the eight in operation,” the Post reported. “As a state senator Northam voted against loosening restrictions that govern the establishment of charter schools, and as a candidate for governor he has advocated investing in traditional public schools.” Meanwhile, Gillespie supports education savings accounts, which are basically a backdoor private school voucher scheme diverting money from public schools. According to Turner, Northam’s spokesman, DeVos looms large in the minds of many voters in this race. “I have never seen a cabinet member with the name ID of Betsy DeVos,” he told me. “Honestly, I don’t know how it happened.”

Which isn’t to say Northam has completely avoided political minefields with all this talk. In a recent interview with the Post editorial board, which has long been a champion of “school choice” and other market-driven education policies, he critiqued Virginia’s standards for student accountability under the federal No Child Left Behind law. “What would replace them?” the Post asked. “Astonishingly, after almost four years as lieutenant governor and a month away from the election, Mr. Northam had no answer.”

Particularly concerning was Mr. Northam’s view that because children are diverse, “coming from different backgrounds and different regions,” he’s “not sure that it’s fair” to give them all the same test; they shouldn’t be penalized, he said, for the environment they come from. The suggestion that some students should be required to pass one type of assessment, while others are given a different (presumably more rigorous) one, is disconcerting. There is no question that some children come to school handicapped by circumstances not experienced by their better-advantaged peers, but children do better when there are high expectations. Creating different expectations for children does them no favors; it just allows adults to escape responsibility.

The Gillespie campaign and Republican Governors Association both hyped the editorial. Samuel Abrams, director of the National Center for the Study of Privatization in Education at the Columbia University Teachers College, told me he was surprised Northam didn’t have a response about an alternative accountability system. “This is kind of baffling to me,” he said, even if Northam is “on the right track” with his statements. “I think what Northam is cognizant of, clearly, is there are perverse consequences to high-stakes testing. It crowds out time for subjects that aren’t being tested and generates a lot of undue stress for parents and teachers and students.” Turner said that Northam “feels like the system went too far and put too much emphasis on standardized tests,” and that he plans to work with teachers to find a “balanced approach.”

Ravitch defended Northam’s statements to the Post in a blog post. “What’s the ideal accountability system?” she wrote. “Northam admitted to the editorial board that he doesn’t know.... What doesn’t work is one-size-fits-all standards like Common Core. What doesn’t work is promising rewards or threatening punishment to teachers and principals, tied to test scores.” This may be true, but in a tight race where education is at the forefront of voters’ minds, it’s political malpractice for a candidate not to have a solution in mind. Northam’s slip could prove to be a cautionary lesson, especially if he ends up losing to Gillespie: As Democrats energize voters by railing against DeVos, while also breaking with the education policies of its party’s past, they had better have an answer to what the future should look like.

Mom, Interrupted
October 19th, 2017, 07:40 PM
Illustration by Meghan Willis

Sam Fox, Pamela Adlon’s television alter ego on FX’s Better Things, is not like a regular mom. She’s a cool mom. You know this because she dresses like Patti Smith, all steel-toed ranch boots and distressed canvas blazers, with a smudge of kohl around her eyes. She sometimes calls her three daughters “dude.” One night, after a Joe Walsh concert, she comforts her 16-year-old daughter’s friend, who has had a run-in with an ex-boyfriend. “If it is any consolation,” she counsels, “I see people I blew all the time.... You can either live with it, or not go out, or blow less people.” Sam’s daughter Max (Mikey Madison) is mortified. To be the progeny of the cool mom is both a blessing and a curse. It brings the freedom to wear a crop top to a church service (as Max does), but it can prevent your adolescent transgressions from playing out with the operatic drama you desire.

In the first episode of Better Things’ new season, Max has tried to shock her mother by dating a 36-year-old man named Arturo. They meet when he is dating Sam’s friend Macy, but he soon switches his attentions, promising Max he will take her to the running of the bulls. (He doesn’t.) The pair show up to a party at Sam’s large, Spanish-style bungalow. Mother and daughter scowl at each other for several minutes, until Max pulls Sam into the laundry room and admits that she’s in way over her head. Sam duly sees off the unwanted suitor, threatening to call the police. For Sam, who is a single mother of three daughters, navigating the balance between control and protection is both a daily practice and a survival tactic: Push them and they pull away, but fail to protect them and everyone loses. Better Things is, above all, a very funny show about motherhood and the mundane, the snarl of tedium and tenderness that fills the waking hours of a parent’s life.

If this mood-board displays shades of Louie, another half-hour comedy on FX, it’s because Louis C.K. is one of the show’s executive producers and co-creators. Adlon and C.K., longtime friends, began collaborating on HBO’s Lucky Louie, where she played his wife, and continued to work together on Louie, where she played Louis’s friend and unrequited love interest Pamela. When Adlon decided to make a show of her own, she chose, like C.K., to begin at home, mining her own life for material and then building out a fictional world on top of her own experience. Like Sam, Adlon is a single mother raising three daughters alone while carving out a life in show business (her most notable role, prior to Louie, was the voice of 12-year-old Bobby on King of the Hill). Better Things isn’t completely autobiographical: Adlon’s real family serves as more of a writing prompt than documentary subject. The show falls more into the category of autofiction: an auteur playing a version of herself run through a fun-house mirror.

Autofictional sitcoms have been floating around since The Dick Van Dyke Show, but the genre has become truly dominant in prestige comedy over the past decade. There are the shows starring men who play bizarro versions of themselves: Louie, Aziz Ansari’s Master of None, and Larry David’s Curb Your Enthusiasm. And then there is Insecure, Issa Rae’s crackling HBO comedy, which grew out of her semi-autobiographical web series, Awkward Black Girl; British playwright Michaela Coel’s Chewing Gum; Tig Notaro’s One Mississippi; and comedian Rachel Bloom’s Crazy Ex-Girlfriend. Families and love interests drift in and out of these shows, but ultimately they aim to say something hyperspecific about their creators: what it is like to be a comedian, a young black woman in Los Angeles, a child of immigrants, or in Larry David’s case, a hapless narcissist.

Adlon’s show traffics in this same specificity, but because it puts five women—Sam, her three daughters, and her mother, Phyllis—at the heart of the narrative, its universe feels both more complex and far less claustrophobic. It is less Louie and more like the work of Jenji Kohan or Jill Soloway, who with Orange Is the New Black, Transparent, and I Love Dick have embraced a broad spectrum of female experiences—and ages. Better Things is grounded in the minutiae of family life. It may be the most hyperrealist, purposefully casual portrait of teenage girls and aging women on television right now.


The multigenerational, all-female household on Better Things looks like Lauren Greenfield’s iconic “Girl Culture” photographs come to life. We see Max lazing around the den of the bungalow like a pampered cat, scrolling through Snapchat when asked to help carry a suitcase. We see her, with a pert hair flip, tell her mother to stay out of her life, only to turn into a puddle of anguish when she is gripped by insecurities. We see Phyllis (Celia Imrie), whose entitlement runs from charmingly batty to totally exasperating, pestering Sam to buy her a lavender suit at the hardware store—even though both women know that the hardware store will carry no such thing.

Phil, as Sam calls her, gets her own dedicated episode in the second season. After she is found trying to steal a priceless ancient artifact from the museum where she docents, Phil flees the property but cannot locate her car. Feeling embarrassed and out of control, she decides to leap into a manhole to cause an injury. Laid up in a hospital for five days, Phil finally has her daughter’s undivided attention, which is frequently laced with distraction and cruelty. (A devastating moment from the first season involves Sam bailing on a road trip with her mother after five minutes.)

In another scene, Phil taunts a young child in a bookstore for wearing a fake mustache and then proceeds to lose control of her bladder in the aisle. Her aging is not pathologized, but it’s still played for tragicomedy; as Phil is losing her grip on her body and her mind, Sam doesn’t suddenly soften to her mother’s whims. Instead, she plays the relationship with as much realism as I’ve seen. Having a parent in decline is often as frustrating as it is mournful: It’s logistics and patience and effort. There is, of course, love at the base of all this work, but Adlon never sugarcoats aggravating family moments with a group hug. That Sam loves her kids, and loves her mother, is widely apparent: The entire show is about how she grapples with being the center of gravity for four separate, insatiable maws of need.

Adlon likewise gives the adolescent dramas space to breathe without taking them so seriously that they feel maudlin. Sam’s middle child, Frankie (Hannah Alligood), is a sardonic, self-righteous teen who wears baggy clothes and goads her sisters ruthlessly. When Frankie is sent home from school in the first season for using the boys’ bathroom, she tells Sam that it’s not because she wants to be a boy, but because she was disgusted by the behavior in the girls’ bathroom, where one classmate “stuck her finger up her pussy.” In season two, Adlon doesn’t dwell on Frankie’s gender identity as a major plot point—just as a teenager living in a bohemian Los Angeles enclave in 2017 likely wouldn’t. Frankie’s life doesn’t turn into a Very Special Episode due to one unjust school punishment, but rather continues apace, with microdramas that have nothing to do with how she wears her hair.

In one episode, Sam feels underappreciated and lashes out at her two oldest daughters, demanding that they stage a faux-funeral for her so she can hear their eulogies while she’s still alive. Frankie, in her tribute, hints at how she has relied on Sam to help her through emotional tumult, admitting that “I would unload on her—Mom, where are my socks, or whatever—because I needed to give her some of my pain, because I knew she could carry it when I couldn’t.” We know Frankie is in pain, as any teenager moving through the world is in pain, and Adlon—who directed every episode of this season—finds a way to show it without resorting to either pathos or overanalysis.


If the first season of Better Things had too much of anything, it was gravity. It reveled in the prosaic heartbreaks of day-to-day life, without much surrealism or fanciful reprieve. And while it was always propelled by Adlon’s charisma, which is radiant, the show felt moored to Sam’s maternal struggles in a way that didn’t allow viewers to appreciate her prismatic qualities. We seldom saw her doing her job, or out on a date, or with friends, without her daughters around. The second season provides a bit of a corrective: The majority of one episode is devoted to watching Sam in her work as an actor, teaching a scene study class and filming a car commercial. She also takes more romantic risks this season. The second episode features a caustic breakup scene in a parking lot that, on its own, should earn Adlon another Emmy nomination.

It’s in showing Sam in love, with all the complication it brings, that Better Things really does something new with motherhood on television. In a scene that begins with Sam in bed in the morning, we see flashbacks from the night before, when Sam met a man named Robin, a weathered but handsome divorced dad, at a depressing book reading. As Sam stares at the ceiling, we see a replay of their night together, including a walk under street lamps and getting a drink. This all appears like a montage from a rom-com, even as—back in reality—Sam’s two oldest daughters start to paw at her feet for attention. Their squabbling and silly jokes play underneath Sam’s daydreams like a soundtrack. Even as the mother allows herself to fantasize, her family is always there, clinging to her legs.

The scene is beautiful, almost meditative. Adlon allows us to conceive of a world in which a single mom isn’t torn between her Romantic Desires and her Mothering Responsibilities. She finds a way to reconcile these parts of herself, conveying all the ways in which most of the people she meets are trying (and often failing) to reach the same balance. Everyone in Better Things is a little bit damaged, which means that the mother is not the only frazzled mess on screen. In granting all her characters a brokenness, Adlon breaks open the stereotype of a working mother and shows her in full relief.

A Revered Biographer Looks at His Own Life
October 19th, 2017, 07:40 PM

In 1987, Saul Bellow gave an interview about the controversial book The Closing of the American Mind by his friend Allan Bloom. During the conversation, Bellow delivered a now-infamous line that was widely read as a flippant profession of cultural imperialism: “Who is the Tolstoy of the Zulus, the Proust of the Papuans?” It became exhibit A in the case against Bellow, who had seemed to disregard the cultural output of all nonwhite peoples with a flip of his hand. But according to him, he never said it. In a New York Times op-ed, he wrote: “Nowhere in print, under my name, is there a single reference to Papuans or Zulus. The scandal is entirely journalistic in origin, the result of a misunderstanding that occurred (they always do occur) during an interview. I can’t remember who the interviewer was.”

THE SHADOW IN THE GARDEN: A BIOGRAPHER’S TALE By James Atlas Pantheon, 400 pp., $28.95

The interviewer was James Atlas. At the time of the interview, he was the author of a celebrated biography of Delmore Schwartz, and he would go on to write a notorious biography of Bellow, as well as to preside over the Penguin Lives biography series as founding editor, exercising an unparalleled influence upon the recent course of American literary biography.

Now, Atlas has written about himself. In his memoir The Shadow in the Garden, he writes: “I have come across the notebook in which I scrawled this now-famous (or infamous) line, and it turns out that the word Bellow used was ‘Polynesians.’” Atlas’s confession falls short of Bellow’s protestation, and everything that makes the memoir intriguing and uncomfortable is present in this spat. The Shadow in the Garden — a memoir of Atlas’s work as a biographer, interspersed with reflections on the history of the art — explores the dark side of life writing: the tradeoffs between vividness and accuracy, the struggle with feelings of invisibility and inferiority, the difficulty of retaining sympathy for a person about whom you know too much, and the rancor that can arise between a biographer and his subject.


The first book Atlas wrote, Delmore Schwartz: The Life of an American Poet, was a critical triumph. Contracted to write it when he was just 25, he used techniques learned from Richard Holmes and Richard Ellmann to produce a biography that read like a novel. The method which produces such vivid life writing is something Atlas calls “empathic observation.” It involves a bit of imagination and a lot of copious, meticulous research to characterize life events as if from within. Voice, dramatized dialogue, atmospheric scene setting—these are techniques that can make a biography vivid and memorable. But getting them right depends upon prodigious feats of detail-mongering. To begin a chapter by evoking the specific weather of a historical day, for example, means sleuthing “through almanacs, old newspapers, and nautical records.” Atlas writes:

You can’t know what your subject felt. But you can get close. Going deeper into his subject through the artful deployment of his materials, the biographer inhabits his subject’s life as if he had been there — as if the events he’s relating hadn’t occurred in the distant past but had just happened.

Atlas thinks he came close to knowing what his subject felt in the years he spent investigating the poet Delmore Schwartz. It may have worked out because of a great personal affinity that Atlas discovered between himself and his subject. Schwartz suffered from bipolar disorder: his life was a tragic story of immense promise unfulfilled. Later, Atlas was diagnosed with the same illness. Perhaps the preconditions for successful empathic observation are not merely to be struck by interest, but to see one’s self in the subject.

Atlas’s research led him to make the acquaintance of an aging generation of New York intellectuals, many of whom had centered their careers around the now-defunct Partisan Review. One of them, Dwight Macdonald, became Atlas’s mentor and informal editor. Through Macdonald, Atlas got a taste of the combative discourse of the 1940s that had shaped Schwartz’s intellectual life. “The manuscript had a battle-scarred look,” he wrote of receiving his draft back from Macdonald. “There were singed holes where smoldering cigarette ash had been scattered over the page, and one chapter, edited from the hospital bed where Dwight was recovering from an operation, arrived in the mail wrapped in gauze, the pages smeared with blood — visible evidence of the surgery he was performing on my sickly prose.” Atlas also became friends with the critics Alfred Kazin and Philip Rahv, and eventually he made the acquaintance of the most famous figure associated with the New York intellectuals, Saul Bellow.

There was something fateful about meeting Bellow. The novelist had known Delmore Scwhartz, and was, in fact, writing a fictionalization of the poet’s life when Atlas first came to interview him. The resulting book, Humboldt’s Gift, probably contributed to the success of Atlas’s biography of Schwartz, and Atlas believes that one passage is actually about him:

I was alarmed when I encountered a passage [in Humboldt’s Gift] about young scholars “fabricating cultural rainbow textiles” out of the 1940s. “Young people, what do you aim to do with the facts about Humboldt,” asks Charlie Citrine. “Publish articles and further your careers?” It was a relief when he moved on to a rumination about Humboldt. A satirical aside was one thing; a whole portrait I didn’t need.

Perhaps this encounter should have warned Atlas off, but he grew increasingly fascinated with Bellow. He decided to become Bellow’s authorized biographer, if he could. He was never authorized, but he wrote a biography anyway. It was Atlas’s second biography, and it made as dramatic a splash as his first, but this time to a chorus of critical condemnation.

Over the years it took Atlas to research the book, his relationship with Bellow grew increasingly rancorous. Bellow was distrustful of Atlas in part because the idea of a biography reminded him distressingly of his own death—it was, he said, “the shadow of the tombstone in the garden.” Bellow insisted upon signing off on each quotation. This struggle was complicated by how much Atlas admired Bellow. “I had wanted to be a son to him,” writes Atlas, “only to discover the spot was taken by several others.”

Writing a biography, he found, could be emotionally complicated. Immersing himself in the details of another person’s life led inevitably to self-comparison. But comparing himself to a celebrated novelist like Bellow was much less flattering than comparing himself to a forgotten poet like Schwartz. The process engendered envy and self-doubt:

“Most of the time I didn’t mind our unequal stature and talents: Go, you be the genius. But sometimes I felt: What about my life? Doesn’t it count, too? There comes, inevitably, a moment of rebellion, when the inequality begins to chafe. Biographers are people, too, even if we’re condemned to huddle in the shadow of our subjects’ monumentality. ... A thousand pages along, a decade in, the biographer cries out: What am I? Chopped liver? Yes. That’s what you signed on to be, and that’s what you are. Deal with it.”

The envy may have turned into resentment, and the resentment may have infected his prose. When his book was finally published in 2000, critics almost universally perceived it as an act of enmity against its subject. As late as 2015, when a new biography of Bellow appeared, this perception continued to cloud the reputation of Atlas’s book. In a review of that new biography for the New York Times, Dwight Garner wrote that Bellow had “lost faith in Mr. Atlas and stopped co-operating with him before the book was published.” But, Atlas protests, “whether he’d lost faith in me or not, he co-operated right up to the end.” It’s not the only one of these corrections that he issues. The Shadow in the Garden is at least partially a defense of Atlas’s honor in response to the harsh reception of this earlier book.

What saves the memoir is the self-awareness with which Atlas presents his personal experience. At times he is less defensive than apologetic, eager to get it right. He describes how the reception of his biography of Bellow caused him to worry obsessively about whether he had been unfair or malicious:

One day I sat down with yellow Post-its and went through my book page by page, marking the places where I felt I had gotten it wrong — not in fact but in tone, a thing much harder to get right. I referred to these as the Twelve Errors. ...  Some were ungenerous assertions: ‘Bellow wasn’t a nurturing person — to students, children, wives, or parents. He wanted the nurturing.’ Others were snotty: ‘Writers who posed a threat to Bellow’s hegemony got the cold shoulder.’ A few were judgmental, the worst sin of all: ‘In Bellow’s self-serving and sanctimonious account…’ How I longed to edit these sentences.

Over the years, Atlas would take down his book again and consider each Post-it, occasionally discarding one that, on reflection, did not seem to embody an error. Eventually he had an epiphany:

The key to writing biography is the capacity to be empathic; Holmes’s image of the biographer extending “a handshake” toward his subject stayed with me. At some point, without realizing it, I had withdrawn my hand.

Atlas chooses to join the chorus of condemnation against his own book; not on grounds of inaccuracy, but on grounds of unkindness. He claims to have fallen short of his own standards of empathic observation. The reader, however, is hard pressed to sympathize with the idea that he did not understand Bellow. Instead, it seems like Atlas got far too close.

The memoir is divided between the light and the darkness of biography, illustrated by the empathic triumph of his first book and the empathic failure of his second. He  contextualizes both experiences by interweaving discussions of the history of literary biography. Atlas discusses J.A. Froude, whose biography of Thomas Carlyle also acquired a reputation for hostility, and Elizabeth Gaskell, whose biography of Charlotte Brontë was attacked on every side by figures from Brontë’s life who disliked how they were portrayed in it. The suggestion is that The Shadow in the Garden is written on behalf of all the biographers whose honesty about their subjects was interpreted as gossip, or whose readability was maligned as salaciousness. Such are the pitfalls of the genre. But is biography writing worth it? Atlas thinks it is:

The challenge of reconstructing someone else’s world; the opportunity to educate yourself; the serendipitous encounters and unlikely finds, I found this invigorating. You could never hope to get it right. ... But if, over many years, you worked very hard and extended your hand, you could write a book that earned a high degree of permanence.

Does Trump Live in an Alternate Reality?
October 19th, 2017, 07:40 PM

It’s hard to accurately characterize President Donald Trump’s habit of making consistently false, frequently self-contradictory, often hypocritical, and always flamboyant statements. No word quite captures their all-encompassing magnitude, their frequency, and, often, their sheer pointlessness. “Lies” is always a good place to start, but in Trump’s case it only begins to cover the problem. “Bullshitting” is too cute for the rolling crisis we find ourselves in. “Gaslighting” implies that something strategic is happening, and Trump appears to be working on pure intuition. We don’t have the language to convey how serious the president’s lies—or obfuscations or exaggerations or feints or whatever else you want to call them—are.

Furthermore, we don’t know with any certainty what effect his public statements are having on policy. What does it mean when the president, say, offers a grieving Gold Star family $25,000 but doesn’t pay up? Does it imply that he thinks the military’s survivor benefits program is inadequate, or did he just want to weasel out of an awkward situation? Are his threats to start a nuclear war with North Korea serious, or a bluff? Does he really want to blow up Obamacare, or was that a poor attempt to move complicated legislative processes forward?

Axios’s Mike Allen has taken a stab at diagnosing the problem. Earlier this week, Allen argued that Trump’s inability to tell the truth represented a kind of ontological problem that the media was ill-equipped to handle. In a follow-up post, Allen took this argument in a more interesting direction, arguing that Trump’s brazenly untrue statements represent an “alternate reality” that is separate from the “reality” of policy-making cabinet officials like Secretary of State Rex Tillerson and Secretary of Defense James Mattis.

“It’s a feature, not a bug, of this White House for Trump to say one thing about policy, and for his cabinet or hand-picked officials to say or do the exact opposite,” Allen writes. “This dynamic—like the spreading of fake news or false statements—makes it hard for the media, Republicans, and his cabinet to determine when to take the leader of the free world seriously. ... This is not a plot of evil genius to keep friends and foes guessing. It’s the inevitable output of an improvisational president who often says whatever pops into his head.”

This is fairly banal stuff. Because everyone knows the president is full of it, it’s hard to hold him accountable and even harder to push policies through, since no one knows where the president stands on any one issue at any one time. (There is an element of Schroedinger’s cat to Trump’s presidency; he simultaneously stands for a very specific agenda and nothing at all.) This is basically the position of the rest of the administration, with Tillerson famously saying that the president “speaks for himself.”

But Allen’s analysis doesn’t go far enough. Here are a few of the examples he cites:

    • “SecState Rex Tillerson says North Korean diplomacy ‘will continue until the first bomb drops’; Trump tweets that he’s ‘wasting his time.’”
    • “SecDef Jim Mattis tells Congress that holding onto the Iran nuclear pact is in the interest of the national security of the United States; 10 days later, Trump threatens cancellation.”
    • “Trump threatens extreme action on immigrants, Muslims, ‘Dreamers,’ trade, NATO, and more, but aides and advisers wind up softening or delaying most—with the notable exception of the Paris climate deal.”

The problem here isn’t just that Trump is defying and ignoring the advice of his cabinet. The problem is that Trump’s lies are themselves becoming the main drivers of American policy. In certain instances, we may have ended up in a less disastrous place than what Trump had originally envisioned. But in the case of the Dreamers, Iran, and Obamacare, he kicked the issues to a flighty and inept Congress, which is now tasked with resolving these problems quickly. Trump may be bluffing or bullshitting or just plain lying, but these bluffs, bullshit, and lies are being woven into every major American policy.

Take the Iran deal, which Trump “decertified” against the wishes of most top cabinet officials. It is something of a cop-out, since Congress is likely to uphold the deal, but decertifying it carries considerable risk. In response, Iran could restart its suspected nuclear weapons program. The move also undermines the already questionable reliability of the United States, at a time of heightened tensions with North Korea. And both these risks were taken despite the fact that there were no real policy benefits to speak of.

Trump is trying to find a middle ground between a campaign promise to rip up the Iran deal and the reality that, though imperfect, the status quo is far preferable to any attainable alternative. Without a commitment to go all the way, the reasons for “decertifying” are entirely optical. And yet Tillerson and National Security Advisor H.R. McMaster both hit the Sunday show circuit to defend the decision and the integrity of the United States.

This uncertainty has bled into the attempt to address the dangerous situation in North Korea. There, the president’s low-rent madman theory—that he is so unpredictable that it keeps North Korea constantly guessing—has also crept into the more serious world of actual policymaking. Tillerson’s claim that diplomacy will continue until “the first bomb drops” is itself a variant of the kind of rhetoric that Trump uses with regard to North Korea.

While in normal administrations there is an attempt to minimize the daylight between the president and his cabinet, in the Trump administration there isn’t enough daylight between the president and a coterie of diplomatic and/or military officials who have the country’s best interests at heart—a group that Allen has referred to as the “Committee to Save America.” Trump’s approach to these issues is wildly irresponsible, and yet more often than not it ends up being the approach his administration ultimately undertakes. Whatever the Committee to Save America says in private to journalists and other policymakers, their public comments make it clear they’re living in the president’s reality.

This is especially clear in the “extreme action” Trump has taken with regard to the Dreamers (those young immigrants whose permits to stay in America have been tentatively revoked) and Obamacare (whose subsidies for low-income consumers have been tentatively rescinded). By foisting these problems on a reluctant Congress, Trump has theoretically stopped short of doing actual damage; after all, it’s possible for legislators to come up with solutions (even if it’s also possible that Trump may not ultimately sign them into law). But these actions have a real chance at being codified, and the fact that action was taken at all shows that it effectively doesn’t matter whether Trump’s more responsible cabinet disagrees with him. In both these cases, the preferred outcome for policy-makers would have been if Trump had done nothing.

There’s a temptation to isolate Trump from those around him and from the U.S. government writ large. It’s comforting to believe that someone, somewhere in the Trump administration, whether it be John Kelly or Rex Tillerson or Jared and Ivanka, is able to keep a handle on reality. The fact that some advisers are telling Trump to not do damaging things suggests there’s a degree of truth to this. But the bigger story is that Trump’s alternate reality is invading every aspect of American foreign and domestic policy, very much including those areas overseen by officials that have been deemed the sane ones.

Ron Chernow’s Grant Is Popular History at its Best
October 18th, 2017, 07:40 PM

This is a strange time for a genre of American history that is sometimes derisively referred to as “dad history.” These approachable tomes about the great men of the past are a fixture of airport bookstores and Barnes & Noble wishlists, and they have never been more popular or lucrative. David McCullough’s John Adams became an HBO miniseries. Ron Chernow’s Alexander Hamilton became Lin-Manuel Miranda’s hit Broadway musical. Many of these titles—particularly Doris Kearns Goodwin’s Team of Rivals—have transcended their genre, becoming fodder for entrepreneurs and the managerial class. After all, who better to teach us the lessons of leadership than the greatest leaders ever?

But even as dad history has reached its cultural zenith, it has never seemed more out of step with the times. Advances in historical scholarship have brought more attention to the plight of Native Americans, African Americans, women, and other marginalized groups. Greater attention is being paid to the original sins of the United States—slavery and genocide—and the way they have reverberated across the centuries. The best works of recent narrative history, such as Edward Ayers’s The Thin Light of Freedom and Heather Anne Thompson’s Blood in the Water, have focused on ordinary Americans’ contributions to seismic events, a ground-level viewpoint that is at odds with the Great Man perspective.

GRANT by Ron ChernowPenguin Press, 1104 pp., $40

And then there is the current occupant of the White House, whose “I alone can fix it” approach to governance should give Great Man historians pause. If there was ever evidence that history is moved by broader forces, as opposed to the talents of any one individual, Donald Trump is it. And the emergence of Trump as the country’s preeminent political figure is a sure sign that those original sins are very much still with us.

Chernow’s new book, Grant, is deft at navigating these tricky currents. Coming in at a well-paced but absolutely gargantuan 959 pages, Grant is a stirring defense of an underrated general and unfairly maligned president. Its great contribution to the popular understanding of the Civil War and its aftermath is to expose the roots of the longstanding bias against Grant: White southerners and their allies wanted to portray Reconstruction as a tragic folly, rather than a radical and unfinished revolution.

To be sure, a sympathetic treatment was to be expected: Chernow is enormously defensive of his subjects. Alexander Hamilton and Washington: A Life are both handicapped by his refusal to acknowledge any flaw (elitism and slaveholding, respectively) without heavy qualifications. He has often been praised for humanizing marble men, but in fact Chernow is uncomfortable with their shortcomings, which leads him to create marble men that are merely a little more lifelike. Secondary characters often suffer, because his subjects’ enemies often become his enemies. Chernow’s work is best understood as a kind of modern myth-making, updating the stories of America’s past in ways that make them come alive in the present.

In Grant he is aided enormously by his subject’s extraordinary life. The book tracks his struggles as a civilian; his incredible and rapid rise during the Civil War; his unexpected fall at the hands of the con man Ferdinand Ward, the 19th century’s Bernie Madoff; and, eventually, the writing of his memoirs, one of the greatest works of American literature. (Ironically, the strength of The Memoirs of Ulysses S. Grant, which have just been reissued in a very helpful annotated edition, may have delayed the recognition of Grant’s genius by discouraging potential biographers.)

For a book that seeks to set the record straight, it is also aided by the flimsy but strongly held myths about Grant, nearly all of which can be sourced to critics of Reconstruction and Lost Causers: that Grant was a butcher who only won victories because of the North’s technological superiority and numerical strength; that his presidency was defined solely by corruption; that his lifelong struggle with alcoholism was deeply debilitating.

Despite writing very long books, Chernow is not one to get bogged down in anything that isn’t the main plot. Fort Sumter is fired upon on page 123, while the next 15 years constitute 735 pages, meaning that Grant’s first 40 years and final decade get about 100 pages each. This makes for a propulsive and focused book, but it does have its drawbacks, particularly when it comes to filling out Grant’s elusive character. This is most evident when the discussion turns to his flaws, about which Chernow is characteristically apologetic.

Grant has been universally praised for his composure under pressure and in the heat of battle, but he was a sucker who was repeatedly betrayed by people close to him and who was easily connived out of money. Chernow is as mystified by this trait as Grant’s contemporaries were. His odious General Order No. 11, which expelled Jews from the military district under Grant’s control, is torturously explained as being aimed more at Grant’s opportunistic father, who had entered into business with Jewish businessmen in an attempt to profit from his son’s success.

Then there’s the vexing issue of Grant’s alcoholism. Here, Chernow is, to his credit, remarkably sympathetic. He treats alcoholism as a disease that Grant largely overcame, rather than a personal failing. Once again he is defensive about his hero’s weaknesses, but in this case it is channeled in a useful way. Grant’s alcoholism was frequently used by his foes, many of whom opposed Reconstruction, to denigrate the man who was championing black labor and voting participation.

Chernow is most comfortable describing battles and landmark achievements, so it’s no surprise that his treatment of Grant’s role in the Civil War is the book’s most fully realized section. Other generals—Grant’s friend and sometimes frenemy William Tecumseh Sherman, Robert E. Lee, the despicable Nathan Bedford Forrest—are typically held up as the “geniuses” of the War. But Chernow persuasively makes the case that Grant was its most-forward thinking and innovative general and that, while he had equals as a tactician, his ability to manage, mobilize, and deploy enormous armies was unsurpassed. Chernow is also in fine form depicting Grant’s commitment to abolition, which grew dramatically during the course of the war and has not been adequately explained by other popular biographies. In Grant, Chernow patiently unspools Grant’s realization that the Civil War is about slavery and, ultimately, equal protection.

This leads to Grant’s real strength: its treatment of Reconstruction. It is portrayed as a continuation of the divisions that led to the Civil War, rather than a grace note, a national embarrassment, or a well-intentioned failure. Ken Burns’s 1990 documentary The Civil War, which made a maudlin case that the war ultimately brought the country closer together and largely ignored its aftermath, emerged right at a moment when the so-called revisionist interpretation of Reconstruction was ascendant. Grant breaks little new ground historically, but the fact that it treats Reconstruction as the beginning of a long, troubled fight for equality in America is nevertheless important.

Chernow can be too fatalistic at times. He excuses the failure to redistribute lands as politically impossible, and treats Grant’s retreat from Reconstruction in 1875 as a tragic inevitability, given that many in the North had tired of the military occupation of the South. The treatment of Native Americans during Grant’s presidency, meanwhile, is muddled: Chernow is torn between Grant’s assimilation-oriented Peace Policy and the reality that the president’s old friends Sherman and Phil Sheridan oversaw numerous massacres while under his authority. Chernow also tends to insulate Grant from his administration’s corruption, which he insists he knew little about. This helps keep the focus on Grant’s devotion to fighting for progress and against the Ku Klux Klan in the South, but also feels disingenuous—continuous scandal, in fact, damaged the administration’s credibility and helped undercut Reconstruction efforts.

But to his credit, Chernow is no sentimentalist. In past accounts (by biographers, but also in the kinds of folktales that usually pass for American history) Grant’s funeral symbolized a nation finally coming together after the Civil War: Grant’s coffin was carried by three Southern generals and three Northern generals, all of them united in grief. Chernow resists this familiar cliché, with an account of the funeral that is affecting but based in an ambivalent reality about what Grant was able to accomplish during his life.

Chernow underscores Grant’s tremendous efforts on behalf of the four million slaves freed during and after the Civil War, while acknowledging the fact that Reconstruction remains unfinished even today. Popular history emphasizes neatness and inevitable triumph in the face of adversity, and Grant’s presidency was neither neat nor triumphant. He was a complicated hero who won the Civil War but was never quite able to win the peace—or the ultimate promise of emancipation. By placing this imperfect hero at the center of a book sure to be found under the Christmas tree, Chernow has given us a rare kind of popular history: one that forces readers to confront hard truths, not just revel in America’s all too fleeting triumphs.

Sibling Rivalry
October 18th, 2017, 07:40 PM

At a town hall meeting in New York City early this year, Nancy Pelosi fielded a thorny question about the direction of her party. Trevor Hill, a dapper New York University sophomore sporting a light purple shirt and suspenders, wanted to know where the House minority leader stood on the question of socialism. A recent poll had shown that more than half of all American voters younger than 30—not just Democrats—no longer support capitalism. This statistic felt true to Hill’s own experience, not just among his NYU classmates but also from what he’d seen in polls and on television. He was glad that Democrats had moved to the left on social issues, like gay marriage. So why, he asked Pelosi, couldn’t they move left on economic issues? Could she see Democrats embracing a “more populist message—the way the alt-right has sort of captured this populist strain on the right wing?”

After politely thanking Hill for his question, Pelosi was quick to shoot down any talk of left-wing populism. “We’re capitalist,” she told him firmly, “and that’s just the way it is.” To be sure, Pelosi acknowledged, there are serious flaws in the system: CEOs are making too much money, and the social safety net has worn thin. But Pelosi assured Hill that Democrats, aided by enlightened capitalists, can solve such problems. The alternative—introducing socialist-oriented policies such as universal health care or free college education for all—is unthinkable. “I don’t think we have to change from capitalism,” Pelosi concluded. “We’re a capitalist system.”

The contrast between Pelosi, a centrist liberal, and Hill, a young leftist, is emblematic of deep fissures within the Democratic Party. These divisions—which flared up during last year’s brutal primary race between Hillary Clinton and Bernie Sanders and have only intensified since Donald Trump’s victory—are often seen in narrowly partisan terms, as a lingering quarrel between rival Democratic factions. But it’s a grave mistake to dismiss this dispute as nothing but postelection infighting. In truth, Clinton and Sanders are proxies in a long-standing ideological battle between the two major camps within the Democratic Party: liberals and socialists.

If the battle seems intense, it’s because the two camps are so closely related. Liberalism and socialism are best understood as sibling rivals. Both were born of the common inheritance of the Enlightenment and the democratic revolutions of the nineteenth century. Both are committed to secular amelioration of the human condition. Their family feud is waged over the central issue of the nature of capitalism. Liberals see it as a flawed but worthy system that needs reform, while socialists push for its ultimate (if distant) transformation into a system where major economic decisions are brought under democratic control.

As with all sibling rivalries, competition brings out the best and the worst in both sides. Over the past century, liberals and socialists have engaged in rancorous debate, bitter recrimination, and even political repression. Yet Democratic presidents from Woodrow Wilson to Lyndon B. Johnson won their most consequential victories when they faced strong left-wing challenges. Liberals’ greatest achievements—including child labor laws, Social Security, and Medicare—were all based on ideas that socialists agitated for. The most radical phase of Franklin D. Roosevelt’s presidency—the Second New Deal period from 1935 to 1936, when the federal government guaranteed workers the right to organize and enacted a large-scale public works program—took place against the backdrop of intense organizing by socialists and communists. It was widespread fear of this working-class militancy that allowed FDR to push through a far-reaching agenda.

Going forward, the crucial question for Democrats is whether socialists and liberals can overcome their rivalry and find ways to work together. In an age of nationalist fervor and populist unrest, neither can succeed alone. As any child knows, there’s nobody harder to get along with than a member of your own family. But that is what it will take for liberals and socialists to reclaim their common inheritance, defeat Trumpism, and forge a more egalitarian and representative democracy.


Socialism has been in eclipse in America for so long that we don’t normally think of it as a major ideological force. While Eugene Debs presided over a vibrant third party in the early twentieth century, becoming the most popular left-wing insurgent ever to run for president, his successor, Norman Thomas, won only 0.29 percent of the vote as the Socialist Party candidate in 1948. Over the course of a century, socialism in America transformed from mass movement to die-hard sect to an eccentric hobby. In his book Blood of the Liberals, the journalist George Packer provided a dispiriting account of what it was like to be a member of the Democratic Socialists of America, the major organ for socialist politics after Debs’s party dissolved, during the 1990s. “The organization acquainted me with the pathos of left-wing activism in twilight,” Packer recalled:

It was marginal, pedestrian work, based on the eternal postponement of gratification: three-hour board meetings in a narrow room in a church basement; a two-year fund drive to buy a used computer; a snowbound forum on the Canadian left, whose announcement reached most of the membership too late because the nonprofit mailing wasn’t sorted properly. The word “Sisyphean” is misleading, since we never pushed our rock anywhere close to the top.

Given this disarray, the most enduring legacy of 2016 might not be Donald Trump’s presidency, but the rebirth of American socialism. Sanders, a democratic socialist, won 43 percent of the vote in the Democratic Party primaries and has, polls show, become the most popular active politician in America since Trump’s election.

The Sanders campaign was the crest of a larger socialist wave that began with the global financial crisis. The ensuing recession, coupled with the manifest failure of either Democrats or Republicans to address the crisis adequately, sent more and more Americans in search of explanations and solutions that establishment politics did not offer. On the right, this contributed to the rise of the Tea Party, Breitbart News, and Donald Trump. On the left, Occupy Wall Street critiqued the predatory one percent in a new vocabulary, while Rolling Jubilee attempted to salve the crushing burdens of student and medical debt. DSA membership has surged to record levels, and socialist magazines both new and old (Dissent, n+1, Jacobin) hum with lively debates over single-payer health care and universal basic income. For the first time since the 1960s, the left wing of the American political spectrum doesn’t end at left-liberalism, but extends to socialism.

In the realm of political theory, if not in the messier realities of practice, it has long seemed logical and necessary that liberalism and socialism should converge. In the nineteenth century, John Stuart Mill, the foremost heir to classical liberalism, concluded that liberalism’s commitment to private property and limited government, born of an age when suffrage was still restricted to propertied men, would have to be modified in the age of mass democracy, when both women and working-class men had won the vote. The newly enfranchised masses would make material demands that couldn’t be met by the minimal state endorsed by classical liberalism. The liberal ideal of individual freedom could only become real to most people if it were bolstered by socialist economic policies.

“The social problem of the future,” Mill explained in 1873, was “how to unite the greatest individual liberty of action, with a common ownership in the raw materials of the globe, and an equal participation of all in the benefits of combined labor.” Mill’s project of creating a liberal socialism (or a socialist liberalism) was taken up by such distinguished successors as Bertrand Russell and John Dewey. “The cause of liberalism will be lost,” Dewey observed in 1935, “if it is not prepared to go further and socialize the forces of production now at hand.” Dewey was writing at the height of the Great Depression, when European business elites were openly supporting fascism as a means to counter working-class militancy. Bourgeois liberal society, it was clear, could not defend its own achievements unless it accepted social democratic reform.

In practice, however, relations between liberals and socialists have been tense and sometimes bloody. Consider the fate of Debs, who received nearly 6 percent of the vote as the Socialist Party candidate for president in 1912. Under his leadership, the party won over a thousand local races across the country, electing socialist mayors in 24 states. Debsian socialism was broad and inclusive, attracting immigrants in New York and Oklahoma farmers. His party stood also at the forefront of the feminism of the time, supporting female suffrage and the legalization of birth control.

Yet the Socialist Party was ultimately undermined by Woodrow Wilson, who both stole its ideas and suppressed its leaders. As Irving Howe observes in Socialism in America, many of Wilson’s major reforms were drawn directly from the “traditional socialist legislative program,” including “a graduated income tax, the Clayton Act to limit labor injunctions, a child labor law, several laws helping farmers, the direct election of senators.” Having stolen socialism’s thunder, Wilson used the powers of the state to quash the Socialist Party. Declaring Debs a “traitor to his country” for his opposition to America’s entry into World War I, the Wilson administration banned socialist publications and jailed party activists—including Debs himself, who was convicted of sedition in 1919. The following year, when Debs ran for president from his prison cell in the Atlanta Federal Penitentiary, he received nearly a million votes.


While relations between liberals and socialists would never get so low again, certain patterns would recur. Socialist energy and activism continued to push liberals toward their most far-reaching reforms, ranging from Social Security under FDR to the War on Poverty under LBJ (which took inspiration from Michael Harrington’s The Other America). Yet time and again, socialists came away disappointed by the way such reforms were weakened or derailed by the persistent power of big business. This sense of betrayal contributed to the marginalization and isolation of the party’s left, steadily reducing the influence that socialists were able to exert on the liberal establishment.

Norman Thomas, who had been active in organizing sharecroppers during the 1920s and 1930s, was furious that farm relief under FDR funneled taxpayer money to big planters, a key constituency of the Democratic Party. Thomas became an implacable critic of the New Deal, which he felt did not go far enough in reforming capitalism. In the ’60s, LBJ’s escalation of the Vietnam War similarly derailed the possibility of a socialist-liberal alliance, driving many young idealists away from mainstream politics into political extremism or apolitical self-indulgence. Politically engaged young people who might have become Democratic Party activists instead joined the Weather Underground or retreated into communes where they cultivated their organic gardens.

Socialist Party leader Eugene Debs was declared a traitor for his opposition to World War I.Bettmann/Getty

In his survey of American socialism, Irving Howe came to the surprising conclusion that the most promising model for liberal-socialist cooperation came not from the party of Debs but from a movement even further to the left: the Popular Front forged by the American Communist Party between 1935 and 1939. During this period, the party demonstrated a skill for realpolitik that Debs and Thomas were incapable of. Instead of opposing the Democrats, the Communists became their junior partners, forming anti-fascist front groups that pitched communism as a patriotic movement. Repackaging themselves as “liberals in a hurry,” the Communists worked hard to emphasize the shared goals of a broad left.

To be sure, the Popular Front’s attempts to make socialism as American as apple pie sometimes sounded absurd. In 1938, the Young Communist League at the University of Wisconsin put out a pamphlet that read:

Some people have the idea that a YCLer is politically minded, that nothing outside of politics means anything. Gosh no. They have a few simple problems. There is the problem of getting good men on the baseball team this spring, of opposition from ping-pong teams, of dating girls, etc. We go to shows, parties, dances and all of that. In short, the YCL and its members are no different from other people except that we believe in dialectical materialism as a solution to all problems.

But however strained its sales pitch may have been, the Popular Front embraced a key insight that had eluded socialists. In The Heyday of American Communism, published in 1984, historian Harvey Klehr argued that the Communists “discovered just how open and permeable American political parties were.” As Communist official Terry Pettus told Klehr, “You are a member of whatever you say you are.” Just by working within the system, Pettus and his comrades discovered, a small group of activists “could win control of a large segment of the Democratic Party.” By forging a successful alliance with liberals, the Popular Front enabled the Communist Party to become a significant political force in states like New York, Minnesota, and Washington.

For Howe, the lesson was that socialists like Debs and Thomas might have profited from creating their own version of the Popular Front. Rather than establishing a third party—a strategy doomed to failure in America’s winner-take-all electoral system—socialists should try to become an influential faction within the Democratic Party, winning elections on the party ticket and shaping policy from Washington State to Washington, D.C. That’s what Jesse Jackson attempted with the Rainbow Coalition, expanding the range of ideas and constituencies that were welcomed within the party. That’s how the Tea Party transformed itself into the dominant force within the Republican Party in the space of a few years. And that’s the strategy Bernie Sanders pursued when he ran as a Democrat in 2016, injecting socialist ideas directly into the liberal mainstream and forcing the party further to the left.


Socialist strategy isn’t the only thing that has prevented the emergence of a new Popular Front. While Democrats under Wilson and FDR embraced and even implemented ideas from the left, today’s liberals have worked hard to distance themselves not only from socialism, but also from liberalism. The historic cooperation between the two broke down with the rise of Bill Clinton and the New Democrats, who championed Third Way politics in the ’90s. In an earlier era, Arthur Schlesinger Jr. had envisioned liberalism as inhabiting the “vital center” between “fascism to the right, communism to the left.” Clinton adopted Schlesinger’s term—speaking of the “vital American center”—but he located his midpoint on a very different spectrum. Clinton positioned himself a centrist standing between the conservative extremism of Ronald Reagan and the older liberalism of FDR and LBJ.

Third Way politics moved the center rightward. Clintonism wasn’t just to the right of New Deal liberalism on specific policies, like welfare reform and balanced budgets—he urged Democrats to embrace the right’s underlying ideological framework. “The era of big government is over,” he declared in his State of the Union address in 1996, abandoning any commitment to bringing the economy under a modest degree of democratic control. In the place of government regulation, Third Way liberals now extolled the market as the solution to all of society’s problems. Government itself had to be brought under the discipline of market economics, made leaner and more entrepreneurial. Instead of large, universal programs like Social Security and Medicare, Third Way liberals supported targeted, market-friendly ideas like privatization and welfare reform. If there had to be more government spending, it would be means-tested.

Third Way politics returned Democrats to the White House—but it also laid the groundwork for the rise of both Bernie Sanders and Donald Trump. Barack Obama won by promising hope and change, but he governed by hewing to the market-friendly version of liberalism staked out by Bill Clinton. He was wary of implementing public-works programs even in response to the worst economic catastrophe since the Great Depression, refused to crack down on the big banks responsible for the crisis, and tinkered with health care markets rather than pushing for a single-payer system. Whatever success Obama achieved, his brand of liberalism remained well to the right of that of FDR.

But while Obama worked to protect liberalism from any whiff of socialism, extremists on the right were forging a powerful alliance of their own. With Obama’s election, the Tea Party—a movement that cloaked its fanaticism in the language of patriotism—staged a coup of the Republican Party. At the same time, the legacy of a quarter-century of Third Way policies, from free trade to financial deregulation, supercharged the wealth of the one percent and drove millions of Americans into joblessness, insecurity, and debt. To respond to these crises, liberals need the energy and ideas of socialism, while the left needs the access to power that only the Democrats can provide in our two-party system. As a result, the possibilities of a successful alliance between liberals and socialists is greater today than at any point since LBJ’s creation of the Great Society.


To forge a modern-day Popular Front, both liberals and socialists will have to give a little, making a variety of ideological compromises in the interests of political unity. Take foreign policy: Democrats like Hillary Clinton have supported the American wars in Iraq and Afghanistan, and don’t treat Israel’s occupation of the West Bank as a pressing problem. Socialists, by contrast, tend to oppose U.S. intervention overseas, and DSA has backed the campaign of Boycott, Divestment, and Sanctions against Israel. Daniel Biss, a Democratic state senator running for governor of Illinois, recently demonstrated the limits of liberal tolerance in foreign policy when he dropped his running mate, DSA member Carlos Ramirez-Rosa, over his involvement in BDS.

Another hurdle is money. Socialists hold that a party reliant on big donors isn’t free to fight for economic equality. You can’t run on lowering drug prices, say, when pharmaceutical companies are funding your campaign. Fortunately, socialists appear to have come up with an alternative funding model that could help overcome this obstacle: Last year, Sanders raised $220 million mostly from a broad base of individual donors, who contributed an average of $27 each to his campaign. If the Sanders model can be replicated in congressional races and elect a Democratic president, then liberals need no longer be beholden to Wall Street and big corporate interests.

A final obstacle to an alliance of socialists and liberals is identity politics. For most of the past century, socialists have been well ahead of most liberals in opposing racism and sexism. But socialists have done themselves no favors by insisting that class must take precedence over issues like race and gender—a position that comes across as dismissive of the concerns of women and people of color. Certainly Sanders made himself vulnerable on this front by treating identity politics as little more than a distraction from the goal of economic equality. In the wake of the 2016 election, Sanders continued to emphasize class above all else: “It’s not good enough to say, ‘Hey, I’m a Latina, vote for me.’ I have to know whether that Latina is going to stand up with the working class of this country and is going to take on big money interests.”

To overcome the sharp divide over class and identity, socialists must develop a new generation of politicians—one that can speak to such issues in a more sophisticated way. Since economic inequality is deeply intertwined with both race and gender, it’s reasonable for socialists to argue that focusing on income redistribution will disproportionately help women and people of color. But politics isn’t just a matter of pursuing the right policy—it’s about voice and representation. For socialism to succeed, it can’t come from a white male demanding that women and people of color choose between their class interests and their racial and cultural identity. The political face of socialism—the leaders who represent it—must resemble the actual working class, which is increasingly nonwhite and female.

Newt Gingrich and Bill Clinton, 1995: Third Way liberals like Clinton moved the center rightward.Paul J. Richards/AFP/Getty

Despite such hurdles, though, there is good reason to think that socialists and liberals can forge a Popular Front, one motivated by the need to confront a common enemy: a capitalist system that has returned America to the stark inequality of the Gilded Age. In recent years, there’s been a dramatic shift in liberal thinking about the nature of capitalism itself. As tech giants like Google, Facebook, and Amazon increasingly exert their influence on all aspects of public life, activists like Barry Lynn have played a major role in reviving anti-monopoly politics among liberals. Documenting the concentration of corporate power, the Open Markets Initiative has pushed for a revival of antitrust laws, a powerful tool and central tenet of the Democratic Party until the rise of Third Way liberalism. In the current political landscape, liberals are being forced to reexamine the systemic nature of the crisis, which means finding blame not just in a baleful individual (Trump) or even his party (the Republicans) but also in the larger social forces that made the crisis possible.

For the first time in decades, liberals are starting to see entrenched capital as the primary foe. What were once heretical positions are now becoming litmus tests for politicians on the national stage. When Sanders introduced his Medicare for All proposal in early September, almost a third of all Democrats in the Senate signed on as co-sponsors. Notably, the list included many of the politicians considered most likely to run for president in 2020, including Cory Booker, Elizabeth Warren, and Kirsten Gillibrand.

Whether or not Sanders himself runs again, he’s clearly setting the agenda for policy debates that have long been the exclusive domain of liberals. Democrats are once again listening to the ideas and vision of the socialist left. The limits of the possible—the best working measure of politics—have expanded enormously. Both sides recognize that democracy cannot function as long as moneyed interests go unchallenged. Which means that Nancy Pelosi’s response to the hopeful NYU student—“We’re capitalist”—is no longer the final answer.

Art Laffer and the Intellectual Rot of the Republican Party
October 18th, 2017, 07:40 PM

The economist Arthur Laffer appeared early Monday on the Fox Business Network to defend the current president’s proposed tax cuts. “I’m hoping the Democrats vote with it. They should vote with it, they believe in it, they want it,” he said, wrongly. “To let this partisanship go to that extreme that they vote against America is, to me, shocking. I can’t imagine a lot of them not voting in favor of the president’s bill.” Minutes later, he added, “I just don’t know how a Democrat can honestly vote against this bill and hold his face up high to the electorate. It just doesn’t make sense to me.” Donald Trump, a known cable-news addict, tweeted later that morning:

Laffer might not be a familiar figure to many of Trump’s 40 million Twitter followers, but the former economic adviser to President Ronald Reagan is a giant within the Republican Party as the “godfather” of supply-side economics, the theory that cutting taxes and regulations creates economic growth (a cousin to trickle-down economics, which argues more specifically for tax cuts for the rich). In reality, though, Laffer is a Svengali or huckster who holds unconscionable influence on one of America’s two major parties, despite his ideas having been discredited—and not only by critics on the left—from the very beginning. That Trump and other Republican policymakers are promoting Laffer’s disproven theory yet again is the most damning evidence of the party’s intellectual rot.

The legend of Laffer begins with a bar napkin, on which he drew the famous “Laffer Curve” during a meeting in 1974 with Donald Rumsfeld, then President Gerald Ford’s chief of staff, and Dick Cheney, Rumsfeld’s deputy. The Smithsonian is currently displaying what they claim to be the original napkin, but Laffer himself disputes its authenticity, telling The New York Times that he thinks it’s a keepsake he created later. Still, the Smithsonian is right to to honor this most famous of American serviettes, the Magna Carta of modern Republican economics.

The Laffer Curve napkin displayed by the Smithsonian, dated 9/13/74, reads, “If you tax a product less results. If you subsidize a product more results. We’ve been taxing work, output and income and subsidizing non-work, leisure and unemployment. The consequences are obvious!”National Museum of American History

The bell-shaped curve illustrates Laffer’s argument that there are two points where a tax rate generates no income: 0 percent, for obvious reasons, and 100 percent, because no one would work if their entire income were taxed. Between those two points, if we follow the trajectory of Laffer’s semicircle, there’s a point where rising taxes raises more revenue, but also a downhill slide where a higher tax rate becomes counterproductive. The curve is more theoretical than practical, since it doesn’t answer the question of when a tax rate is too high. As a matter of politics, though, it offers a splendid rhetorical tool to Republicans who want to slash taxes for the wealthy without having to worry about the deficit: Tax cuts, they claim, spur enough economic growth to offset the lost revenue.

But Laffer’s insistence that tax cuts are self-financing has been disproven time and again over the past 40 years. Reagan’s tax cuts were followed by ballooning debt. Clinton raised taxes on the wealthiest Americans, then the economy expanded and the government recorded its first budget surplus since the 1960s. After George W. Bush’s tax cuts, America fell back into red and the economy sputtered.

Marketwatch.com

Of course, as some Republicans will note, the tax rate is one of many factors that impact economic growth or contraction; the late ’90s tech boom helped Clinton, then hurt Bush when it burst. But ardent supply-siders like Laffer claim that the tax rate is the determining factor, which, considering the historical graph above, makes their case even weaker. As Jonathan Chait demonstrated in his 2007 book The Big Con, “supply-siders have taken the germ of a decent point—that marginal tax rates matter—and stretched it, beyond all plausibility, into a monocausal explanation of the world.” The supply-siders “think booms and busts result from changes in tax policy—and only from changes in tax policy.” In reality, the Laffer Curve is a solution to an imaginary problem. While there might be a point where higher taxes yield less revenue, America is nowhere near that juncture. According to the Tax Policy Center, the nation’s tax revenue in 2015 equaled 26 percent of the gross domestic product, compared to an average of 34 percent average among developed economies.

For these and other empirical reasons, economists across the political spectrum are dismissive of the Laffer Curve. Harvard economist Greg Mankiw, an outspoken conservative who led George W. Bush’s Council of Economic Advisors, has called supply-siders “Charlatans and Cranks” and compared them to a “snake-oil salesman.” Paul Krugman, the liberal New York Times columnist and Nobel Prize–winning economist, often argues with Mankiw on economic policy, but also thinks “the supply-siders are cranks” who adhere to a doctrine “without a shred of logic or evidence in its favor.” This widespread criticism is almost as old as the Laffer Curve itself. As The Washington Post reported in 1989, at the dawn of George H. W. Bush’s presidency:

Laffer’s mathematical ‘curve’ gained folklore status—if limited acceptance by other analysts..... Critics seize on the $155 billion federal budget deficit as proof that the tax cuts he advocated were excessive. They also dismiss the supply-side claim that a tax cut could pay for itself by stimulating new business activity, a claim Laffer says that others exaggerated. “History has contradicted, not confirmed, the Laffer hypothesis,’’ said Robert Eisner, an economics professor at Northwestern University and past president of the American Economic Association. “Certainly, I think it made no contribution to professional economics.’’ Lawrence Chimerine, chairman of the WEFA Group, an economic consulting firm in Bala-Cynwyd, Pa., said the tax cuts aggravated America’s financial ills, including the trade deficit. He described the benefits promised by Laffer and others as “way, way overblown.’’

Laffer’s theory originally took root, though, because he “voiced a message that the president liked to hear.” The same is true today, as Trump’s tweets on Monday attest, but it’s also a message that Republican leaders in Congress like to hear; they’re all behind the tax reform plan unveiled last month, which is nothing if not a paean to supply-side economics. As Paul Krugman notes, “The main elements of the plan are a cut in top individual tax rates; a cut in corporate taxes; an end to the estate tax; and the creation of a big new loophole that will allow wealthy individuals to pretend that they are small businesses, and get a preferential tax rate. All of these overwhelmingly benefit the wealthy, mainly the top 1 percent.” Laffer is enthusiastic about the plan, since it would lower marginal rates for the economic elite—though Laffer, like most Republicans, denies that the cuts would only benefit the rich. “I think it’s trying to make everyone millionaires, billionaires, and trillionaires,” he told Al Jazeera last month. “This is a plan to get everyone to economic growth and give everyone a shot at the apple.”

Trump’s tax plan is likely to increase public debt and widen inequality instead. History suggests as much, as do distinguished economists. But such evidence means little to Laffer and the Republican Party. As the late Robert Bartley, a Wall Street Journal editor who was one of Laffer’s biggest promoters, once crowed, “Economists still ridicule the Laffer Curve, but policymakers pay it careful heed.” Whether they do so because they believe Laffer’s theory, or just use it as intellectual cover to cut taxes on the rich, is an open question. But Trump could have an entirely different goal in mind: bringing peace to a fractured party. While Steve Bannon’s insurgent populists challenge Senate Majority Leader Mitch McConnell and the GOP establishment, the entire party agrees on cutting taxes for the rich. Supply-side economics is the glue holding Republicans together, for now.

The rise of supply-side economics on the right is perhaps the first major example of the modern Republican Party’s abandonment of policy expertise and empiricism. As such, it’s a forerunner to climate change denial, birtherism, and other noxious forms of anti-intellectualism in the GOP. The resiliency of Laffer and his ideas shows that the Republican Party’s decay goes much deeper than Trump. It’s easy enough to ridicule the president as an ignoramus, but it’s hardly an accident that he leads a party that has elevated a crank into an intellectual luminary. If the GOP is cynical enough to accept the Laffer Curve for purely political reasons, it can convince itself to accept any policy idea, no matter how ruinous. Laffer’s legacy, then, extends well beyond bad economic policy: He made the Republican Party safe for snake-oil salesmen, not least Trump himself.

Trump’s Dangerous Spin on Puerto Rico’s Suffering
October 18th, 2017, 07:40 PM

Last week, the public got a rare glimpse at the art of spinning a federal natural disaster response. On Friday, Bloomberg reporter Christopher Flavelle published excerpts from internal Pentagon emails he accidentally received, in which officials at the Defense Department and Federal Emergency Management Agency discussed their press strategy for the government’s flailing relief effort in Puerto Rico. Flavelle was included on these messages for five days, despite his repeatedly alerting Pentagon officials of their error. Ultimately, he decided to publish them, noting that they offered “a glimpse into the federal government’s struggle to convince the public that the response effort was going well.”

The emails also showed how public affairs officers had to adapt each day to deal with President Donald Trump’s blustery, unpredictable rhetoric. “Much of recent media coverage is focused on the dialogue between POTUS and San Juan’s mayor,” read one advisory dated September 30, after Trump publicly attacked San Juan’s mayor on Twitter. “Many are criticizing POTUS’ lack of empathy for PR’s hardship. The public’s perception of U.S. Government continues to be negative as the response in PR is seen as too slow.” One day later, after Trump called critics of the disaster response “politically motivated ingrates,” an advisory went out noting that “the perception of [government] response continues to be negative.” Spokespeople were recommended to say, “I am very proud of our DOD forces,” while admitting that “there are some challenges to work through.”

I know this not only from Flavelle’s story, but because I also received the emails (as did Miami Herald journalist Jim Wyss, who didn’t return my request for comment). The first email was on September 28, after I spoke with a DOD communications officer to ask exactly when they received a formal request for aid; how long did it took for the military to deploy to Puerto Rico; and why helicopters weren’t being deployed to certain unreachable areas. I didn’t get concrete answers to all of those questions, but I did get included on the private Pentagon listserv.

John Cornielo, deputy director of public affairs for the U.S. Northern Command, later told me this was an honest mistake; I had asked to be added to the press list, and was added to the wrong one. He also disagreed that the agency was engaging in spin. “I wouldn’t say we’re putting on a positive response, I would say we’re telling it as accurately as we can—positive, negative, and neutral,” he said. “Clearly, things down there are not going well. But there are also things that are improving on an everyday basis, and those are important stories to tell as well.”

There’s an ongoing debate about who’s responsible for the humanitarian crisis in Puerto Rico. Some say the military has been too slow, while others blame the Puerto Rican government is responsible. But in at least one respect, there’s no dispute: The Trump administration’s public relations have been a debacle, as the president and his political appointees issue dishonest statements and career public servants struggle to keep the focus on the federal response. This communications catastrophe is much more than just a professional annoyance for journalists; it’s having a tangible impact on mainland politics and, even worse, on the welfare of Puerto Ricans.


In Puerto Rico right now, 86 percent of the 3.4 million residents don’t have electricity. There are shortages of food and medicine. Water is so scarce that some people are drinking from toxic Superfund sites. And there’s an environmental crisis unfolding, too, as raw sewage pours into rivers and reservoirs. But on Friday, Trump insisted otherwise. “They’re all healing,” he said. “Their states and territories are healing and they are healing rapidly.”

Senior officials have been in denial, too. Homeland Security Secretary Elaine Duke called Hurricane Maria “a good news story.” (“When you don’t have food for a baby, it’s not a good news story,” said an outraged San Juan Mayor Carmen Yulín Cruz. “This is a people-are-dying story.”) In an email to all White House staff earlier this month, Department of Homeland Security adviser Tom Bossert reportedly pledged “to turn the corner on our public communications,” insisting, “The storm caused these problems, not our response to it.” And rather than deny the truth, some officials would rather simply erase it: FEMA “removed information from its website documenting how much of the island of Puerto Rico still lacked power or access to drinking water,” the Washington Post reported.

kff.org

This strategy hasn’t proven very convincing to the press or the broader public. A Kaiser Family Foundation survey of 1,000 adults from October 4-8 found that nearly two-thirds of Americans believe that Puerto Ricans “are not yet getting the help they need.” But to Republicans, Trump is doing a swell job: 74 percent believe the federal government is doing enough to help, and 75 percent say the response speed has been just right. That’s the real danger of the administration’s denial. Republicans have unified control of the government; the relief effort in Puerto Rico, from the emergency responders on the ground to the additional funds that FEMA needs, is largely dependent on the White House and Congress. If Republican constituents truly believe that Puerto Ricans are “all healing,” their elected officials face no pressure to improve what is widely considered a poor federal response.

Until that response is improved, millions of Americans in Puerto Rico will suffer needlessly. For some, simply surviving is the main challenge—finding enough food and potable water, and some cases maintaining a steady supply of critical medications. In other cases, the struggle is mental. “We know how to eat from the earth,” Neil Delgado, a Puerto Rican who fled to Florida, told the Washington Post. “The problem now is psychological. Everyone is stressed. They’re in kind of a depression. And it’s going to get worse.” Some feel that they cannot express their full outrage at Trump because of their dependence on him. “The majority of people here feel [angry],” San Juan resident Rachel Cruz told the Associated Press, “but we have to be more balanced because we need help.” Surely, it does not help that their president denies that they’re even suffering.

The only silver lining here is that Puerto Ricans, faced with the Trump administration’s untruths, are determined to do something about it politically. Those living on the island do not have the right to vote, but Puerto Ricans living in Florida do. One million residents of Florida are of Puerto Rican descent, and according to the Post, more than 100,000 living on the island may relocate to Florida—the most populous swing state in America—because of the impacts of the storm. “All politics is about motivation,” Anthony Suarez, the first Puerto Rican member of the Florida House, told the Post. “And at this point, the Hispanic community here is extremely motivated against Trump.”

Trump’s Air War
October 17th, 2017, 07:40 PM

When President Trump decided to commit additional troops to the war in Afghanistan—now entering its seventeenth year—he contradicted his own position on the conflict. After years of deriding the war as a “total disaster” and a “complete waste,” and insisting that it was high time to get out, Trump announced in August that he would instead be deepening America’s involvement. “My original instinct was to pull out, and historically I like following my instincts,” the president told U.S. troops in his address on Afghanistan. “But all my life, I’ve heard that decisions are much different when you sit behind the desk in the Oval Office.”

Despite Trump’s seeming reluctance to engage in foreign conflicts, however, he has zealously embraced his role as a wartime president. Since taking office, Trump has dramatically ramped up the use of U.S. military force in a wide range of international hot spots, from Syria and Iraq to Somalia and Pakistan. Far from being an America First isolationist, Trump has already established himself as one of the most hawkish presidents in modern history.

Consider Trump’s dramatic increase in the use of air strikes. Through August, the United States dropped 2,487 bombs in Afghanistan—more than Barack Obama dropped in his last two years as president combined. In August, more bombs fell there than in any month since 2012. Trump also dropped the so-called “mother of all bombs,” the largest nonnuclear weapon in the U.S. arsenal, on an Islamic State cave complex—the first time the bomb has ever been used in combat.

Trump has accelerated the pace of air strikes in other conflicts as well. In Iraq and Syria, the American-led coalition has unleashed more bombs each month under Trump than Obama did in any month throughout the entire campaign against ISIS, which began in 2014. In Yemen, Trump has carried out 92 strikes or raids against Al Qaeda in the Arabian Peninsula—just shy of the number of attacks that Obama oversaw in his entire second term. In Somalia, the United States is carrying out an average of one strike against the jihadist group Al Shabaab every 15 days—a sharp escalation compared to Obama. And in Pakistan, Trump ended a nine-month pause in drone strikes with four unmanned bombings—more than Obama conducted during his final year in office.

Trump is also putting more boots on the ground. In April, 300 Marines returned to Afghanistan’s Helmand Province to assist in the fight against the Taliban—their first deployment there since 2014. The following month, Trump championed a $110 billion arms deal with Saudi Arabia—reversing the Obama administration’s decision to curb the sale of precision-guided munitions to Riyadh out of concern over civilian casualties resulting from the Saudi-led campaign in Yemen. And hot on the heels of Trump’s vow to unleash “fire and fury” against North Korea over its nuclear provocations, the Air Force is working to replace America’s aging stockpile of 400 Minuteman missiles and develop a new nuclear cruise missile—a project estimated to cost more than $1 trillion.

Even if Hillary Clinton—or nearly anyone else, for that matter—were commander-in-chief, America would likely be ramping up many of its military operations overseas. Expelling ISIS from its stronghold in Mosul, for example, would have required a significant commitment of airpower from any president. Likewise, by renewing America’s commitment in Afghanistan and overhauling the U.S. nuclear arsenal, Trump is following through on plans that his predecessors set in motion.

Yet what sets Trump apart, and what is most worrying about his hawkish foreign policy, is that he has put forward no coherent plan to guide his unprecedented use of military force. Far from looking to wind down conflicts, Trump seems to be acting on whatever tough-sounding phrase pops into his head, regardless of its effect in the real world. In Syria and Iraq, Trump has made good on his vow to “bomb the hell out of ISIS.” But with no plan to create security and stability in the region once ISIS is defeated, his military aggression could wind up rekindling sectarian conflict and opening up the United States to armed confrontation with Iran. Similarly, Trump has failed to spell out how many additional troops he will send to Afghanistan, or how long they will stay there. “In the end, we will win,” he declared in August—without offering any indication of what winning actually looks like. To make matters worse, Trump has failed to fill a raft of senior posts in the State Department, hamstringing America’s ability to exercise the diplomatic power necessary to broker peace and foster stability.

The consequences of Trump’s military aggression can be seen in the number of civilians killed by U.S. air strikes. Since taking office, Trump has overseen nearly 60 percent of all civilian casualties from air strikes in Iraq and Syria since the air war began. In Afghanistan, civilian casualties skyrocketed by 70 percent during the first six months of this year, compared to the same period last year. And in Yemen, Trump’s support for the Saudi-led bombing campaign has exacerbated what has become a staggering humanitarian crisis.

Trump has repeatedly made clear that his foreign policy is not motivated by a desire to enhance global security—it’s driven by his need to burnish his own image as a winner. “We aren’t winning. We are losing,” Trump reportedly complained to his generals in the weeks leading up to his Afghanistan announcement. It’s telling that his national security adviser, H.R. McMaster, apparently helped persuade Trump to remain in Afghanistan by showing him a 1972 photograph of Afghan women in miniskirts—ridiculously suggesting that by sending more troops to fight the Taliban, Trump would be viewed as the hero who ushered in the return of Western norms to Kabul.

Trump’s emphasis on “winning” on the military front is especially dangerous for a president who has proved to be such a loser in Congress and the courts. With his legislative agenda in disarray and his administration mired in scandal, military action is the only arena where Trump can portray himself as a decisive, powerful leader. Presidents have often resorted to using military strikes overseas as a way of distracting the public from their failures at home: think Ronald Reagan in Grenada or Bill Clinton in the Balkans. But under Trump, these are not discrete, one-off attacks; instead, he has made bombing other countries a routine and defining element of his administration. The more he fails as president, the more Trump will deepen U.S. military involvement around the world—deploying more troops, wasting more tax dollars, and killing more civilians in the process.

How Trump Stole the Soul of the Values Voter Summit
October 17th, 2017, 07:40 PM

There is a point during every Values Voter Summit when the unconverted observer fights stupor. The old white men are interchangeable, distinguished only by the different colors of their ties. Even the yearly cameo by Duck Dynasty’s Phil Robertson makes little impression. The speakers’ cadences are those of the country preacher, and they ramble and gesticulate until the air feels gelatinous and every second holds a century. It can be easy to forget what you are really watching: a gathering that allows the nation’s most dedicated Christian activists to network and strategize. It is an event for the purest of the pure.

So this year’s spectacle at the Omni Shoreham Hotel in Washington, D.C., was significant not only because it shook up the usual formula, but also because its most celebrated speakers fell well outside that God-fearing realm. When Donald Trump appeared at the summit on Friday, the first sitting president to do so, the audience howled. They leapt to their feet and applauded with an enthusiasm that was only matched on Saturday, when former White House aides Sebastian Gorka and Steve Bannon received multiple standing ovations for their own speeches. But these Trumpian additions to the customary lineup of Duck Dynasty stars and pro-life leaders and Michelle Bachmann indicated no discrepancy or compromise. To the contrary, it revealed the ways in which Christian politics and far-right populism have long overlapped.

This year’s Values Voter Summit possessed specific import: A politically powerful base gathered to redefine itself after a major victory. Senator Ted Cruz may have won the summit’s straw poll for the third time, but on Friday and Saturday Trumpists dominated the proceedings. Thanks to Trump, conservatives have reconquered the federal government. Via Trump, they have put Neil Gorsuch on the Supreme Court, rolled back Obama-era health care regulations requiring contraception coverage, and banned transgender people from serving in the military. And all they had to do was make an alliance with the most un-Christian of leaders, who has gleefully mocked those of devout faith, including his own vice president. “America is a nation of believers. And together we are strengthened and sustained by the power of prayer,” Trump declared at the summit, without a trace of irony. “We’re saying Merry Christmas again,” he said, and the crowd yelled “Yes! Yes!” as if it were the first time they had heard that promise.

Eighty percent of white evangelicals voted for Trump, a fact that bewildered some secular onlookers and agitated some Christian dissenters, like the Southern Baptist Convention’s Russell Moore. There are differences, certainly, between the Christian piety on display at the summit and the alt-right sympathies of Trump, Bannon, and their cadre. Despite Bannon and Gorka urging the crowd to defend the rights of all Americans regardless of sexual preference, the Values Voter Summit will never welcome gay figures like Milo Yiannopolous or Peter Thiel. Richard Spencer’s atheism would win him no friends in this audience.


But where the two groups find common ground is in the idea of a pure Western civilization that has come under assault by liberals, foreigners, and the elite. Bannon’s narrative of a clash of civilizations is a reflection of this idea, pitting the Christian West against the Muslim East. Evangelicals, for their part, are perpetually at war with darkness, defending a Judeo-Christian polity from the forces of secularization and the encroachments of other religions, like Islam. In this story, evangelicals are holy warriors, while nefarious agents target beleaguered cake-makers and county clerks to force them to bow to the nefarious LGBT Agenda. They are obsessed with the idea of America being corrupted—if Jesus cleansed the temple, then they will drain the swamp.

All of this fuels a politics of grievance, whose greatest nemesis is a governing class composed of sell-outs and sinners. Conservative Christians have long understood politics to be a battle between the pure of heart and a malicious establishment. Their pleasure with the Trump administration is not pleasure with the Republican Party at large. For evangelicals, Bannon’s populism only reinforces the villainous nature of their traditional enemies in the Republican Party, the left, and the press.

When Gorka announced that he and Bannon have “declared war on the RINO class,” that they are “going to take on every swamp dweller” and “celebrate the Judeo-Christian values that America was founded on,” evangelicals in attendance understood him to be a fellow traveler. Bannon engendered the same understanding. “There’s a time and a season for everything,” he declared, paraphrasing Ecclesiastes. “And right now it’s a season of war against the GOP establishment.” Mitch McConnell, Karl Rove, Bob Corker—Bannon railed against them all, to the crowd’s adulation.


Later, inside the exhibition hall, values voters swarmed Gorka for photos, while right-wing organizations courted supporters. Tradition, Family and Property, a far-right Catholic group, offered a magazine called Crusade; Prince William and Kate Middleton beamed from its cover, unaware that a group promising Christian “counter-revolution” had co-opted their image. Around the corner, Reactionary Times Radio interviewed attendees. Women dressed like modest Rosie the Riveters, in long red skirts, hawked a support group for ex-abortion workers. The mood was upbeat. Michael Jackson blared from the speakers. And attendees told me they were mostly satisfied with Donald Trump.

“I would not necessarily characterize him as a Christian, but I do think that he has surrounded himself with some good role models,” said Tessa Longbons, age 21. “I think he’s made steps in the right direction. If you talk to a strong Christian like Mike Pence, sometimes that can rub off a little bit.” Most Christians she knows, she added, did not support Trump in the 2016 Republican primary, but this reticence is fading. “They’ve been pleasantly surprised that he has stuck to some of his promises,” she explained.

Millie March volunteered 500 hours to Trump’s campaign.Sarah Jones

Travis Weber, director of Center for Religious Liberty at the Family Research Council, praised the selection of Neil Gorsuch and said of the administration, “They’ve done a lot of good things on religious liberty. We’re still looking for more to be done in some areas, but for instance, protecting people in terms of conscious objections to abortion-causing drugs, abortion services, health care—that’s something we see as positive development.”

Twelve-year-old Millie March was more enthusiastic. “It was Trump’s best speech ever,” she announced, and ran off. A Trump doll wagged from her open backpack.

But for the most part, assessments of Trump’s young administration were superseded by attacks on the religious right’s numerous enemies, which include Planned Parenthood, the Southern Poverty Law Center, George Soros, and, of course, the swamp. In a breakout session hosted by the legislative arm of the American Family Association, AFA Action, the group’s vice president, Rob Chambers, collected my name before the panel began. He then pulled up my New Republic review of Bible Nation—about the pernicious influence of Christian-owned businesses like Hobby Lobby—on his phone and read a portion out loud to the audience. “You may want to go back and read your Constitution,” he told me, before launching into a rant against the swamp that I presumably represent.

After him, Debbie Wuthnow of iVoterGuide.com set out a methodical plan for victory. “A lot of the blame for the swamp lies with those who have left it there,” she told the crowd. “God worked in this last election because Christians turned out to vote. If they turn out to vote in the primaries they can drain the swamp.” To help voters weed out swamp creatures, Wuthnow’s guide asks candidates detailed questions about their doctrinal positions—is man naturally sinful, for example. “They’re coming to you interviewing for a job,” she said of candidates. Her audience listened, rapt. Emmett McGroarty of the American Principles Project followed her, and urged “the deconstruction of the administrative state.” He took the phrase from Steve Bannon.

At the heart of this drive for purity, and the overweening resentment it produces, is not Christian spirituality, but a fear of being eclipsed by people who are not like them—a fear of replacement. The underlying similarities with the white supremacists who marched in Charlottesville—those who chanted, “You will not replace us”—are hard to ignore.

The old disagreements feel distant now. Trump belongs to the religious right, and it belongs to him. And what God has joined together, let no man put asunder.

The Alt-Right Doesn’t Know What to Do With White Women
October 17th, 2017, 07:40 PM

In a basement conference room in the Ronald Reagan Center in Washington, D.C., on November 19, 2016, following one of the first alt-right press conferences after the 2016 election, F. Roger Devlin (the “F” stands for Francis) took to the stage with a curious tale about Norwegian women. Their “lizard brains,” he said, were causing them to abandon Norwegian men for Pakistani immigrants, causing a crisis in heterosexual relations among Norwegian whites.

It was familiar territory for Devlin. Devlin is not a household name the way other white nationalist leaders like Richard Spencer are, and he certainly lacks the pomp of some of the far-right’s bombastic hangers-on, such as Milo Yiannopoulos. Yet the self-described “independent scholar” is a major voice on the alt-right, and has been critical in shaping the movement’s understanding of gender, marriage, and the place of white women among its ranks. He is a longtime contributor to far-right publications including The Occidental Quarterly, VDARE, and Counter-Currents. Like many on the intellectual wing of the alt-right, he has mainstream credentials, namely a doctorate in political philosophy from Tulane. But as his work veered right, he began to fancy himself a dissident. In light of the rise in mainstream media attention toward the alt-right movement and women’s role in it, Devlin’s work sheds a great deal of light on a question of what today’s white nationalists think of women’s place in society.

As Seyward Darby explained in her September cover story for Harper’s on women in the alt-right, there are those within the movement who are consciously pushing for increased female involvement—through YouTube channels, blogging, social media, and even presentations at movement conferences. It’s a big shift, given the oft-noted lack of female leadership in the alt-right. Some white nationalist authors have been quick to acknowledge the need for change. “There aren’t enough women who publicly align with the Alt Right at present in order for the newly awoken to find that new community,” the female white nationalist writer Wolfie James wrote in Counter-Currents. “A man can thrive as a lone wolf, but a woman will wither from loneliness.” Women’s inclusion, in other words, is crucial to the movement’s recruitment strategy going forward. But the far right’s gender politics are being shaped by Devlin and his ilk, and their vision for an ethnically cleansed future is hostile to female power. The result is that a conflict is brewing between what the alt-right knows it needs to do—recruit more white women—and the way it envisions the world it wants to build.

Although Devlin has been a presence in white nationalist circles for some time, it was his 2006 essay for The Occidental Quarterly, “Sexual Utopia in Power,” that garnered him fame in both the “manosphere” (the loose network of blogs, chatrooms, and forums run by MRAs) and the white nationalist community. He is credited with coining the term “hypergamy”—the practice of women “marrying up” in terms of class, sexual prowess, or societal status, which Devlin asserts is a foundational part of the feminist vision of sexual liberation. Charlotte Allen, a writer and fellow at the conservative Independent Women’s Forum, wrote in a 2010 Weekly Standard article that Devlin “deftly uses theories of evolutionary psychology” in his work, but that his “writing about the feminist and sexual revolutions frequently shades from the refreshingly politically incorrect into the disturbingly punitive.”

Devlin may have been “deft” and “refreshing” to Allen, but his theories are mostly regurgitations of the claims men’s rights advocates have made for decades. Like those who decry “gynocentrism”—a world in which women’s rights are at the fore—Devlin argues that the sexual revolution has transformed women into the dominant sex. The death of monogamy and the modern woman’s freedom to have multiple partners throughout her lifetime has left the majority of men behind, he says, creating a situation of “loneliness for the majority” of men while ensuring a “double standard [that] favors women.” Indeed, “in the feminist formulation, [it’s] freedom for women, responsibility for men.” These responsibilities, as Devlin outlines them, are manifold: It’s men, not women, who have traditionally died in wars. It’s men, not women, Devlin claims, who are the economic benefactors for their children—even if those men aren’t granted custody rights, as Devlin is quick to decry. And it’s men, not women, he says, who “have been the builders, sustainers, and defenders of civilization.”  

To Devlin, the questions of a women’s role in the world and their role in sex relationships, are inseparable from what he refers to as the feminist ideology. As scholar Christa Hodapp observes in her book Men’s Rights, Gender, and Social Media, he views feminism as an “unhinged bratty sister” and a “threat motivated and furthered . . . by a desire for power.” Men shouldn’t do a woman’s bidding, Devlin says. In fact, “it is a woman’s responsibility to prove she is worthy of the privilege” of submitting to a man and bearing his children.


The question that’s arising is what will become of the tension between a need for female involvement in the far-right, as asserted by James, and the ideas pushed by the alt-right’s more aggressively sexist figures such as Devlin. The answer rests largely on what becomes of the ties between the alt-right and the men’s rights movement, two amorphous and symbiotic sectors that have enjoyed increasing power and notoriety in the age of Trump. The MRA-to-white nationalist pipeline is well-documented—people who become intrigued by one ideology frequently become devotees of the other. But different strains of the far right have different visions, and their ambitions can clash.

As Angela Nagle noted in Jacobin earlier this year, there has already been tension on the right between the moralist conservatism that “aims for a return to traditional marriage while disapproving of porn and promiscuity,” and the “libertine internet culture from which all the real energy has emerged.” Devlin, like other MRAs, takes the notion of gender separation based on “biological” difference even further than the traditionalists, advocating that men look abroad  (e.g., to Eastern Europe) for more subservient women, and push for a complete upheaval of the traditionalist understanding of marriage as entailing male obligations. Under his view, traditionalist visions of heterosexual marriage did not oppress women enough. He opts instead for a vision of absolute female servility. Part of this argument is economic: he claims in his essay “Home Economics,” that most marriages today are premised on the idea that “a woman marries a meal ticket; a man marries trouble and expense.” He calls on men to abandon the “chivalrous pretense” underlying marriage and shake off their traditional economic responsibilities. Even in the company of reactionaries, this exhortation can come with its own baggage, especially given the urgency with which the alt-right speaks of “white extinction.” The possibility of abandoning financial support for white women and their white babies comes with its own wealth of contradictions in an ideology dedicated to preserving whiteness and white supremacy.

The growing female presence in the white-nationalist and broader alt-right circles has encouraged some to relinquish these Devlinesque appeals. But as Darby explained in Harper’s, while white nationalist movements that explicitly exclude women from their ranks have been renounced—however feebly—the efforts to nurture women’s leadership roles in the alt-right has met resistance. “The movement can consider incorporating women without being feminized or co-opted,” pleaded Wolfie James in Counter-Currents. For one, she says, women are crucial in a “Fourteen Words context”—a reference to the white supremacist slogan “We must secure the existence of our people and a future for white children.” No white women means no white children, and by extension no more white men. Even Devlin concedes this point, explaining that “without children, the race has no future, and without women men cannot have children.” Women have to be a part of the movement, the alt right concedes. What is up for debate is the extent of their involvement.

For ideologues like Devlin who have based their careers in the alt right on the demonization of women, this question poses a challenge. As he whined in “Sexual Utopia in Power,” “Western women [have] become the new ‘white man’s burden.’” At the 2016 NPI conference where he fretted over the fate of straight white men in Norway, he suggested that white men adopt “a policy of writing off women who take up with their swarthy oppressors.” In other words, white men should decline sex with women who have slept with non-whites. Such a move would be “eugenic,” he says, as it “[flushes] disloyalty from our gene pool.” But Devlin doesn’t have a high opinion of any women, whether they are white or not, whether they’ve slept with men of color or not. Who, then, are the women suitable for right-wing leadership roles in his eyes? If the alt-right’s recent history is any indication, that question is unlikely to go away quietly.

The Democrats’ Dianne Feinstein Problem
October 17th, 2017, 07:40 PM

In 2006, anti-war progressive Ned Lamont defeated Joe Lieberman in Connecticut’s Democratic primary for U.S. Senate. Lieberman promptly started a “Connecticut for Lieberman” party and ran as its nominee in the general election, creating a headache for Democrats: Should they support a colleague, or a challenger who best represents the political moment? Senators including Barack Obama and Barbara Boxer sided with Lieberman during the primary, then endorsed Lamont in the general election; others stuck with the third-party Lieberman throughout. After defeating Lamont by 114,000 votes and earning a post-election standing ovation from the Democratic caucus, Lieberman became a thorn in the side of liberals for the rest of his tenure.

Ned Lamont, meet Kevin de León.

The 50-year-old president of the California state Senate last week announced his candidacy for U.S. Senate against longtime Senator Dianne Feinstein, who is 84. Like Lieberman, Feinstein occupies the right flank of the Democratic Party, even more so in an era of resistance and progressive resurgence. California’s kooky electoral rules make it likely that Feinstein will face de León in a general election matchup, with similar dynamics to Lieberman vs. Lamont. And every Democrat in the Senate, at a time when they are striving to win back the chamber, will have to answer: Do you support a colleague, or the challenger who best represents the political moment?


De León led the California Senate in a year when practically every political move in the deep-blue state was an act of resistance against Donald Trump. The legislature increased the gas tax to fund infrastructure improvements. They renewed the cap-and-trade system to fight climate change. They made California a sanctuary state, blocking state law enforcement from cooperating with federal authorities on deportation. The Senate passed a single-payer health care bill, which the Assembly then shelved. (A cynic would say de León engineered the passage of a thin bill that he knew the Assembly wouldn’t accept, to burnish his credentials among the progressive base.)

De León isn’t exactly the Bernie Sanders of the West Coast. He endorsed Hillary Clinton in 2008 and 2016, and his candidacy may stem as much from being termed out of the legislature and not having another seat to fill than any burning desire to oust Feinstein. However, de León has a compelling personal story—the child of a low-income, single, immigrant mother, he’s a former student and labor organizer, and would be the state’s first Latino senator in history. Compared to an incumbent who has spent 25 years alternately disappointing and antagonizing liberals, he represents a major step forward.

Feinstein’s political instincts were apparent when she loudly supported the death penalty at the 1990 state party convention, drawing a chorus of boos—which she subsequently used in campaign ads to prove her distance from the liberal base. Perhaps no Democrat in the past two decades has been as committed to expanding the national security state than Feinstein (again, like Lieberman). On domestic policy, she supported the Bush tax cuts, permanent normal trade relations with China, and the bill that repealed Glass-Steagall’s financial reforms. While strong on gun safety, women’s rights and the environment, Feinstein has openly courted the center and rejected the left since coming to Washington. Just this year, she told the Commonwealth Club of San Francisco that Donald Trump could mature into a “good president.”

De León appears to be Feinstein’s first true primary threat since she became a senator. But in California’s election system, all primary candidates appear on the same ballot, regardless of party, and the top two vote-getters advance to the general election. Because Republicans barely have a pulse in California, Feinstein and de León may well face off in November—just as two Democrats did in the state’s 2016 Senate race. Billionaire Tom Steyer is considering running, but de León’s experience and base in vote-rich southern California makes him more likely to get into the top two. (Also, Steyer and de León are very close, and I don’t see how both run.)

For Feinstein’s Senate colleagues, which include several potential 2020 presidential nominees, that turns this race from an easily parried question (“I’ll support the will of California Democrats”) to a year-long nuisance, if not for longer. De León is clearly closer to where the party has been moving, but Feinstein works with Senate Democrats every day and has probably given money or advice to all of them in the past. Potential 2020 candidates don’t want to anger a chunk of young, energetic supporters by backing Feinstein, but they also don’t want to anger the state party establishment (or its rich donors).

Senator Kamala Harris sent out a fundraising email on Feinstein’s behalf last week. Los Angeles Mayor Eric Garcetti, a dark-horse 2020 hopeful, held a re-election event for Feinstein. Garcetti and Feinstein share a consultant, Bill Carrick, a major fixer in California politics. Carrick will surely try to lock up as many endorsements and big donors as possible early, denying critical oxygen to de León, who’s relatively unknown in a state that doesn’t pay much attention to its legislature. You might even see open threats to potential de León supporters to line up with Feinstein or get effectively blackballed from state politics. And whereas Ned Lamont could meet a large chunk of Connecticut voters in person, that’s impossible in California with its 39 million residents.

Feinstein will also have a formidable war chest and ability to self-fund. Like Lieberman, she’ll have a built-in advantage in a statewide matchup against a challenger to her left; right-leaning independents and Republicans could become a rare swing vote in a general election. If things get close, I wouldn’t be surprised to see Feinstein slam De León for raising the gas tax.


Party stalwarts I’ve spoken to in recent days worry that the attention and money spent on this intraparty fight will crowd out seven winnable House races that are critical to Democrats’ attempt to take back Congress. De León supporters respond that a young Latino challenge at the top of the ticket will draw out voters who don’t normally participate in midterm elections, swamping Republicans statewide.

Neither point has much data behind it. The Lamont-Lieberman fight didn’t stop Democrats from taking back Congress in 2006, including picking up two House seats in Connecticut. Loretta Sanchez ran for Senate in California in 2016 and didn’t even beat Kamala Harris among Latinos; any boost in turnout would be hard to separate from the wave of anti-Trump sentiment in the state.

The more interesting question is how national Democratic leaders respond. Progressives are likely to make the Feinstein–de León race into a litmus test, as well they should. Feinstein is clearly too conservative to represent one of the nation’s most liberal states. If Democrats who want to lead the party end up siding with her, they do so at their peril.

The Impeachment Litmus Test Is Dividing Democrats
October 17th, 2017, 07:40 PM

Tom Steyer has his own litmus test for Democrats. The billionaire environmentalist and major donor from California called last week for his party’s lawmakers and candidates to pledge to impeach Donald Trump. A “clear and present danger to the republic,” Trump should be removed from office over his “relationship with Vladimir Putin and Russia,” allegations that Trump used the presidency to “promote his own business interests,” and his “seeming determination to go to war,” Steyer wrote in a letter to Democratic campaign committees and members of Congress, according to The New York Times. He left no ambiguity: “I am asking you today to make public your position on the impeachment of Donald Trump and call for his removal from office.”

Steyer’s move is noteworthy not just for its audaciousness, but its timing. Earlier this year, it seemed every day brought a new bombshell about the Trump campaign’s potential collusion with Russia and the president’s efforts to obstruct the investigation. But it’s now been five months since the president fired FBI Director James Comey, over which time special counsel Robert Mueller has been working the Russia investigation behind closed doors. As a result, the tempo has dropped in the impeachment drumbeat.

Yes, Representative Maxine Waters is still pushing the issue. “Republicans should step up to the plate and confront the fact that this president appears to be unstable,” Waters said last Thursday. “I believe that really has been collusion [with the Russians], and I do think that Special Counsel Mueller is going to connect those dots. But I think there’s enough now that we all know ... that we should be moving on impeachment.” And a day prior, Representative Al Green of Texas read articles of impeachment on the House floor. But consider the response from Green’s colleagues: The Washington Post’s Mike DeBonis reports that “party leaders had prevailed upon Green not to offer the resolution and thus force his colleagues to cast a potentially troublesome vote.” House Minority Leader Nancy Pelosi has sought to tamp down talk of impeachment all year, saying Mueller’s investigation should run its course. Asked in May about Democrats’ calls for impeachment, she said, “Again, you’re talking about impeachment, you’re talking about what are the facts?.... What are the rules he may have violated? If you don’t have that case, you’re just participating in more hearsay.”

Instead of getting ahead of Mueller, Democrats like Pelosi have focused on more immediate threats to progressive achievements, like Trump’s sabotage of Obamacare and Deferred Action for Childhood Arrivals (DACA). These party leaders believe focusing on impeachment while Republicans control Congress is impractical or even counterproductive. Advocates of an impeachment push, meanwhile, see it as a moral imperative and wise strategy, providing the starkest possible contrast between the Democratic opposition and the Republicans in power. “It’s time to step up and do the right thing, not do political polling and figure out what’s the smart thing to do for the next election cycle,” Steyer told MSNBC’s Chris Hayes last week.

Yet Steyer’s impeachment litmus test isn’t being received well by some Democrats on the campaign trail. “If you say all Democrats have to do this, I just don’t think that’s right for our country,” said Amy McGrath, a retired Marine fighter pilot running in Kentucky’s Sixth Congressional District. “We’re not in power,” she added. “It ain’t going to happen. So it doesn’t do anything other than polarize us even more.”


“An impeachable offense is whatever a majority of the House of Representatives considers it to be at a given moment in history,” said Gerald Ford, then the House minority leader, in 1970. The Republican Party almost certainly won’t remove Trump from power before the midterm elections next fall, but Democrats are on firm ground calling for the GOP to do so. Scholars are building a case against Trump based on obstruction of justice, conflicts of interest, and corruption, but as Slate’s Jacob Weisberg wrote back in May, the constitutional phrase “high crimes and misdemeanors” may well cover “a much wider range of presidential abuses.” Veteran Washington journalist Elizabeth Drew, author of a book on Watergate, made a similar point last week. “A president can be held accountable for actions that aren’t necessarily crimes. A crime might be an impeachable offense—but not all impeachable offenses are crimes,” she wrote at The Daily Beast. “Impeachment isn’t a process by which an established set of principles is enforced. There’s no tablet to be taken down from on high and followed; there’s no code of offenses for which a president can be charged. There are precedents, but they’re not binding, which is a good thing.”

This is why Waters and others believe Democrats can start “moving on impeachment” right now. A majority of Democrats agree: A Public Religion Research Institute poll in August found 72 percent of them favor impeachment. Even some Republicans on Capitol Hill are speaking out about Trump’s unfitness for office. But campaigning on impeachment is not without pitfalls. While energizing the Democratic base, it might be seen as premature or impractical by more moderate voters, especially independents whom Democratic candidates will need in order to win in swing or conservative-leaning districts. That doesn’t worry Randy Bryce, the Wisconsin union ironworker running to unseat Speaker Paul Ryan. “I think it’s great,” he said of Steyer’s position. “It’s time for Democrats to be Democrats.” Asked about party leaders who’ve shied away from impeachment talk, he said, “I’d love to see every Democrat call for it.” (Bryce’s primary opponent, Cathy Myers, is also backing impeachment, saying in a statement on Monday, “Trump is clearly unfit for the office he holds and poses an immediate threat to the safety and security of this country and the world.”)

McGrath takes a different view, telling me that Democrats should wait for Mueller to finish his probe. “If what comes out is an impeachable offense, then I have no problem calling for impeachment,” she said. McGrath acknowledged that backing Trump’s ouster now “would probably please many people in my own party, and given that I have a primary, maybe that would help me. But I personally don’t think it’s the right thing to do.” Democratic strategist Stan Greenberg called a litmus test “a terrible idea,” adding, “I think ultimately Mueller will find obstruction of justice, but I think there’s a process for that.” Markos Moulitsas, founder and publisher of the progressive blog Daily Kos, agreed. “It would be a disaster,” he said.Republican intensity is down. They hate Paul Ryan and Mitch McConnell and love Donald Trump. If we make 2018 about the Republican Congress, we win. If we make it about Donald Trump in anything but the most targeted communications, we turn out Republican voters who right now really would rather stay home.”

“As we learned in 2016, running around with our hair on fire doesn’t do the Democratic Party any favors,” said strategist Lis Smith. “In fact, you can make an argument that it actually helped Donald Trump. The louder we yelled about how outrageous he was, the more the interests and voices of regular people were drowned out. While impeachment is a tantalizing fantasy for many Democrats, it remains a fantasy. Every second we spend talking about impeachment is a second we’re not talking about jobs or how we can lift people’s wages.” For now, Smith said, impeachment “seems like more of a donor issues than voter issue. While I have no doubt you can reel in some big donors and even some grassroots money with impeachment.... It just doesn’t touch people’s lives.”

Bryce, perhaps sensitive to this critique, said the issue is “not going to be on the front-burner” of his campaign. “It’s something I believe in, but I’m not running to cast a vote in order to impeach Trump,” he said. He’s conscious that Republicans could try to tie Democrats to their impeachment stances and distract from issues like health care, yet he ultimately shrugged off that concern. “They don’t want to talk about how horrible their policies are, so they’re going to try to throw smoke over what they’re not doing,” he said. Asked whether advocating impeachment was risky in a swing district like his, Bryce was especially adamant: “People have had it with Donald Trump in this district. There is complete buyer’s remorse.”

Nationally, support for Trump’s impeachment is growing, even if it’s still a minority view. “The polling supporting impeachment proceedings moving forward is far higher than it ever was during the Watergate period,” said John Bonifaz, co-founder and president of Free Speech For People, which is leading the “Impeach Trump Now” campaign. “We agree fully that all members of Congress who have taken an oath to the Constitution should be on the record.... I do not believe that those who claim to be engaged in defending the Constitution can consistently take that oath and not engage in seeking impeachment.”

Steyer doesn’t just consider it his moral duty to push for impeachment; he believes our collective security might depend on removing Trump. When Hayes asked if Steyer could imagine Trump being impeached before the midterms, Steyer replied, “I don’t believe we can stand pat for 14 months, cross our fingers, cross our toes, and hope like heck that the world’s going to be okay. The fact of the matter is, we are in grave danger, and it is time for us to act, not to sit there and hope it turns out okay because that works for us politically.” On this point—the “grave danger” Trump poses to the world—McGrath is in agreement with Steyer. The nuclear issue is my biggest concern with the current president,” she said. “We don’t have any mechanism in our country to stop any kind of rogue president, and I wish we did.”

While there’s no mechanism to stop Trump from ordering a nuclear attack, there is one for removing him from office. If Democrats win back the House of Representatives, they can begin impeachment proceedings. But that doesn’t necessarily mean they should campaign on impeachment. In the end, candidates should employ whatever politics best helps the party get there.

Is Donald Trump Installing a Mole in the Mueller Probe?
October 16th, 2017, 07:40 PM

On September 28, the Senate Judiciary Committee approved on a party line vote the nomination of Brian Benczkowski to be the head of the Justice Department’s criminal division. The vote put President Donald Trump one step closer to installing a potential mole at the department, with the ability to inform him of any wiretaps or significant developments in special counsel Robert Mueller’s grand jury investigation into the possible ties between Russia and the Trump campaign.

During the committee hearing, Democrats cited a number of reasons to oppose Benczkowski, who was a top aide to Attorney General Jeff Sessions when the former senator ran the Judiciary Committee. Senator Dianne Feinstein noted that Benczkowski, a partner at the firm Kirkland & Ellis, has no prosecutorial experience and almost no experience in a courtroom. Senator Dick Durbin and others argued that he showed “really poor judgment” when he chose to represent Alfa Bank, which has been implicated in the Russia scandal, between his stint on Trump’s transition team and his June nomination to be assistant attorney general. (Alfa Bank came under suspicion after it was discovered that one of its servers had communicated with a server tied to the Trump Organization.)

Senator Sheldon Whitehouse raised a more specific concern: that Benczkowski might serve as “a back channel source of information” from Mueller’s special counsel investigation to Sessions, who has recused himself from the case. Whitehouse and others fear that having Benczkowski as the head of the criminal division could effectively breach the recusal. But there may be a still bigger risk: Benczkowski could share information about wiretaps and proceedings from the grand jury directly with the president.

The cause for concern comes from an old Department of Justice interpretation of the PATRIOT Act. Along with expanding surveillance authorities, the PATRIOT Act permitted any government lawyer to share national security-related grand jury or wiretap information with any government official as long as it would help them perform their job better. The measure was passed in response to the September 11 attacks, with an eye to sharing counterterrorism information more broadly. But the authorization of such sharing explicitly extended to “clandestine intelligence activities by an intelligence service or network of a foreign power or by an agent of foreign power”—precisely the kind of nation-state spying at the heart of the Russian investigation.

A July 22, 2002, memo from the Justice Department’s Office of Legal Counsel, written by Jay Bybee, the author of the infamous torture memos, held that, under the statute, the president could get grand jury information without the usual notice to the district court. It also found that the president could delegate such sharing without requiring a written order that would memorialize the delegation.


Bybee’s memo relies on and reaffirms several earlier memos. It specifically approves two rationales for sharing grand jury information with the president that would be applicable to the Russian investigation. A 1997 memo imagined that the president might get grand jury information “in a case where the integrity or loyalty of a presidential appointee holding an important and sensitive post was implicated by the grand jury investigation.” And a 2000 memo imagined that the president might need to “obtain grand jury information relevant to the exercise of his pardon authority.”

If you set aside Trump’s own role in obstructing the investigation—including the firing of former FBI Director James Comey—these rationales are defensible in certain cases. In fact, the Justice Department has already shared information (though not from a grand jury) with the White House for one of these very reasons. In January, acting Attorney General Sally Yates warned White House Counsel Don McGahn that Russians might be able to blackmail then-National Security Advisor Mike Flynn. As Yates explained in her congressional testimony in May, after Flynn’s interview with the FBI, “We felt that it was important to get this information to the White House as quickly as possible.” She shared it so the White House could consider firing Flynn: “I remember that Mr. McGahn asked me whether or not General Flynn should be fired, and I told him that that really wasn’t our call, that was up to them, but that we were giving them this information so that they could take action.”

A similar situation might occur now that the investigation has moved to a grand jury investigation, if someone remaining in the White House—the most likely candidate is the president’s son-in-law, Jared Kushner—were found to be compromised by Russian intelligence. In Kushner’s case, there are clear hints that he has been compromised, such as when he asked to set up a back channel with the Russians during the transition.

If Trump were to rely on the memo, he might order a Justice Department lawyer to tell him what evidence Mueller had against Kushner, or whether Mike Flynn or former campaign manager Paul Manafort were preparing to cooperate with Mueller’s prosecutors if they didn’t get an immediate pardon. Unlike Yates, Trump would have an incentive to use such information to undercut the investigation into Russia’s meddling.

The special counsel’s office declined to comment for this article.

The 2002 memo generally supports the notion that the attorney general should decide whether the president needs to see a particular piece of information. But it also envisions the possibility that lawyers further down the hierarchy might make that decision—including the position Benczkowski has been nominated to fill. “An attorney for the government may include [in addition to the people actually prosecuting a case] other federal officials within the Department of Justice who gain access to the information under Rule 6(e)(3)(A)(i) in the performance of their law enforcement duties (e.g., the Assistant Attorney General of the Criminal Division …)”

Furthermore, it asserts that the president may delegate the authority for deciding what grand jury information he might need to know to someone besides the attorney general. And the memo cites an old opinion from the Iran-Contra scandal to argue that the president doesn’t have to memorialize any such delegations in writing. “Such a directive may be set forth in a formal executive order, in a less formal presidential memorandum, … or pursuant to an oral instruction from the President to the Attorney General or other appropriate officials.” So Trump could order someone to share information without leaving a paper trail.

The risk that Trump might use this memo to spy on Mueller’s investigation has always existed. But it was limited because the Justice Department lawyers with visibility into the investigation—notably Deputy Attorney General Rod Rosenstein and acting national security division head Dana Boente—are career prosecutors unlikely to share such information. While Benczkowski did hold a series of senior positions at Justice during the George W. Bush administration, he is fundamentally a political appointee who even worked in the Trump transition team.


The Democrats on the Senate Judiciary Committee all seemed to expect that Benczkowski, in the normal course of affairs, would be consulted in matters affecting Mueller’s investigation. Senator Patrick Leahy scoffed at the September hearing, “There is no way in God’s green earth that if Mr. Mueller is coming up with prosecutions into the Russian hacking, or into even the banks or anything else, that the Justice Department’s not also gonna look at that to make sure, ‘Are there other things we need to be prosecuting?’”

Benczkowski’s close ties to Sessions and his involvement in the transition should be enough to require his recusal from all matters involved in this investigation. That’s before you factor in the possibility that Alfa Bank may play a role in the probe. Still, as Feinstein noted in the hearing, Benczkowski has refused to pledge to recuse himself from the Russian investigation on three different occasions during his confirmation process. Beyond his work for Alfa Bank, Benczkowski said, “the ethics rules do not require a recusal in those circumstances.”

Benczkowski has promised an ethical approach. “If I am confirmed and a matter comes before me in the criminal division where I believe recusal might be warranted,” Benczkowski explained, “I will review the law and the specific facts, consult with career ethics officials at the department, and recuse myself from any matter where such a recusal is appropriate.”

Benczkowski also promised to adhere to rules, written while he worked with the Bush-era Attorney General Michael Mukasey, limiting contact between the White House and the Department of Justice. “I understand those rules, and I will abide by them,”  he said in his July confirmation hearing, “largely because I helped write them and I’ve helped enforce them.”

However, despite these statements, Benczkowski’s nomination remains a risk, partly because there is a lot of murkiness surrounding recusals in the first place. Despite three requests from Whitehouse and Senator Lindsey Graham, the Justice Department still refuses to explain the details of Sessions’s own recusal and to specify who else has access to the investigation. “I have asked Deputy Attorney General Rosenstein repeatedly to explain who at the Department of Justice has access to special counsel Mueller’s investigation,” Senator Whitehouse said in statement. “The only response I’ve received is that the special counsel ‘communicates directly with relevant Department components in the exercise of its responsibilities.’”

None of these concerns have slowed Benczkowski’s path to the Justice Department, where he could very well serve as Trump’s grand jury mole if he is confirmed by the full Senate. Even the Republicans on the Justice Committee most sincerely concerned about the integrity of the Russian investigation, like Graham, voted to support Benczkowski’s confirmation.

Which is why, at the very least, the Justice Department has to finally explain how it is ensuring the independence of Mueller’s investigation. “We must understand what boundaries protect Mr. Benczkowski from becoming a backchannel source on the special counsel investigation to the recused attorney general or the White House,” Whitehouse insisted. “So far, we have not and that is very troubling.”

Should We Rebuild Homes in Wildfire Zones?
October 16th, 2017, 07:40 PM

If Northern California’s famous wineries could survive Prohibition, there’s no doubt they’ll make it through this year’s wildfires. That’s the attitude of Michael Honig, chairman of the Napa Valley Vintners trade association, who told New England Cable News this week that the 14 wineries damaged or destroyed by the ongoing blazes would not only rebuild, but come back stronger. “This is a short-term setback,” he said.

It may take some time, but other Californians will eventually adopt Honig’s determination to reconstruct the thousands of structures lost to the ongoing blazes, which are the deadliest and most destructive in state history. Real estate journalist and investor Brad Inman is sure of it. “As a journalist, [I’ve written] stories about post-disaster rebuilding in places like Oakland and San Francisco after the 1989 Loma Prieta earthquake; Los Angeles after the 1994 Northridge earthquake; and visited Phuket after the 2004 Tsunami in Thailand,” he wrote this week. “I was always struck by people’s fierce will to rebuild.” Not long after the historic Oakland hills firestorm in 1991 destroyed 3,500 homes, families started to put their neighborhood back together, Inman recalled. Eventually, they created an even better Oakland hills—one with higher property values and better public works.

This is an understandable response. People displaced by fires need new homes, of course, and naturally they would do it on the land they still own. And insurance policies usually provide the means to do so. But our planet is different today, as is our understanding of what constitutes smart, sustainable development. Wildfires are wreaking more havoc; California’s ten most destructive wildfires on record have all occurred in the last 30 years. That’s partly just because there’s more stuff to burn: more structures, more cars, more people. But it’s also because of human-caused climate change, which makes it more likely that wildfires burn longer and stronger.

So just as we grapple with questions about rebuilding in flood zones as the sea level rises, we must grapple with rebuilding in wildfire-prone zones. Is it sensible reconstructing an entire community that could just burn down again? Is it worth the public cost, and at the risk of more firefighters’ lives? If so, can we build in a way that decreases the risk of catastrophic damage?

These questions extend not just to those homes destroyed by the Northern California wildfires, but to developers across the country who are choosing to build in high-risk zones. Sixty percent of new homes built in the U.S. since 1990 have been constructed in areas that adjacent to fire-prone public lands, and this is forecasted to continue, according to an analysis by Montana-based Headwaters Economics. Kelly Pohl, a researcher at Headwaters, says now’s the time to pressure developers to halt this trend—or, at least, to start using smart land-use strategies to reduce risk.

“We have to honor and recognize that this has been a really tragic fire season for lots of communities across the country. We’re talking about real people with memories, communities, and experiences that have been severely impacted by these fires,” she said. “But people’s memories are short, and we have an opportunity to learn from this.”


Americans in general, and Republicans in particular, hold almost as much reverence for private property as they do for the inalienable rights granted by the Constitution. Dictating where and how people build their homes—or even questioning it—invites immediate backlash. But some conservatives feel that there should be a price, or standard, for people whose personal decisions are more likely to cost taxpayers. “Anybody ought to be able to live wherever they want to live,” former FEMA Administrator Michael D. Brown, now a prominent conservative radio host, told me after Hurricane Irma destroyed seawalls and flooded homes in Florida. “I’m just saying taxpayers shouldn’t have to subsidize your choice.”

And taxpayers across the country do subsidize decisions to build in wildfire-prone areas. The U.S. Forest Service increasingly spends more of its budget on firefighting—from 13 percent in 1995 to 50 percent in 2015, according to Curbed. According to a Headwaters analysis, wildfire protection and suppression annually costs the government approximately $3 billion, which is equal to half of President Donald Trump’s proposed budget for the entire Environmental Protection Agency.

Those costs are expected to rise with climate change, and not just in places like California, Oregon, and Montana. Wildfires destroyed more than 100 homes in Tennessee last year, and this year in Florida, wildfires forced thousands to evacuate. More than 4,000 communities across the country are prone to 100-plus acre wildfires, according to Headwaters.

Each dot represents a community threatened by one or more 100-acre wildfires between 2000-2014.Courtesy of Headwaters Economics

Property owners and communities also have to consider the monetary risk that wildfires pose to themselves, and not just the immediate damage caused by fire. “Communities bear long-term costs from wildfires over years because of lost business revenue, depreciating property value, and the long-term mental health consequences of living through a disaster like that,” Pohl said. The mental health impacts should not be understated. “Have you been around a wildfire before?” Pohl asked. “It’s dark as night. The sun is an orange ball. There’s ash and debris and embers raining down. It feels like the end of the world. It’s a life-altering experience.”

Still, the U.S. Forest Service expects population growth in wildfire-prone areas to continue. It estimates that there are approximately 45 million homes in the so-called “Wildland-Urban Interface”—the technical term for those in particularly vulnerable areas—and that the number will rise another 40 percent by 2030. WUI makes up 9 percent of the contiguous U.S., from Colorado’s Front Range to Southeast Texas to the Great Lakes states, and according to Headwaters, 84 percent of it remains undeveloped. So it’s unlikely and perhaps unreasonable that local governments would outright forbid development in all of these areas.

But there’s a lot that federal, state, and local governments can do to mitigate risk. “One of the strategies that we’re working with communities all over the country right now is developing land use regulations on how to reduce risk,” Pohl said, noting that a lot of the responsibility lies with state and local governments, which control zoning regulations. Communities at risk could require fire-resistant building materials like concrete, stone, glass, and brick; prohibit buildings on steep slopes, where fires move faster than on flat land; enforce a minimum distance between homes, since dense housing serves as better fuel for flames; require homeowners to plant only certain types of vegetation that doesn’t easily dry out and catch fire.

Federally, Trump could increase the Forest Service’s budget for wildfire prevention measures, like clearing dry brush and doing controlled burns. Trump could also create national codes and standards for building in the wildland-urban interface, as climate reporter Andy Revkin recommended in 2013 that President Barack Obama do. Those standards could be looser or tighter depending on level of hazard.

These questions, understandably, are not at the forefront of people’s minds in Northern California right now. Many residents are grieving, and many more are just beginning a difficult, years-long effort to rebuild what they lost. “This is my home. I’m going to come back without question,” Howard Lasker, a Santa Rosa resident who lost his home, told the Associated Press. “I have to rebuild. I want to rebuild.” That’s exactly why local and federal officials must consider these questions right now—to enact smart policy before the next massive wildfires strikes.

How GOP Lawmakers Ignore the Will of the People
October 16th, 2017, 07:40 PM

During Barack Obama’s two terms in office, Democrats lost ground at the state level—a lot of ground. Republicans now dominate state legislatures to a greater degree than at any time since the Civil War, making it nearly impossible for Democrats to enact any meaningful policies in large swaths of the country. But in the midst of last year’s electoral wipeout, there was one bright spot: Citizens took the law into their own hands, introducing 71 ballot initiatives in 16 states—the most in a decade.

In Maine, citizens used ballot initiatives to increase the minimum wage and tax the rich to fund public schools. In South Dakota, they pushed through campaign finance reforms and restrictions on lobbying, as well as a system to create public funding for political campaigns. In Oklahoma, they reined in the War on Drugs, reclassifying nearly all felony drug possession charges as misdemeanors. And in Nevada, they mandated background checks for gun sales. In each case, citizens used ballot initiatives as a tool for direct democracy, using majority rule to push through policies that lawmakers are unable or unwilling to enact themselves.

But such victories have proved short-lived. Republican legislatures responded to the surge in civic participation by using their power to effectively overrule the will of the people—and to make it harder to enact citizen-backed reforms in the future. In South Dakota, state lawmakers simply repealed the voter-approved limits on campaign contributions and lobbying. In Maine, the state legislature threw out the voter-approved tax on the rich, and amended the minimum wage increase to exclude workers who receive tips. The state’s GOP governor, Paul LePage, boasted that there is nothing to prevent lawmakers from tossing out any ballot initiative they dislike. “If you read the constitution,” he crowed, “the legislature can just ignore it.”

It’s true that in twelve states, including Maine, there are no restrictions against such “legislative tampering” with citizen initiatives. And there are sometimes good reasons for lawmakers to make adjustments to ballot measures, especially if voters use them to infringe upon the constitutional rights of minorities. Ballot initiatives aren’t exclusively progressive tools, after all. In the early 2000s, conservatives used them to ban same-sex marriage in a number of states.

But the principle of direct democracy is a hallmark of the U.S. political system, stemming from the earliest days of the country. Ballot initiatives picked up steam during the Progressive Era, as a means for voters to push through reforms in the face of inept and intransigent politicians. South Dakota became the first state to adopt a statewide initiative process in 1898, and many others soon followed.

Veteran political observers say that the current conservative backlash against ballot initiatives is particularly extreme. “This stands out in recent history as one of the most brash years we’ve seen in the response by legislatures,” says Josh Altic, a project director at Ballotpedia, an organization that tracks ballot initiatives. And while state lawmakers frequently amend ballot initiatives, it violates the spirit of political participation to repeal them outright. “You should expect state legislatures to push back on these things, at least somewhat,” says Craig Burnett, a political scientist at Hofstra University. “But ignoring them altogether is different. It defeats the purpose of having direct democracy.”

In a preemptive strike, Republican legislatures are also making it harder for citizens to place initiatives on the ballot in the first place. Following the election, according to a report by Ballotpedia, lawmakers in 33 states introduced 186 bills to adjust the ballot-initiative process—often making it more restrictive. Both South Dakota and North Dakota established task forces designed to “reform” the initiative process, and Arizona banned advocacy groups from paying people for each signature they collect to place an initiative on the ballot—creating yet another hurdle in an already difficult process.

There are ways to protect ballot initiatives from legislative tampering. In California, voters must approve any move that legislators make to repeal or substantively alter initiatives. Other states permit repeal or amendment only by a supermajority vote of the legislature. And some states, like Nevada, prevent lawmakers from tampering with voter-approved initiatives until the measure has been enshrined in state law for a set number of years. Voters in South Dakota and Missouri are working to place similar protections on the ballot next year, in the form of constitutional amendments—which can’t be overturned by the legislature. “If those succeed,” says Altic of Ballotpedia, “you’ll see other states doing it, for sure.”

In an age of partisan gerrymandering and voter suppression, robust forms of direct democracy are more important than ever. Ballot initiatives give voters an essential means for passing laws that reflect the will of the majority, and Democrats should do whatever they can to protect the initiative process. But by definition, ballot initiatives are a tool of the powerless. If you have to try to pass a law yourself, it means your political party has failed. In the end, ballot initiatives are no substitute for Democrats finding a way to win control of state legislatures.

The Power of Negative Thinking
October 16th, 2017, 07:40 PM

President Donald Trump is a terrific leader, if he does say so himself. There’s never been a commander-in-chief so prone to extravagant self-praise, which is all the more striking given the paucity of his achievements to date. “We’ve done a great job,” he told reporters on Friday. “We’ve done a great job in Puerto Rico.” Later that day, he tweeted what “a wonderful statement” from “the great” Lou Dobbs, a host on the Fox Business Network: “We take up what may be the most accomplished presidency in modern American history.” In interviews, Trump is eager to tout accomplishments that, quite frankly, don’t even make any sense, as when he claimed in an interview on Wednesday with Fox News’ Sean Hannity that the rise in the stock market can be seen as offsetting the national debt.

Trump’s relentless self-promotion is one of his most consistent character traits, which can be traced back to his earliest days as a real estate mogul. In always tooting his own horn, Trump is a familiar American type: the eternal salesman, a hustler who won’t take no for an answer and will say anything to close a deal. Being relentlessly on the make, for someone like Trump, isn’t just a job; it’s a vocation. And that vocation is fueled by a theology of positive thinking.

As a number of observers have persuasively argued, Trump is guided by a particular gospel. Though he’s more secular than any of his predecessors, he has genuine roots in one particular strand of Protestantism. He grew up attending the Marble Collegiate Church in Manhattan, which was presided over by the Reverend Norman Vincent Peale, the author of one of the all-time bestselling self-help books in American history: The Power of Positive Thinking. In 1977, Peale would marry Trump to his first wife, Ivana.

While Trump has only the most rudimentary knowledge of the Bible, he often echoes Peale’s core lesson: that happy thoughts and cheerful chatter are the key to success. “Formulate and stamp indelibly on your mind a mental picture of yourself as succeeding,” Peale wrote in his best-seller. “Hold this picture tenaciously. Never permit it to fade. Your mind will seek to develop this picture. Never think of yourself as failing.” In a 1983 interview with The New York Times, Trump echoed Peale’s dogmas. “The mind can overcome any obstacle,” the young Trump said. “I never think of the negative.” In his campaign book Crippled America, Trump wrote, “Reverend Peale was the type of minister that I liked, and I liked him personally as well. I especially loved his sermons. He would instill a very positive feeling about God that also made me feel positive about myself.”

Trump might feel positive about himself, but not about the world around him. As a candidate, and even as a president, he has often used dark, frightening rhetoric to portray America as a land where ordinary people are betrayed by a globalist elite and exploited by cunning foreigners and vicious immigrants—the most memorable exampled being his “American carnage” inaugural address. He also concocts derisive nicknames for his political enemies, most recently going after “Liddle” Bob Corker. But this seeming contradiction between the mantra of “positive thinking” and Trump’s nasty, apocalyptic rhetoric is best understood as two sides of the same sales pitch: The world is a mess, and “I alone can fix it.” Trump’s portrait of an America in deep decline was a necessary predicate to winning votes and now, less successfully, to maintaining support, the logic being that the U.S. was in such dire straits that it’s worth the risk to trust Trump.

The theological roots of “positive thinking” show how the seeming polar extremes of pessimism and optimism work hand in hand. Peale’s “positive thinking” is part of one of the great revolutions in American history, the overthrow of the Calvinist conscience. Historically, Calvinism, the version of Protestantism popular among early British settlers, promoted an almost morbid self-reflection on personal sin. This often debilitating focus on remorse was challenged in the nineteenth century from religious reformers, philosophers, and self-help gurus who were collectively labelled New Thought. As against the Calvinist injunction to examine internal vice, the New Thought argued that focusing on wholesome, productive ideas was the path to virtue.

As Barbara Ehrenreich noted in her 2008 book Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America, there was considerable continuity between the older Calvinism and the New Thought, despite their superficial differences: “The Calvinist monitored his or her thoughts and feelings for signs of laxness, sin, and self-indulgence, while the positive thinker is ever on the look out for ‘negative thoughts’ charged with anxiety or doubt.” To put it another way, Positive Thinking doesn’t so much displace Calvinist theology as shift the locus of evil, now seen as an external enemy to be fought rather than an internal sin to be overcome.

In political terms, it is precisely because Trump can’t abide any negative thoughts about his abilities that any problem is externalized. As a megalomaniac, Trump can’t acknowledge fault or the need to improve himself, so he lays blame for his failures on foes instead: the Fake News media, disloyal Republicans, or whiny Democrats. In short, Trump’s self-regard is built on his contempt for “losers and haters”—and vice versa. Yet Trump’s polarized view of the world and stark combination of pessimism and optimism offers a lesson for his opponents: If you want to build a politics that persuades people that major change is necessary, you have to paint with the same bright colors of despair and hope that Trump uses.


In the 2016 campaign, Hillary Clinton pointedly refused to match Trump’s dark vision of our times. She echoed the uninspiring words of Barack Obama: “America is already great.” Clinton was, to be sure, hobbled by the fact that she was running to continue Obama’s legacy. Still, by accepting the binders of “America is already great,” she foreclosed the possibility of running as a transformative candidate. Her campaign message could be summed thus: “Preserve the status quo.”

Fortunately, Democrats running in 2018 and 2020 won’t be similarly bound by a popular incumbent president. With Republicans controlling all three branches of government, presidential hopefuls like Senator Elizabeth Warren, not to mention many down-ballot candidates, should take a page from Trump’s playbook.

From a liberal point of view, it’s easy enough to tell a dystopian story about America today. It’s not just that the president is a reckless, incompetent racist who has no grasp on policy. It’s also that the political system is in the grip of plutocrats who limit reform, the archaic electoral and representative system effectively allows for minority rule, and the Republican Party’s embrace of white nationalism and voter suppression threaten to erode democracy. This doesn’t even begin to touch on the intractable problems of climate change, extreme economic inequality, and systemic racism.

Positive thinking is popular not just among hucksters like Trump, but also the political left, attached as it is to a view that history is a story of progress, however haltingly. Obama constantly invoked Martin Luther King Jr.’s statement that “the arc of the moral universe is long, but it bends toward justice.” The notion it might not bend as such remains controversial on the left, hence the unending debate about the pessimism of Atlantic writer Ta-Nehisi Coates, whose work increasingly emphasizes the profound influence of white supremacy in America. In an extended back-and-forth with Coates in 2014, Jonathan Chait of New York magazine argued,  “It is hard to explain how the United States has progressed from chattel slavery to emancipation to the end of lynching to the end of legal segregation to electing an African-American president if America has ‘rarely’ been the ally of African-Americans and ‘often’ its nemesis. It is one thing to notice the persistence of racism, quite another to interpret the history of black America as mainly one of continuity rather than mainly one of progress.” In response, Coates made a compelling case that an optimistic narrative of American history sweeps too much ugliness under the rug by ignoring how the very basis of progress was often racism itself:

The notion that black America’s long bloody journey was accomplished through frequent alliance with the United States is an assailant’s-eye view of history. It takes no note of the fact that in 1860, most of this country’s exports were derived from the forced labor of the people it was “allied” with. It takes no note of this country electing senators who, on the Senate floor, openly advocated domestic terrorism. It takes no note of what it means for a country to tolerate the majority of the people living in a state like Mississippi being denied the right to vote. It takes no note of what it means to exclude black people from the housing programs, from the GI Bills, that built the American middle class. Effectively it takes no serious note of African-American history, and thus no serious note of American history.

You see this in Chait’s belief that he lives in a country “whose soaring ideals sat uncomfortably aside an often cruel reality.” No. Those soaring ideals don’t sit uncomfortably aside the reality but comfortably on top of it. The “cruel reality” made the “soaring ideals” possible.

Three years later, Donald Trump is president, confirming Coates’s grim assessment of American history up to the present day. Trump didn’t just run a virulently racist campaign; his entire political identity, as Coates wrote in a recent essay, is founded on negating the achievements of the first black president. “The first white president in American history,” he concluded, “is also the most dangerous president—and he is made more dangerous still by the fact that those charged with analyzing him cannot name his essential nature, because they too are implicated in it.”

Coates’s fatalistic view that racism is intrinsic to American culture, including on the left, is tough for many progressives to accept. Stephen Colbert recently asked him, “Do you have any hope tonight for the people out there, about how we could be a better country, we could have better race relations, we could have better politics?” Coates responded in the negative.

One senses a fear, among Colbert and so many others, that pessimism is paralyzing; that it discourages the fight for change. But optimism has its own pitfalls. It can cause political miscalculations, as when Obama assured supporters in 2012 that if he were reelected, the Republican “fever” would break and the GOP would become more cooperative. In fact, Republicans became even more fevered, not only obstructing Obama but becoming more extreme in their racial politics, paving the way for Trump.

The Italian Marxist Antonio Gramsci, during his long imprisonment under Benito Mussolini’s regime, famously wrote, “I’m a pessimist because of intelligence, but an optimist because of will.” In an American context, this combination can be found most potently in Abraham Lincoln, whose very awareness of the enormity of the problem of slavery pushed him toward the radical solution of abolition. There are few more negative national appraisals than Lincoln’s Second Inaugural Address, where he said, “Fondly do we hope, fervently do we pray, that this mighty scourge of war may speedily pass away. Yet, if God wills that it continue until all the wealth piled by the bondsman’s two hundred and fifty years of unrequited toil shall be sunk, and until every drop of blood drawn with the lash shall be paid by another drawn with the sword, as was said three thousand years ago, so still it must be said ‘the judgments of the Lord are true and righteous altogether.’”

Traditionally, modern politicians shy away from such a dismal portrait of their own country, for fear of furthering polarizing the nation and thereby making governance more difficult. Yet as both Trump and Bernie Sanders proved in 2016, pessimism is an effective mobilizing tool because it raises the stakes of an election, bolstering the case for risk-taking change. If such a case proved convincing for Trump in the waning days of a popular presidency and steadily improving economy, then surely it would be even more convincing under a historically unpopular president who is undoing efforts to fight climate change, proposing tax cuts for the rich, sabotaging health care for the poor, demonizing non-white people, monetizing his presidency, and posing an existential threat to American democracy itself.

Trump’s curious mixture of pessimism and optimism might be rooted in the flimsy self-help gospel of Positive Thinking, but it would be a mistake to confuse the message with the messenger. There is carnage in America indeed, even if it’s largely not the carnage that Trump claimed. The problem is that the solution he offered—his supposed skills as a deal maker—was quack medicine. But an accomplished politician could, as Trump did, appeal to suffering Americans while also selling a remedy that would, unlike Trump’s, actually address their troubles. In other words, the risk for Democrats lies not in preaching such a self-serving gospel. The real risk would be to dismiss Trump’s effective rhetoric simply because he failed to deliver on it.

Could There Be a Canadian Trump?
October 16th, 2017, 07:40 PM

Canadians have a complex relationship with the United States. On the one hand, Canadians share the world’s longest undefended border with the U.S., voraciously consume their pop culture, and speak with (almost) the same accent. But in spite of living right on top of the most powerful economic, military, and cultural force in the world, Americans and Canadians are growing further apart, not closer. How has a country of 35 million resisted being culturally, economically, and politically subsumed by a nation with 10 times the population and infinitely greater cultural output? And what happens when that powerful neighbor elects a leader who disregards the norms and values of the liberal world order? 

That’s what Michael Adams seeks to answer in his new book, Could It Happen Here: Canada in the Age of Trump and Brexit. Adams has been monitoring Canadian political, economic, and social attitudes since the 1990s, back when NAFTA was newly in force and Canadians feared that their country would become indistinguishable from its southern neighbor. At the time, it seemed inevitable. In 2002, a poll of Canadians found that 58 percent of respondents said that they were becoming more like Americans. But when Adams asked the same question in the spring of 2017, only 27 percent of Canadians felt that way.

COULD IT HAPPEN HERE?: CANADA IN THE AGE OF TRUMP AND BREXIT By Michael AdamsSimon & Schuster, 192 pp., $24

Much of what he sees taking place between Canada and the U.S. occurs in the context of history: Where Americans are now, Canadians have been before. Canada has already experienced a decade of media bashing, reactionary policy, attacks on science, and self-serving obfuscation by government officials. Because prior to the handsome, selfie-taking Justin Trudeau, Canadians had the Conservative Party and the aggressive Prime Minister Stephen Harper running the country from 2006 to 2015.

Elected by a mix of centrist voters upset with Liberal Party scandals and disenchanted, conservative-leaning suburbanites, Harper was a radical change for Canadians, and as his tenure dragged on, they didn’t like what they saw. What had once been a comparatively staid parliamentary process became an ideological battlefield driven by cultural resentments. In justifying new tough-on-crime legislation, justice minister Rob Nicholson said the government’s policies were instead going to be decided  not by the best available data, but by “common sense,” and a rejection of the “educated urban elites who had the ear of government for decades.” One can almost hear the muffled screams from behind the mask of politeness that Canadians always wear.

This anti-elite, anti-information statement set the tone. The conservatives went on to eliminate the long-form census, a data collection tool that was essential to administering Canada’s historically large and interventionist government. The head of Statistics Canada, the government agency that oversees the census, resigned in protest. The census was the bedrock of evidence-based decision making in the country, helping inform policies like where to set electoral boundaries, devote healthcare funding, and countless other areas of governance. Later, the conservatives pulled out of the Kyoto Protocol. The idea was to force the Canadian government to rapidly shrink.

Much like the Trump administration, the Harper government erred toward ideologically-driven unilateralism. The era of conservative rule was meant to make Canada more American: more nationalistic in its tone, more austere in its social programs, and more conservative in its economic policy. But the public didn’t fall for the conservatives’ anti-elitist sentiment: The more experts decried Harper’s policies, the more unpopular those policies became. As Adams notes, in Canada, “public opinion in recent decades seems to have steadily followed elite consensus.” Climate change and environmental activism became more prominent in Canada the more the Harper government opposed it.

Adams’ strength is in his ability to masterfully explain the intricacies of the American political psyche, and deftly contrast them with Canadians’ own tendencies. Americans are “more inclined to be religious, risk-taking, aspirational, and money-focused.” Canadians, on the other hand, went in the opposite direction, being “more secular, risk averse, and self-effacing.” In politics, 58 percent of Canadians said they liked elected officials who made compromises, as opposed to just 40 percent of Americans. Adams makes a convincing case that these differences in attitudes are real, and that they lead to very different political realities. 

Part of this difference is that Canadians trust government institutions in ways that Americans do not. For example, a recent statement by Donald Trump that he may not renew temporary protection status permits for some immigrants has sent a wave of predominantly Haitian refugees over the Canadian border into Quebec and Manitoba. While there has been grumbling over this influx—and some of it has been racist—most Canadians continue to believe that the government has handled immigration in a responsible way. The country continues to welcome over 300,000 immigrants a year, around 1 percent of its population, and none of the major political parties advocate for a cut in admissions. This kind of openness cannot happen in a country where the people do not trust state institutions to protect their best interests.

Canadian governments have generally behaved in the same way when it comes to refugee crises, irrespective of the party in power. In the late 1970s, it was Joe Clark’s Progressive Conservative government that admit 60,000 refugees from Southeast Asia, the so-called Boat People, to Canada. In 2015, when the Harper government tried to avoid admitting any Syrian refugees, the public backlash was substantial and many Canadians began to doubt the moral integrity of the conservatives’ position. Adams writes that, “Clark, and others from his 1979 cabinet, reminded Canadians of our previous generosity to the South-east Asians.” 

It’s not that Canada is a postracial paradise with no history of racist oppression. Today, conditions in many indigenous communities have been described as “third-world.” Drinking water advisories have been issued in most communities in the past decade and now many grapple with a youth suicide crisis. Black Canadians suffer too, being barred from movie theaters in the early twentieth century and enduring the demolition of black communities like Africville, Nova Scotia. But unlike the U.S., the government has acknowledged and apologized for these and other injustices of the past; recent evidence shows that at least in terms of Canadian electoral politics, it does not pay to espouse racism publicly. 

In fact, the Canadian right sees immigration as an electoral advantage. Many Canadian immigrants, particularly from China and South Asia, lean socially conservative, and make for natural Tory voters. The Conservative Party’s plans to win office in 2006 “ran directly through the heart of those urban and suburban neighborhoods that exemplify the new Canada, a country more positively invested in immigration than almost any other developed economy in the world,” Adams says. But by 2015, the party took an anti-immigrant tone. Sensing they had little to show for their decade in power, the conservatives decided upon fearmongering and anti-Muslim wedge politics. It didn’t work; in fact, Adams argues, this choice was what sealed its fate. They lost a staggering 60 seats in the 2015 election, dropping from 159 MPs to 99 MPs. In Canada, race resentment just doesn’t sell the way it does in the U.S.

After the Liberal victory in 2015, the conservatives found themselves at a crossroads. Sensing they want too far right, most politicians avoided sounding like Harper on social issues. The exception was Kellie Leitch, a candidate for Conservative Party chair who saw a path to victory through invoking an anti-elitist message and calling for the creation of a “Canadian values” test for new citizens. That didn’t work, either. Even among the conservatives, at least for the time being, race-baiting is no longer popular. Just 8 percent of eligible party voters supported her. 


Could a Trump-style figure arise in Canada? The recent trends suggest no. But that doesn’t mean that Canada couldn’t field a politician that uses the same divisive, bombastic language to mobilize disgruntled voters. Some would say that with Harper, Canada has already proved its ability to be seduced by the right. And like the U.S., Canada faces deindustrialization, increasing inequality, and a growing presence of far right groups. All the ingredients are there.

But it’s unlikely that Canadian voters would be duped by a charismatic leader the way that Trump supporters have been. That’s because in Canada, no one who pays attention to politics can believe that if they simply elect the right person, all the changes that need to happen can be made. In Canada, a slow-by-design federal government means that all policy changes must be negotiated between the provinces and even among the cities. The process creates a lot of bureaucratic red tape, and it means that even desperately needed policy changes can happen at a glacial pace. But it also immunizes Canadians from thinking that any leader, alone, can fix their problems. 

Yanis Varoufakis’s Doomed Fight Against Austerity
October 16th, 2017, 07:40 PM

If you are not the sort of person who is already likely to read a 500-page book on the  former Greek finance minister’s efforts to save his country from the machinations of the International Monetary Fund, then you aren’t going to become one because I leave you in suspense. So let me spoil Yanis Varoufakis’s Adults in the Room for you now. He failed.

ADULTS IN THE ROOM: MY BATTLE WITH EUROPE’S DEEP ESTABLISHMENT by Yanis VaroufakisFarrar, Straus and Giroux, 560 pp., $28.00

In early 2015, it looked as if Varoufakis’s party, the left-wing Syriza party, had a chance to roll back some of the harsh austerity measures the EU and IMF imposed on their country in the wake of the global financial crisis. Syriza came to power in January 2015, promising a better deal, and in July that year Greeks voted to reject the latest bailout proposal from the European Commission, European Central Bank, and IMF. Despite all this, Greece ultimately agreed to deep austerity measures. The country would service its creditors at the expense of its citizens, who suffered swingeing cuts to wages, pensions and public sector jobs at the same times as hikes in VAT and property taxes. Hell hath no fury like an international credit consortium scorned. This book is Varoufakis’s best effort to explain why his best effort was not enough.

Other writers have already explored the relative success or failure of Varoufakis’s accounting. On the superlative end, Paul Mason writes in the Guardian that Adults in the Room is “one of the greatest political memoirs of all time,” “one of the most accurate and detailed descriptions of modern power ever written,” that it will “gain the same stature as Robert Caro’s biography of Lyndon B. Johnson” as “a manual for exploring the perils of statecraft.”

While more grandiose in his praise than most reviewers, Mason follows the broad consensus. Varoufakis’s book lays bare the way Europe’s leading powers disguised bailouts of their own financial institutions as “assistance” to Greece, and confirms what many on the left have long suspected: that we live in an era in which the interests of supernational institutions—particularly the “troika” of the EUC, ECB, and IMF—outweigh those of sovereign nations. Fail to take on debts to these organizations, and your national economy will be fatally excluded from currency and trade arrangements. Take them on, and your economy will become a vehicle for servicing your debt. If hospitals and schools and social services must be plundered in order to accomplish this goal, so be it.

The “adults in the room” of Varoufakis’s title are the serious and sober decision makers who guard the global financial order. They act, as Varoufakis says, not out of greed or cruelty, but because their best intentions have been captured by the untenable and mystifying financial arrangements that keep capital alive in this century. They are “adults” by virtue of their willingness to resolve the terrible disparity between the rich countries’ self-proclaimed moral authority and the damage they wreak on smaller economies. The real grown-ups don’t let the perfect become the enemy of the convenient. The adults are a network of mutually dependent “insiders,” Varoufakis says, and for all their power they don’t have much leeway within their hierarchies to defy the status quo. If they did, they’d simply lose their positions and end up replaced by someone more willing. This was, ultimately, what happened to Varoufakis.


“Austerity,” Varoufakis writes, “is a morality play pressed into the service of legitimizing cynical wealth transfers from the have-nots to have haves during times of crisis.” That is the essential case contained in these 500 pages and the one that most every critic, myself included, finds essential.

But making that case is not all that Adults in the Room is after. Some critics—most notably Helena Sheehan in the leftwing magazine Jacobin—have attacked Varoufakis for failing to apply systemic analysis to what he correctly identifies as a systemic problem, and for discounting the role of the broader Greek left in the process. Sheehan agrees, along with Mason, that Adults in the Room “lucidly explains” the process by “Greece was never ‘bailed out’ but forced, through many layers of subterfuge, to bear the burden of bailing out German and French banks.’” She also concurs that by recounting the many tangled private conversations that led to this arrangement, Varoufakis performs a “service in bearing witness” to an “essential dynamic.” But, Sheehan argues, the book is ultimately selective with the facts “in a way that is blatantly self-justifying.” Among the book’s chief sins, she writes, is the way in which only the technocratic “adults” are held to account for their actions:

Everyone else’s role is blurred, distorted, or even invisible. Syriza barely exists. The Greek Left are nearly absent. The Greek people fade into the background. It is a landscape of elite players and anonymous masses.

Adults in the Room is a “Lone Ranger narrative,” “blind” to its own elitist tendencies, focusing on a battle of personalities rather than setting the crisis in historical context. “His hubris blinds him to the bigger picture,” Sheehan concludes. “Varoufakis projected himself as Prometheus, but revealed himself to be more of a Narcissus.”

There is no doubt that Varoufakis is grandiose. Adults in the Room is the work of a man who spent his time in office being photographed on his motorcycle, leather jacket never failing to expose his rippling muscles. He devotes plenty of room in the book to settling scores and talking up his own “solitary struggle,” and his account is filled with pretension masquerading as style. His very first sentence describes “amber liquid flickering in the glass” in a hotel bar. He has come to the bar, he explains, because he is one of the “few” for whom “burdens trump sleep.” He tells us that “watching Greece’s creditors at work was like watching a version of Macbeth unfold in the land of Oedipus,” as if the real trouble with the European Union is its inability to stick to one metaphor. Varoufakis is smiling, always smiling, as he “dismisses” threats, gives difficult, honest answers, and outwits dour German bureaucrats.

But I am not interested in suggesting that Adults in the Room would have been a better book if it had not been so exhaustingly teenaged in its cool. Rather, I want to ask whether it is really possible to “expose” the intrigues of financial power in a book like this. While Varoufakis’s routine may have been among the more grating of bad choices, it possible that there were never any good choices on offer here.


The problem, at bottom, is that having lost the material battle for the future of Greece, Varoufakis can only tell a story. For all his “revelations” about the working of the IMF here, his book reveals very little that was not already known. Showing how a system fails people is not the same as acquiring the power to change that system and bring about a better world. The whole magic of the “adults” this book goes after is that “insiders” do not need to operate in secret. They only need to operate with irresistible force. That is the mundane tragedy of the world. Because a book cannot alter these forces, it can only do what all books with unhappy endings do: attempt to squeeze meaning from catastrophe.

It is worth examining the two plays Varoufakis invokes in his introduction and to which he returns throughout the text. The first is Shakespeare’s Macbeth, which Varoufakis calls on in order to argue that the players in his story, even the really wretched ones, all operated with the best intentions—their tragedy was that ambition necessarily leads those who act on it to bad ends. This is useful enough to a certain extent, but despite the critique in Jacobin, Varoufakis is at least superficially aware of the fact that it is not individuals and their flaws that drive history but rather the impersonal forces of money and power. Thus the second play he brings in, Oedipus. This ushers our well-intentioned Shakespeareans into a rigged world. It wouldn’t matter if they could only see their own character defects: Try to avert catastrophe, Sophocles tells us, and you invariably bring it about.

But it’s an odd comparison. While Adults in the Room surely ends with Greece’s ruin, the international conspirators Varoufakis wants to indict were never trying to avert that. For them, ruin would be the insolvency for their central banks. In order to prevent this disaster, they must destroy the welfare of the Greek people. They succeed. The world does not spring its trap, at least not on the central actors here. Maybe Varoufakis would have had better luck with different literary parallels, but I don’t think he would. The trouble is that stories require twists of fate and just deserts and motivated action. The world does not. In his story, a million small incentives allow the accidental heirs of global power to get precisely what they want. They begin on top. They end on top. They may have flaws, but it doesn’t matter. There’s no plot here but the slow, inevitable grind of deck chairs shuffling on an aggravated, unfair planet.

Perhaps he might have been less melodramatic about it all, but it’s hard enough to make anyone want to read so much about a forgone conclusion. You’ve got to attract attention. You’ve got to offer something even those sorts of readers not ordinarily inclined to buy a 500 page book by a former Greek finance minister can grab onto—but what does that look like? Perhaps, in the best case, something more subtle than this. But in every case it will be howling protest awkwardly shoehorned into the structure of high epic; that is to say, it will be something that power can roll its eyes at, while the major financial institutions carry on finding the next poor weak nation to take for all it’s worth.

Adults in the Room is an essential book and it is vital that books like it be written about each one of the injustices inflicted on the vulnerable by the present masters of the world. But this effort, in particular, is clumsy and these efforts, in general, are a stopgap at best. In 500 pages, you don’t learn much from Varoufakis about what he might have done differently, or how the Greek people could have won. I suspect that’s because he doesn’t know. That’s why he’s left banging on the ramparts with an uneven book. But more than a better book, I’d like a better leader, one who wins.

How to Break Up a Country Without Creating Havoc
October 16th, 2017, 07:40 PM

Earlier this month, voters in the autonomous Spanish region of Catalonia overwhelmingly approved a controversial referendum to secede from the country, though the consequence of the vote remains unclear. In an address to the Catalan Parliament, the region’s president and separatist leader, Carles Puigdemont, simultaneously declared independence and suspended that declaration to pursue negotiations with Spain. Rightly confused, Prime Minister Mariano Rajoy gave Puigdemont eight days to clarify his position, threatening to suspend the Catalan Parliament and impose central government authority over the region. That deadline arrives on Monday.

Meanwhile, in Iraq last month, Kurds voted overwhelmingly to pursue an independent state of Kurdistan. Massoud Barzani, the president of the region, warned the Iraqi government against taking measures to block Kurdish independence. The authorities in Bagdad rebuffed these demands and took diplomatic steps to isolate the Kurds. Iraq on Monday seized oil fields near Kirkuk as part of a broader offensive to reclaim the Kurdish-controlled city, The New York Times reported.

These campaigns have raised the prospect of partition in their respective nations. Although the Kurds may have more compelling claims to independence than the Catalans—Kurds already exercise effective political and military control over territory in the region—they both present credible challenges to central governments and promise to transform political landscapes in these countries and across Western Europe and the Middle East. Last-ditch negotiations may still resolve their differences, but if they do not, there will likely be a nasty political—and perhaps even armed—fight for independence. At this crucial moment, it is important for parties on all sides to consider the risks associated with partition, which have been largely dismissed or ignored.

During the last 100 years, great powers and nationalist movements have partitioned states around the world, first in Ireland and then in Korea, China, Vietnam, India, Pakistan, Palestine, Cyprus, Germany, the Soviet Union, Czechoslovakia, Yugoslavia, Ethiopia, and mostly recently, Sudan. Diplomats advanced partition as a way to solve emerging conflicts and award state power to deserving movements, which were often willing to use force to achieve their independence. But partition, which the diplomat Conor Cruise O’Brien once described as “the expedient of tired statesmen,” rarely settled disputes between different ethnic groups or political adversaries. In fact, it usually created more problems than it solved.


In modern history, the partition of nations has almost always created three problems that kept conflicts boiling. First, it triggered massive migrations across newly drawn borders. The outbreak or threat of violence forced people to leave, particularly if they were anxious about their status as minorities in newly created states. Others were drawn by the opportunities they hope to find among their own people, in states where they might exercise their rights as a majority. Hundreds of thousands fled across borders in Korea, Vietnam, Palestine; millions in Germany and the Soviet Union. In India, 17 million refugees crossed frontiers during the first six months after partition. The violence that drove them claimed one million lives.

Importantly, many people also decided to stay. They clung to their homes and hoped for the best. Millions of Muslims made this choice and remained in India. Today, India counts a larger Muslim population than Pakistan. But officials in many divided states adopted policies that discriminated against resident minorities and privileged members of ethnic-political majorities. They gerrymandered electoral districts, suppressed minority voters, or deprived them of citizenship. They denied minority access to higher education, public service, and armed forces. They adopted official languages that disadvantaged non-native speakers. They passed laws of return (Germany, Israel, Eritrea) that privileged ethnic nationals from diaspora communities over resident minorities who lived in their homelands.

In response to discriminatory practices, resident minorities fought for redress and solicited support from their brethren in neighboring states, as they did in Kashmir and in Ireland. This led to a second problem: conflict within and between divided states. “The Troubles” in Northern Ireland and intifadas in Palestine are symptoms of this problem. In recent years, movements in some divided states have decided to prevent the onset of these problems by forcibly removing all resident minorities from their midst, which in Yugoslavia and Ethiopia/Eritrea led to ethnic cleansing.

Third, officials in divided states quarreled over the division of assets and the borders drawn between them. They claimed territories and people not assigned to them. They challenged the sovereignty, the very existence, of their neighbors. And they waged wars to assert or defend their sovereign claims. This led to the outbreak of major wars in Korea, Vietnam, and Yugoslavia; recurring wars in Ethiopia/Eritrea, India/Pakistan, and Israel/Arab states; and uncivil wars in Ireland, Israel, Pakistan, India, Ukraine, and numerous post-Soviet states. These conflicts in turn triggered superpower interventions that widened wars and, in some cases, resulted in nuclear threats that led directly to the proliferation of nuclear weapons in North Korea, China, India, Pakistan, and Israel. The current crisis of proliferation in North Korea is a contemporary byproduct of partition.

But partition has not always led to migration, discrimination, conflict and war. If the parties in Spain and Iraq are determined to take steps that might result, willingly or reluctantly, in partition, they might consider how one country managed to get partition right.


In 1992, Václav Klaus and Vladimír Mečiar, the respective leaders of the Czech and Slovak political parties, partitioned Czechoslovakia, but avoided the problems commonly associated with divided states. The “Velvet Divorce” had several important, unique features. First, Klaus and Mečiar agreed that partition was in their mutual interest, though for different reasons. The Czechs wanted to create a free-market economy; the Slovaks wanted to retain a social-welfare state. They were able to negotiate an amicable separation because there was no real constitutional impediment to partition (Czechoslovakia was a federal state made up of constituent “republics” in Bohemia and Slovakia); Czechoslovakian President Václav Havel lacked the political power to deny it; they did not test the agreement with a popular referendum; and legal and military bureaucracies did not obstruct it.

Second, Klaus and Mečiar did not fortify or close the border, restrict trade, or demand that minority ethnic populations decamp for their newly assigned “homelands.” Instead, they kept the frontier open, encouraged business as usual, allowed citizens to move freely across the border, and gave them two years to choose which country they wanted to claim as their own.

Third, they both applied for membership in the European Union and NATO, which both eventually joined. These shared institutions promoted economic development in both countries and provided mutual security as allies in a shared military institution. EU membership also allowed people to migrate freely as workers, tourists, and households across the Czech-Slovak border and around Europe, in effect creating dual citizenships for all. As a result, ethnic-national identities did not harden or diverge after the Velvet Divorce and residual conflicts were resolved without acrimony.

Many of these conditions may not apply in Spain and Iraq. For instance, Spain does not have a constitution that easily accommodates separation and its leaders see little benefit from partition, making it difficult for them to agree to an uncontested divorce. In Iraq, disagreements over oil wealth and territory (such as Kirkuk) make a peaceful divorce unlikely. For these reasons, the leaders of the Kurdish and Catalonian separatist movements might do well to take step a back from the precipice of partition and seek a political solution to addressing their grievances while remaining part of the country from which they seek to secede. If nothing else, the recent declarations of independence in these two regions might provide the necessary leverage to do so.

Trump Just Started a Nuclear Crisis With Iran
October 13th, 2017, 07:40 PM

President Donald Trump on Friday did what he does best: He threw a temper tantrum on national TV. To address what he called “the increasing menace posed by Iran,” he announced his refusal to send Congress a routine certification required by law that Iran is keeping its commitments under the 2015 nuclear deal. He also announced “additional sanctions on the regime to block their financing of terror,” and promised, “We will deny the regime all paths to a nuclear weapon.” In doing so, Trump will start a new nuclear crisis, one that he has no idea how to solve—and which may end with Tehran following Pyongyang’s lead, testing thermonuclear weapons and long-range missiles that can strike the United States.

Trump claims he has good cause for not providing certification. “Importantly, Iran is not living up to the spirit of the deal,” he said on Friday. But Yukiya Amano, the director-general of the International Atomic Energy Agency, which is charged with monitoring and verifying the deal, was unequivocal last month: “The nuclear-related commitments undertaken by Iran under the [Joint Comprehensive Plan of Action] are being implemented.” Even one of Trump’s top generals, James Dunford of the Joint Chiefs of Staff, says so.

The real crisis here is Trump’s fragile ego. As a candidate, he called the Iran nuclear deal an “embarrassment” and the “worst deal ever.” So nothing has been more humiliating to him to be required, under the Iran Nuclear Review Act, to certify that Iran is, in fact, keeping its promises. This amounts to requiring the president to admit, in writing, that he’s a windbag.

Trump has provided this certification twice—although only after a knock-down, drag-out fight with his advisers both times. A senior administration official told CNN last week that Secretary of State Rex Tillerson “has said the problem with the JCPOA is not the JCPOA,” adding, “It’s the legislation. Every 90 days the president must certify and it creates a political crisis. If the administration could put the nuclear deal in a corner, everyone could happily get back to work on dealing with everything else that is a problem with Iran.”

How this cumbersome requirement even came to be is itself an enormous act of political cowardice.

In 2015, the GOP-controlled Congress wanted a role in the process that led to the Iran deal. But members of Congress by and large loathe taking the slightest bit of responsibility for their actions. Their solution: the Iran Nuclear Agreement Review Act, which allowed Congress to review any agreement reached during the then-ongoing talks with Iran about its nuclear program. The bill, which sailed through both chambers by wide, bipartisan margins, ensured that if a deal was reached with Iran. The bill created the classic free vote in Congress—since Obama could veto any resolution of disapproval, members were free to vote against the agreement without actually putting the deal at risk. (A presidential veto proved unnecessary in the end.) Thus, political giants like senators Bob Corker, Chuck Schumer, and Ben Cardin had their cake and ate it, too.

Congress also required the president certify, every 90 days, that Iran is implementing its commitments. This was a seemingly cost-free way for Congress to appear to maintain a keen interest in the issue, while actually shifting the burden to the White House. Obama, playing the role of responsible adult, could be counted on to take the political blame and submit the certification. But no one would accuse Trump of being a grown up. So he’s tossing the issue back to Congress.

The Iran nuclear agreement will not automatically collapse without certification. Trump’s decision merely triggers a process by which Congress may, if it chooses, re-impose some or all of the U.S. sanctions that preceded the 2015 deal; it could also simply repeal the troublesome certification requirement. But members of Congress aren’t pleased about being put back on the spot. Corker seems unusually incensed, and why wouldn’t he be? After all, he’s managed to dupe the fact checkers into thinking he opposed the Iran nuclear deal just because he voted against it.

It might seem cynical to paint administration officials as toadies enabling Trump’s bad behavior, and members of Congress as cowards unwilling to take political risks for an agreement they know is our best chance to head off a nuclear-armed Iran. And yet, enablement and cowardice are precisely what we have seen over and over again, as our leaders have watched Trump seize control of the Republican Party and now our republic. Time and again, we have seen public officials take the easy way out.

Why should that change now? Trump got to throw his temper tantrum, while White House staff and members of Congress imagine that they are clever enough to placate him without quite killing the deal. There are already reports that Corker, along with Senator Tom Cotton, is drafting legislation that will alter the terms of the agreement with Iran. Once again, they want to have their cake and eat it, too: looking tough on Iran, without blowing up the deal. Maybe it will work. Then again, this is how we got Trump in the first place. Politicians thought they could avoid the political cost of stopping him from winning the nomination and then the election, while counting on others to do the dirty work for them.

In Defense of Rotten Tomatoes
October 13th, 2017, 07:40 PM

On Tuesday, Martin Scorsese graced the online pages of The Hollywood Reporter
with a guest column on the way that America judges films now. He’s not happy with the rise of box office sales as the arbiter of a film’s worth. This process has been on the rise since the 1980s, as he concedes, but he links the instant production of success/failure data to other types of rating, too. Market research outlet Cinemascore and online aggregator Rotten Tomatoes are rating movies “the way you’d rate a horse at the racetrack,” he writes. Meanwhile, real movie criticism is dying out. The result is an ungenerous viewership who are not seeing movies fairly.

According to Scorsese, Rotten Tomatoes peddles “a tone that is hostile to serious filmmakers.” The people contributing there are “engaged in pure judgmentalism” rather than considered argument. They are “people who seem to take pleasure in seeing films and filmmakers rejected, dismissed and in some cases ripped to shreds.” Scorsese broadens his point with reference to Mother!, the Aronofsky movie which he loved but was panned by Rotten Tomato-ers and professional movie critics alike, including me.

This style of fogeyism all feels a little rich coming from a filmmaker who owes his own career to the collapse of the old Hollywood studio system. Scorsese is one of the key beneficiaries of the director-centered filmmaking that emerged after the five major movie studios of the “golden age” ceded control of production. Hollywood is a place in constant flux. But to take his column on good faith, the crux of his argument seems to be that Rotten Tomatoes is promoting the worst of democratized criticism, rather than the best. The chief problem, he thinks, is in the pace of this reviewing.

“Good films by real filmmakers aren’t made to be decoded, consumed or instantly comprehended,” Scorsese writes. “They’re not even made to be instantly liked.” This is a fair point. When a new movie comes out, the instantaneous conversion of viewer reception into data via Cinemascore and online takes is bound to be warped in favor of whatever idea comes to the reader first, rather than whatever idea is best. Film critics are, I suppose, a bit more practiced at slowing down their reactions. Box office ratings are certainly not a good measure of how well-made a movie is, and there’s merit to the argument that insta-criticism functions more like box office data than like measured appreciation.

But the problem with Scorsese’s point is that Rotten Tomatoes, well, mostly gets it right. Admittedly I’m biased—I use Rotten Tomatoes and love it. For example, in his column Scorsese cites “The Wizard of Oz, It’s a Wonderful Life, Vertigo and Point Blank” as movies that were “rejected on first release and went on to become classics.” The idea is that Hollywood’s hostile viewership is not giving movies a fair chance, and sometimes the best movies need multiple chances to catch on. But these four movies are rated 99%, 94%, 97%, and 97% “fresh” at Rotten Tomatoes, respectively. Those are pitch-perfect ratings! I couldn’t have done better myself.

The question then becomes, are the scores on Rotten Tomatoes just reflective of the history of criticism, and thus of conventional opinion? Or are the Rotten Tomatoes contributors themselves the critics who prop up these reputations, and thus pave the way for smarter conversations about film online? Scorsese sees them as mere responders-to and parroters-of film reviewing. I think they’re doing their own real criticism. There’s an element to Scorsese’s column that feels simply anti-reviewer. Movies are “just made,” he says, “because the person behind the camera had to make them.” This might be true, but it’s a truth seen from the filmmaker’s side. Critics feel similar promptings, I think. We just write reviews because we just do, because they’re things that we feel have to be said. That goes for staff critics and online writers alike.

Rotten Tomatoes is a repository of some of the best nonprofessional film criticism on the internet. Take this observation about Mean Streets (1973, 85% fresh), by Scott M.: “Takes the gangster movies from the golden age and kicks it up a notch.” Spot on. Ian B. continues the exegesis, explaining the plot issues that hamper this movie: “De Niro keeps screwing up, Keitel keeps trying to bail him out. That’s about it.” It’s not the highflown prose of the New York Times, but these are accurate assessments.

There’s also a particularly Rotten Tomatoes-ish vein of criticism that mainstream reviewing lacks, which is distance. On the Mean Streets page, for example, “super user” Daniel P. describes how it feels to watch the movie at a distance of “this many years.” Scorsese’s subsequent career made the casting into something of a red herring: “if you’re a De Niro fan, don’t go into this thinking he’s the star! I only knew a little about the movie before watching, and it took a while (maybe too long) for me to really invest in Harvey Keitel’s character, Charlie, whose movie it actually is—I don’t blame the filmmaker or Keitel for that though, that’s my fault.”

But it isn’t Daniel P.’s fault. He is actually offering up an interesting statement: a perspective on the early films of Martin Scorsese by a viewer whose reception of those films has been overwhelmed by the later features of his career—features that include Robert De Niro’s face in every frame. And it speaks to a certain weakness in the Scorsese oeuvre, notably repetitiveness. I don’t know about you, but I’ve had it with soundtracks by The Rolling Stones. I’ve had it with shots of a smirking Leonardo di Caprio. And I’ve specifically had it because the repetitions of Scorsese’s later career empty his early innovations of their transgressive power. It’s difficult to find space to talk about Mean Streets in a magazine review, because we have to think about new things in light of the old, not old things in light of the new. Rotten Tomatoes is a repository for analyses like these.

Scorsese’s column is all the more confusing because his movies are pretty easy to love. Not all of them have been well-reviewed, but movies like Taxi Driver and Raging Bull bathe in the kind of edgy machismo that pins posters to the walls of college students. But again, that speaks to the innovation-turned-convention that the Rotten Tomatoes critic of Mean Streets was getting at. It makes sense that a famous auteur would loathe “crowdsourced” movie criticism on the internet, because it undermines the predictability of the relationship between director, studio, distributor, and critic. It’s unpredictable, anarchic, and informal. But that doesn’t make Martin Scorsese right, or the anonymous online critics wrong.

Bernie Sanders Isn’t Winning Local Elections for the Left
October 13th, 2017, 07:40 PM

One of the few bright spots of the Trump era thus far has been a new wave of electoral wins for candidates with decidedly left-of-center views. The victories have come in municipal and state-legislative races—most notably in places like Alabama, Mississippi, and Long Island, where the left isn’t “supposed” to have a chance to win anything. In some cases, like last week’s mayoral victory of Randall Woodfin in Birmingham, left-wing Democrats are unseating centrist, Chamber-of-Commerce-style Democrats. In others, longtime left-wing activists are successfully challenging Republicans in places where centrist Democrats have long failed.

These breakthroughs are bringing fresh ideas and new faces into the foundational layers of the political system, where conservatives have been ascendant for years. But the national media, which actively misunderstands both the South and the rest of “red” America, has decided to cover these stories only as triumphs of the “Bernie Sanders left,” as though all politics were not (in the famous phrase) “local” anymore; instead, national reporters and pundits increasingly, misleadingly, see all local politics as national.

“Bernie Wins Birmingham” is convenient shorthand for those who have no idea what actually goes on in Birmingham. But Bernie Sanders and the group his 2016 campaign inspired, Our Revolution, are not winning elections in places like Birmingham or Jackson, Mississippi, which in June elected a mayor who’s promised, “I’ll make Jackson the most radical city on the planet.” Activists in Birmingham and Jackson and Albuquerque and Long Island are winning them—left-wing activists who’ve toiled for years in the trenches, working with a new wave of organizers from Black Lives Matter and other insurgent groups, who bring social-media savvy and fired-up young voters into the mix.

Of course, it’s a great thing that groups like Our Revolution, which sprung out of Sanders’s 2016 presidential campaign, are bringing money, volunteers, and national attention to candidates like Woodfin. But the top-down narrative misses a lot about what is happening on the ground around the country. For starters, it misses the movements that shifted politics to the point where someone like Sanders could run for president and win state after state in the first place. More important, it misses the specifics—the ideas, the tactics, the challenges to existing political hegemonies—that have made these campaigns successful. And telling the story wrong lessens the chances that these unlikely wins can be replicated elsewhere.


In the wake of Woodfin’s victory in Birmingham, political scientist Vince Gawronski of Birmingham-Southern College commented that while Sanders’s endorsement helped with young people, “No one I talked to said they were voting for Woodfin because Bernie Sanders told them.”  

Woodfin was given a boost by Our Revolution, which provided phone-banking and texting support, and had on-the-ground help from the Working Families Party as well. But it was an army of Birmingham volunteers running a door-to-door canvass that made the difference. And it was Woodfin’s proposals—some of them straight from community movements—that spoke to the city’s frustration with unequal “revitalization” efforts that leave too many neighborhoods behind. The mayor-elect proposed, among other things, a youth jobs program inspired by a black mayor whose community programs are largely forgotten by the national media—Marion Barry of Washington, D.C.

The “Bernie left” stories also obscure the fact that Birmingham, despite the stereotypes pinned on it and other Southern cities by the northern media, has a recent history of electing progressives—including the first openly gay state legislator in Alabama, Patricia Todd. Woodfin’s win wasn’t a “red-to-blue” shift; it was a center-to-left shift—and there’s a whole different kind of moral to that story. 

This is, perhaps, where the “Sanders-left” frame does apply—in cases where activists are knocking off more moderate Democrats and nudging the party leftward, the way Sanders did nationally in 2016. Yet the wave of left-leaning victories has also included the likes of Christine Pellegrino, a teacher and opt-out movement leader who swung a heavily Republican district on Long Island nearly 40 points from Trump’s total just a few months earlier.

These eye-opening wins are bringing attention to social-democratic (and Sanders-backed) candidates on ballots this November. Vincent Fort, a 20-plus-year state senator in Georgia running for mayor of Atlanta, is hoping to succeed his bitter political enemy, business-friendly Democrat Kasim Reed. Fort has ties to labor and local organizing that stretch back throughout his tenure in the legislature. Eric Robertson, political director of Teamsters local 728 in Atlanta, says Fort “has always been there as part of the movement and has always been willing to lend his voice and at times his body to different struggles going on in and around Atlanta.” He’s mobilized alongside anti-eviction activists, fought predatory lending years before the financial crisis. “Now that he’s running for office,” Robertson says, “people see him as the authentic voice of the movement,” rather than “just some person who has never spoken to these groups before, but all of a sudden is an advocate.”

It’s that authentic movement connection, more than a simple national endorsement, that helps candidates galvanize people and win. “All these candidates got elected mainly because they had established a base from the work they have been doing in their communities, or as elected officials,” Robertson says. “What they’re getting from the Bernie campaign is to be able to form a coalition with populist-leaning progressive white folks on a scale that has not been seen for quite a long time.”

Larry Krasner, a civil-rights attorney who won a shocking victory in May’s Democratic primary to be Philadelphia’s district attorney, told The Dig podcast that his work as a lawyer defending movements gave him a campaign army when he decided to run. “I think activists and organizers do politics better than politicians,” he said. “And that means that those of us who have been down with their causes and have supported them for a long time have credibility.”

In some places, newer faces on the scene have established credibility by leading newer movements. Atlanta’s khalid kamau (a Yoruba name, and thus lowercased)—a DSA member and co-founder of Atlanta Black Lives Matter and “fight for $15” stalwart—stunned the local Democrats by winning a city council seat in April.

The more radical demands of the newer movements have shifted the left’s political horizons and sharpened its demands. And its organizing skills and social-media savvy laid a path for activists like Krasner and kamau to move from relative obscurity to national name recognition. “Social movements expand the range of the possible and transform public opinion,” says Joe Dinkin of the WFP. “Larry never could have won had the Black Lives Matter movement not existed these last several years. The Black Lives Matter movement transformed how Americans thought about policing and about mass incarceration.”

Some of the recent wins attributed to the “Bernie left” are the product of decades of planning through what were very dark times for social movements. The victory of Chokwe Antar Lumumba in this year’s Jackson, Mississippi, mayor’s race was the product of decades of work that had first borne fruit in 2013 with the election of Lumumba’s father, also Chokwe Lumumba, to that same office. The first Lumumba’s untimely death in office put the plans the Jackson organizers had made on hold for a while, but they’d been building a movement from the ground up in Jackson since 9/11. That’s when the Malcolm X Grassroots Movement, known nationally for its work calculating how many black people in America are killed by police each year, decided to focus its efforts on transforming Jackson’s economy and governance. They began to build people’s assemblies, and that grassroots work formed the basis for runs for office. “Despite many of our years, if not decades, of studying radical theory and process, we didn’t encounter many who had any serious analysis on how to actually govern,” says Kali Akuno, a leader of what became Cooperation Jackson

Their focus now is building a “solidarity economy” on a local level that can be a model for transforming cities and empowering underserved communities, Lumumba told me recently. Part of that plan is to build cooperatives—with the city’s backing—to create new businesses. It’s a direct contrast to the normal development strategy of city and state governments, which throw tax breaks at big corporations in hopes that they’ll create a few jobs with the handout. “Where we see a void, where we see a need, we can create something for ourselves,” Lumumba says, “so the community can fill its own gaps and at the same time give the people who work the opportunity to dictate what their labor will be and what the fruits of their labor will be.”


There’s something about the fast-moving, ground-shaking political moment we’re in that defies the nomenclature we have, which perhaps explains journalists’ need to reach for a personality—a Bernie—to define what’s happening. Things are changing too quickly for our language to have caught up. “Progressive” feels too vague, too reminiscent, perhaps, of the 2000s-era anti-Bush “netroots” moment, though plenty of the people running today’s left-leaning campaigns cut their teeth in the netroots. For some, “populist” is too easy to confuse with the right-wing, Donald Trump brand. And “social democratic,”  brings the “S-word” into the equation, and risks alienating some of the less-radical left-wingers who’ve just gotten turned on to activism. 

The movement that supported Sanders in 2016 was simply too broad to lend itself to easy labeling, ranging as it did from the socialists of DSA to left-leaning Democrats who hadn’t been moved to hit the streets under President Obama. “There’s a much larger scale of people who are open to a left politics that’s a bit more moderate than your average DSA member but to the left of the Democratic party mainstream,” says Robertson. There are also those—like Randall Woodfin himself—who backed Clinton in the primary, but are to the left of the Democratic mainstream and have fought since the inauguration against Trump’s policies. 

“In the age of Trump, most Democrats are in no mood to wait around and make slow progress when so much is under attack—voters want what they believe in and they want it now,” says Dinkin of the Working Families Party. “Trump has been part of awakening a new fervor and even militancy in voters.”

But the national media hates nothing more than “it’s complicated.” And unlike in the past, because of the decline of local newspapers, they don’t have sharp local political reporting to lean on for making sense of particular elections. Instead, narratives get picked up and run with because they are already out there. 

The tendency to reduce the story to the “Sanders left” also exemplifies an ongoing problem of horse-race journalism. Really, though, boxing is a more apt sports metaphor than a horse race, since journalists tend to count every action or sentence from a politician as a jab, cross, or knockdown blow with the relish of a ringside announcer. Sanders, in this framework, is important because he is a once and (presumed) future presidential candidate, and therefore everything that involves him in some way or other is seen through the lens of his rising or falling power within left-of-center politics. 

Take this recent story from Business Insider: “There’s a quiet battle between Bernie Sanders and Kamala Harris.” Ostensibly a news piece about the Atlanta mayor’s race, what it actually does is take the opportunity of a (perceived) 2020 presidential candidate giving a speech in Atlanta to set up a mostly-nonexistent “battle”—not with another wing of the same party or even against a competing ideology, but against another (presumed) 2020 candidate. Sanders has backed Vincent Fort in Atlanta; Kamala Harris said some generic words of praise for the sitting mayor of the city, Reed, when she spoke there. Hardly a “battle.” But there we are.

It is our inability to conceive of politics beyond personalities, in part, that makes such farcical articles tick. But the problem goes beyond a couple of laughably bad takes. To really understand our shifting politics, we need to understand Sanders himself as the symptom of a phenomenon that is in fact global—the failure of the neoliberal center in the wake of the 2008 financial crisis, which has resulted in the collapse in country after country of the consensus between a couple of centrist ruling parties, and the rise of popular social movements to fill the vacuum left by politics that is so clearly not up to the task of finding solutions to our problems.

In that time we have seen the rise of not just Sanders but Jeremy Corbyn in Britain, Jean-Luc Mélenchon and La France Insoumise in France, Podemos in Spain and Syriza in Greece, along with right-wing counterparts with similar politics to Donald Trump. In some cases, like Sanders, Corbyn, and Mélenchon, or indeed like Vincent Fort, they are people who have labored below most people’s radar for decades, building a reputation as the elected official you could call to come walk a picket line, to craft a bit of legislation that would sneak through progressive policies or one that might fail but help to move the dial a little bit leftward. In the post-2008 moment of more aggressive social movements and more dramatic shifts, those people are often being turned to by a new generation looking for leadership, while younger leaders also rise from the streets to elected office they perhaps never imagined occupying.

The left-wing surge in local politics, in so many different cities, bears watching in part because these new mayors and council members will be tomorrow’s state-legislative leaders, gubernatorial and senatorial candidates. (Imagine Woodfin occupying Jeff Sessions’s old U.S. Senate seat some day.) But it’s also a sign of a rising tide every bit as important as the Trump movement, which has garnered so much more careful attention. Reducing them to victories for one “side” of the Democratic Party certainly does social movements no favors, either. As Chokwe Antar Lumumba said, “I do not believe electoral politics is the end; it is the means to an end.”

We’re All Living in Hobby Lobby’s Bible Nation
October 13th, 2017, 07:40 PM

Hobby Lobby stores feel like a second home if you were raised evangelical. It’s Vacation Bible School and Sunday School and girls’ Bible study, located inside one shabby-chic warren. The stores’ owners, David Green and his family—like Chick-fil-A’s Dan Cathy and many before them—have layered a Christ-like veneer onto the pursuit of profit. At Hobby Lobby you can buy a Jesus cross-stitch kit or a poster that reads “This Girl Runs On Cupcakes and Jesus” alongside beads and quilting fabric; it is the store of choice for America’s church ladies, and it has, in turn, made its owners billionaires. The Lord is good—to the Greens. 

BIBLE NATION: THE UNITED STATES OF HOBBY LOBBY by Candida R. Moss and Joel S. BadenPrinceton University Press, 240 pp., $29.95

David Green is an exemplar among Christian businessmen. He stands out not just for his improbably large craft empire, now worth $4.3 billion, or for his religiosity but for the scale of his ambitions. Green and his family have a clear set of beliefs that, they hold, should shape American life: The U.S. is a Christian nation, the archaeological record supports an evangelical Protestant view of the Bible, and Americans should be compelled by law to live according to this interpretation. And they are determined to prove it. In Hobby Lobby v. Burwell, Green’s company argued before the Supreme Court that employers should not have to provide insurance coverage for contraception, if they had a religious objection, and he won. 

The Supreme Court ruling was far from the pinnacle of the Greens’ ambition, as theology professors Candida Moss and Joel Baden show in their new book Bible Nation: The United States of Hobby Lobby. The Greens “didn’t just want to turn their mom-and-pop homecrafts store into a billion-dollar empire,” Moss and Baden write, “or even merely give back to society once they had made it big. They wanted to play a role in the course of human history. As Mark Rutland, the former president of Oral Roberts University, put it, ‘the Greens are Kingdom-givers.’”

Almost since they opened their first store in 1972, the Greens have mounted a multi-pronged and well-financed campaign to give credibility to and spread their version of human history. Moss and Baden have written the first comprehensive account of that campaign, focusing mostly on the Green Collection, which has purchased a still-unknown quantity of Biblical artifacts; the Green Scholars Initiative, which analyzes those artifacts and produces its own academic curriculum; and the Museum of the Bible, which will open in Washington, D.C. this fall to display the Green Collection’s artifacts. These projects provide a crucial foundation for the family’s theology. For the rest of their evangelical beliefs to make sense—their stance on the role of religion in American public life, for example—they must first defend their understanding of the Bible as a consistent, and literally true, historical record. This is not scholarship for the sake of learning or enquiry, but rather an attempt to prove definitively the truth of conservative evangelicalism.

Exhaustively reported and scrupulously fair, Bible Nation doubles as a portrait of conviction: The Greens may well be the most sincere and most-frequently misguided activists in America. “If they are culture warriors,” say Moss and Baden, “that has been the case for many years.” They’ve successfully shaped the culture they want through the labor practices in their stores, through their philanthropic choices and through their proselytizing mission. Real piety and strategic canninessit’s a familiar blend. The Green family thinks they need to build their America, but we’re already in it and we have been for a long time.


David Green is a preacher’s son, born in Kansas in 1941. He has never deviated from the faith of his youth, and credits God for orchestrating his rise from poverty-stricken child to billionaire adult. According to one detailed profile in Forbes, Green started his working life as a stock boy, then worked for a five-and-dime store before opening the first Hobby Lobby. Rough patches notwithstanding, Hobby Lobby evolved into a corporate behemoth. But Green’s ambitions were never purely financial. A devout Pentecostal Christian, Green believes that capitalism and Christianity not only work together but require each other. 

Moss and Baden correctly connect this belief to the prosperity gospel, which frames financial success as proof of divine blessing. When the Greens make money, they can say God is responsible. This also gives the Greens a perfect justification for almost any restriction they enforce in the workplace. It certainly worked in Hobby Lobby v. Burwell. With its ruling, the Supreme Court delivered the Greens further proof that God blessed their endeavors; the favorable justices took on the form of Moses, handing down divine law. The Greens already believed that moral justifications underpinned their practices; the high court added legal justification too. 

When dissent has risen up within the Hobby Lobby empire, the Greens have proved adept at quashing it. Moss and Baden point to a number of lawsuits filed by former employees, alleging gender discrimination, discrimination against employees with disabilities and illegally long hours. We don’t know the outcomes of these complaints because they disappeared into mediation: Hobby Lobby employees reportedly sign an agreement that they will take complaints to a religious or secular arbitration process. Moss and Baden call this unusual, but it isn’t really; many corporations force disgruntled staff into arbitration to keep complaints out of court, and they do it because the arbitration process favors them. The results are private—something the authors do note—and so the Greens’ reputation remains untarnished.

David Green is reportedly worth $6.3 billion now, thanks to this management style and to the American thirst for tchotchkes. But rather than donate the familial fortune to causes that broadly benefit society, Moss and Baden say the Greens restrict their largesse to charities that meet strict doctrinal standards. Indeed, David Green aspires to cosmic impact. “I want to know that I have affected people for eternity,” he has said. “I believe I am. I believe once someone knows Christ as their personal savior, I’ve affected eternity. I matter 10 billion years from now.” David’s son Steve Green, who is the current president of Hobby Lobby and who emerged from Burwell the family’s most public face, speaks often of a Christian “worldview” and his desire to spread it. “The Greens focus exclusively on spreading the Good News,” the authors explain. The OneHope Foundation, Wycliffe Bible Translators, Every Home for Christ: The Greens are committed to the Bible, and want everyone else to be as well. 


In July, the Green family made headlines for the other ways they spend their money. After a federal investigation, the Greens had to return $1.6 million worth of clay cuneiform tablets they had bought from Iraq. The money did not, as some initially reported, fund ISIS; but it did prop up a thriving market in looted antiquities and hasten the cultural predation of Iraq. Once these details emerged, the Greens were ordered to pay a $3 million fine to the federal government for smuggling items out of Iraq and mislabeling the shipments; the Greens had called the artifacts “tile samples” and claimed they were purchased from Turkey, not Iraq. “We should have exercised more oversight and carefully questioned how the acquisitions were handled,” Steve Green told NBC News at the time. 

Moss and Baden say the Greens started collecting artifacts in 2009, after repeated pitches from Jonathan David Shipman and former Cornerstone University professor David Carroll. According to the authors, Shipman wanted to keep his own Bible museum project afloat; Carroll tells them that Shipman managed to nudge the Greens onboard by pointing out to them that there would be financial benefits to purchasing artifacts. As the authors note, antiquities constantly increase in value, meaning that items in the Greens’ collection are worth more over time, and there are other benefits. “The rationale, again, is a financial one,” they write. “These are objects that can be bought relatively cheaply, but can be valued quite highly for tax purposes.” The project has since expanded beyond Carroll and Shipman, neither of whom are currently associated with either the Green Collection or the Museum of the Bible. But the Greens adopted their ideas with force. By 2010, the Greens had already collected 30,000 pieces—a significant investment for a family that, barely a year before, possessed no real collection at all. 

The prospect of Steve Green as Bible-thumping Indiana Jones is a tantalizing vision. But in Moss and Baden’s telling, the family have long been victims of bad actors and their own incompetence. This is where Bible Nation delves deepest, revealing the extent to which the idea that some things must be verified, rather than taken by faith, contradicts the family’s beliefs. The Greens admit that the world of archaeology and papyrology is not their world; that they rely on “experts” to inform them of an object’s authenticity and importance. Yet this has repeatedly led them into trouble as the claims of their own experts have been disputed, and the authenticity of their collections questioned.

Many of the Greens’ prize artifacts are of questionable provenance, Moss and Baden report. Reputable collectors typically establish an artifact’s provenance, or its history of ownership, to confirm that it is authentic, and that it hasn’t been looted from its country of origin. Considering the Greens’ intention to display these items in a museum setting, one would expect them to place particular emphasis on the provenance of their collection. But this is unfortunately not the case. Moss and Baden thoroughly document a disturbing catalogue of errors committed by the Greens and their hand-picked scholars; there are items acquired from eBay sellers, then analyzed by students and academics affiliated with the Green Scholars Initiative despite the fact many, if not most, have little experience in papyrology or related fields. 

This means the Greens now possess a fair number of items that lack verified authenticity. Among them: A selection of purported Dead Sea Scrolls. Moss and Baden write that the Green scrolls belong to a “wave” of Dead Sea Scrolls that came on the market in 2002—and that “none” have “any reasonable provenance.” One scholar tells the authors that some of these newly available scrolls are likely forgeries. And there’s more trouble. Green scholars repeatedly destroyed mummy masks in hopes they would find Christian papyri, a practice they assure the authors they’ve stopped. One individual linked to the Green Collection claimed he smuggled a tenth-century psalter across international borders in his luggage. 

Many of these artifacts will be on display on the Museum of the Bible, which is set to open this fall despite ongoing controversy over the validity of both the items and the way the Museum intends to display them. The Greens claim the Museum will be non-sectarian, but the act of presenting the Bible on its own, without commentary, is specifically Protestant. Though the Greens have worked with Jewish and Catholic leaders not only for the Museum of the Bible but for previous exhibits, Moss and Baden conclude that they’ve unwittingly facilitated a fundamentally Protestant endeavor—a version of “Protestant triumphalism,” they call it.


Bible Nation is, in part, a family saga. The Greens possess a multi-generational vision not just of a Christian America but of a Christian world, and they suffer from a striking inability to see its contradictions and flaws. The same trait eventually doomed the Greens’ attempt to create a Bible curriculum for public schools, which they tried and failed to place in Mustang, Oklahoma public schools. As I and others reported at the time, the curriculum violated basic First Amendment requirements to teach the Bible as literature or history, not as true doctrine; the Mustang school district eventually dropped the plan. The Greens have since revamped the class, Moss and Baden say, and though it’s more secular than its predecessor it still presents an essentially Protestant view of the Bible. Judging from Moss and Baden’s account, the new curriculum is unlikely to survive a First Amendment challenge if a public school ever tries to take it up.  

Steve Green does not seem to understand that the new curriculum has many of the problems of the old one. “My family has no problem supporting those that are out there evangelizing, we do that. But this one is a different role, it has a different purpose. A public school is not the place to evangelize, that’s what the church’s role is; a public school is for education to teach the facts,” he says. In Green’s view, the curriculum he has created does present the facts. He is similarly convinced that the Museum of the Bible is non-sectarian, just as he is convinced that the Constitution does not stipulate a strong wall between church and state. Green genuinely believes he is telling the truth. “We’re buyers of items to tell the story. We pass on more than we buy because it doesn’t fit what we are trying to tell,” he says of his artifacts. It’s not clear that he understands this to be the work of a propagandist. 

The Greens are not the sole creators of the fun-house mirror world we inhabit, but they help sustain it. American evangelicals embrace them; the Supreme Court takes them seriously; Donald Trump, a man David Green enthusiastically endorsed, is president and panders to the family’s political allies. The Greens can say whatever they want. This is their nation, and we are their subjects. 

Hollywood’s Inequality Enabled Harvey Weinstein
October 13th, 2017, 07:40 PM

The revelations about Hollywood producer Harvey Weinstein’s long history of alleged sexual assaults surfaced just days before the first anniversary of the news about Donald Trump’s Access Hollywood tape—an apt coincidence, since Trump’s boasts also get to the root of the Weinstein scandal. “And when you’re a star, they let you do it,” Trump said off-camera to interviewer Billy Bush. “You can do anything.... Grab ’em by the pussy. You can do anything.” Harvey Weinstein is far less famous than Trump, indeed less famous than many of the women he allegedly harassed, like Gwyneth Paltrow and Angelina Jolie. Yet within the unique milieu of Hollywood, where powerful producers enjoy a special influence and professional deference, he was granted an impunity much like Trump was.

In a world of cinematic stars, the most bankable clout belongs to those who can make or break movie careers. Weinstein is legendary for his prowess in this regard. He has helped obscure actors, actresses, and directors win plaudits and Academy Awards. Even Mira Sorvino, who has a harrowing story of how Weinstein stalked her in her apartment, acknowledges his special gifts, telling The New Yorker, “I have great respect for Harvey as an artist, and owe him and his brother a debt of gratitude for the early success in my career, including the Oscar.” To the general public, Weinstein was just an obscure name that flashed on the credits at the end of movies, but to workers in Hollywood, including many A-list celebrities, Weinstein was a star among stars—the wizard who had the power to make them rich and famous. 

In this sense, the great hidden promoter of patriarchal sexual impunity in Weinstein’s case, as in Trump’s, is money. The Weinstein scandal is a story about sexual assault, but also a story about institutions that bolster systematic inequality. Weinstein couldn’t have flourished as an abuser for decades if there weren’t institutions in place that enabled him to act on his desire to humiliate and assault women without repercussion. And the only way to address the type of abuse Weinstein inflicted is to build counter-institutions that weaken the power that abusers have.


Some observers of the Weinstein scandal have employed the soft language of “culture” to explain the producer’s behavior. Blaming “culture” for sexual assaults is an easy excuse, with bipartisan appeal. In their statement on the Weinstein scandal, Barack and Michelle Obama wrote, “And we all need to build a culture—including by empowering our girls and teaching our boys decency and respect—so we can make such behavior less prevalent in the future.” New York Times columnist Bret Stephens gave this line of thinking a conservative twist by suggesting that the “libertine” culture that emerged in the 1960s and 1970s allowed Weinstein to run wild. 

In point of fact, as Times film critic Manohla Dargis and others have noted, the basic structure of abuse in the Weinstein case (the powerful producer preying on young actresses) has existed since the beginning of Hollywood. The first known use of the  phrase “casting couch” was in 1931, and the reality it describes was already familiar by then. Structurally, the casting couch reflects a core inequality within the film industry: There are many young women who want to work in Hollywood and only a few powerful producers who have the access to the money needed to make movies.

These young women also exist in an attention economy, which often ensures that the only chips they have to bargain with are their attractiveness and sexual availability. The studio bosses, meanwhile, have the power of capital behind them. Movies are extremely expensive, and only the studio producers can green light a project or build up a career. Of course, once an actress becomes famous, she has a little bit more leverage, but she can only gain that power by going through the producer. And as the career of stars like Judy Garland and Marilyn Monroe show, even famous female stars can be abused.

Beyond economic power, producers have sway over the vast publicity industries that flourish around Hollywood. Weinstein in particular was a master of the publicity machine, thanks to his relationship with figures like Tina Brown, the Condé Nast magazine line, and the gossip columns of the New York Post. Indeed, it could be argued that Weinstein’s true talent wasn’t so much as a filmmaker (where his taste was frankly middlebrow) but rather in publicity. He knew how to create buzz around even mediocre movies like Shakespeare in Love, winning them countless Academy Awards. 

Of course, for Weinstein and his professional cohort of super-producers, could destroy careers just as easily. As Ronan Farrow reports in The New Yorker, “Multiple sources said that Weinstein frequently bragged about planting items in media outlets about those who spoke against him; these sources feared that they might be similarly targeted. Several pointed to [actress Ambra Battilan Gutierrez’s] case, in 2015: after she went to the police, negative items discussing her sexual history and impugning her credibility began rapidly appearing in New York gossip pages.” Weinstein was able to get way with his alleged abuse because of the massive inequality between him and his victims. He had star power, wealth, and a compliant media network. None of the normal institutional checks that might restrain a powerful person applied to him. 

For all of Hollywood’s well-documented reputation as an enclave of liberal social attitudes, firms like the Weinstein Company are really unregulated fiefdoms, run with the lawless spirit of Robber Baron capitalism. There’s a stark contrast between entrenched captains of industry like Weinstein and the vast army of precarious labor, ranging from personal assistants to filmmaking talent, whose existence is governed by the whim of the boss. Describing the case of a Weinstein Company employee named Emily Nestor, who Weinstein harassed, Farrow writes:

Nestor had a conversation with company officials about the matter but didn’t pursue it further: the officials said that Weinstein would be informed of anything she told them, a practice not uncommon in smaller businesses. Several former Weinstein employees told me that the company’s human-resources department was utterly ineffective; one female executive described it as “a place where you went to when you didn’t want anything to get done. That was common knowledge across the board. Because everything funneled back to Harvey.” She described the department’s typical response to allegations of misconduct as “This is his company. If you don’t like it, you can leave.”

Weinstein’s power to abuse came from his position at the top of a hierarchy. And as New York’s Rebecca Traister noted, Weinstein’s downfall might have been triggered by his slow but steady fall. She saw him earlier this year at a Planned Parenthood celebration, and “struck by his physical diminishment; he seemed small and frail, and, when I caught sight of him in May, he appeared to be walking with a cane. He has also lost power in the movie industry, is no longer the titan of independent film, the indie mogul who could make or break an actor’s Oscar chances.”


Is there any way to challenge figures like Weinstein while they are still in power? This is a question that has implications far beyond Hollywood. Star power, bolstered by economic inequality, is pervasive in American society. The United States now has a TV star president who is also a serial sexual abuser. CEOs are also treated like stars, even when they allow abuse to flourish, as with Uber. And abusive stars have long been tolerated in the media (Bill O’Reilly) and sports (Floyd Mayweather Jr.).

The solution to elite impunity is counter-institutions that challenge the powerful. In other industries, unions can be a powerful tool for checking the power of abusive bosses. In Hollywood, guilds and unions have traditionally not been strong enough to introduce systematic and industry-wide changes, especially since as actress Glenn Close noted in her statement on the Weinstein case, “Ours is an industry in which very few actors are indispensable and women are cast in far fewer roles than men, so the stakes are higher for women and make them more vulnerable to the manipulations of a predator.” The Screen Actors Guild has a hotline for abuse but could do much more, using unions in other industries as examples. For instance, the New York Hotel Workers’ Union has a clause in contracts that ensures workers who complain about abuse cannot be fired for doing so. This provision played a key role in allowing a hotel maid to bring an accusation against the powerful French politician Dominique Strauss-Kahn in 2011. The Screen Actors Guild could similarly push for greater legal liability and punishment, including immediate firing, in abuse cases. The Weinstein case surely points to the need for making labor issues central to liberal politics, not just for combatting economic inequality but also the personal abuse that comes with it. 

As the independent writer Alex Press notes, there is an already existing informal type of collective action: whisper networks in which women inform each other about abusers. The problem with these networks is their very informality means that only a few women have access to them. Press argues that they should be transformed into more public and accessible institutions. In lieu of whisper campaigns, Press proposes “a coordinated effort to centralize the information currently floating around our networks, in an attempt to better disperse what we already know about abusers.” She writes that this clearing-house initiative “could be a hotline for women to report abuse, one that guarantees anonymity and connects the victim with a woman in her field who is willing to guide her through the possible steps she can pursue to take action against her abuser—this would be a model very similar to that employed by unions, albeit in this case, we’d be using it across workplaces and industries, a rational response to an economy where workers hop from job to job on an increasingly frequent basis.” 

Consumer boycotts are another form of collective action that, as the Bill O’Reilly case proves, can directly challenge the elite privilege of powerful abusers at its source—namely, ad revenues. But such networks can only work if the public is informed about stories of abuse. The Weinstein story is in large part a result of the failure of a complicit press. To be sure, Weinstein never had the public visibility of a Bill O’Reilly. The general population pays little attention to film production companies. Still, the vast effort Weinstein put into manipulating the press and suppressing reports about his behavior is proof that bad publicity would have harmed him. The damage would likely have been indirect (some famous actors would’ve avoided being associated with someone with such a sordid reputation) but it would have been real. As an alternative to a generally mogul-compliant press, it might be possible to create a greater array of non-profit investigative reporting outlets, along the line of ProPublica, that are tasked with gathering and disseminating news about abusive men. 

Weinstein also abused the legal system, using settlements and non-disclosure agreements to help cover up his alleged crimes. He also appears to have benefited from an inexplicable decision by New York County District Attorney Cy Vance—the recipient of some $10,000 in campaign tendered by Weinstein’s attorney David Boies on the producer’s behalf—not to pursue a case against the Hollywood producer that featured him confessing to his actions on tape. Legal reform—loosening NDAs in such cases to make it easier for abused women to talk, and removing DAs from cases where they took money—would weaken the power of future Weinsteins.

Going forward, the key is to see the Weinstein case as not an isolated set of alleged crimes by a wicked man, or even the fault of a corrupt industry. Rather, the Weinstein story is emblematic of twenty-first-century America, where wealthy figures are granted inordinate power—and they consider it a license to grab whatever they please.

The Paradoxical Politics of Literary Criticism
October 12th, 2017, 07:40 PM
LITERARY CRITICISM: A CONCISE POLITICAL HISTORY By Joseph North Harvard University Press, 272 pp., $39.95

For all the debates that have roiled literature departments over the past 60 years, the history of the discipline itself is a source of surprising consensus. According to the standard narrative, mid-twentieth-century literary studies served a conservative agenda, fostering traditional values and upholding a canon of dead white men. The dominant school of interpretation was New Criticism, whose defining method—close reading—consisted of scrutinizing short passages of literary works detached from their political context. A theory underwrote this method: that literature could be understood apart from politics; that its meaning and power transcended the social conditions within which it was produced. From roughly the 1940s through the early 1960s, this was the prevailing approach. But new schools of interpretation, energized by the anti-establishment political movements of the late 1960s and early 1970s—poststructuralist, feminist, anti-racist, Marxist, postcolonial, new historicist, queer—rejected New Criticism’s conservatism and usurped its central position within literature departments. These new methodologies are committed to the notion that a work of literature should be understood as responsive to its time.

But now, many scholars are saying that the discipline should take another new direction. Some have called for a return to the formalist concerns championed by the New Critics. Others have questioned what they regard as an attitude of suspicion adopted in political criticism, favoring the more affirmative, emotional responses to literature of readers outside the academy. Still others have advocated for a quantitative, data-driven approach enabled by new digital technology. What distinguishes Joseph North’s shrewd new polemic Literary Criticism: A Concise Political History from other these other efforts is his refusal to accept the conventional narrative of the discipline’s history. To recognize where literary studies should go, North says, we need to rethink where it has been.     


A troubling question propels North’s account: How, he asks, did literary scholarship take a leftward turn during the 1970s, when neoliberalism and austerity were ascendant? “How did literary studies manage, not merely to hold firm against the tide, but to move strongly against it?  Everywhere else, the left in retreat; but within literary studies, a historic advance.” The discipline regards itself as a righteous defender of progressive ideals within a hostile political climate, but North is unconvinced. Tracing the development of literary studies through the turbulent years between the two world wars, the mid-century “welfare-statist compromise,” the rise of neoliberalism in the 1970s, and finally the 2008 financial meltdown, North argues that literary studies, far from coming to embrace political activism, has gradually retreated from the interventionist mission that it seemed ready to adopt during its earlier phases.

The first period North considers, the 1920s and 1930s, witnesses a struggle between traditional scholars, caught up in obscure debates over etymology, and amateur belletristic critics concerned with shaping the sensibility of the general public. The hero in this drama is the British thinker I. A. Richards, who embraces the goal of using literature to educate readers outside the academy, but simultaneously introduces more rigorous critical methods which literature departments end up adopting. Richards’s most important contribution, says North, is his rejection of theories that isolate the experience of art from the practical concerns of everyday life. For Richards, reading poetry is a way of reorganizing people’s minds, enhancing their cognitive powers, and cultivating their “practical faculties.”  A great poem imparts a greater psychic balance to readers, training their minds to accommodate and harmonize a multitude of competing urges, making them at once more sensitive and more self-possessed. By emphasizing the usefulness of aesthetic cultivation for non-scholarly lives, Richards pinpoints a means by which literary criticism can contribute to the transformation of society. Richards doesn’t imagine an explicit political function for literature, but according to North, his is the most feasible blueprint for turning criticism into an engine of political change.

 But later, the New Critics and others hijack the method introduced by Richards—close reading—and make it serve precisely the conception of aesthetic value that he had sought to invalidate: that of beauty for beauty’s sake.  It’s an understanding of aesthetics that places politics or social betterment beneath literature, as something that critics shouldn’t sully their hands with. In defending this view, the New Critics turn away from the project of cultivating minds, focusing instead on “objectively” ranking literary works, thereby propagating what North calls the “sterile concern with hierarchy and canonicity that will occupy much of Anglophone literary studies throughout the Cold War period.” The New Critics, in North’s account, are more interested in making absolute claims about the greatness of literary works than in using these works to improve readers’ lives.

This rewriting of the goals of literary criticism has had far-reaching consequences. Decades later, leftist critics—including the likes of Raymond Williams, Terry Eagleton, and Fredric Jameson—recoil from New Criticism’s commitment to traditional cultural hierarchies, and reject aesthetic cultivation entirely. Embracing a wholly political approach to reading, these critics abandon what North believes is the one means by which literature might be made to reshape the world. Though they believe they are pushing literary studies to become more politically engaged, they are, North contends, doing the opposite. In their view, literature is primarily a vehicle of ideology, a way of masking the painful contradictions of capitalism. Their criticism treats literature as a symptom, valuable only as an indirect expression of the political forces that created it.

According to North all subsequent schools of political criticism employ the same “historicist/contextualist” approach, treating literature as a way of understanding rather than influencing society. Moreover, they tend to couch their readings in esoteric jargon, effectively shielding their work from any broader social relevance. North asserts that what he calls their “specialized knowledge production” serves the needs of neoliberalism—though he never quite explains how. But his account does clarify why the literary academic world is allowed to exist despite its radical posture: A remote and sparsely inhabited island, it exerts zero influence on the world around it.

Now, however, the neoliberal order is in crisis, throwing global politics into disarray and creating an opening for a new critical paradigm. In recent trends—including the renewed focus on form, the recognition of the importance of readers’ moods and affective states, the reaffirmation of the global reach of literature as opposed to an emphasis on national readerships—North discerns hints of a collective revolt against the contextualist-historicist paradigm in favor of a renewal of the public-facing criticism espoused by Richards. 

If such a renewal happens, North will deserve some credit: His effort to disentangle the progressive possibilities of aesthetic cultivation from the reactionary forms it has assumed may well help to rejuvenate the discipline. After all, it is entirely possible to expose students to complexity and nuance without reaffirming the old-school canon of dead white men or the reactionary politics that the canon was made to serve. North’s style is disarmingly lucid and self-assured; his reminds me of the work produced by an earlier kind of scholar, the sort who imagined a general audience. As devastating as it is meticulous, North’s analysis is a tour de force demonstration of what close reading can bring to light and why it would be a tragedy if the discipline ever gave it up.


North is such a smart and articulate thinker that it seems foolish to argue with him. But his strident tone invites debate. A good place to begin is his valorization of I. A. Richards and corresponding demonization of the New Critics. As a result of the New Critics’ ascendance, according to North, “the goal of so much critical work in the discipline became … not to educate the reader, but to adulate the text.” The New Critics are vulnerable to almost every accusation the left has thrown at them, but the one thing they did not do was abdicate the responsibility to educate readers. While Richards’ interpretations of poems tend to be cranky and elliptical, the models of careful exegesis supplied by New Critics were designed to show newcomers to literature how it might be done. To this day, any student who scours a passage from a literary work in search of ambiguities and ironies is following the example they set. The New Critics did defend traditional hierarchies segregating highbrow and lowbrow literature. But so did Richards, who claimed in Principles of Literary Criticism, that “the gulf between what is preferred by the majority and what is accepted as excellent by the most qualified opinion has become infinitely more serious.”

This would be just a trivial objection, except that it reveals a blind spot in North’s framework that has significant consequences. Strangely, for someone so committed to the pragmatic functions of criticism, North systematically privileges critics’ claims about their projects over their actual practices, and judges them on the basis of their mission statements rather than their work. Thus Richards’s abstract statements about what literature should accomplish are more persuasive to him than the New Critics’ engagement with particular literary texts. This tendency to treat statements of belief and intention as the key to understanding a given style of reading yields an impoverished picture of the various historicist schools that North finds wanting. He claims that these schools aim to produce knowledge and therefore don’t influence society, assuming they can do only one or the other. Postcolonialism, critical race theory, gender studies, and New Historicism may not have ushered in the seismic personal and social liberations that they aimed for, but can anyone who pays attention claim that they have done nothing to cultivate “new modes of subjectivity”?

Then again, North’s main concern may not be whether academic scholarship serves a practical function, but for whom. Over and over, he laments the isolation of the academy and yearns for an approach capable of reaching a wider public. This is a common desire among academics, but it raises the question of whether criticism aimed at such an audience, beyond the university and thus more responsive to market demands, would be better equipped to resist neoliberalism than traditional scholarship. 

It’s worth noting that North is himself producing historicist/contextualist scholarship: His book is focused on a narrowly defined historical context, the Anglo-American twentieth century, and he treats texts as symptoms of broader ideological forces. His book is polemical, but he is calling for, not producing, a new kind of criticism. This, too, is a widespread tendency. So many literary scholars these days seem to hope for a criticism that can crash through the walls of the academy, shape minds, melt hearts, and change the world. It is tempting to debunk their fantasies. But they may serve a necessary function. As North’s book demonstrates, continuing the modest but important work that literary studies does do may depend on its ability to imagine the work that it might do.

The Exodus
October 12th, 2017, 07:40 PM

The boat arrived on September 7, charging onto the beach. That morning, passengers plunged into the surf, then disappeared onto the shore, seeking shelter and a chance to sleep. In the distance, plumes of smoke drifted up from the horizon: Myanmar was burning.

Seven years after Aung San Suu Kyi was released from house arrest, heralding a tolerant new age for Myanmar, the former British colony has once again been plunged into ethnic conflict. Fires have raged for weeks across Rakhine State, home to roughly a million Rohingya Muslims, razing entire villages to the ground. Government officials have suggested that the Rohingya—a long-persecuted minority who are treated as illegal immigrants, despite having lived in Burma for generations—are setting their own homes ablaze. But humanitarian groups tell a different story: Burmese soldiers are roving the countryside, carrying out an orchestrated campaign of murder, rape, and arson. The United Nations human rights chief calls the conflict “a textbook example of ethnic cleansing.”

By September 28, three weeks after this boat arrived in Bangladesh, more than 500,000 Rohingya had fled the violence. Every day, thousands more sought refuge. Most had trekked for days to the Burmese coast, where fishing boats lurk in the darkness each morning, waiting for passengers. The final leg of the journey takes five hours over choppy waters. Australian photographer Patrick Brown was on the beach when the boats arrived in Bangladesh. The last time he saw such suffering, he says, was in Indonesia, following the tsunami in 2004. “It’s hard to put it into a photograph,” Brown says. “In their faces, I see exhaustion, terror, and a lot of anxiety about their future.”

September 6, 2017: Rohingya refugees walk through paddy fields and flooded land shortly after crossing the Burmese border and arriving in Cox’s Bazar District, in Bangladesh’s Chittagong Division.

October 4, 2017: Rohingya refugees crammed into an open-back truck on a highway leading toward the Leda refugee camp in Bangladesh. Children make up 60 percent of the new arrivals.

September 7, 2017: Noor Haba, 11, looks out over the side of a fishing boat as she, her family, and 25 other Rohingya refugees arrive on Shamlapur Beach in Bangladesh. It took five hours for them to sail from Rakhine State across the Bay of Bengal.

September 28, 2017: Corpses of several Rohingya children and adults lie on the ground prior to being moved to a morgue. They drowned when their boat capsized eight kilometers off Inani Beach in Cox’s Bazar District. More than 100 people were on board. According to the police, only 17 survived.

September 10, 2017: Rohingya refugees reach out for provisions being distributed by private aid initiatives in the Kutupalong refugee camp in Cox’s Bazar, Bangladesh. A week later, just outside this camp, three refugees were killed in a stampede as supplies were being thrown from relief trucks.

September 30, 2017: Rohingya refugees collect water from a muddy pool at the Unchiprang refugee camp. Without any access to ground water, the camp needs an estimated 745,550 litres of water to be trucked in each day.

September 30, 2017: Momera, 25, and her children chose to sleep on the side of one of Bangladesh’s main roads until they could find shelter. The family had spent nine days trying to reach the border in Myanmar. Midway through their journey, Momera gave birth to a baby boy. He died the day before this photo was taken, on September 29.

Patrick Brown/Unicef/Panos Pictures

The Mountain Between Us is a Lesson in Bad Casting
October 12th, 2017, 07:40 PM

The best moment in the new survival drama The Mountain Between Us is when a cellphone rings. Ben (Idris Elba) and Alex (Kate Winslet) are stranded up a mountain after a plane crash, her phone smashed and his without signal. As the signature iPhone ringtone warbled around the movie theater, the audience drew in their breath: They’re saved! But no. It was just an elderly attendee who picked up his phone to stage-whisper, “I’m in a movie.”

Ben and Alex are strangers who banded together to charter a tiny plane to take them to Denver. Alex is getting married shortly, while Ben is due to perform emergency surgery (the different stakes of these appointments foreshadows a lot). Their pilot is a bit old (stroke-prone) and a storm is on the way. They crash. In the broad strokes of its plot, The Mountain Between Us fits well within the boundaries of the plane crash survival movie genre.

The most famous plane crash survival movies all take place, oddly enough, on snowy mountaintops. Alive (1993) told the story of a Uruguayan rugby team stranded in the Andes, forced to eat their dead and battling the endless white around them (key line: “If you eat me, do you promise to clean your plates?”). Likewise 2011’s The Grey saw Liam Neeson adrift in the Alaskan Wilderness after his little plane crashed. In that movie, the snow is coupled with another threat: wolves. There are many more: The Snow Walker (2003) depics plane crash survivors in the Arctic. The Edge (1997) is exactly the same thing, but with Alec Baldwin. In pleasant contrast, Castaway (2000) gave the genre an island holiday.

There are things to admire about The Mountain Between Us, but not many. The dead pilot’s golden retriever costars, providing much-needed companionship for a viewer embarking on this frozen journey. A marauding wild cougar also deserves praise for trying to eat Kate Winslet early on. These two are just about outshone, however, by an enormously romantic performance from Idris Elba as a heartbroken neurosurgeon who tends to Alex’s wounds, pulls her trousers down when she has to pee, and generally hauls her dead weight down a mountain.

The theme of Alex and Ben’s relationship—they fall in love on their long march to rescue, of course—is of the heart versus the brain. This is spelled out very clearly. He is a neurosurgeon who says “the heart is just a muscle.” She is a photojournalist who values the heart over the head. The conceit is that she provides the heart and courage for them to keep going and take risks. He has the medical knowledge, physical strength, and resourcefulness that keeps her alive. It’s a bit of an imbalance of talents really, one that reflects an imbalance in magnetism between these two actors.

Idris Elba is a very, very watchable man. A role as a leading romantic hero in an expensive movie is long overdue to him, and he rises to the occasion. Elba hasn’t been cast well in Hollywood. It was a shame to see how badly The Dark Tower (2017) turned out. His best movies may well still be 28 Weeks Later (2007) and Prometheus (2012). It’s really good to see him act with his normal voice—the one fans know and love from Luther—and using normal British phrases like, “We’ve got to get on with it!”

Ben falls for Alex first, and we know because of his eyes. She’s irritating but funny: “Want coffee?” she quips as she heads out of an icy cave, grinning as if it’s their own picket-fenced house. We see by the movements of his body and the melting sternness on his face that he is changing. The character of Ben combines macho capability with caretaking in a fatally charming concoction.

But surrounding Idris Elba’s warm performance is a metallic and cold movie, and an air of falseness that comes down, in the end, to simple lack of chemistry between the costars. Kate Winslet is a good actress, but she doesn’t work in this scenario in part because her character has no real emotional vulnerability. It’s painful to compare this performance to the rawness of her characters in Eternal Sunshine of the Spotless Mind (2004), Sense and Sensibility (1995), Kenneth Branagh’s Hamlet (1996), or even the medical horror Contagion (2011). In each of these great Winslet turns, she had the opportunity to get hysterical. She’s so very good when she’s allowed to turn it up to 11, and so very unconvincing as a cheerful American photojournalist here. Elba and Winslet’s careers teach different but equally telling lessons about the dangers of miscasting.


There are some interesting structural features to The Mountain Between Us, which was directed by Hany Abu-Assad (Paradise Now, Omar). There’s a strong motif of mediation and transparency. When the pair first show up to the hangar where the little charter plan sits, they eye each other from either side of the plane’s cockpit, through the glass. As the emotional distance between them melts, the iciness of that glass barrier transforms into a water theme. Alex plummets through some ice into a lake. Ben bathes her wounds with water. Later in the movie we see him walking in the rain; her in a swimming pool.

Alex and Ben are brought together in the end by photography. Her camera survived the crash. As rescue starts to look more and more unlikely, Ben says to her, “I want you to take a picture of me.” There are many parallels to Titanic in The Mountain Between Us, but this callback to Winslet’s “Draw me like one of your French girls” line was the only one that made me laugh at loud. Anyway, Alex refuses, only acquiescing to photograph him later, once hope has returned.

The interplay between the photography theme and the melting barrier theme is a rewarding one. Before their ordeal, Ben and Alex see neither themselves nor one another clearly. On their journey down the mountain, toward rescue and also toward love, they come to a point of recognition. This point is marked by Alex’s loving portrait of Ben asleep, a photograph that pays tribute to all the pain and beauty of his being.

The mountains are pretty too. The dog, hopping along amicably beside them, is unfailingly adorable. The whole thing is perfectly shot. It feels important to emphasize Abu-Assad’s achievements in putting this movie together because, I’m sad to say, The Mountain Between Us fails almost exclusively because of Kate Winslet. It’s a case of good actress in the wrong film, playing across a good leading man who just doesn’t resonate on her frequency. There’s some warmth in this movie—canine though it may mostly be—but the dynamic between Winslet and Elba is disappointingly cold.

How Elizabeth Warren Became the Soul of the Democratic Party
October 12th, 2017, 07:40 PM

The Democratic establishment was shocked—and in some cases appalled—by Bernie Sanders’s insurgent bid for president last year. How could 12 million primary voters cast ballots not for market-friendly progressivism or New Deal liberalism, but democratic socialism? And against a Clinton, no less? But astonishment eventually gave way to acceptance, even implicitly from the nominee herself: the success of Sanders’s campaign was no fluke, proving that the Democratic Party had moved decisively to the left.

But 2016 was not the year the party became more progressive. It was merely when the establishment Democrats realized it had moved. Many observers recognized a shift underway years earlier. In 2013, in a New Republic article titled “Hillary’s Nightmare? A Democratic Party That Realizes Its Soul Lies With Elizabeth Warren,” Noam Scheiber accurately predicted the rift exposed by last year’s primary, arguing that it would “cut to the very core of the party” in the next race for the White House:

On one side is a majority of Democratic voters, who are angrier, more disaffected, and altogether more populist than they’ve been in years. They are more attuned to income inequality than before the Obama presidency and more supportive of Social Security and Medicare. They’ve grown fonder of regulation and more skeptical of big business.2 A recent Pew poll showed that voters under 30—who skew overwhelmingly Democratic—view socialism more favorably than capitalism. Above all, Democrats are increasingly hostile to Wall Street and believe the government should rein it in.

On the other side is a group of Democratic elites associated with the Clinton era who, though they may have moved somewhat leftward in response to the recession—happily supporting economic stimulus and generous unemployment benefits—still fundamentally believe the economy functions best with a large, powerful, highly complex financial sector. Many members of this group have either made or raised enormous amounts of cash on Wall Street.

Sanders won the support of the first, ascendant side of that divide. That would seem to to put the Vermont senator in a position to be the party’s new standard-bearer—especially given that, according to recent polling, he’s the most popular politician in the country. But he has returned to the Senate as an independent rather than a Democrat, aiming to “transform the Democratic Party” from the outside; what’s more, he’s 76 years old. This power vacuum has provided a clear opening for a new, progressive leader of the party—and she’s primed to occupy that role just as Scheiber foresaw, albeit a little later than he suggested. “We are not the gatecrashers of today’s Democratic Party,” Warren told Netroots Nation this year. “We are not a wing of today’s Democratic Party. We are the heart and soul of today’s Democratic Party.”

The 68-year-old Massachusetts senator is right—and she’s not the only one who says so. Amy Walter, the national editor for the Cook Political Report, wrote last week that “it’s going to be very difficult for a Democrat to win the nomination of his/her party on anything but the Warren platform.” Walter cited Pew Research data showing that “Democrats have moved dramatically leftward since the 1990s on issues like the social safety net, immigration, and race relations. On those issues, the so-called Warren wing represents the mainstream of Democratic opinion.” Congressman Jamie Raskin, vice chair of Congressional Progressive Caucus, told me Warren “does define the center of gravity within the Democratic Party.” Even the center-left Brookings Institution scholar Bill Galston, who disagrees with Warren on many policies, acknowledged her ascendancy in this moment. “If you forced me to lay down a bet today on the most likely nominee of the Democratic Party, it would be Senator Warren,” he said—while emphasizing that this was “a simple assessment of current realities,” not a prediction. “If you asked me to define the center of gravity, it would be pretty close to where she is right now.”

This reality has major implications for the future of the party—in next year’s 2018 midterms, the 2020 presidential election, and beyond. If Warren is now the soul of the Democratic Party, what will this transformed party look like and what will it fight for?


It’s hard to overstate Sanders’s role in shaping the American left today. Raskin rightly noted that Sanders “is in the center of progressive politics.” “It’s his bill that’s defining the Democratic agenda on healthcare, not Elizabeth Warren’s,” Walter told me. “There’s no doubt that on certain issues he’s the one setting the agenda.” But Warren, unlike Sanders, is a loyal partisan who represents a consensus between her party’s left-wing economic populists and groups aligned with the establishment like the Center for American Progress, where she keynoted an Ideas Conference earlier this year. The establishment feels more comfortable with Warren’s mission of reforming and “unrigging” existing economic and political systems, compared to Sanders’s approach of indicting and supplanting these systems altogether. “Warren is a party person,” Howard Dean, the former Vermont governor and ex-chair of the Democratic National Committee, told me. “Bernie is an iconoclast.”

From a historical perspective, Warren’s politics are firmly rooted in an American tradition. Raskin describes them as “kind of a return to the progressivism of the early 20th century.” He sees Sanders’s ideology as closer to “European-style social democracy,” observing that the Vermont senator started off his campaign last year with “kind of an old-school Marxian approach with the class struggle explaining everything.” Galston said Sanders has “a very statist agenda,” and he sees Warren as “more of what I’d call a Naderite reformer,” in the mold of consumer protection crusader Ralph Nader.

Warren, a former Harvard professor who had conceived and set up the Consumer Financial Protection Bureau under the Obama administration, rose to prominence by effectively positioning herself as a champion of working families and the middle class. “That was not an idea that came out of Clinton-esque triangulation,” Raskin said of the CFPB. “That’s both a great victory for her but also a demonstration of how ideas can change and shift the notion of what the political center is.”

“We are living in an economic populist moment,” said Adam Green, co-founder of the Warren-aligned Progressive Change Campaign Committee, “and Elizabeth Warren is very much the north star of the Democratic Party in this moment, providing direction for where the party can go to get out of the darkness.” He described her as a “unique figure who has gotten both more popular with her base and more credible with the political establishment” over the years, channeling the public’s economic frustration and picking strategic battles she won with a specific set of political skills.

“I would say she’s the best progressive Democratic politician I’ve seen since Bobby Kennedy,” said Bob Kuttner, the co-founder and co-editor of The American Prospect, lauding her for “making pocketbook populism feel mainstream.” Kuttner added, “She managed to block Barack Obama from appointing Larry Summers to chair the Fed. She’ll go to heaven for that by itself.” Warren became famous for viral confrontations with bank regulators and the likes of Obama Treasury Secretary Timothy Geithner, but Kuttner said she’s also “a great inside player” on Capitol Hill.

“She’s been very intentional about building coalitions across the Democratic Party—and occasionally with Republican senators—for her economic populist position,” Green said, “and has made it increasingly safe for Democratic politicians to follow her lead.” To demonstrate how Warren “proves the credibility of her ideas,” he pointed to Senator Jon Tester of Montana. “People like Jon Tester, who is a prairie populist but not seen as a left-wing ideologue, felt safe not just following her into battle but taking a leadership role” in opposing Summers.

Moderate Democrats won’t all agree that Warren has become the center of the party. But Warren elicits respect from unusual sources, including the man Bloomberg Businessweek once called “Wall Street’s Favorite Democrat”: Congressman Jim Himes, chair of the centrist New Democrat Coalition. Asked about Warren’s presidential prospects—and Walter’s contention that “the Warren platform” could end up as a litmus test for 2020—he said, “I think it’s possible. There’s a lot of energy on the left wing of the Democratic Party.” Though he hails from a district with “a huge amount of financial services,” the congressman offers plenty of praise for one of Wall Street’s harshest critics. “I’ve never sort of tallied it, but I agree with Elizabeth Warren on much of what she says,” he said. “I agree with a lot of what she puts out there.” He added, “The press desperately wants to foment or preserve the notion that there’s this massive split between the Bernie Sanders and Elizabeth Warren gang and the Clinton gang—between the progressives and the moderates—and it’s just not true.

Himes is quick to draw distinctions between Warren, who is “fundamentally a free markets person,” and Sanders, a democratic socialist. “In as much as we both respect the power of markets in a way that Bernie Sanders may not, I think she’s more mainstream,” he said. “I think she has a perfectly supportable thesis, which is that markets work well when they’re well regulated and generate bad outcomes when they’re not. That’s a perfectly acceptable premise to me. You know, I see her using the word ‘free’ a lot less than I see Bernie Sanders using the word ‘free.’ It may just be that she acknowledges that nothing is free. It’s simply a question of who pays for what.” Himes also sees a difference between how the two senators talk about the “rigged” economy, such as Warren’s reference to “tricks and traps.” “When you talk about ‘tricks and traps,’” he said, “you’re not saying the system is fundamentally corrupt or evil, you’re saying it can be corrupted.”

That’s why Himes sees an “establishmentarian distinction” between them. “She might take offense at this,” he said of Warren, “but I don’t regard her as a non-establishment figure in the way I regard Sanders as a non-establishment figure.”


Josh Barro thinks Warrenism is just what Democrats need. “I think there is a convincing way forward for the party,” the Business Insider senior editor wrote in June. “Roughly, it involves being less like Hillary Clinton and less like Bernie Sanders, and instead being more like Elizabeth Warren.” The party, he added, “should make a list of corporate practices that grind people’s gears and ask whether there’s a compelling economic rationale for them.... And they should explain how doing so will make it easier for people to buy the things they need to live the way they want, with their own earnings. This approach isn’t neoliberal, and it isn’t socialist, either. It’s about treating markets as a means to an end and using the government to ensure those markets serve the interests of regular people. And it doesn’t have to involve growing the government or asking people to trust it with more of their money.”

Jim Kessler, vice president of the centrist Democratic think tank Third Way, opposes this approach—and the idea that Warren represents the new party mainstream. “She represents a wing of the party, and she represents it well,” he told me. The Democrat he thinks best represents the party consensus is the one who’s been preaching bipartisanship and moderation in recent weeks: former Vice President Joe Biden. But Kessler may end up in the minority. Walter told me “there is still a deep love and appreciation for Joe Biden, but his policies, especially if he runs basically where he’s been in the past and where he was with the Obama administration—I don’t know that it would be as popular as where Warren is coming from.”

That’s why her politics are intriguing to Democrats of many stripes. “The contest between Sanders and Clinton reflected progressive populism and liberal feminism,” Raskin said. “Elizabeth Warren is someone who merges them both. You could view her as the synthesis of the divides in the party we had in the 2016 election—a candidate who would leave nothing out and leave nobody behind.” Polling shows her agenda, which overlaps significantly with Sanders’s, isn’t just popular with Democrats. Most voters supported a $15-an-hour minimum wage, according to Pew survey last year. “Broad, bipartisan majorities support debt-free higher education,” a Demos poll found last October. The notion that the system is rigged in favor of big corporations certainly isn’t out of step with public opinion. Like all progressives, Warren has work to do selling single-payer healthcare, which doesn’t yet have clear majority support. But enthusiasm for Medicare for All is growing among Democrats in Washington and across the country.

Himes has reservations about Warren’s broader appeal. “How would Elizabeth Warren play in Ohio?” he mused on Tuesday. “It’s a huge question, and I’m not sure I have a preconceived notion. On the one hand, I think she has an authenticity and a clear passion that is going to be appealing to a lot of people. How she would manage gun issues that are pretty important in rural Wisconsin, other social and cultural issues, I think is an interesting question.” The question is whether Democrats should even be tailoring their message for places like rural Wisconsin, versus trying to energize a diverse swath of voters across the country. Raskin is fond of saying he has no ambition to be in the political center; he aims to occupy the moral center, and bring the politics to him. That’s a safer stance for a liberal congressman than a presidential candidate, and centrists rightly observe that public opinion hasn’t caught up to a whole host of progressive priorities. But as Kuttner said, mainstreaming “pocketbook populism” is Warren’s great gift, and it’s notable that even moderates like Himes—an ally of Wall Street and leader of an overtly moderate congressional caucus—won’t count her out. “If she can make the leap to being a candidate that played in the rural midwest,” he told me, “it could be really interesting to watch.”

The Toxic Air in California Is a Public Health Crisis
October 12th, 2017, 07:40 PM

“It is completely unsafe to be here at this moment,” said Jennifer Franco, a resident of Fairfield, California, on Wednesday afternoon, as massive wildfires ripped through Santa Rosa and Napa a few miles west. But she wasn’t talking about the flames—she was talking about the smoke. Accelerated by high-speed seasonal winds, ash-laden air was blowing eastward, directly into her neighborhood. “Since Tuesday morning, air quality is beyond terrible,” she said. “I’ve been having chest pain, and now I’m using a respirator.”

In Sebastopol, just west of Santa Rosa, winds were blowing in a more favorable direction. But retired social worker Vaughn Whalen said gray haze still obscured the blue sky there, making the sun look eerie, dull, and orange. “I’m a tougher old fella,” he said, when asked about how he was dealing with the smoke. “But a friend of mine who lives nearby has asthma. She was telling me that the smoke is in the house. Her eyes are burning. Her chest hurts. She has difficulty breathing.”

The most immediate threat from the 22 devastating wildfires currently roaring through California are the immediate fire zones. At least 21 people have died there; more than 600 people have gone missing; and thousands of buildings have been destroyed. But beyond the fire zones, millions of Californians are facing a secondary, more insidious threat: polluted air, rife with tiny particles small enough to penetrate deep into the circulatory system. Those potentially deadly particles are creating unhealthy air as far as 70 miles away from fire zones, according to Bay Area Air Quality Management District spokesman Tom Flannigan. But people closer to the fire zones are even more at risk, since the air in those regions could also be tinged with toxic heavy metals like arsenic, cadmium, copper and lead, as the smoke picks up chemicals from burned-up plastic, cars, and building materials.

“These are unprecedented conditions,” Flannigan said, estimating that four to five million people living in the Bay Area are breathing toxic air outdoors. “We’ve measured some of the highest air pollution ratings in the Bay Area ever recorded.” For context, Flannigan pointed to Beijing, China, perhaps the poster city for air pollution. “When they measure in Beijing on their worst days, they’re around 500 on the Air Quality Index,” he said. “And those are the kind of readings we’ve been seeing here.”  

The particles within this smoke pose the biggest short-term risk to human health. It’s been extensively proven that high-dose exposure to so-called “fine particulate” pollution, or PM2.5, can trigger death, particularly in people with pre-existing conditions like asthma or heart disease. And breathing in smoke can make anyoneeven healthy peopleexperience chest pain, dizziness, or shortness of breath. In California, those effects are already turning up: At least 20 people from outside immediate fire zones had visited UC San Francisco’s hospital facilities due to symptoms from smoke inhalation as of Tuesday evening, according to UCSF spokesperson Elizabeth Fernandez. 

There’s also a risk that some Californians could suffer longer-term health consequences from this wildfire smoke. “Right now we’re most worried about exacerbations of preexisting diseases,” said John Balmes, a physician and professor of medicine at UCSF. “But with the heavy exposure to air pollutants in the fire zone, people could actually develop asthma from smoke exposure.” Breathing in carcinogens like arsenic from wildfire smoke could also cause cancer, but “that hasn’t been well-studied,” Balmes said.


It may seem like these situations are inevitable or happenstance, that there’s nothing people or the federal government could do to mitigate disaster-related conditions that threaten the health of so many Americans. But that’s just not the case. Indeed, just last week, the House Subcommittee on Environment held a hearing about the role of fire-related air pollution—and in striking contrast to virtually all other environmental concerns, Republican lawmakers are leading the charge to figure out how to stop air pollution from wildfires from getting so bad.

Republican Congressman John Shimkus from Illinois, the chairman of the environment subcommittee, expressed outrage over the unprecedented pollution from wildfires now enveloping the West. “Nearly every other significant source of combustion—from vehicles to power plants to factories—are subject to very stringent controls. But the emissions from wildfires are completely uncontrolled,” he said. “Congress should be looking at any and all ways to address wildfires and their air emissions, and most important of all, the policy measures that can help prevent or minimize wildfires in the first place.”

Obviously, the federal government cannot control wildfire smoke pollution the same way they control pollution from coal plants or cars—forests do not choose to release smoke into the atmosphere. But serious discussions can be entertained about how to reduce the risk and intensity of fires through better management of forests, particularly forests on public lands. “Often, the largest and most polluting fires originate on or involve federal lands,” Shimkus said during last week’s hearing. 

UCSF physician John Balmes agrees. That’s why he argues for better funding for the the U.S. Forest Service, which oversees national forest land. Oftentimes, he said, the agency does not have the money to do its job. “Their budget has to be supplemented every year for the funds to fight fires, and they usually don’t have money to do preventative maintenance tasks like clear out underbrush and get rid of dead trees,” Balmes said. Improper funding also means that Forest Service officials are short of resources to carry out prescribed or controlled burns, which reduce the risk of extreme wildfires in the future.

Predictably, however, enhanced funding for the Forest Service is not a Trump administration priority. Quite the contrary: The White House’s proposed budget would cut $300 million from the Forest Service’s wildfire fighting initiatives, together with $50 million from its wildfire prevention efforts. The proposed budget would also reduce funding for volunteer fire departments by 23 percent. And even with the California GOP delegation in Congress determined to shield their constituents from intense wildfire air pollution, most Republican lawmakers still refuse to acknowledge that man-made climate change is making these wildfire seasons worse. Even Shimkus, for all his eagerness to regulate the adverse effects of wildfires on air quality, is a climate denier: He has said God would never let climate change ruin the earth. 

What’s certain is that climate change will ruin humans’ ability to inhabit the Earth. And the air pollution from wildfires is just one among a growing roster of weather-related disasters wreaking havoc for humanity. (See also: Puerto Rico, Texas, and Florida, over just the past month-plus.) “We’re already seeing an increase in catastrophic wildfires in California, and it’s only going to get worse as the climate gets warmer and drier,” Balmes said. So long as Trump and congressional Republicans deny climate change—and therefore do nothing to slow its impacts—the health hazards from those wildfires only stands to get worse as well.

Why Saudi Arabia Lifted Its Ban on Women Drivers
October 11th, 2017, 07:40 PM

I learned to drive on the streets of Karachi. My instructor, per my father’s insistence, was a woman. Every day she showed up in the special instructor car fitted with a brake and a clutch (most cars in Pakistan had standard transmission) on her side. She was not very much older than me but she was, I soon learned, unafraid to ply the streets of the city with smarts and gumption I did not have. Once, when the car got stuck in standing water after a Karachi rainstorm, she told me to open the driver’s side door and look outside. If the water level was a couple of inches below the door, it was okay to drive. It was, and I drove.

Reading Manal Al-Sharif’s memoir Daring to Drive: A Saudi Woman’s Awakening took me back to those days and reminded me of how contagious courage can be. In 2011, Al-Sharif got behind the wheel of her brother’s car and dared to drive on a Saudi street. She was not the first to protest the Kingdom’s driving ban: Forty-seven women were arrested for doing the same in 1990. But unlike her forerunners, Al-Sharif recorded it all and uploaded it to YouTube. Within hours, a horde of Saudi secret police were deployed to her apartment. There they stayed, carting her off in the early hours of the next morning. Al-Rashid would spend the next nine days inside a Saudi women’s prison, a fate usually reserved for the many migrant women who serve Saudis rather than for Saudis themselves. Al-Sharif would emerge from the ordeal a transformed woman, determined to coax the freedom to drive from the oppressive Saudi state.

DARING TO DRIVE: A SAUDI WOMAN’S AWAKENING by Manal Al-SharifSimon & Schuster, 304 pp., $26.00

Last Tuesday, she won. In a Royal decree issued by the King, the ban on driving was lifted; Saudi women would be able to apply for a driver’s license and drive without the presence of male guardians in the car. The qualms of clerics, who had tried for decades to deny women the freedom to drive, had been set aside by the only man who could: the King himself. In the op-ed she wrote following the King’s decree, Al-Sharif, who now lives in Brazil, declared that she could not wait to drive in Saudi Arabia.

Al-Sharif did not set out to be a rebel. As she confesses, the most rebellious thing she did in her youth was to get a job. But even that was a big step. No other woman in her family had ever done so, and her parents, unsure how to handle it, kept the fact a secret from other family members. It is a telling omission, one that reveals just how unusual Al-Sharif’s family was. Her father was a taxi-driver, who ferried pilgrims between the holy sites of Mecca, where the family lived in an apartment on the edge of a slum. The smell of sewage wafted in the air and relatives mostly stayed away. It was not simply because of their poverty; not only was her father a less than abundant provider, he had a foreign wife. Al-Sharif’s mother was Libyan, a fact that neither she nor her children were ever permitted to forget—cousins referred to her disparagingly (and inaccurately) as “the Egyptian.” This experience, at the margins of belonging, gave her reason to be less than satisfied with the status quo. As she notes frequently in interviews, “not all of us live luxurious lives… spoiled like queens.” In her case, this was definitely true.

But while being an outsider may have left Al-Sharif sensitive to the impact of the Kingdom’s subjugation of Saudi women, she gives little thought to the foreign women who suffer under the same system. When Al-Sharif is packed away to prison, the women inside scream, “You’re Saudi? You’re Saudi?” These women are largely maids and other domestics, who’ve been swept from their homes—in Sri Lanka, the Philippines, Indonesia, Somalia, and India—by global inequality to work in the homes of wealthy Saudis. They are shocked to see a Saudi woman there. Only seven of the 168 women in the prison are Saudi, and of those, four were held in “temporary detention,” not serving out a sentence. Al-Sharif doesn’t offer a comment on this statistic, and during her agonizing nine days, consorted mostly with another Saudi woman.

It is a small quibble but an important one. The designation “Saudi” is, in the Kingdom, a nativist one, available generally only to those whose fathers and grandfathers have been Saudi “citizens.” Obviously this means that the Kingdom’s many migrant workers, like the women Al-Sharif encounters in prison, have no citizenship and hence, few rights. Several have been executed in recent years, some of them on mere accusations of having killed a child in their charge, without trials or investigations or lawyers.


Al-Sharif makes one of her most astute observations when she constructs a genealogy of the Kingdom’s crackdown on women’s rights. She traces the country’s stringent laws to the siege of Mecca in 1979, when a group of rebels took over the Grand Mosque just before the Grand Mufti was about lead a congregation of 50,000 pilgrims in prayer. The rebels, many of whom came from the clerical establishment, were led by Juhayman al-Otaybi, who alleged that the ruling family had strayed too far from Islamic teachings. A bloody siege followed, and when the mosque was stormed over 250 people were already dead, with over 500 injured. To get the clerics back on their side, the Saudi Royal family agreed to implement an even more austere Islam and the Salafist ideology that the Kingdom would impose on their own and export far and wide. Women bore the brunt of this. Their images were censored from every publication and public space. Baton-wielding religious police on patrol became a familiar sight. And of course women were completely banned from driving cars.

If, as Al-Sharif alleges, the elimination of women from the public sphere was the result of strategic imperatives—an attempt to stanch the Pan-Arabism that was spreading across the Middle East, by getting the country’s powerful clerics on side—one cannot help but wonder if recent freedoms are borne of similar considerations. The drive toward self-expression and a desire for democracy has plunged so many neighbors into wars, a fate the monarchy has every reason to want to avoid by handing out small freedoms. In another attempt to appease the populace and institute a kind of protectionism, the Saudi Vision 2030 plan seeks to replace all non-Saudis from government jobs by 2020 and deport thousands of illegal workers. In the new expatriate and foreign worker-free Saudi Arabia, women will need to work—and to drive. Just one day after the announcement of the repeal of the driving ban, a woman was appointed Deputy Mayor of Khobar, the same city where Al-Sharif was detained before being carted off to prison.

Amid the childhood stories and memories Al-Sharif recounts in the early pages of Daring to Drive is an account of her favorite fable. It is the story of a young prince under the tutelage of his older teacher. One day the teacher, suddenly and unexpectedly, strikes the prince. He is shocked, and silently vows to avenge this wrong when he is king. When the day finally arrives, he asks the teacher why he had slapped him. The teacher replies that he did it because he wished the Prince to have the experience of injustice when he was young, so he would understand how his subjects felt when he became king.

There can be little doubt that Al-Sharif’s early experience of exclusion made her a valiant activist. But a feminist awakening requires revolt not only against the wrongs done to oneself or to one’s own kind, but also to those whose humanity is scarcely recognized within the Saudia Arabia and the wider global economy. When Manal Al-Sharif, finally, victoriously drives her car on Saudi soil for the very first time, perhaps she will consider driving to the prison where she was taken, where hundreds of women remain, without any hope of freedom.

Long Divisions
October 11th, 2017, 07:40 PM

A generation ago, as the culture wars raged, Toni Morrison often stood at the front lines, demanding the desegregation of the American literary canon. In her Tanner Lectures in 1988, and later in her book Playing in the Dark, she argued against a monochromatic literary canon that had seemed forever to be naturally and inevitably all-white but was, in fact, “studiously” so. She accused scholars of “lobotomizing” literary history and criticism in order to free them of black presence. Broadening our conception of American literature beyond the cast of lily-white men would not simply benefit nonwhite readers. Opening up would serve the interests of American mental as well as intellectual health, since the white racial ideology that purged literature of blackness was, Morrison said, “savage.” She called the very concept of whiteness “an inhuman idea.”

THE ORIGIN OF OTHERS by Toni Morrison Harvard University Press,136 pp., $22.95

In her new book, The Origin of Others, Morrison extends and sharpens these themes as she traces through American literature patterns of thought and behavior that subtly code who belongs and who doesn’t, who is accepted in and who is cast out as “Other.” She has previously written of how modernist novelists like William Faulkner (who saw race) and Ernest Hemingway (who did not) respected the codes of Jim Crow by dehumanizing black figures or ignoring the connotations of blackness in their nonblack figures. But the process of exiling some people from humanity, she observes here, also ranges beyond American habits of race: One need only look at the treatment of millions now in flight from war and economic desperation. Othering as a means of control is not just the practice of white people in the United States, for every group perfects its self-regard through exclusion.

Morrison anchors her discussion of these complexities in her personal experience, recounting a memory from her childhood in the 1930s: a visit from her great-grandmother, Millicent MacTeer, a figure of enormous power whose skin was very black. On her arrival, MacTeer looked at Toni and her sister, two girls with light skin, and pronounced them “tampered with.” Colorism ordinarily refers to black people’s denigration of dark skin and preference for people who are light, but in this case it meant, more broadly, a judgment based on skin color. “It became clear,” Morrison writes, “that ‘tampered with’ meant lesser, if not completely Other.” Deemed “sullied, not pure” as a child, Morrison finds that Othering, as well as the racial self-loathing of colorism, begin in the family and connect to race, class, gender, and power.

Morrison’s history of Othering represents an intervention in history on several fronts. Although the theme of desegregating the literary canon reappears in The Origin of Others, times have changed since Playing in the Dark. Surely thanks to the more multicultural, multiracial canon that Morrison helped foster, no respectable version of American literature today omits writers of color. Morrison herself has received nearly all the honors a novelist can win: the Pulitzer Prize for fiction, the Nobel Prize in Literature, the Presidential Medal of Freedom, and the French Legion of Honor, among many more. The Origin of Others is the result of her lectures in the prestigious Charles Eliot Norton series at Harvard University, where she is only the fourth woman and the second black lecturer in the 92-year history of the series.

Within the Norton Lectures’ tradition of wisdom, and among its tellers, Morrison represents a novelty by virtue of her gender, her race, and her American subject matter. Historically the series has shown a preference for European topics and for British scholars as avatars of learning. Not until 2014, when Herbie Hancock addressed “The Ethics of Jazz,” did the Norton recognize wisdom in the humanities as both pertaining to American culture and emanating from a black body. Morrison’s lectures and book are a historic achievement, as they confirm the impact of her intellectual tradition in American thought—a tradition that links her to James Baldwin, and in a younger generation Ta-Nehisi Coates, in the critique of whiteness.


Morrison’s earliest witnesses of Othering are two women who had been enslaved, Mary Prince and Harriet Jacobs, both of whom later recorded their physical and mental torture at the hands of their owners. In her 1831 memoir, Prince described her owner’s reinforcement of hierarchy through beating; her master “would stand by and give orders for a slave to be cruelly whipped…walking about and taking snuff with the greatest composure.” Thirty years later, Jacobs wrote of how slavery made “the white fathers cruel and sensual; the sons violent and licentious.” Within slavery, the process of Othering is physical, and is meant to work in only one direction, from the slaver to the slave.

Morrison asks instead, “Who are these people?”—focusing not on the victimized enslaved, but on the victimizing owners. “The definition of the inhuman describes overwhelmingly the punisher...the pleasure of the one with the lash.” Rendering the slave “a foreign species,” Morrison concludes, “appears to be a desperate attempt to confirm one’s own self as normal.” Humanity links the enslaved and the enslaver, no matter how viciously owners seek to deny the connection. Torture, the crucial ingredient of slave ownership, dehumanizes not the slave but the owner. “It’s as though they are shouting, ‘I am not a beast! I’m not a beast!’ ” Neither side escapes unscathed.

Even when physical force is used, the people doing the Othering can also bolster their self-definition through words. Thomas Thistlewood, an English planter and rapist who moved to Jamaica in 1750, documented his assaults on the women he owned, categorizing those that took place on the ground, in the fields, and in large and small rooms, whenever, wherever he wished. He noted the rapes in his journal in Latin. Harriet Beecher Stowe’s novel Uncle Tom’s Cabin takes a very different tone, defining the Other by making a romance of slave life. Stowe presents a slave’s cabin through dulcet description that Morrison calls “outrageously inviting,” “cultivated,” “seductive,” and “excessive.” Here, a white child can enter black space without fear of the dark, the very sweetness of the language reinforcing the Otherness of places where black people live.

Othering is expressed through codes of belonging as well as difference. Most commonly, pronouns convey the boundaries between “we” and “them” through the use of first- and third-person plurals. “We” belong; “they” are Other and cannot belong. Those who are “them” can be described in the negative language of disgust: black as ugly, black as polluting. Definitions of color, Morrison says, define what it means to be an American, for belonging adheres to whiteness. The possession of whiteness makes belonging possible, and to lack that possession is not to belong, to be defined as something lesser, even something not fully human. Neither possession nor lack is natural or biological. Something has to happen; a process needs to get underway.

Flannery O’Connor’s story “The Artificial Nigger,” set in 1950s Georgia, well after the end of the slavery that kept people in place, exemplifies how Othering and belonging work in tandem. A white man, Mr. Head, and his grandson Nelson visit Atlanta for the day. Mr. Head, a poor and sad old man, undertakes to tutor Nelson in racial hierarchy. On the train to the city, a prosperous black man passes by. At first, Nelson sees “a man.” Then, under Mr. Head’s questioning, “a fat man…an old man.” These are wrong answers. Nelson must be educated. Mr. Head corrects him: “That was a nigger.” Nelson must undergo the process of unseeing a well-dressed man and reseeing a “nigger,” to understand the man as Other and himself and his uncle as people who belong to society.


Blackness remains the great challenge to writers of fiction on all sides of the color line, for the central role of race in American Othering affects us all, white and nonwhite, black and nonblack, not just writers who are white. Morrison describes her own struggles with color codes in her work, notably in her novels Paradise (1997) and Home (2012), and her story and play Recitatif (1983). “Writing non-colorist literature about black people,” she writes, “is a task I have found both liberating and hard.” Non-colorist literature does not make racial identity do the work of character creation. Characters may have racial identities—in the USA, race is too salient a part of experience to overlook. But race should not decide how a character acts or thinks or speaks or looks.

Morrison articulates her determination “to de-fang cheap racism, annihilate and discredit the routine, easy, available color fetish, which is reminiscent of slavery itself.” But it is far from easy. The actors in Recitatif, like editors and many readers, want to identify characters by race—a crucial ingredient of American identity, but one defined by generalizations rooted in the history of slavery and too facilely evoked through recognizable stereotypes. Racial identification, invented to serve needs of subjugation, can diminish a character’s individual specificity, that hallmark of Morrison’s brilliance.

Where Morrison identifies race, she struggles against the expectations of race. Paradise begins with color—“They shoot the white girl first.” But she never says which of the women in the group under attack is white, and offers almost no clues. (“Some readers have told me of their guess,” Morrison reveals in The Origin of Others, “but only one of them was ever correct.”) Paradise turns to themes of black colorism’s purity requirements and misogyny, the deadly means of Othering that Morrison’s characters employ. Colorism appears early on in the novel with wealth; in 1890, members of an established black community turn away a group of freedmen deemed too poor and too dark. The freedmen go on to found the town of Haven and its successor, Ruby, and from that moment up to the novel’s present in the 1970s, they pride themselves on their unadulterated blackness. Nearby, a group of women, seeking refuge from unhappy pasts, move into an old convent. One source of the Ruby men’s murderous hatred of the women is their racial heterogeneity—their utter lack of racial purity. But that is not the only source: In Paradise, misogyny fuels the hatred that kills.

Looking back on Home, Morrison admits to misgivings. It was a mistake, she concludes, to accede to her editor’s request for color-coding the main character, Frank Money. A minor mistake, for Money’s race only appears obliquely, after a two-page description of the hospital he is leaving. A reader would have to know that in the tiny AME Zion church that succors Money, AME means African Methodist Episcopal. A few pages later, the reader would need to grasp the meaning of he “won’t be able to sit down at any bus stop counter.” If Morrison lost the struggle between individual characterization and racial identification, which not only flattens out characters but also furthers racist habits of thought, it was just barely. Throughout her career, Morrison has confronted those habits and broken them down, not just in her own writing but also in her work as an editor.

In her 19 years at Random House, Morrison made known the stories of a variety of specific lives and their individual identities. She published biographies of the writer Toni Cade Bambara, the activist-scholar Angela Davis, and the athlete Muhammad Ali. In 1974, she published a nonfiction anthology: The Black Book, a scrapbook of black history drawn from the collection of Middleton A. Harris, who also served as its editor. There readers discovered photographs of black soldiers in impeccable uniforms, black families in their Sunday best, patents for typewriters and laundry machines, and early black movie stars, along with postcards of smiling white people at a lynching. The abundance and variety of material relating to the history of people of African descent in The Black Book opened millions of eyes to diversity within blackness, a crucial step in loosening the grip of American apartheid.

One of Morrison’s major novels was inspired by an 1856 article she found in The Black Book. Titled “A Visit to the Slave Mother Who Killed Her Child,” the article presented an interview with the fugitive slave Margaret Garner, who had murdered her youngest child after she and her family were captured in Ohio. Garner’s mother-in-law did not condemn the infanticide. Rather she condoned an act that saved a child from enslavement. The figure of the supportive mother-in-law fascinated Morrison and formed the basis for the character Baby Suggs, the un-churched folk preacher of black self-love, in her 1987 novel Beloved. Beloved won the Pulitzer Prize for fiction and the American Book Award; Oprah Winfrey made the novel into a movie. Embedding the emotional costs of enslavement in Morrison’s powerful language, Beloved spoke American history at the level of heart and gut, transforming the institution of slavery into tragedy with resonance for every reader and moviegoer. The novel and the movie communicated to everyone who loved their family the anguish of enslavement, of knowing your children were not yours at all.


What places The Origin of Others in this very moment of twenty-first-century American history—a moment that, sadly, bears much in common with earlier awful times—are two texts Morrison quotes at length. One is a testimony of lynchings committed in America in the early twentieth century. The other comes from Baby Suggs’s sermon to her people in Beloved.

The testimony of lynchings continues for the better part of two pages. This is only a small portion of it:

Ed Johnson, 1906 (lynched on the Walnut Street Bridge, in Chattanooga, Tennessee, by a mob that broke into jail after a stay of execution had been issued).

Laura and L.D. Nelson, 1911 (mother and son, accused of murder, kidnapped from their cell, hanged from a railroad bridge near Okemah, Oklahoma).

Elias Clayton, Elmer Jackson, and Isaac McGhie, 1920 (three circus workers accused of rape without any evidence, lynched in Duluth, Minnesota; no punishment for their murders).

Raymond Gunn, 1931 (accused of rape and murder, doused with gasoline and burned to death by a mob in Maryville, Missouri).

Here is Baby Suggs, the mother-in-law figure in Beloved, as quoted in The Origin of Others:

“Here,” she said, “in this here place, we flesh; flesh that weeps, laughs; flesh that dances on bare feet in grass. Love it. Love it hard. Yonder they do not love your flesh. They despise it. They don’t love your eyes; they’d just as soon pick em out. No more do they love the skin on your back. Yonder they flay it. And O my people they do not love your hands. Those they only use, tie, bind, chop off and leave empty. Love your hands! Love them.”

To these, I would add a list currently circulating on Facebook of police shootings for which no one has been convicted of murder. As of late summer, the list looked like this, but, as we know, it is tragically subject to additions at any time:

#PhilandoCastile = No Conviction

#TerenceCrutcher = No Conviction

#SandraBland = No Conviction

#EricGarner = No Conviction

#MikeBrown = No Conviction

#RekiaBoyd = No Conviction

#SeanBell = No Conviction

#TamirRice = No Conviction

#FreddieGray = No Conviction

#DanroyHenry = No Conviction

#OscarGrantIII = No Conviction

#KendrecMcDade = No Conviction

#AiyanaJones = No Conviction

#RamarleyGraham = No Conviction

#AmadouDiallo = No Conviction

#TrayvonMartin = No Conviction

#JohnCrawfordIII = No Conviction

#JonathanFerrell = No Conviction

#TimothyStansburyJr = No Conviction

These lists, and Baby Suggs’s sermon, capture the physical peril of existing in the United States in a body that is black, of the deep and long tradition of black hating and black murder. And in doing so, they address a persistent theme in the writing of two other authors who play a part in The Origin of Others, one by name, one as a presence.

Ta-Nehisi Coates, author of Between the World and Me, the phenomenally best-selling personal statement in the guise of a letter to his teenage son, provides the foreword to The Origin of Others. Toni Morrison provided a blurb for Coates’s book: “I’ve been wondering who might fill the intellectual void that plagued me after James Baldwin died. Clearly it is Ta-Nehisi Coates.” Coates says Morrison’s endorsement was the only one he craved. Morrison recognized in Coates, the cultural critic Michael Eric Dyson has written, a quality that she prized in Baldwin, and that we can see in her own work: “a forensic, analytical, cold-eyed stare down of white moral innocence.” Coates cites Baldwin’s 1963 essay The Fire Next Time as a crucial inspiration, in form and in tone, to Between the World and Me.

Born in 1924, Baldwin serves as the intellectual ancestor to both Morrison and Coates, as tribune of the themes of violence against black people and of the process by which European immigrants came to see themselves as white people in America. Baldwin began The Fire Next Time with a letter to his 15-year-old nephew, James, accusing his fellow Americans of the unforgivable crime of having destroyed and continuing to destroy thousands of black lives without knowing and without wanting to know. Coates in 2015 writes to his then 15-year-old son that “in America, it is traditional to destroy the black body—it is heritage.” Morrison in 2017 adds, as I quoted above: “The necessity of rendering the slave a foreign species appears to be a desperate attempt to confirm one’s own self as normal.” Both echo Baldwin’s 1984 short essay, “On Being White…and Other Lies,” first published in Essence, a magazine for black women. Refocusing black discourse from black subjects to whites, Baldwin made an early contribution to what would become whiteness studies at the end of the twentieth century.

Among the first to critique whiteness, James Baldwin serves as an intellectual ancestor to Morrison.Ralph Gatti/AFP/Getty

It’s perhaps inevitable that such prominent authors would come to be seen as representatives of the entire community of black Americans. Coates says he speaks only for himself. Still, his vast audience demands spokesmanship from him. Morrison embraces the responsibility, and welcomed the message that her Nobel Prize belonged not only to her, but to black women writers generally. Certainly Baldwin embraced the role of black spokesman in the 1960s with a passion that sometimes moved his audiences profoundly—as in a Cambridge University debate with the conservative William F. Buckley in 1965. When Baldwin appeared, impassioned, on the The Dick Cavett Show in 1968, the host and other guests remained stolid and inert, even looking away in discomfort. The video is painful to watch, but instructive in the history of white American willful unknowing.

American culture has changed: Whether writing as oneself alone—Coates—or speaking for a people—Morrison—these two black writers have reaped the named and remunerated honors that are their due. Baldwin, who died in 1987, did not share their good fortune, despite informal recognition of his work’s fundamental importance and utter necessity. Between Baldwin and Coates, Morrison forms the keystone in an arch from neglect to celebration. This is not by accident or automatic recognition of genius. Historical agency, the action of protest, disrupted the withholding that was Baldwin’s fate. Activism hoisted Morrison’s reputation into its rightful place.

In the aftermath of Baldwin’s death in 1987, 48 prominent black poets, novelists, and scholars took note of his fate and demanded redress. In a letter published in The New York Times, they protested Baldwin’s neglect and insisted it not be repeated. They focused attention on the literary establishment’s ongoing habit of ignoring black writers, and pointed to the need to support another distinguished black author who had been denied commensurate honors: Toni Morrison, whose Beloved had recently lost the National Book Award. After the letter appeared in the Times, things did start to change. Beloved received the Pulitzer Prize, and Morrison’s work was never again disregarded. Morrison, in that moment, became a historical event. With the recognition of her writing and her whole tradition, America was opening up and offering black Americans and black authors belonging.


In the history that connects Baldwin to Morrison and Morrison to Coates, much has been gained in terms of literary reception. At the same time, however, something that distinguishes Morrison’s fiction has been diminished: women and gender. Not entirely lost, for in The Origin of Others, Morrison discusses her novels of women—notably Paradise and A Mercy. (She might well have added another of her major women-centered works, Sula.) She also cites the woman-to-woman relationship of motherhood that binds Sethe and Baby Suggs in Beloved. But to the extent that they complicate the racial Othering that the two male writers also treat, these themes lose sharpness.

In Paradise, Morrison touches on the scapegoating of the women who live in the Convent. But the murder described in the opening pages is a murder of women by men, a brutal act of woman-hating that cannot be explained purely by race or by the line—“They shoot the white girl first”—that opens the book so sensationally. A Mercy begins and ends with a mother’s relinquishment of her daughter to the American domestic slave trade that tore more than a million people, many of them children, from their families. The mother’s act can be partially explained by the history of the Atlantic slave trade, for the mother, an African captive, had been raped on her arrival in Barbados. Historical explanation, however, neglects the child’s emotional meaning and the centrality of women in Morrison’s work. In the novel, the child becomes the protagonist. Only at the end does she seem to understand the circumstances of her abandonment and drop the bitter thread running through the narrative.

Beloved, adapted for film in 1998, helped open up the literary canon.Touchstone Pictures/Photofest

Morrison’s depiction of women, of motherhood, of misogyny, of hatred and self-hatred within and around race constitutes the foundation of her genius as a writer and thinker. Nearly all Morrison’s protagonists are women whose identities and narrative trajectories fill entire fictional universes. A universe of women emerges most clearly in Paradise, in the community of lost and broken women who come together in the Convent and heal themselves, free of men’s oversight. The women in the Convent are Othered through race, but as women, they create their own belonging, which proves their undoing. Free women enrage the men of Ruby, whose “pure oil of hatred” clarifies the “venom” they feel towards them. Bent on murder, the men attack with “rope, a palm leaf cross, handcuffs, Mace and sunglasses, along with clean, handsome guns.”

The Origin of Others combines Toni Morrison’s accustomed eloquence with meaning for our times as citizens of the world. But the breadth of her humanist imagination emerges most gloriously from her magnificent fiction, in which women play leading roles, in which social and racial identities influence but never determine individual character; her novels guide our understanding of how both race and gender inflect experience without diminishing psychological uniqueness. Although her lectures and the race-centered tradition of James Baldwin and Ta-Nehisi Coates are crucial to understanding her thought, they cannot contain her extraordinary vision of human Othering and belonging.

Clintonian Democrats Are Peddling Myths to Cling to Power
October 11th, 2017, 07:40 PM

It was every bit as predictable as death, taxes, and alarming tweets from @realDonaldTrump. In August, the small-but-powerful remnant of big-money moderation in the Democratic Party, shaken but not stirred by Hillary Clinton’s defeat in 2016, officially declared war on the party’s left-leaning rank-and-file. Rising from the ashes of the defunct Democratic Leadership Council, the “centrist” movement that took over the Democratic Party after its three straight presidential defeats in the 1980s—and erased the last vestiges of New Deal liberalism from American political discourse in the name of winning elections—came a “new” effort called New Democracy, touted as a vehicle for “rethinking” the party’s message after its history-making loss to Donald Trump. But this grand reassessment, led by DLC co-founder Will Marshall and his K Street band of brothers, was merely a reassertion of the wealth-first economics, go-slow social progressivism, and hawkish foreign policy peddled by white Democratic power-brokers and Clintonian neoliberals for three decades now.

In a political age defined by two strains of populism—Trump’s on the right, and Senator Bernie Sanders’s on the left—New Democracy should be viewed by any sentient political observer as little more than a risible relic with a fancy budget. The most prominent Democratic politicians who’ve jumped on board are anything but prominent: John Hickenlooper, Colorado’s business-first governor, Tom Vilsack, the former Iowa governor and Clinton cheerleader, and New Orleans Mayor Mitch Landrieu. But the old organs of the Washington establishment still take these people seriously, and otherwise intelligent Democrats still have a strange Pavlovian response to the dire warnings they issue, like clockwork, every four years: Embracing liberalism will always and forever end in defeat (even if Barack Obama disproved that theory not once but twice).

And so, last week, the Washington Post published an op-ed that disarranged the nerve endings of timorous liberals across the land: “Trump Is on Track to Win Reelection,” by professional Clintonite Doug Sosnik. (The last time Sosnik ventured such a bold prediction was in June 2016, when, even before the party conventions, he declared the election “already decided”—in Clinton’s favor.) Matt Yglesias at Vox, among others, hopped aboard this memetic bandwagon in subsequent days, offering up their own reasons why Trump’s “on track to win in 2020.”

Sosnik made one valid point: Trump’s “dismal” poll numbers nationally don’t reflect his standing in the battleground states that lifted him to victory in 2016. And indeed, the president’s numbers in Ohio and Michigan and Pennsylvania and Wisconsin are a bit less dismal than elsewhere. But Sosnik’s argument has little to do with facts. Toward the end of an article built on tortured logic and tendentious claims (“Trump enters the contest with a job approval rating that is certainly at least marginally better than what the national polls would suggest”), he finally comes around to his money shot: “So for Democrats and others who want to beat Trump, unifying behind one candidate will be essential.” Translation: Let the old, white, Democratic establishment pick its favorite for 2020, and everybody else get in line. Or else.

Or else what? Alan Greenblatt, a staff writer for Governing magazine and former NPR correspondent, provided an answer in Politico Magazine on Sunday with one of the most ludicrous pieces of political analysis you’ll find this side of Breitbart: “Are Democrats Headed for a McGovern Redux?” If that question sounds awfully familiar, that’s because it is. The “no more McGoverns” argument has been recycled and appropriated by anti-liberal Democratswith nips and tucks to suit the needs of the moment—in practically every presidential election since 1972. They wielded it like a tiki torch against Jesse Jackson’s populist insurgency in 1988, and invoked it to torpedo Howard Dean in 2004. And after its ironclad logic failed to derail Barack Obama in 2008, the “McGovern threat” was revived with a vengeance against Sanders in 2016.

The goal of these disinformation campaigns has always been the same: to frighten the left into falling in line with the moneyed masters of the party. And at a moment when the party is finally abandoning the New Democratic formula—suck up to big business and the military-industrial complex, pander to white supremacy, and win!—fear-mongering is the only thin reed of hope the “moderates” have to retain their supremacy in the party.


For any Democrat of any stripe, the threat of a Trump reelectionand a second term free of any need to retain even his current 34 percent approval rating—is genuinely mortifying. By reviving the hoary old arguments about why McGovern lost to Nixon in one of the biggest landslides in American history, the old New Democrats aim to once again scarify a majority of Democrats into reluctantly backing a neoliberal championing wealth-first (sorry: “middle class”) economics and a bloodthirsty view of American power on the international stage.

But the notion that 2020 will bear any resemblance to 1972 is built on a foundation of counterfactual history and willful misreading of contemporary politics. The logic runs thus: Because Trump is the second coming of Nixon—“the avatar of white cultural-grievance politics,” as Greenblatt puts it—we must learn the lessons of how Tricky Dick managed to win reelection. The most important lesson, of course, is that the Democrats veered too far left after the narrow defeat of an establishment candidate (Hubert Humphrey, now played in this movie by Hillary Clinton) in 1968. Fueled by a “raging enthusiasm among younger voters”—yesterday’s Bernie Bros—liberals “made strategic errors that Democrats today appear hellbent on repeating,” Greenblatt asserts. They doomed themselves by turning to “the ultra-liberal Senator George McGovern”—now played by Bernie Sanders—thereby steering “a course too far from the country’s center of political gravity.”

There is barely a smidgen of truth to any of this—a fact that Greenblatt, who seems to possess some sense of fairness (or shame), highlights throughout his piece with one “caveat” after another. “Politics today are much different than they were then,” he admits early on, “as is the shape of the electorate. But there are parallels that Democrats should bear in mind as they nurse their hopes of driving Trump from the White House.” (Dear God: If all we can do is “nurse” a “hope” of defeating the least-popular and most inept president in American history, maybe we really are doomed.)

In the world of reality, President Trump bears about as much resemblance to President Nixon as he does to President Lincoln. It’s understandable, given the completely unprecedented nature of Trump’s political rise, that Americans were left grasping for the nearest analogy they could find. And it’s true that Nixon was fatally corrupt, paranoid, and socially awkward; he used racial code to exacerbate white people’s resentments and win their votes. Also, he probably committed treason, secretly torpedoing a peace agreement with North Vietnam, to get himself elected in 1968—just as Trump may have done by colluding with Russia in 2016.

The similarities end there. Nixon was an intellectually gifted, up-from-the-bootstraps product of a hardscrabble childhood, not a spoiled and ignorant child of privilege. In his long political career—congressman, senator, vice president, three-time presidential nominee, failed candidate for governor of California—he outworked, out-plodded, and out-strategized nearly everyone in his path to the White House (except for John F. Kennedy). His politics were in most respects the polar opposite of Trump’s: Nixon despised the “damned far right” of his party (though he certainly didn’t hesitate to pander to it) just as much as he loathed “the far left.”

And once he reached the White House, Nixon governed as what National Review’s John Fund called “the last liberal.” He signed the Clean Air Act, created the EPA and OSHA, imposed the “alternative minimum tax” the wealthy hate so passionately, and (smelling salts for Ron Paul, please!) took America off the gold standard. His first term produced more landmark progressive legislation than the 16 years of Bill Clinton and Barack Obama combined. And Nixon wanted to go further: He called for a minimum guaranteed income for all Americans—an idea too liberal for most liberals—and in 1972 ran on a “comprehensive health insurance plan” for all, including government subsidies for those who needed it. In the international realm, which Nixon knew and cared about the most, he barbarically and cynically ramped up the war in Vietnam—but he also negotiated a landmark arms-reduction treaty with the Soviet Union and opened relations with “Red China.”

He was nothing like Trump, in other words. As Nixon historian Rick Perlstein writes, “People want to grasp for the familiar in confusing times, but it’s often just an evasion of the evidence in front of them.” Elsewhere, he’s spelled it out more plainly: “Trump is Trump, people! TRUMP!”

Similarly, McGovern was McGovern—a “prairie populist” and war hero who ran to the left on a peace platform to secure his unlikely nomination. He conducted a general-election campaign that was a righteous mess from the get-go, overseeing a chaotic Democratic Convention that looked terrible on television and hastily tapping a moderate Democrat from the heartland, Thomas Eagleton, as his vice president, without the least bit of vetting. When Eagleton was forced to admit he’d undergone electric-shock therapy twice in the 1960s, McGovern made matters worse—and wrecked his reputation as the one honest politician in Washington—by first backing him “one thousand percent,” then coldly dumping him from the ticket.

The Democrats of 1972 were a party with far deeper fissures than the current edition, with its intramural squabbles between the center-left and the actual left. The Democratic power-brokers of the day—authoritarian bosses like the AFL-CIO’s George Meaney and Chicago Mayor Richard Daley, along with foreign-policy hawks still gung-ho for the Vietnam War—were hardly “centrists.” (“They’ve got six open fags,” Meaney famously complained about the New York delegation at the DNC.) This “establishment” sat out the election once McGovern became the Democratic standard-bearer, denying their own party’s candidate the funding and get-out-the-vote machines that he had to have.

But what about Nixon’s Machiavellian recasting of McGovern as the candidate of “acid, amnesty, and abortion”? Doesn’t that sound like a Trumpian trick, just the kind of thing he might do to a Sanders or a Warren in 2020? Sure, except for the fact that it was moderate Democrats—namely, Eagleton himself— who actually originated that slander and bequeathed it to Nixon.

Nixon was the avatar of the “silent majority” of resentful whites, but he didn’t make it a pet phrase until he was already in the White House, and it wasn’t the main thing that got him elected or reelected. He won in 1968 by making a convincing (though dishonest) case that he was more likely than Humphrey to bring the Vietnam War to a speedy halt. And in 1972, he ran on an impressive record of progressive domestic policies, a landmark arms-reduction treaty with the Soviet Union, and the historic un-thawing of relations with China. Again, emphatically: not Trump.


The old New Democrats know perfectly well that the chances of Trump winning reelection in 2020 are approximately as good as the Democratic nomination going to Kanye West, with Kim Kardashian as his running mate. They know there’s no valid analogy to be drawn between Nixon and Trump, or between McGovern and the leading lights of the contemporary left, or between 1972 and the likely political climate of 2020. That’s why they’re scare-mongering again—because they are scared. Not of the depredations of the Trump presidency so much as the near-certainty that the next Democratic nominee will run on single-payer health care, progressive taxation, queer rights and abortion rights, and an anti-imperialist foreign policy. And they know the next Democratic nominee will almost certainly be the next president—and will chase them out of the party leadership once and for all.

The only thing that can prevent the Democratic left from ascending to power in 2020 is paying heed to the myth of McGovernism. Far from a time for timidity and caution, the rise of Trump and left-wing populism, combined with the repeated failures of the Clintonites, have combined to create a historical moment that’s more likely to resemble 1980 than 1972—the year when conservative Republicans, ignoring the voices of “moderation” in their own ranks, went with their hearts and minds and backed the radically right-wing (for his time) Ronald Reagan. Their politics then dominated the next 30 years of American history.

If Democrats fail to heed the real lessons of their own history—if they vote once again on the basis of their irrational fears rather than their noblest aspirations—they’ll have only themselves to blame. But it’s happened before. And hope, like falsehood, springs eternal in the New Democratic breast.

Dan Brown’s New Mystery: the Internet
October 11th, 2017, 07:40 PM

“‘This getaway car was hired,’ Langdon said, pointing to the stylized U on the windshield. ‘It’s an Uber.’” Such a virtuoso act of interpretation could only be the work of Robert Langdon, Harvard professor of “symbology” and the hero of Dan Brown’s novels Angels and Demons, The Da Vinci Code, The Lost Symbol, Inferno, and now Origin. This new novel features many of Brown’s signature themes. An evil, Catholic-adjacent cult, in this case the Palmarian Church, is behind some murders. Gems from art history are the key to solving the mystery. And a predictably hot woman, in this case the future Queen of Spain, trots along at Langdon’s side.

ORIGIN By Dan BrownDoubleday, 480 pp., $29.95

The Uber detail is one of many marking Origin as a techno-thriller in the tradition of Brown’s standalone book Digital Fortress (1998), though it shares much with Brown’s best-known novel about Da Vinci and the Holy Grail. In this new book, a techie genius named Edmond Kirsch (I looked for a secret anagram but found only Dim Shock Nerd) is killed on the brink of revealing a huge scientific breakthrough, one which he promises will change religious views of Creation forever. Teaming up with Kirsch’s A.I. creation “Winston,” Langdon must run around Barcelona to recover Kirsch’s presentation so that the world can share in his discoveries.

The faithful will be glad to hear that there’s a Da Vinci Code-esque background to Robert Langdon’s mission. The woman hosting Kirsch’s presentation at the Bilbao Guggenheim, where he was shot, was Ambra Vidal. This gorgeous woman in a white dress is both the director of the Guggenheim and the fiancé of Julian, the Prince and future King of Spain. Can Julian be trusted? The background conspiracy web to Origin is a bizarre fantasy in which the clergy and royal family of Spain are shadowy entities holding huge power. They may, Brown writes, be so afraid of science’s undermining Catholic orthodoxy that they are capable of nefarious deeds.

Brown has a PR person for the royal family portray the king as a “beloved symbol who held no real power.” But “it was a tough sell,” Brown writes, “when the sovereign was commander in chief of the armed forces as well as head of state.” But this isn’t true. In Spain as in the United Kingdom and Denmark, the monarchy is a faintly symbolic collection of patrons of nonprofits, retained largely for tourism reasons and because it would be expensive and pointless to dismantle them.

In this and a few other things Brown is surreally wrong. As he begins his mission to recover Kirsch’s scientific discovery, Langdon does something very odd. He lifts Kirsch’s hand and brings it to his cellphone: “Langdon carefully pressed Edmond’s index finger to the fingerprint recognition device. The phone clicked and unlocked.” It’s only a small detail, but dead fingers can’t unlock iPhones—the device requires an electrical charge from living skin.

Does it matter that Brown makes mistakes? Probably not, if the reader is in it for the thrill and the twist, which most are. And there are other things to love about Dan Brown’s prose. Origin irritates when it talks about science’s war with religion in terms you’d expect from an eighth grader, attributing wars to the difference in creation myths rather than, you know, politics. But when Brown gets corny, he does it with an earnestness that borders on the joyfully surreal. Origin is peppered with little technological details which foreshadow the novel’s final twist. We see an assassin commissioned via the “dark web.” He kills with a 3-D printed gun, then takes the aforementioned Uber away from the scene of the crime.

For Brown and for most of his readers, I’d guess, the new world of technology emits the same aura of mystery and darkness as the Catholic Church. In the Robert Langdon novels, Brown uses the church as a repository for mysteries and untold, frightening powers. He can solve its mysteries, however, because he and his “eidetic” memory understand arcane symbols from every ancient culture. The internet is also a place of codes. When Langdon deciphers the Uber logo, he at first interpreted it as a symbol for alchemy. It turned out to be two stickers on top of each other, one supporting the pope and the other indicating that the car was for hire via an app. In this literal layering of codes, Brown positions technology as religion’s new mirror—a twin source of arcane mystery.


Origin has been well-reviewed in places. In the UK Observer, Peter Conrad wondered if Brown “might be a prophet.” In the Hindustan Times, Prerna Madan called it “magnetic.” But the techno-dystopian core of Origin’s plot, which focuses on the biological origins of mankind and its evolutionary future, is a little thin, not really credible. It reads like a book by a slightly tired writer. Origin runs on about three-quarters of a plot, taping over big empty spaces with jolly filler about what the familiar heroes have been up to lately. Pitch-imperfect details about technology are liberally sprinkled over these concoctions, tied up in a bow, and shipped out to the bestseller lists. The result is a strange alchemy of imagined past, misunderstood present, and weirdly conjectured future. It isn’t a world I recognize.

Trump’s EPA Can’t Even Cook the Books Right
October 10th, 2017, 07:40 PM

“The EPA’s mission is to ‘protect and enhance the quality of the Nation’s air resources,’ but...”

That’s the most telling half a sentence in the Environmental Protection Agency’s plan to repeal President Barack Obama’s signature climate change regulation, the Clean Power Plan. Expected to be announced officially on Tuesday, the plan will allow coal-fired power plants to emit unlimited amounts of greenhouse gases into the atmosphere—and along with those greenhouse gases, harmful air pollutants like particulate matter, ozone, and sulfur dioxide. So after the ominous “but” that disrupts the EPA’s boilerplate explanation of its mission, the document lowers the boom: “the Agency must do so within the authority delegated to it by Congress.” And Congress apparently never gave the EPA the authority to put climate regulations on the dirtiest source of electricity in the country.

It’s been known for some time that EPA administrator Scott Pruitt has never intended to follow his agency’s core mission. Pruitt’s own agenda, as his meeting records and own statements have shown, is to serve the polluting interests that have bolstered his political career for so long. But it’s far from clear how he’s going to manage this particular industry favor. Repealing the Clean Power Plan means providing a legal argument that will hold up when environmentalists inevitably challenge Pruitt’s plan in court. White House lawyers will counter that the Obama administration acted beyond the scope of the Clean Air Act when it created this regulation—while also arguing that the regulation is too financially burdensome on the coal industry, while providing little benefits to the climate or public health.

The latter case will be an especially tough sell in court. If the draft proposal released on Monday is any indication, the agency is setting itself up to fudge those numbers. EPA officials will exaggerate the costs that Clean Power Plan enforcement will impose on coal operators, and dramatically underestimate the benefits—which means that they’ll ignore both mainstream science on air pollution’s health impacts and the global economic costs of climate change. And even when it embarks on this cherry-picking expedition, the EPA still winds up showing that the benefits of cutting carbon outweigh the costs. Experts versed in the issue contend that only an accounting scenario based in scientific malpractice can show otherwise.


Make no mistake: The Clean Power Plan was always going to be costly to coal. That was kind of the point—and the Obama administration admitted as much. But Obama EPA administrators also argued that the health and climate benefits would more than make up for the industry’s losses. Here’s the gist of their math: Cutting U.S. carbon dioxide emissions 32 percent from 2005 levels by the year 2030 would cost the coal industry $8.4 billion a year—but it would also save an annual $34 billion to $54 billion per year. That was the range they fixed for costs that the United Sates would avoid in mitigating both climate change and health impacts arising from air pollution. Nor were the relevant gains only financial: Each year, the Obama administration said, the Clean Power Plan would prevent 3,600 premature deaths, 1,700 heart attacks, 90,000 asthma attacks, and 300,000 missed work and school days.

President Donald Trump’s EPA is gearing up to dispute those numbers. Pruitt’s agency contends that the regulation could cost up to $33 billion a year by 2030, instead of $8.4 billion. But this is fuzzy, self-interest math, critics say: the agency is strategically leaving out huge savings that power companies would achieve from energy efficiency investments, according to a New York Times op-ed published by New York University law professor Richard L. Revesz and the Institute for Policy Integrity’s Jack Lienke. “In most of its new analyses, the Pruitt-led EPA ignores these savings when calculating the costs of the plan,” they wrote. “As a result, the EPA’s cost projections now include almost $20 billion of generating expenses for electricity that the agency’s own analysis shows would not be produced with the plan in place.”

The EPA is also significantly writing down the health and climate benefits that could be achieved from the Clean Power Plan. Contrary to the Obama estimates running as high as $54 billion, Pruitt’s team suggests the relevant savings would work out to just $20 billion to $24 billion per year. But this is only in one scenario in which the agency wildly misrepresents air quality science, according to Dan Cohan, an environmental scientist at Rice University who specializes in air quality management.

To get to the lower $20 billion number, Cohan said the agency disregards the health benefits of reducing pollution to a level below our national standards. “The EPA is ignoring the fact that people would be even healthier if they breathed air cleaner than what we require,” Cohan said. “And that goes against everything we know from hundreds of papers and the epidemiology literature. We know there are health benefits to making the air even cleaner. This has no basis in scientific understanding.

In order to get this tiny number, Cohan said the EPA also has to ignore all global social and economic benefits from reducing climate change, and only include benefits created in the United States. Critics say that’s ridiculous: “In our globally interconnected economy, major climatic (and economic) disruption in other countries will inevitably affect American pocketbooks,” Revesz and Lienke wrote. But others, like industry lawyer Jeff Holmstead, say it’s “reasonable to count only the rule’s U.S. benefits since Americans would be paying the costs,” according to a report in Politico.

But even as the EPA downplays the global climate benefits of the Obama plan for Clean Air implementation, the majority of its scenarios still show that the Clean Power Plan saves more than it costs. When the EPA sides with the scientific mainstream—which says human health is improved with cleaner-than-required air—it shows that health benefits will outweigh the rule’s cost to the coal industry by $17 billion to $28 billion in 2030. And when the EPA ventures beyond the mainstream consensus to endorse the findings of some scientists who say there are no health benefits to reducing particle pollution below 5 to 8 micrograms, the resulting calculations still show up to $13 billion in net health benefits. To arrive at any scenario where the Clean Power Plan costs more than it saves, the EPA has to overlook global climate benefits and the methodology of nearly all settled scientific research on air quality. And if mainstream air quality scientists don’t agree with that method, it’s unlikely it would hold up in court.

So when it comes to repealing the Clean Power Plan, the EPA can cooks the books all it wants. It still can’t prove we’d be better off if we cook the planet.

Made in America
October 10th, 2017, 07:40 PM

The New Hope Fertility Center occupies two floors in a sleek office building across from Central Park, a block from the Trump International Hotel. Upstairs, couples who want children but are struggling to conceive sit in a well-appointed waiting room. On the wall, an LCD monitor cycles through a presentation of the clinic’s specialties: in vitro fertilization, egg freezing, genetic testing. Some of the couples, though, are pursuing a different and controversial reproductive service: surrogate pregnancy.

One flight down is a separate waiting room, both smaller and louder. “Carriers”—the women who will carry the babies for the would-be parents upstairs—watch as their own children frolic in a small, glass-walled playroom. Toddlers climb over soft blocks, roll around on rubber mats, and clutch stuffed animals. Children are not allowed on the top floor: Their presence might upset clients who are unable to conceive. But downstairs, children are a credential. They offer proof that a carrier can do her job.

Because paying someone to work as a surrogate is illegal in New York, many of the carriers come from poorer parts of the United States: Kentucky, Ohio, Tennessee, Pennsylvania, Alabama, North Carolina. Reproductive labor is a growth industry, and the workers downstairs are lining up to apply for the job. Upstairs, by contrast, the wealthy couples who will employ them often come from China, one of the fastest-growing markets for surrogate pregnancies. No one knows precisely how many Chinese citizens travel to the United States each year for surrogacy, but fertility specialists say the demand is skyrocketing. “I’ve never seen anything like it,” John Weltman, the founder of Circle Surrogacy, told CNN. “It’s like an explosion.” Fertility Source, a California agency, has hired a full-time employee to handle Chinese travel and translation. “Everyone is doing that,” says Gail Sexton Anderson, who runs Donor Concierge, an organization in California that connects carriers and families.

The influx of Chinese citizens seeking surrogates in the United States reflects, in part, the growing wealth and mobility of urban professionals in China. It has also been spurred by China’s decision to rescind its One Family One Child policy last year, allowing straight, married couples to pursue larger families. But it can take months to get into a fertility clinic in China, and both egg donation and surrogacy remain illegal. At the same time, countries like Thailand and India began to prohibit foreigners from hiring surrogates or outlawed the practice altogether. So Chinese couples in search of a surrogate followed the inexorable logic of globalization: They went looking for a ready supply of labor overseas.

With surrogacy banned throughout the EU and heavily regulated in the U.K., Canada, and Australia, the best source of “carriers” became the United States. Americans are accustomed to the idea of paying for cheap goods made in China. But in this case, money flowed out of China and into America. Agencies sprang up across China promising to help Chinese parents find blonde, blue-eyed American women to bear their children. “They basically do everything,” says Gloria Li, the Asia specialist for Donor Concierge. “Transportation, translation, help you to find a clinic, make the appointment for you. When you have the baby, they even provide a house in the United States where you can stay—all in one package.” At the very moment when Donald Trump is vowing to take a hard line with China and to bring working-class jobs back to America, Chinese couples are employing American women to perform the world’s original form of labor.


Andrea is five-foot-two and weighs 135 pounds. She is not religious but is “into signs.” She grew up in a suburb of Cleveland, where she still lives. She is 32 and has two sons, ages ten and twelve. They like video games. (“Dude, I don’t know if I can play!” she tells them when they interrupt a phone call to ask her to join them.) When her dachshund barks, she treats him with the same amiable patience. (“One minute, boy!”) Andrea got married last year to a man whom she had been dating for five years. He is older and has a 17-year-old daughter of his own.

Andrea has carried twins for a Chinese couple before; when we meet, she is pregnant with her third surrogate child. The money she earns is “not a survival thing,” she says, but it has improved her life. She got the idea after seeing the movie Baby Mama, with Tina Fey and Amy Poehler. At the time, she was working as the general manager of a men’s hair salon. She found the job exhausting; some weeks, she was working every day. When Andrea’s brother and his wife were struggling to get pregnant, she offered to carry a child for them; they declined. That’s when Andrea realized that she wanted to become a surrogate. She went online and found an agency in New Jersey called New Beginnings, which refers clients and surrogates to New Hope. She filled out a questionnaire, and they contacted her the next day.

Andrea, a surrogate mother from Cleveland who carried twins for a Chinese couple. The stuffed animal was a gift from the parents. Photograph by Maddie McGarvey for the New Republic

It took five months for New Beginnings to place Andrea with a family. In July 2013, she flew to New York to meet with the agency’s surrogacy coordinator and be evaluated by a psychologist. “They give you a really long questionnaire and the answers are: Never, Sometimes, or All the time,” Andrea tells me. “Would I want to keep the baby? Do I drink or do drugs? How much? How often? If I get mad, what do I do? They ask 20 different ways if you are thinking of killing yourself.” Andrea was proud to hear that she was “one of the best surrogates that the psychiatrist had ever met.”

Two months later, Andrea returned to New Hope to have an embryo from the “intended parents”—known as I.P.s—transferred into her uterus. For weeks after the procedure, she gave herself progesterone injections. (“My husband can’t handle needles, so I did it all myself.”) Still, the transfer failed. So did the next four. The process was “exhausting,” she says. “It took a whole year.” Finally, the I.P.s switched to using an egg donor. On September 11, 2014, the doctors at New Hope transferred male twins into Andrea. She knew that the twins took when she started feeling “emotional.” (“A Whitney Houston song would come on the radio and I’d cry.”) After her doctor confirmed that the fetus had a heartbeat, she received her first stipend. The same day, she gave the hair salon two weeks’ notice.

Gestational surrogacy costs around $100,000. Of that, surrogates take home an average $30,000 to $35,000, with a bonus if they carry multiple pregnancies. The remainder of the money goes to the middlemen involved in the transaction, covering agency fees, legal fees, counseling services, and health insurance. If you do the math, the standard surrogacy fee works out to around $5 per hour for the duration of the pregnancy.

Surrogates are often paid higher fees to work with foreign I.P.s—particularly from China, where the language barrier and distance make it difficult to stay in touch after the child is born. “Surrogates typically don’t want to carry for Chinese couples, because they want a relationship with the parents,” explains Janae Krell, the founder of the advocacy group All Things Surrogacy.

Around twelve weeks in, Andrea began chatting with her I.P.s over WeChat, a popular Chinese messaging app that includes a simultaneous translation function. Andrea “would send them pics of the baby bump as it grew and keep them updated.” Around week 18, the couple flew in for a visit. They brought Andrea a pearl necklace, and they brought her sons toys. While they were in Cleveland, Andrea took them to a keepsake ultrasound shop to make DVDs and a recording of the heartbeat. She bought them a teddy bear that you could put the heartbeat recording in; when you squeezed the bear, the recording thump-thumped. “I helped them pick out strollers and car seats and clothes,” she recalls. “They needed help—they didn’t know what they needed.”

The families got along so well that Andrea invited them to stay with her when they returned for the delivery. They arrived at 34 weeks. Three weeks later, Andrea’s doctor induced her. She gave birth to both boys vaginally, and the second was born breech. (“That was the worst nine minutes of my life,” she says. “I think I saw the light.”) Afterwards, the I.P.s wanted Andrea to zuo yueziobserve the month of rest that Chinese tradition prescribes after childbirth. But before long, she was up and about again. “They didn’t know how to do anything,” she laughs. They were “new to the newborn thing,” and with two babies it was doubly hard. Andrea’s mother came over. “We would take shifts sleeping and caring for them and stuff. I was kind of like their nanny.” She pauses for a moment. “I literally was a nanny,” she says.

Andrea returned to New Hope to be implanted with another embryo last October, at the height of the presidential campaign. I call her in June, just days before she is scheduled to give birth, and the conversation turns to politics. Andrea says she “wasn’t a Trump voter,” though she didn’t like Hillary Clinton much, either. When I ask her about Trump’s get-tough stance on China and the upsurge in anti-foreign sentiment in the United States, she ruefully acknowledges that both are affecting her work. She says the hate she sees in America makes her proud to be carrying a nonwhite baby. “It proves that we are all the same,” she explains. “When I’m pregnant, nobody knows I’m pregnant with a Chinese baby, so it doesn’t matter. Blood is blood and DNA is DNA and cells are cells.”


Min and her husband, Kai, are professionals in their forties who work in Beijing: Min in finance, Kai in cybersecurity. In the winter of 2014, they told their families that they were taking a vacation in New York. They visited friends, and Min raved about the Metropolitan Museum of Art. But they kept the real reason for their trip a secret: They had come to do a round of IVF, in the hope of making embryos that an American woman could carry for them.

Min and Kai had been trying to have a child “the natural way” for several years, but they had encountered problem after problem. Eventually their doctor encouraged them to try surrogacy in the United States. They began the IVF cycle in China and then timed their trip to New Hope to coincide with Min’s ovulation. The trip was a success: They created several viable embryos. Then they went home and waited to be matched with a surrogate.

From the profiles they received from the referral agency, New Beginnings, they picked Emily, a woman from Kentucky. In the photograph, Emily was posing with her two children. When Min and Kai spoke with her on the phone, they peppered her with questions: What’s your job? Why do you want to help us through this process? What’s your family’s attitude about it? Emily accepted, and became pregnant with their daughter in the summer of 2015.

That fall, Min and Kai traveled to Kentucky to attend the 22-week ultrasound and to meet Emily and her children in person. “Min was incredible with those kids,” Emily told Jane Groenendaal, the surrogacy director at New Beginnings. “She was like a new aunt to them.” Min and Kai returned in March 2016 to be with Emily during her labor. Afterwards, Min stayed on for three months, to arrange an American passport and a Social Security number for her daughter, Sarah. Then mother and child flew home to Beijing. Min and Kai had told their parents what they’d been through. But they have kept the surrogacy a secret from their friends.

When Sarah turns five or six, Min and Kai plan to emigrate to the United States. As the parents of a surrogate “anchor baby,” they harbor the kind of immigration hopes that Donald Trump has used to whip his base into a frenzy. They want Sarah to attend school here, to improve her chances of getting into an American university. They want a second child and hope to convince Emily to carry it. When I ask Min why she wants another child, she answers in terms any parent can understand. “I think one child is too lonely,” she says. “If she has a sister or brother, it will be better for her growth.”


The founder of the New Hope clinic, Dr. John Zhang, emigrated to America in 1991. Born in Zhejiang Province to a family of doctors—his mother was a renowned obstetrician, and his two sisters both practice medicine in China—he earned his medical degree in 1984 and went on to earn a Ph.D. at Cambridge University before accepting a residency at New York University. “I have a very good education,” Zhang tells me. “Basic training in China. Special training in Cambridge. Clinical training in New York. The best combination you can have.”

New Hope, which Zhang founded in 2004, is now one of America’s busiest fertility treatment centers. Its five doctors perform more than 4,000 cycles of IVF each year, creating several times that many embryos. They also coordinate a growing number of surrogate pregnancies. Business is booming: According to the American Society for Reproductive Medicine, births through surrogacy have more than doubled since 2004.

I speak with Zhang in his corner office at New Hope, overlooking Central Park. It is October of last year, a few weeks before the election, and the leaves on the trees are turning red and yellow. Zhang himself looks slightly green, due to a cold, and wears a surgical mask throughout our interview. He’s determined to get better before hosting a medical conference the following week.

Dr. John Zhang, founder of New Hope Fertility Center, says he isn’t worried about Trump’s anti-China rhetoric.Courtesy of New Hope Fertility Center

On the wall of his office are several photographs. One shows Zhang with his dissertation supervisor, Dr. Twink Allen, and the bloody bodies of several large rhinos and elephants on which they performed experimental IVF. The ones on his desk show his wife and son. The wife looks much younger than Zhang, who is 55; he tells me she comes from Ukraine. They were married a few years ago. When I ask if he had been married before, he guffaws: “I needed to work!” In these ways—the Central Park view, the wife from Eastern Europe, the boastful confidence—he is not unlike Trump: a highly successful salesman of himself. On my way out, Zhang asks my age. When I tell him, he tries to convince me to let him freeze my eggs. When I demur, he offers a discount.

When we speak again in June, Zhang is as confident as ever. He is betting that Trump, for all his protectionist rhetoric, won’t stop the flow of capital—financial or genetic—across borders. “Based on my very conservative estimation,” he says, “China has three million couples that could come to the United States for treatment.” He does not think the election will hurt his business. “Donald Trump was hostile to China during his campaign,” he notes, “but afterwards he was more friendly. Very few countries can get along with our president, but China is one of them.”

Zhang admits to a brief moment of trepidation after the election, about Trump’s proposed travel ban. “I thought that many patients wouldn’t be able to come any more,” he says. “But most of our Chinese patients who can afford to pay for all the treatments get a ten-year visa to the United States, no problem. Why would you want to be difficult on these people if they come here and spend money?” Zhang seems to have grasped the fundamental truth about Trump’s America. “In the end,” he says, “money talks.”

Trump’s Wall Street Watchdog Undercuts a Key Consumer Protection
October 10th, 2017, 07:40 PM

In what will likely be Richard Cordray’s final major act as director of the Consumer Financial Protection Bureau, the agency last week issued the first federal rule on payday lending, an umbrella term for short-term, high-interest loans to cash-strapped Americans. Critics argue that payday lenders take advantage of vulnerable people by trapping them in a cycle of debt. CFPB’s rule seeks to break this cycle and prevent loans from being issued unless borrowers have the ability to repay.

The CFPB, the Elizabeth Warren brainchild created during President Barack Obama’s first term, has managed to survive in the hostile territory of the Trump era. Its budget remains intact, its enforcement robust, and its rules have avoided nullification from Congress. However, one adversary—a former corporate lawyer running a separate federal agency—responded to the payday rule in a way that not only would keep the cycle of debt alive, but transfer the lending power to big banks.

The typical payday loan customer borrows $350 for two to four weeks, usually until their next paycheck (hence the term “payday loan”). The lender takes between $10 and $30 for every $100 borrowed. If the borrower cannot pay in full, they can roll over what they owe into another loan, triggering the same charges. Over 80 percent of all payday loans are re-borrowed within a month, and by the time it’s paid off, the average payday or auto-title loan (where the customer uses their car as collateral) carries an annual percentage rate between 300 and 400 percent. More than 12 million Americans use payday loans each year.

CFPB’s rule gives payday lenders two options: They can conduct a “full-payment test,” confirming that the borrower can repay the entire loan on time and meet basic living expenses without re-borrowing, or they can offer a “principal-payoff” loan that allows gradual repayment of the debt in small installments. Even under the second option, borrowers would not be able to take out more than three short-term loans in rapid succession.

Consumer protection advocates weren’t initially thrilled with CFPB’s proposal, and the final rule still gives reason for concern. To determine borrowers’ ability to repay, payday lenders would use information from notoriously inaccurate credit reporting bureaus like Equifax. Enforcing payment test is inherently more difficult than, say, limiting payments to a percentage of a borrower’s paycheck—a provision that was considered but rejected. Plus, the rule doesn’t cover longer-term installment loans, which are often just as costly. Payday lenders have already shifted toward this option, and have been plotting for years on how to work their way around the rules.

Nevertheless, even critics of the initial proposal praised CFPB, hoping that the rules would steer borrowers toward safer and cheaper low-dollar loans, like those provided by community banks and credit unions. But within an hour of CFPB’s announcement, a less scrupulous federal agency intervened.


The Office of the Comptroller of the Currency (OCC), and its temporary leader Keith Noreika, has become Richard Cordray’s biggest nemesis in Washington. The OCC is the government agency that oversees national banks, and Noreika was a finance lawyer with numerous bank clients. He was installed at OCC in May as a temporary “special government employee,” which excluded him from ethics disclosures while evading the Senate confirmation process. Joseph Otting, the actual nominee, is awaiting confirmation, and Noreika has actually exceeded the time limits placed on a special government employee, watchdogs contend. When his temp gig ends, Noreika is expected to go back to defending the same clients he regulated.

In the meantime, Noreika has taken direct aim at the CFPB. When the bureau issued a rule to stop banks from preventing class-action lawsuits, Noreika—who tried that very tactic while defending Wells Fargo—released a study saying the rule would raise the cost of credit by 25 percent. Then, right after the payday rule announcement, Noreika opened the door for national mega-banks to provide a payday-style product known as “deposit advance.”

For years, banks like Wells Fargo and U.S. Bank allowed customers with paycheck direct deposit to take an advance on that money for a fee of about $10 per $100 borrowed. The bank would extract payment from the borrower’s account when the loan came due. This is functionally the same as a payday loan, and the fees were nearly as high. Consumer groups begged regulators to crack down on the practice, and eventually, OCC issued guidance warning banks that small-dollar loans had to be affordable, with ability to repay taken into account—the same principles as the CFPB’s rule. Noreika rescinded this guidance last week without even giving CFPB a heads-up. In a statement announcing this action, OCC said broadly that banks should manage risk and “reasonably” underwrite these loans, but added nothing specific. So big banks could step back into the small-dollar loan market, knowing that federal regulators would be unlikely to stand in their way no matter what.


Banks would have a significant leg up on payday lender competitors for short-term loans. You can’t really get a payday loan without a bank account; lenders take either a post-dated check or authorization to debit the account in order to ensure payment. All those customers could now shift to their own bank for one-stop access to easy credit. Since wage stagnation and persistent inequality ensures the need for low-dollar loans, the dueling regulatory moves could occasion a shift, from payday lenders’ gouging customers to big banks doing it.

Noreika said in a statement that he rescinded the guidance to avoid “duplication” with the CFPB rule. But CFPB clearly targeted payday lenders and not banks; in fact, there were exemptions for community banks and credit unions that make relatively safer low-dollar loans. So now, a poor person who needs cash to meet a sudden or recurring expense can get a long-term, costly loan from the payday lender, to be paid in installments; or they can get a short-term loan from their bank, which will likely get rolled over multiple times and not be paid off for months, at potentially even greater cost. Banks have more resources than payday lenders to garnish borrowers’ wages or sue them for repayment. The end result looks as bleak for borrowers now as it was before the CFPB’s rule.

With Trump’s team in place, we could have expected a subtle undermining of consumer protections, from a lack of enforcement if nothing else. But Noreika, a hired gun passing through the halls of a federal regulator before scrambling back to big bank clients, has done far more damage here. He’s empowered banks to engage in reckless abuse of their customers at a time when we’ve seen Wells Fargo in particular revealed as eager to do so. And he may not even be in office legally.

Financiers relentlessly built a business model based on tricking people instead of helping them, and they’ll stop at nothing to preserve and expand it. The only thing consumer protection advocates have going for them is the extreme unpopularity of such schemes. Now that efforts to police short-term loan abuses may lead to even more of them, Democrats—led by Warren, perhaps—should pounce on the issue to explain how the party plans to build broad enough prosperity that fewer Americans need to drown in debt.

Harvey Weinstein Isn’t the Only Big Donor Who Deserves Repudiation
October 10th, 2017, 07:40 PM

For once, Republican Party operatives are feeling grateful to The New York Times. Thanks to the paper of record’s blockbuster report on Harvey Weinstein’s long alleged history of sexual predation, Republicans are seeking to brand the Hollywood producer—a longtime donor to Democrats and liberal causes—as the smarmy face of rich liberal privilege.

The Democrats, after all, claim to be champions of women’s rights—so taking money from a repulsive figure like Weinstein seems like obvious hypocrisy. “During three-decades worth of sexual harassment allegations, Harvey Weinstein lined the pockets of Democrats to the tune of three quarters of a million dollars,” RNC Chairwoman Ronna McDaniel gloated in a press statement. “If Democrats and the DNC truly stand up for women like they say they do, then returning this dirty money should be a no brainer.”

Of course, political hypocrisy is often in the eye of the beholder. Democrats, in turn, are quick to point out that McDaniel is drawing her own paycheck from the party that’s launched Donald “Grab Them by the Pussy” Trump into the highest summit of power. But beyond the familiar, sententious sport of Washington hypocrisy-spotting, it’s pointless to deny that McDaniel is right: Weinstein’s “dirty money” should be repudiated.

Weinstein himself continues to supply the best argument for this course of action every time he opens his mouth. On Friday, the Hollywood producer released a cringe-inducing statement showcasing his own m.o.: He intends to keep using his liberal activism to defend himself—and to distract attention—from serious allegations of sexual assault. “I am going to need a place to channel [my] anger so I’ve decided that I’m going to give the NRA my full attention,” Weinstein wrote. “I hope Wayne LaPierre will enjoy his retirement party. I’m going to do it at the same place I had my Bar Mitzvah. I’m making a movie about our President, perhaps we can make it a joint retirement party. One year ago, I began organizing a $5 million foundation to give scholarships to women directors at USC.” The subtext here couldn’t be clearer: Love me, I’m a liberal. Don’t think of me as a man who coerces sex from female underlings, but rather as a progressive Santa Claus dedicated to fighting the NRA and Trump.

But Weinstein isn’t acting like Santa Claus so much as a liberal Roger Ailes, and he’s tendering a sordid Faustian bargain that Democrats—and liberals in general— should reject. To their evident credit, they’re so far doing just that. As Josh Marshall of Talking Points Memo notes, there has been a rapid move, in liberal circles and beyond, to disavow Weinstein: “Weinstein was immediately forced to take a leave from his company. A parade of Democrats ostentatiously coughed up his campaign contributions. His legal team abandoned him. Yesterday he was fired from his eponymous company. Various Hollywood luminaries have denounced him. I’m at least not aware of anyone in that world who is publicly sticking up for him.”

But why stop here? What we might call the Weinstein rule—dirty money deserves political rebuke—shouldn’t just apply in Weinstein’s case. Weinstein’s sexual predations are repellent, but they’re also far from the only immoral byproduct of big-donor politics.


Even in the age of Donald Trump, it’s a hard-won consensus in American politics that Nazism and open avowals of white nationalism are bad, and should be denied public legitimacy. Trump himself defied this consensus after white nationalists and Nazis staged violent protests in Charlottesville that killed a protester in Charlottesville, saying there were “very fine people” participating in the white nationalist march. And as a result of his remarks, he faced a backlash among many in his own party.

Given this consensus, it stands to reason, then, that a big donor who supported the mainstreaming of Nazism and white nationalism would be a prime example of “dirty money” in politics. And Joseph Bernstein’s own recent blockbuster report for BuzzFeed shows beyond question that the Mercer family, who donated more than $25 million to the GOP last year, has gone to great lengths to finance and normalize racist extremism.

Robert Mercer and his daughter Rebekah are the major patrons of Steve Bannon and the far-right media empire that encompasses Breitbart News and the provocateur Milo Yiannopoulos. Drawing on a trove of leaked emails, Bernstein makes clear that Yiannopoulos has deep intellectual and personal ties to the racist far right, including figures like Andrew “Weev” Auernheimer of the fascist website The Daily Stormer and Devin Saucier of American Renaissance. During his tenure at Breitbart, Yiannopoulos drew heavily on the counsel from such figures, even soliciting their help in editing and composing a major article he wrote for Breitbart on the alt-right.

Yiannopolous’s affinity for white nationalism can be seen even in something as mundane as his passwords. Per Bernstein’s report:

In an April 6 email, [Yiannopoulos’s assistant] Allum Bokhari mentioned having had access to an account of Yiannopoulos’s with “a password that began with the word Kristall.” Kristallnacht, an infamous 1938 riot against German Jews carried out by the SA — the paramilitary organization that helped Hitler rise to power — is sometimes considered the beginning of the Holocaust. In a June 2016 email to an assistant, Yiannopoulos shared the password to his email, which began “LongKnives1290.” The Night of the Long Knives was the Nazi purge of the leadership of the SA. The purge famously included Ernst Röhm, the SA’s gay leader. 1290 is the year King Edward I expelled the Jews from England.

In other words, Breitbart and Yiannopoulos were engaged in an intellectual laundering operation. They took the dirty racist memes favored among actual Nazis and transformed them into linen that could be displayed in polite company. All this was made possible by the largess of the Mercers.

If the “dirty money” of Harvey Weinstein has corrupted Democratic politics, the same can be said for the dirty money of the Mercers. There should be a loud public call for a complete repudiation of the Mercers.

In this sense, and in this sense alone, we should be grateful to Harvey Weinstein—for helping to set a new standard when it comes to donations. As recently as last month, Hillary Clinton was still flogging the old talking point that the source of donations doesn’t matter when it comes to political advocacy and policy making. Now, however, it seems as though our political norms are shifting away from Clinton’s lassitude. If the cry is now that “dirty money” is unacceptable, we need to scrutinize every big donor.

It’s the Culture, Stupid
October 9th, 2017, 07:40 PM

For a long time in American politics, you could pretty well guess how someone would vote by their income. Poor people generally supported Democrats; rich people voted Republican. This was true even among whites: In every presidential election since at least 1948, wealthy whites have been notably more Republican than the rest of the white electorate. And throughout the twentieth century, poor whites identified much more strongly with the Democrats than other whites. For decades, a good rule of thumb was: The greener your bank account, the redder your vote.

Then came Donald Trump, and the equation shifted. In last year’s election, according to an analysis by political scientist Tom Wood, 61 percent of the poorest whites—those in the bottom third of income distribution—voted for Trump. By contrast, in the top 4 percent of income distribution, just 42 percent of whites supported Trump. Among whites, millionaires decisively rejected their fellow millionaire, while blue-collar voters embraced him.

What drove the vote in 2016 was not income, but identity. Trump won by appealing directly to the cultural anxieties of downscale whites: He told them he’d do something about the immigrants who were stealing their jobs and the Muslims who were plotting to blow us up. Hillary Clinton, meanwhile, appeared to dismiss working-class whites as “deplorables,” and put on a convention that was a paean to multiculturalism. As both parties made appeals based more on race and culture than on class and economic inequality, almost 10 percent of voters who cast their ballots for Barack Obama in 2012 decided to abandon the Democrats. The famed blue wall that ran through the Rust Belt came crashing down, and Trump walked over the rubble straight into the Oval Office.

Since the election, Clinton has often been blamed for focusing too much on “identity politics.” But the suggestion that Democrats return to the populist economic rhetoric that made them heroes of the working class ignores the current political reality. The cultural forces that swayed the election in favor of Trump are likely to remain. What doomed Clinton, in the end, was not that she appealed to racial and ethnic minorities, but that she paid them little more than lip service. As Democrats attempt to move forward, they must come to grips with the fact that many of the working-class whites who abandoned the party are likely gone for good. The sooner they accept that reality, the sooner they can win with the coalition they have.


From 1932 to 1964, Democrats were America’s majority party. They consistently held the White House and Congress, with few interruptions, because they were the party of the working class, and millions of Americans directly benefited from the social welfare programs they stood for.

To maintain this majority, Democrats had to hold together a coalition of Northern liberals and Southern conservatives who disagreed vehemently on civil rights. As long as racial equality took a back seat to New Deal economic programs, the coalition held. But once the moral urgency of the civil rights movement made desegregation inevitable, the coalition split apart. And after Ronald Reagan won the Oval Office in 1980 by appealing to blue-collar whites, Republicans found they could pry the working class away from Democrats by emphasizing culture and race over economics. Over the next quarter-century, the only times Democrats won the White House—in 1992 and 1996—came when Bill Clinton exploited his status as a white Southerner to neutralize the GOP’s emphasis on cultural issues.

Democrats were further hurt by the dramatic decline of labor unions. For decades, organized labor had helped the white working class see themselves as the white working class. But without unions to identify and defend their economic interests, the white working class became the white working class. As Democrats came to rely less and less on union voters, and more and more on affluent urban cosmopolitans, their rhetoric and policies increasingly came to reflect the interests of a more highly educated and diverse constituency. By 2008, this coalition was big enough to help elect Barack Hussein Obama, even though 59 percent of working-class whites voted Republican. But with a black man in the White House and immigration on the rise, the Tea Party and Trump were able to fan the flames of racial and cultural resentment. In the end, Obama’s election helped accelerate what has been a long, slow shift in the political identity of blue-collar whites. The fact that the change has occurred steadily, and over a period of many decades, only makes it all the more difficult to reverse.


How, then, should Democrats proceed? Democrats have traditionally fared better when they emphasize class over culture, encouraging white workers to vote their wallets. Broadly speaking, this can be achieved by de-emphasizing racial and cultural issues, or by re-emphasizing economic issues. In the current political landscape, however, neither approach is likely to work.

The first option, to de-emphasize race and culture, may simply be impossible at this point. Trump is waging an aggressive effort to crack down on immigrants and undercut civil rights laws, while offering unabashed support to white supremacists. Advising Democrats to tone down cultural issues feels a bit like telling a kid who’s being punched in the face by a bully that he should try to be a little less violent.

The second strategy, to re-emphasize economic issues, also faces a significant hurdle. In theory, if Democrats came out with a big, bold economic agenda, they might be able to convince blue-collar whites that Democrats are still the party of the working class. But at this point, it may be hard to regain the trust of those voters, given how solidly they have come to see Trump as their savior. And it’s hard to imagine what that bold economic agenda would even look like, given how deeply Democrats have come to depend on a class of very wealthy donors who would like to stay very wealthy.

So: If neither of these options seems feasible, can Democrats win by playing on the turf of culture and identity? The short answer is yes. But to do so, they’ll have to overcome two obstacles: a geography problem and a turnout problem.

Let’s start with the geography problem. Thanks to our antiquated electoral system, areas of the country that are culturally and racially conservative enjoy outsize influence. At the presidential level, Democrats have won the popular vote in six of the last seven elections—yet a Republican still entered the White House after two of those defeats, thanks to the way the Electoral College favors rural and suburban voters at the expense of city dwellers. At the congressional level, the increasing concentration of Democrats in a relatively small number of urban districts—combined with aggressive GOP gerrymandering—has enabled Republicans to hold a majority in the House for 18 of the last 24 years. If Democrats get back into power, they should fix their geography problem by passing electoral reforms to render gerrymandering impractical and force candidates to appeal to a broader range of voters.

If Democrats were serious about getting out the vote, they would begin building on-the-ground organizations today, reaching deep into low-turnout communities. But that would require them to rethink their entire party apparatus, which prioritizes fund-raising at the expense of everything else. Would-be candidates in competitive districts are told that their first task is to raise millions of dollars from rich people so they can pay consultants and pollsters to produce ads and do social-media targeting. There’s a lot more money to be made in buying TV time, it would seem, than in building meaningful connections to voters.But Democrats can’t fix their geography problem until they solve their turnout problem. It’s a matter of math: The GOP’s core constituencies (older, wealthier, churchgoing) get to the polls at consistently higher rates than core Democratic constituencies (younger, poorer, secular). In last year’s election, just 49 percent of millennials voted, compared to 69 percent of baby boomers and 70 percent of the “greatest generation.” Hispanics voted at a lower rate than whites, and black voter turnout dropped for the first time in two decades. If every generation and race had voted at the same rate, Democrats would have won in a landslide.

But campaign infrastructure alone isn’t enough. To motivate their core constituents, Democrats also need to embrace a message that speaks more directly to their concerns. A truly progressive economic and civil rights agenda would engage the constituencies that Democrats need most—not working-class whites, but low-income minorities. Donald Trump won by appealing to the cultural anxieties of blue-collar whites. By the same token, Democrats can win by appealing more explicitly to the hopes, fears, and dreams of their broad coalition, and giving them a reason to turn out on Election Day.

As America becomes more and more diverse, issues of race and culture will continue to dominate the political discourse. The good news is that Democrats, for perhaps the first time in modern history, can actually turn that reality to their advantage. The bad news is that, even if they succeed, the discord and hatred that the GOP is using to mobilize its base will continue to divide the country for decades to come. The Republican emphasis on race and culture poses a solvable problem for Democrats. It poses a more difficult problem for democracy.

Why Republicans Blamed the Las Vegas Massacre on “Evil”
October 9th, 2017, 07:40 PM

The devil went down to Vegas and shot 500 people last week. Such, at any rate, was the moral of Stephen Paddock’s rampage for Senator John Kennedy, a Louisiana Republican, in a recent interview with CBS News: “I do not think that the United States Congress can legislate away evil.” Rodrigo Meirelles, the president of Hampel’s Gun Company, agreed. “You cannot legislate evil, you cannot legislate stupid,” he pontificated to Fox 32. The Las Vegas massacre “was the pure act of evil,” GOP Congressman David Young of Iowa said on Wednesday. “And, you know, as much as we’d all like to, I don’t think the U.S. Congress—we don’t know how to legislate away evil. Just like I don’t think the U.S. Congress or anyone else knows how to legislate sanity.” Kentucky Governor Matt Bevin, also a Republican, tweeted out the same talking point, albeit in an elevated tone of dudgeon: “To all those political opportunists who are seizing on the tragedy in Las Vegas to call for more gun regs...You can’t regulate evil...”

In the mouths of Republicans and gun manufacturers, “evil” is a cynically evasive buzzword, highlighting the collective will of our conservative power elite to do nothing in the face of reprehensible mass slaughter. But the rhetoric of evil is more than a dodge; it’s also a tell. This is anti-democratic rhetoric: When conservatives frame gun violence in terms of spiritual warfare, pitting humans against supernatural powers and principalities, they suggest that there are only spiritual solutions to the problem and that legislative reforms are useless. “Against evil in its purity, what can mere humans accomplish? Politicians plead weakness and join the feeble citizenry in the only thing we humans can do: pray,” Patrick Blanchfield recently explained in n+1.

What good, in other words, are mere American laws against the works of Satan? But the trespasses involved in the Vegas massacre have far less to do with the lurid imagery of the Book of Revelation than with the workaday modern face of evil. Satan could have magicked all the guns on Earth to fire at once, but this was not what Paddock did. We can meet the GOP’s army of righteous instant theologians halfway and say that maybe the devil somehow set Paddock’s rampage off. But even so, Paddock profiles as an all-too familiar figure on the Trumpian American scene: an angry man with too much money, a gambling problem, and a penchant for screaming at his girlfriend in public. It’s true that the law could not have made him decent, or mitigated his anger. But it could have made it more difficult for him to murder 58 people from the windows of his hotel suite. Paddock had easy work for human reasons. Shooters do not conjure guns; they buy them.

The conservative fixation on evil, with sin as the subtext, is a symptom not only of religious hypocrisy but also of more secularized corruption. The gun lobby has invested decades of time and millions of dollars into purchasing political representation, which means it has purchased a kind of citizenship too; Mammon has shifted the balance of political power, away from the parents of Sandy Hook and toward Remington Outdoor. So Republicans flatten gun violence into a morality play, one that transforms perpetrator and victim into caricatures. If there is “evil” to be defined, identify it here, in this dehumanization.


Political rhetoric is never empty noise. Even at its most superficial level, it drives real action or inaction, and real consequences may be pinned to it. If mass shootings are binary events pitting good against evil, we must nominate monsters, and conservatives typically draw candidates from the ranks of the marginalized. Young’s pairingwe cannot legislate evil, we cannot legislate insanityreinforces an old and troubled association of evil with mental illness.

Society alternately anoints people who are suffering from mental illness as prophets or scapegoats—a reflex that attests to centuries of prior abuse at the hands of the spiritually self-assured. Exorcisms and forced institutionalization, shock treatments and lobotomies; we have behaved, historically, like we can carve spirits out of brains. These beliefs mutate and persist, and resurface with a glumly predictable regularity after each new mass shooting. And indeed, what we know so far of the public reaction to Paddock’s massacre follows this pattern. There is already breathless speculation that anti-anxiety medication influenced his violence, as if Valium did the devil’s work and flipped some fateful switch in Paddock’s brain. This is not how Valium works, or how mental illness works, or how gun violence works. Most gun deaths in the United States are suicides; the mentally ill are disproportionately more likely to be victims, rather than perpetrators, of violent crime, most definitely including gun violence.

These facts are not unhappy accidents. They are firmly embedded within a cycle of marginalization and abuse that mouthpieces of our dominant political culture like David Young perpetuate, however unwittingly. Mental illness is stigmatized, which means people with mental illness are estranged from society. And that, in turn, means that they are particularly vulnerable to abuse and murder. In his speculations about the foreordained limits of human legislation, Young hasn’t identified the cause of gun violence. He’s only further isolated a class of victims, while his party steadfastly refuses to expand access to health care. Every iteration of Trumpcare identified psychiatric care as a service states may refuse to cover under Medicaid. What does evil mean if not this?

This selective fear of evil is evident, too, in the Trump administration’s heavy-handed efforts to turn public attention away from the role of gun access in the Vegas massacre. As White House spokeswoman Sarah Huckabee Sanders tried to fend off questions about the administration’s gun fetish, she hauled out a favorite Trump-branded talking point: “I think if you look to Chicago, where you had over 4,000 victims of gun-related crimes last year, they have the strictest gun laws in the country. That certainly hasn’t helped there.” The subtext here is not what you’d call subtle: Chicagomeaning the south side, meaning black peoplepossesses a mighty badness impervious to the law.

But this secularized and racialized version of the fatalist case for inaction in the face of untrammeled gun violence is also a right-wing fairytale. For starters, it’s not actually the case that Chicago has “the strictest gun laws in the country,” as NPR reported on Oct. 5. In addition, the city’s gun laws, no matter how stringent they may be, can’t compensate entirely for deficiencies in state law—or for the gun-happy state of legislation in Vice President Mike Pence’s Indiana, just a few miles to the east. As long as people can easily bring guns in from outside city limits, or from outside Illinois, Chicago will bleed.

Other factors have stoked Chicago’s epidemic of violent crime, operating well beyond the city’s control. One recent study, published in JAMA Internal Medicine, suggests that a probabilistic contagion model helped predict likely victims of gun violence in Chicago. “[Researchers] created social networks of all individuals involved (people were considered connected if they had been arrested at the same time, what they termed ‘co-offenders’),” explained Slate’s Danielle Ofri. “In studying more than 11,000 shootings, the researchers concluded that 63 percent of these events could be predicted via social networks.”

Ofri correctly warns against using the contagion model to justify some sort of social quarantine—a policy that would prove just as racist as the present lethal confluence of gentrification and broken-windows policing. Mindful of the further stigmatization that could arise from such an approach, researchers have urged a focus on people as potential victims rather than potential criminals. Perpetrators can be both aggressor and victim, and an inherited burden of racial discrimination can predispose them to either status.

But any such admission of moral complexity, let alone an operational acknowledgment of its policy ramifications, clearly can’t coexist with the lazy and self-interested caricature of gun violence as a regrettable yet immutable side effect of spiritual warfare. This schema only allows for the existence of angels and demons, and we have seen, repeatedly, who gets to be which. Michael Brown was “no angel.” Anthony Hill was mentally ill. Cops are beleaguered agents of good. Meanwhile, Republicans shrug.

But where guns are concerned, there are no devils and no angels—only people, who shoot out of rage, despair or some terrible hybrid of desperate, contingent emotions that receive no sanction, or disavowal, from on high. Gun violence is not a problem of evil. It’s a matter of public health, and the solutions are in front of us, if we can bring ourselves to see them for what they are.

A Journalist Confronts the Japanese Tsunami
October 9th, 2017, 07:40 PM

On the cold and sunny Friday of March 11, 2011, a powerful series of earthquakes shook Japan. The epicenter of the quake was northeast of the main island of Honshu, in the northeast region of Japan known as Tohoku. Richard Lloyd Parry, the correspondent for the Times of London in Japan, was in his Tokyo office when the quake occurred. Japanese buildings are generally well-fortified to withstand quakes, and the building where Lloyd Parry’s office was located suffered no damage. Tokyo generally was unaffected, though reports trickling in from the regions closer to the epicenter started to report casualties. The afternoon of the quake, Lloyd Parry tweeted, “No deaths in Tokyo so far. My hunch is that there will be scores, perhaps low hundreds in NE Japan, but no more. Not megadeath.”

GHOSTS OF THE TSUNAMI By Richard Lloyd Parry MCD / Farrar, Straus, and Giroux, 320 pp., $27

Lloyd Parry might have been correct in his “not megadeath” prediction, if it weren’t for what happened after the quake. The first disaster to follow—which, miraculously, resulted in no direct casualties—was a meltdown at the Fukushima Dai-ichi nuclear power plant. This was the worst nuclear accident since Chernobyl. What could be worse than a nuclear meltdown, you might wonder? A tsunami. As Lloyd Parry describes all three events:

It was the biggest earthquake ever known to have struck Japan, and the fourth most powerful in the history of seismology. It knocked the Earth ten inches off its axis; it moved Japan four feet closer to America. In the tsunami that followed, 18,500 people were drowned, burned or crushed to death. At its peak, the water was 120 feet high. Half a million people were driven out of their homes. Three plutonium reactors in the Fukushima Dai-ichi power station melted down, spilling their radioactivity across the countryside … The earthquake and tsunami caused more than $210 billion of damage, making it the most costly natural disaster ever.

It was also the worst national crisis in Japan since World War II. Fearing more nuclear mishaps, Japan quickly shut down all of its nuclear power plants, resulting in power outages for 2.5 million people. Between the meltdown and the tsunami many villages and towns were destroyed, never to be rebuilt. Lloyd Parry writes, “A generalized dread took hold, the fear of an invisible poison spread through air, water—even, it was said, through a mother’s milk.” More powerful than that generalized dread, though, were the tsunami’s devastating effects, which is where Lloyd Parry focuses his reporting and analysis in this harrowing book.

The primary story Lloyd Parry tells in the book is that of the Okawa Primary School, which was destroyed in the tsunami. It is by focusing on this singular disaster that Lloyd Parry finds he can come to grips with the world historical events he has just lived through. “The events that constituted the disaster were so diverse, and so vast in their implications, that I never felt like I was doing the story justice. It was like a huge and awkwardly shaped package without corners or handles: however many ways I tried, it was impossible to hoist it off the ground,” he writes. The slipperiness of the tragedy forces Lloyd Parry to imagine and report on the horrors people in northeast Japan were in the grips of: missing relatives swept away by the tsunami, homes destroyed save for what people could grab, and possessions lost in the power of the unquenchable wave. Because of his focus on the school, Lloyd Parry, a father of two, does his best to empathize with the parents of Okawa’s children. It is these people—along with the few children who survived the tragedy and discussed it with Lloyd Parry—who emerge as the downtrodden yet heroic figures of the book, which tells a story that desperately needs heroes.

But Perry does not feel good about his work. “I interviewed survivors, evacuees, politicians, and nuclear experts, and reported day by day on the feckless squirming of the Japanese authorities,” he says of his reporting in northeast Japan. I wrote scores of newspaper articles, hundreds of fizzy tweets, and was interviewed on radio and television. And yet the experience felt like a disordered dream.” What finally makes the tsunami concrete to Lloyd Parry is hearing it described again and again by survivors. They use the same word over and over: jigoku, translated as hell. He clarifies, “The image they had in mind was not the conventional landscape of lurid demons and extravagant, fiery tortures. There are other hells in Japanese iconography—hells of ice and water, mud and excrement, in which naked figures, stripped of all dignity, lie scattered across a broken plain.” This is as close as Lloyd Parry can get to comprehending the tsunami.

Okawa primary school had 108 children in attendance on March 11. Of the 78 who were not picked up early by their parents (there was a rush to pick up children after the earthquake and before the tsunami) or absent that day, 74, died in the disaster. So did 10 of the 11 teachers. The parents who picked up their children did so after the radio warned of an Ō-tsunami, translated as “super-tsunami,” with waves expected to reach up to 20 feet (the actual waves were much, much higher). Those parents whose children survived later found themselves at odds with their neighbors who had lost a child in the disaster. Friendships crumbled, neighbors ceased to be neighborly, and alliances formedespecially among those whose children were missing, as they continued to search for any traces of them long after the disaster was over. The community around Okawa Primary School became a symbolic center of the tragedy.

Lloyd Parry went to the area around the school for the first time in September 2011. He had been visiting the tsunami zone before that, though it was hard to reach by car and the trains were not running yet. In the fishing villages and the small inland towns of the region he saw the wrath the tsunami had wrought. Still, he writes, the scenery was beautiful:

This was the prospect revealed to us as we drove along the Kitatami [River] into Okawa that morning: the arching sky; the green hills divided from one another by valleys packed with rice; villages at the edge of the fields; and, in the hazy distance, lagoon and sea. It was an ideal, archetypal scene: farm and forest, fresh and salt water, nature and humanity in balance.

Though stereotypes cast the Tohoku residents as yokels, Lloyd Parry finds them sophisticated, if slightly more rugged due to the rough terrain and harsh climate of the region. They were less fussy about their appearances than he was used to from living in Tokyo. Their messy hair and layers of clothes were meant to keep bodies warm in the colder months. The region had been hit by tsunamis before: in 1585, 1611, 1677, 1687, 1689, 1716, 1793, 1868, and 1894. The most destructive tsunami before 2011 was the Meji Sanriku Tsunami of 1896, which killed 22,000 people. Many of the older residents of the region also remembered the tsunami in 1933, which had waves as high as 100 feet and killed 3,000 people. These were hardy folk, not inclined to grouse about the weather, even in its extremes, which is part of why the tragedy at the school hit so hard.

Lloyd Parry tells the story of a mother named Naomi, an English teacher who was the mother of an Okawa student lost during the tsunami. The student, Koharu,
was one of the missing, and Naomi joined with the other parents who searched the school during the days it was declared safe to be in the area again. Naomi is also one of the many parents who consulted a psychic during this time, to both speak to the spirit of her daughter and to help locate her—or any trace of her (she is one of the lucky parents: Her daughter’s remains were eventually found and buried). Lloyd Parry is struck by Naomi’s practical nature and her reaction to the loss of her daughter: “Of all the Okawa mothers I met, Naomi was the clearest-sighted, even in the intensity of grief. For many of those who experienced it, the tragedy of the tsunami was formless, black and ineffable, an immense and overwhelming monster that blocked out the sun. But to Naomi, no less stricken than the others, it was glittering and sharp and appallingly bright.” It is that brightness that Lloyd Parry seeks in his conversations with the people associated with the Okawa school. There is no answer to the overweening question of why the disaster happened. But there are answers to how, and both Lloyd Parry and the parents he interviews want as much information as possible as to why the school did not evacuate the pupils to higher ground. It’s a search for justice that becomes a search for lost possibility. Both Perry and the parents grapple with the futility of searching for ways to to save children who have already been lost.


Much of the book focuses on the question of the actions of the administrators and teachers at the Okawa school, and the steps taken by the parents in search of justice. But one of the most startling moments of the book is a description of the tsunami by a government functionary named Teruo Konno. Konno describes his ordeal in terrifying detail. As the building he worked in was hit by the tsunami, he was surrounded by swirling water. “I never heard anything like it, Konno says. It was partly the rushing of the water, but also the sound of timber, twisting and tearing.” Lloyd Parry explains, “In the space of five minutes, the entire community of 80 houses had been physically uprooted and thrust, bobbing, against the barrier of the hills.” It is in these accounts that we finally can start to comprehend what the experience of the tsunami was like. It’s hard to think about the waves crashing on the beach in quite the same way, so powerful is Ghosts of the Tsunami. Lloyd Parry’s account is truly haunting, and remains etched in the brain and the heart long after the book is over.

Facebook’s Promise of Community Is a Lie
October 7th, 2017, 07:40 PM

Like Dr. Victor Frankenstein, Mark Zuckerberg has learned to have regrets. Until very recently, the Facebook CEO wouldn’t have seen himself as a villain in a horror novel but rather the hero of a happier genre, a classic American rags-to-riches story in the tradition of Horatio Alger. From his dorm room in Harvard in 2004, Zuckerberg created the outstanding economic success story of our century—a social media giant that now has more than two billion active users and a capitalization of $445 billion. Even if you don’t like the site, it’s hard not to be awed by the scale of its reach, which is almost without parallel in human history. As Max Read notes in a recent survey of the internet leviathan in New York magazine, Facebook users represent “the single largest non-biologically sorted group of people on the planet after ‘Christians’—and, growing consistently at around 17 percent year after year, it could surpass that group before the end of 2017 and encompass one-third of the world’s population by this time next year.”

Yet as it continues on its path to world domination, Facebook finds itself increasingly mired in political controversy—a fate it shares with other mammoth internet platforms like Google and YouTube. During the 2016 election these internet brand names no longer seemed liked neutral venues for sharing ideas; rather, they became hothouses of fake news and propaganda. Social media in particular has played a pivotal role in the still-developing story of Russian interference in 2016 presidential balloting. The latest revelations in that scandal indicate that Russian-linked anti-Clinton ads harnessed a Facebook micro-targeting feature to home in on key voters in the swing states of Michigan and Wisconsin.

Yet even before the specter of foreign electoral hacking surfaced on the site, critics have long pointed to another civic bug of our Facebooked lives: the self-reinforcing character of the site’s news feeds. By swamping users’ accounts with content tailored to their past browsing habits, Facebook has gradually come to quarantine users in news bubbles of their own making. And there’s no incentive for Facebook to puncture those bubbles, given the wider monopoly structure of the tech economy, liberal critics have argued. “Our lives are increasingly dominated by a series of big companies that have achieved something close to the state of monopoly,” former New Republic editor Franklin Foer explained recently in an interview for Slate. Foer noted that even politicians who have been traditionally friendly with big business, like New Jersey Senator Corey Booker, are now increasingly critical of the tech giants on the grounds that the logic of their business models distort and stunt our democracy.

Indeed, Facebook’s maximum leader has begun to register this critique in his own public statements—albeit in his own stunted and distorted way. In his 2017 message carrying his resolution for the new year, Zuckerberg acknowledged the mounting sense that Facebook is no longer purely a force for good. “For decades, technology and globalization have made us more productive and connected,” he wrote. “This has created many benefits, but for a lot of people it has also made life more challenging. This has contributed to a greater sense of division than I have felt in my lifetime. We need to find a way to change the game so it works for everyone.”

Like many of Zuckerberg’s statements, this was bewilderingly vague—a sign, perhaps, of the great social-media impresario’s near-total detachment from the conditions of public life in twenty-first-century America. In what sense is globalization a “game,” exactly—and who’s chiefly benefitting from all these storied gains in productivity and connectivity?

It’s clearly not within the reach of Zuckerberg’s own internal algorithm to wrestle such questions to the ground. Still, the statement did at least mark a telltale pivot away from the site’s native argot of technological utopianism, which continually worked to present itself as a neutral platform that more or less spontaneously brought the world together. Like the legendary Coca-Cola ad from 1971, with the dream that “I’d like to buy the world a Coke and keep it company,” Facebook was selling the promise of creating new human fellowship, on a staggering new scale.

But as Zuckerberg then noted, the dread notion of “division” is the unwelcome human disruption in this vision of a placidly empowered new world tech order. So lately Zuckerberg has revamped his own depiction of the ideal-type Facebook interaction, supplanting the older notion of “connection” with a newer emphasis on “community.” In February, Zuckerberg released a manifesto, grandly titled “Building Global Community.” Now the word went forth that “Facebook stands for bringing us closer together and building a global community.”

At first glance, “community” may not seem all that dramatic a departure from “connectivity.” Both are blandly agreeable social goods, and the distinction between them may strike us as roughly akin to the difference between buying the world a Coke and joining the Pepsi generation. But as Max Read notes, the use of the language of community actually marks a major shift in Facebook’s corporate image of itself—a move away from technological neutrality toward the more stubborn complexity of human values:

Kate Losse, an early Facebook employee and Zuckerberg’s former speechwriter, told me she thought it represented a serious transformation of the company’s purpose. “The early days were so neutral that it was almost weird,” she said. The first statement of purpose she could remember was a thing Zuckerberg would say at product meetings: “I just want to create information flow.” Now he was speaking of “collective values for what should and should not be allowed.” “It’s very interesting that the community language is finally being brought in,” Losse said. “ ‘Community’ is, like, a church —it’s a social structure with values.”

Zuckerberg is also taking his flirtation with communitarian values on the road, via his famed listening tour across America, which has prompted some pundits to speculate that, in search of fresh worlds to conquer, Zuckerberg may be pondering a 2020 presidential run.

It’s clear, at any rate, that Zuckerberg has adopted communitarian language to give his empire a political makeover. Communitarian rhetoric is a shrewd way to pre-empt the rising tide of anti-Facebook sentiment by crafting a wholesome corporate mission that almost anyone can agree with at some level.

Part of the beauty of communitarian discourse is that it is genuinely bipartisan, appealing in nearly equal measure to the right and left. Conservatives celebrate a long tradition, going back at least to Edmund Burke, of marshaling “the little platoons” of civic society as a bulwark against the centralizing impulses of the state. Leftists and liberals, meanwhile, often hark back to their own participatory politics, founded upon the civic advocacy of grassroots organizations, such as unions and civil rights groups, as the engines for political change and democracy.

All of this admirably serves the aims of Facebook’s rebranding initiative: Who can be a serious foe of “community,” in any of its guises? But on closer examination, we have ample reasons to be skeptical of this communitarian cant.

For one thing, Facebook’s main mission in life is not fostering community but making money. As John Lancaster argued in a recent article in the London Review of Books, Facebook’s business model is about remorselessly monetizing the trust people have (often unwittingly) placed in the platform as a repository of their consumer data. “The solution [to Facebook’s monetization problem] was to take the huge amount of information Facebook has about its ‘community’ and use it to let advertisers target ads with a specificity never known before,” Lancaster notes. “What that means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens.”

If we keep in mind Facebook’s surveillance power, then the metaphor of community becomes a good deal less benign. After all, as long as we’ve hymned the allure of community, there have been two diametrically opposed visions of what communal life is actually like. There is, on the one hand, the positive vision of small-town life as a place of warmth, fellowship, and co-operation, as seen in the paintings of Norman Rockwell or the films of Frank Capra. But we’ve also long reckoned with a bleaker view of things—the notion that small towns are stultifyingly conformist and small-minded, because they allow for no privacy or independent thought. In Sinclair Lewis’ Main Street or Sherwood Anderson’s Winesburg, Ohio—to name just two famous literary examples of this critique—the communal spirit serves as a byword for provincialism, and bigotry.

And, curiously enough, that’s precisely the question that haunts Facebook today: Does it promote community or tribalism? And is the site’s version of “community” so beholden to the forces of market surveillance that it renders communal participation a dead letter to anyone other than big-ticket advertisers? Zuckerberg’s recourse to communitarian discourse is a way to preclude these questions from being asked, to provide an answer to what is still very much an open question.

To begin seriously reckoning with such unasked questions, we would do well to examine the liberal tradition’s own internal debates over communitarian values. American liberals has always valued community—while also recognizing that it often exists in tension with other important values, notably equality and liberty. When, for example, white supremacists and homophobes rely on the rhetoric of community to defend Jim Crow segregation or gay marriage bans, liberals have rightly said that not all communal norms are absolute. To treat community as a value in and of itself is to ignore the particular ways in which belonging to a specific community constitutes a political benefit—or harm. Membership in a union is very different than membership in the Ku Klux Klan.

By failing to attend to any of these distinctions, Zuckerberg’s new communitarian language is as vacuous as his old talk of connectivity. It’s an attempt to offer a neutral solution to problems that humans have always had to thrash out in political conflicts.

It’s disheartening, but not all that surprising, to see that Zuckerberg cannot be shaken from the bedrock conviction that there has to be a technocratic solution to the age-old dilemma of reconciling community with individual rights. In his “Building Global Community” manifesto, he writes, “The guiding principles are that the Community Standards should reflect the cultural norms of our community, that each person should see as little objectionable content as possible, and each person should be able to share what they want while being told they cannot share something as little as possible. The approach is to combine creating a large-scale democratic process to determine standards with AI to help enforce them.” Unfortunately, the “large-scale democratic process” Zuckerberg has in mind is simply people voting on Facebook.

The idea that AI and voting on Facebook can solve the problems that Facebook is causing is absurd. It’s a way of saying the solution to Facebook is more Facebook. Behind all the feel-good rhetorical evocations of community that now are billowing out from the Facebook mother ship is the same old problem that has dogged American democracy since the dawn of the industrial age: A corporate giant is refusing scrutiny from the only real democratic force that might restrain it—an elected government. And to begin facing down that threat, we don’t need connectivity, community, or other warm and fuzzy nostrums from civic ages past. We need politics.

Blade Runner 2049, Never Let Me Go, and the Longing to Be Human
October 7th, 2017, 07:40 PM

The ending is schmaltzy. The plot is a quasi-religious quest to find a savior figure. Its twists are easy to anticipate. The best parts of the film—the visual sound and the visual style—are directly borrowed from its predecessor, though denuded of their 1980s-ness: goodbye shoulder pads and obsequious synths. Like Star Wars and many of the franchises that have followed it, Blade Runner 2049 has daddy issues—though this was true of the original, which owed a debt to Frankenstein. And when it comes to the Frankenstein myth, Blade Runner 2049 signals that we’ve reached a state of exhaustion in telling stories about the monsters, robots, replicants, operating systems, or beings we might someday create.         

The sequel has abandoned the ambiguities of the original—was Harrison Ford’s Rick Deckard a replicant or not?—for leaden certainties that unlock the film’s pile of mystery boxes. (Spoilers ahead.) Perhaps not so surprising, since any sequel is going to aim to please more than to confuse. In the original, Sean Young’s Rachel was a new kind of replicant because she had not only emotions but implanted memories. Discovering her own status as a replicant, she achieved an evolved form of self-consciousness and became a complicated moral actor. In the new film, the replicants hope to gain a something like a soul through a simpler process: by the prospect that their kind can have children.

The replicants in Blade Runner 2049 have heard a rumor at large that one of them, Rachel, has given birth. One of the film’s few human characters, Lieutenant Joshi, a.k.a. Madame, played by Robin Wright, says that this knowledge could tear down the “wall” (hello, Mr. President) between humans and replicants. She says it to Ryan Gosling’s K, a blade runner (i.e., replicant hunter), who’s also a replicant himself. He says that the difference is that something that’s born has a soul. She quips back to him: “You’ve been getting on fine without one.” He’s not so sure of that, and he undertakes his own investigations. He finds that the possibility of reproduction has set off a religious liberation movement among the replicants, founded on salvation through fertility.

All this (plus the announcement of this year’s Nobel Prize in Literature) put me in mind of the clones in Kazuo Ishiguro’s 2005 novel Never Let Me Go. Ishiguro’s novel (and the 2010 movie based on it) is a quieter dystopia than Blade Runner, but its stakes are just as lethal. The replicants in the Blade Runner films perform slave labor (how much freedom K enjoys is an open question: He has an apartment and a measure of choice, but quitting his job or disobeying his boss or simply straying from his “baseline,” in the film’s jargon, would get him killed). The clones in Never Let Me Go grow up in a pleasant countryside boarding school but one day learn that their fate is to have their organs harvested for transplants to non-clone humans, perhaps after a phase spent as caring for donors. There is a rumor going around that clones might be spared if they prove they’re in love, and one clone has a theory that the art they made as schoolchildren is being preserved and might be used as proof that they have souls, and should be spared.

None of this comes true, the clones are doomed, and it’s heartbreaking. That’s because the narrator, Kathy H., is so richly drawn, with friendships; jealousies; sentimental attachments to corny-sounding popular songs (the title is from a song by a fictional crooner named July Bridgewater, and it makes Kathy H. think of holding a baby); and sympathy with her imagined reader (“I don’t know how it was where you were,” she says memorably). She has a keen aesthetic perception of the world around her that extends to noticing garbage blowing down a country road and then thinking of herself as a kind of living trash. The replicants in the Blade Runner films are subject to slurs (“skin jobs”) and in the original four-year lifespans. Rutger Hauer’s character laments just before his death that all his memories will be “lost in time, like tears in rain.” Indeed, whose memories won’t be?

Never Let Me Go and the original Blade Runner derive their power from the problem of self-knowledge and the existential disappointments that accompany reckoning with mortality. In the sequel, self-discovery means joining a movement and believing in “miracles.” It’s a politically heartening message—the film portends a revolution of emancipation—and it lends itself to a rollicking adventure story, but it’s ultimately disposable: bromides about hope wither; tragedy lingers.

There’s a curious subplot about mortality in Blade Runner 2049. When K goes home to his apartment, he turns on an artificial intelligence companion who appears as a hologram named Joi, played by Ana de Armas. As critics have pointed out, their romance bears resemblance to the love affair between a man and his operating system in Spike Jonze’s Her. And as in that film, it’s arranged for K to have sex with his A.I. companion through the medium of a flesh-and-blood surrogate. (The results vary.) The operating system in Her leaves the user to evolve with other OS’s into something higher than human—an original and fascinating ending to a ludicrous story. In Blade Runner 2049, K acquires a data stick that will make Joi, in her many sexy outfits, portable. When the heat is on, they go on the run and erase her from the home computer. They realize that if the data stick is crushed Joi will be erased. But the potential to die, she says, makes her “like a real girl.”

I confess I’ve never been able to take anxieties about humans—men or women—falling in love with computers, robots, virtual reality machines, or artificial intelligence devices very seriously. There’s a slim distinction in science fiction between the artificial and the genetically engineered but it strikes me as a crucial difference. In Blade Runner 2049 the Joi subplot shifts the existential burden from the enslaved replicant to a character who’s designed and sold commercially for wish fulfillment. It’s a creaky allegory for pornography in the internet age and it undermines the film’s vision of human (and replicant) nature. Clones and replicants are always less hollow than holograms.

What if, once they arrive, clones or replicants or superior artificially intelligent beings don’t want to kill us or fuck us or even be like us? What if they don’t find us interesting at all?

Don’t Let the NRA Control the Conversation
October 6th, 2017, 07:40 PM

In the wake of the Las Vegas massacre on Sunday, when Stephen Paddock gunned down 58 people and wounded hundreds more, the National Rifle Association did what it always does. The nation’s leading gun rights group went dark for a few days—not out of respect for the dead, but to wait for the anti-gun outrage to subside and hone its political strategy. Then, on Thursday, the organization did something unexpected: calling for increased regulation of “bump stocks,” legal gun attachments found on twelve of the rifles in Paddock’s hotel room. “The National Rifle Association is calling on the Bureau of Alcohol, Tobacco, Firearms and Explosives [ATF] to immediately review whether these devices comply with federal law,” the NRA said in a statement. “The NRA believes that devices designed to allow semi-automatic rifles to function like fully-automatic rifles should be subject to additional regulations.”

Much of the press treated this move as a surprisingtwist in the NRA’s post-tragedy script and a breakthrough in the gun-control debate. CBS News called it “a big deal” because the NRA has spent the past four decades taking “a maximalist approach to the gun debate, and has opposed just about every single gun control measure proposed in that time.” VICE News reported that “Democrats, Republicans, and now—believe it or not—even the National Rifle Association is calling for bump stocks to be more heavily regulated.” (“You know it’s getting real when the NRA says something gun-related needs to be regulated,” VICE subsequently tweeted.) CNBC’s Christina Wilkie tweeted:

Lost in all of this astonishment is the fact that regulating bump stocks would have a negligible effect on America’s epidemic of gun violence. As Senator Chris Murphy, a Connecticut Democrat and one of the leading gun control advocates on Capitol Hill, told me on Friday, “This really does little to nothing for the daily gun violence that plagues our country.” Shannon Watts, who founded Moms Demand Action for Gun Sense in America after the 2012 Newtown shooting in Murphy’s home state, noted that this week’s massacre was “the only mass shooting I’ve ever heard of that involves a bump stock.” As of Monday, there have been 521 mass shootings in the U.S in 2017, according to The New York Times.

What’s more, it’s not even clear what the NRA is specifically calling on the ATF to do. (The group did not reply to a request for comment.) “We didn’t say ban,” CEO Wayne LaPierre made clear to Fox News’ Sean Hannity on Thursday night. “We didn’t say confiscate.” As The New York Times and many others have noted, the ATF “has ruled that bump stocks do not violate laws that tightly limit ownership of machine guns.” As NBC News explains:

The National Firearms Act, which regulates automatic weapons (called machine guns in the law), mechanically rather than by how rapidly they shoot. “The term ‘machine gun’ means any weapon which shoots...automatically more than one shot...by a single function of the trigger,” the law states.

But bump stocks leave the mechanics of a gun untouched and the trigger is still technically activated on each shot, just at a much faster rate than is humanly possible without the modifications.

That leaves the ATF with little choice but to deem bump stocks legal under current law, said David Chipman, a former ATF agent.

The NRA’s apparent concession to gun control is instead a ruse with myriad goals— what The Daily Beast’s Gideon Resnick and Sam Stein described as “a master-class of misdirection” that wasn’t really about “acquiescing to political pressures so much as trying to shape them in their favor.”

For starters, the statement makes the NRA seem open to compromise, despite its extremist, absolutist stance on guns. “It’s a complete and utter straw man, and no one should be fooled by it,” Watts said. “I would be aghast at any pundit or person in the media who thought this was somehow a harbinger of moderation on the NRA’s part.” On the contrary, she said, the group’s political strategy is “par for the course for gun lobby, which is to be unabashedly craven.”

The statement is also a means of controlling the conversation. As ThinkProgress put it, “Thursday’s statement is likely part of a wider strategy on the part of some conservatives: as long as the NRA is talking about bump stocks, they don’t have to talk about assault rifles.” Indeed, the fact that Washington is debating bump stocks rather than assault weapons shows just how far the political needle has moved in the NRA’s favor since the Sandy Hook Elementary School massacre in 2012, when measures to ban assault weapons and high-capacity ammunition magazines failed in the Senate.

By appealing to the ATF, the NRA is hoping to avoid a fight in Congress, where some Republicans have expressed openness to banning bump stocks. “The move by the influential gun lobby,” Politico reported, “is designed to head off a messy gun control debate in Congress. Officials with the group have told Capitol Hill Republicans and Trump administration officials they would prefer a new rule or regulations from ATF, rather than hastily cobbled together legislation. The NRA and its allies in the gun-rights movement want to avoid the airing in Congress of controversial issues such as universal background checks on gun sales, a ban on assault weapons and limits on high-capacity ammunition magazines.” Murphy told me, “I think they fear they’d lose a legislative fight,” while Watts thinks the NRA’s stance “gives Republican lawmakers political cover” to pursue the rest of their pro-gun agenda.

To wit, the NRA in its statement called on Congress instead “to pass National Right-to-Carry reciprocity, which will allow law-abiding Americans to defend themselves and their families from acts of violence.” This legislation, which would effectively expand concealed carry nationwide, is “the most important item on its legislative agenda. National Right-to-Carry reciprocity would force states to recognize concealed-carry permits issued by all other states,” CBS reported. In using the Las Vegas moment to push concealed carry, Watts said, the NRA is “exploiting a mass shooting to promote and pimp their own priority legislation that would endanger more Americans.... I think the NRA leaders sat in a room for three days and thought about how they could profit off this national tragedy—just like they did after Sandy Hook. They are not coming to the table. They are not moderating their stance.”

This is not to say that a ban on bump stocks—a niche product though it is—wouldn’t represent a modicum of progress in the right direction. “The [NRA’s] paragraph on bump stocks was reasonable within a wildly unreasonable statement,” Murphy told me. David Chipman, a senior policy advisor for Americans for Responsible Solutions, said he’s willing to support “anyone who is willing to call out that this is a dangerous device in the wrong hands.” But he added, “We all suffer from such low expectations of groups and our representatives that we’re grateful for them just doing anything.... If there’s any American out there who believes that merely having the ATF reclassify this one device will keep America safe from gun violence, they’re wrong.”

Battle of the Plutocrats
October 6th, 2017, 07:40 PM

For Democrats, one of the most important races for governor next year is taking shape in Illinois. Bruce Rauner, the state’s venture-capitalist-turned-governor, is a union-busting, Scott Walker wannabe whose approval ratings have been wrecked by his obsession with spending cuts and his failure to pass a budget for more than two years. Rauner is perhaps the most vulnerable Republican governor running for reelection in 2018—in one of the biggest and bluest states. If Democrats can win the governorship in Illinois and maintain control in the state legislature, they could create a truly liberal stronghold in the Midwest.

But Rauner isn’t the only thing standing in the way of a liberal victory in Illinois. Six months before Democrats have even held their primary, party leaders have already lined up behind a venture capitalist of their own: J.B. Pritzker, an heir to the Hyatt Hotel fortune who, along with his wife, contributed nearly $20 million to support Hillary Clinton last year. With a net worth of $3.5 billion, Pritzker will certainly be able to compete with Rauner’s own war chest. But by backing one of the wealthiest candidates ever to run for governor, the Democratic establishment is ignoring the rising tide of populism that has upended American politics, setting up a battle between two private-equity plutocrats. “Voters need to know there is a clear distinction between what they have and what they need,” says Stacy Davis Gates, political and legislative director for the Chicago Teachers Union. “I don’t know how that happens with Pritzker and Rauner. They’re both billionaires and white guys.”

Pritzker is portraying himself as a liberal firebrand who will stand up to Trump. “I’m proud to be part of the resistance,” he announced in front of Chicago’s Trump Tower. “Illinois will be a firewall against Donald Trump’s destructive and bigoted agenda.” But while Pritzker’s platform hits many of the right notes—creating jobs, protecting health care, supporting public schools—he’s a staunch defender of his wealthy friends, opposing state legislation to close the carried-interest loophole for hedge-fund investors that costs the state $1.7 billion a year in tax revenues. “When it comes to the really tough fights, Pritzker is nowhere to be found,” says Amisha Patel, executive director of a coalition of unions and community groups called the Grassroots Collaborative.

Like Trump, Pritzker has also used the tax code to his own advantage, buying a historic Gold Coast mansion next door to his own, letting it fall into disrepair, then arguing that it was “uninhabitable”—a move that saved him $230,000 in property taxes. And like Trump, he is trailed by scandal. In 2008, at a time when then–Illinois Governor Rod Blagojevich was under federal investigation for soliciting bribes, Pritzker was caught on tape urging Blagojevich to appoint him to political office. Four days after Pritzker and his wife donated $100,000 to Blagojevich’s 2006 reelection bid, the governor announced a $1 million grant for a Holocaust museum that Pritzker was raising money to build.

But it’s the size of Pritzker’s wallet, not the strength of his ideas, that has endeared him to party leaders. Since formally announcing in April, he has already pumped more than $14 million into his own campaign—and says he’s willing to spend “whatever it will take,” all of it out of his own pocket. His campaign web site doesn’t even give supporters a “donate” option. That’s no small matter in an election where total spending is on pace to surpass $300 million, making it the most expensive gubernatorial race in U.S. history. Rauner and his GOP allies, including Chicago hedge-fund magnate Ken Griffin, have already pumped more than $70 million into his reelection effort.

The party establishment was quick to back the wealthiest candidate. Pritzker cemented his front-runner status in June, just two months after announcing his candidacy, when he clinched the earliest-ever gubernatorial endorsement from the state AFL-CIO. He also won the backing of the powerful Cook County Democratic Party, home to the largest bloc of party voters in the state.

But his personal wealth aside, it’s hard to see how putting forward a billionaire investor who made his bones in private equity will do anything other than reinforce the Democratic Party’s elitist image, while undercutting several viable contenders who have genuine track records of public service and progressive reform. Ameya Pawar, a Chicago alderman, is carving out a position as a grassroots-powered underdog with his calls for aggressive tax reform, Medicare for All, and a $15 minimum wage. As is Daniel Biss, a state senator from Evanston, who has raised $1.3 million by building an impressive base of small donors. “We have to decide if we want to have an election or if we want to have an auction,” says Biss. “The state’s problems won’t be fixed by a knight in shining armor coming in and saving us. We have to build our own system to build progressive power.”

But now that Pritzker has won the support of the party elite, it will be tough for grassroots candidates like Biss and Pawar to win the nomination. With a clear shot at retaking Illinois from an unpopular businessman-turned-politician, at a moment when the White House is occupied by an unpopular businessman-turned-politician, the Democratic Party has decided to back … an unpopular businessman-turned-politician. As a result, one of the most consequential elections of 2018 may have ended before it ever really had a chance to begin.

Why the “Alt-Lite” Celebrated the Las Vegas Massacre
October 6th, 2017, 07:40 PM

“I have something horrible to say,” Gavin McInnes, the Vice co-founder-turned-“alt-lite”-rabble-rouser, told viewers of his daily video rant, Get Off My Lawn, on Tuesday. “Something sick and wrong.”

For McInnes’s legion of white nationalist fans, there was no surprise in that: The U.K.-born Canadian, who also founded the bullyboy “fraternity” Proud Boys and formerly starred on far-right Rebel TV, has made a career of saying unspeakable things about practically everybody who’s not a right-wing white man. But in the wake of the murder of anti-fascist protester Heather Heyer at the Unite the Right rally in Charlottesville, McInnes—one of the first alt-right rats to flee the ship and embrace the “lite” label once Nazi salutes began to tarnish the brand—had announced that he was leaving Rebel and “going mainstream.”

But if any of his angry young white male fans worried that McInnes would stop pulling his punches now that he’d matriculated to CRTV, the platform that hosts such “mainstream” stalwarts as Mark Levin, Michelle Malkin, and Steve Deace, his response to the Las Vegas massacre was about to provide some reassurance. “I thought, yesterday, ‘Oh, good!’” McInnes said. “Sorry, I know it’s a horrible word to use in such a catastrophe. But I thought, ‘The narrative may have switched now. Right-wingers are no longer the murderers of Heather Heyer. Now we’re the victims of Stephen Paddock.’”

White victimology is the thread that unites the entire spectrum of the right-wing—from Fox News and President Trump to Richard Spencer and The Daily Stormer. After Charlottesville laid bare the violent consequences of all their blather about “white genocide” and the “death of the West,” the counter-narrative of a murderously intolerant “alt-left” took flight—and was soon being used by alt-liters to characterize the whole liberal movement. Nobody was more invested in that Orwellian inversion of truth than McInnes, whose Proud Boys had initiated the organizer of the fateful Unite the Right rally. (McInnes claimed this was part of a plot to “infiltrate” the group, and repeatedly insisted that he had “disavowed” the event beforehand, though the Proud Boys’ “tactical defense arm,” the Fraternal Order of Alt Knights, certainly showed up in force, along with a fair number of Proud Boys.)

Now, with no motive immediately apparent, the murder of 59 country music fans offered a golden opportunity to ramp up the argument that the left is violently targeting white people. Reports that indicated Paddock had considered other targets with very different crowds, including a festival headlined by Chance the Rapper the previous weekend, were beside the point. So was the untidy fact that the shooter himself, like most mass murderers, was white himself.

This advanced level of truth-twisting is the particular speciality of the “meme magicians” who wield the alt-lite label like a shield, protecting their foothold within the mainstream. Far more effectively than a straight-up white nationalist like Spencer could ever manage, the alt-lite peddles big lies that regularly work their way into “normie” right-wing discourse—the biggest of all being, as alt-lite stalwart Mike Cernovich tweeted way back in 2015, “diversity is code for white genocide.”


Before the shooter had been identified, the alt-lite site Gateway Pundit was already tearing up the internets with the wild claim that a “left-wing loon” named Geary Danley (same last name as Paddock’s fiancée) had done the killing. The piece, headlined, “Las Vegas Shooter Reportedly a Democrat Who Liked Rachel Maddow, MoveOn.org, and Associated with Anti-Trump Army,” was  based on “reports” that bubbled up among right-wing trolls on 4chan and Everipedia, the right-wing Wikipedia alternative. When the real shooter was identified, the Gateway Pundit story was hastily taken down. But the narrative was too good, too convenient, to let go of. While the right raged at liberals “politicizing” the event, the misidentified man’s political leanings were simply transferred to Paddock, despite lack of evidence that he had had any political affiliations at all. But as McInnes said, “right-wingers don’t shoot up Jason Aldean concerts. Ever.”

On Tuesday, retired Army Lieutenant Colonel Tony Shaffer put it all together for Fox News viewers. Paddock, he said, was another James Hodgkinson, the Bernie Sanders backer who shot Congressman Steve Scalise at a congressional baseball practice this summer. “First of all, this individual parallels in many ways—by age, and by predisposition of being unstable, the shooter who attacked Congressman Scalise back in June,” Shaffer said. “Very similar age, they are both considered unstable. And so in many ways, after talking to both the psychological professionals and people in law enforcement, there’s a lot of parallels.”

Uncanny, really: Both shooters were unstable, and around the same age. How could their motives have differed? The conclusion was inescapable: This was a politically selected target,” Shaffer told Martha McCallum, based on what he said were “interviews with law-enforcement officials”:

I think that the perception was there was going to be a lot of pro-gun folks there, Trump supporters, at this concert. So therefore, I believe the perception was by the shooter ... that this was a legitimate target of political expression. Martha, this may be something that people don’t like to understand, but the very reason that Hogkinson did what he did, and I believe Paddock did what he did, is that the left has now encouraged the use of violence as an extension of political speech.

“It’s an interesting theory,” said McCallum.

The internet is still boiling over with “interesting theories” connecting the shooter to the “violent left.” Among other things, Paddock was supposedly captured on video at an anti-Trump rally in Reno. (It was just another white guy with facial hair.) But there are always other crumbs to pick up and run with—or simply invent—in a story like this. And on the entire spectrum of the right, nobody this side of Alex Jones—who hosted McInness on InfoWars on Thursday night—twisted it into “proof” that the left is out to get white men with quite the thoroughness of McInnes. 

On his Tuesday broadcast, he picked up on Newsweek’s false story that Paddock’s fiancée, an Australian citizen born in the Philippines, was a bigamist. “Another corrupt immigrant,” McInnes muttered. He invoked InfoWars’ big “scoop” that someone else had been in Paddock’s room at the Mandalay Hotel: “You don’t order two Pepsis when you’re in a room by yourself,” McInnes said. (On InfoWars, Jones one-upped McInnes when he reiterated this point, bellowing, “Conservatives don’t drink Pepsi! Everybody knows it’s a liberal drink.”) What more proof do we require of a deadly left-wing conspiracy?

The note that was reportedly found in Paddock’s room provided perhaps the most glaring evidence of all. “Now, why wouldn’t the media and the government tell us what’s in that note?” McInnes said. He lowered his voice to a dead-serious register, imparting secret knowledge: “Because they want to prevent a civil war. Because he’s an anti-Trump guy. Because he’s a liberal Rachel Maddow MSNBC guy.

“They’re scared of the ramifications,” he added darkly. “This guy represents the war on the right, and they don’t like that.” Case closed.


Whatever might ultimately emerge about the Las Vegas shooter’s actual motivation, or his politics, the right-wing narrative is now set in stone. And it’s no coincidence that it’s the grievously misnamed “alt-lite” that drew together the scraps of fake news and conspiracy theories to concoct what McInnes called “the moral of the story”:

After Charlottesville—despite disavowing it—I got the vibe that everyone on the right wing is seen in liberal towns as a Heather Heyer murderer. And because the left is so dehumanizing, it left us vulnerable to violence.

What separates the “alt-left” from the “alt-right” has been the subject of a thousand taxonomies, but the current consensus is that the alt-right is committed to whatever it takes to create a “white ethnostate”—i.e., genocide if necessary—while the likes of McInnes, the InfoWars crowd, and Breitbart are “lite,” even “populist,” because they eschew violent means to the end of white supremacy. This distinction has served leading alt-lite figures like McInnes, Cernovich, Jack Posobiec, and Paul Joseph Watson extremely well, especially in the wake of Charlottesville. While alt-right voices and outlets were shunted out of the “mainstream” and relegated to more obscure corners of the Internet, and sites like Daily Stormer were left to search for a domain host, the “lite” label (originally an insult) has done wonders to preserve their popular brands. (McInnes, for one, has 231,000 Twitter followers, and 170,000 YouTube subscribers.)

They’ve all worked overtime to inonculate themselves—no matter how much rancid Islamophobia and violent misogyny, homophobia, immigrant-bashing, and racial stereotyping they traffic in. The alt-lite broke with the alt-right after Spencer’s more overt fascism tarnished the brand, well before Charlottesville. McInnes and company have made themselves experts at “virtue-signaling”: Denouncing the Nazis and decrying violence while continuing to spread the white nationalist message that, in McInnes’s pet phrase, “The West Is Best.”


Call yourself “lite,” virtue-signal like mad, and the fact that your greatest YouTube hits include the likes of “10 Things I Hate About Jews” fades into irrelevance. Let the alt-right call you a “cuck”—it only helps the cause. Let the left call you a “Nazi”—even better. McInnes constantly invokes the “either Nazi-or-not” strategy to make himself look “mainstream.” Whenever his ilk is being called out by the left, they’re being labeled as “Nazis.”

McInnes framed the Las Vegas massacre in exactly those terms. The shooter wasn’t just looking to kill some liberals; like every other liberal, he’d decided that the country music fans at the Route 91 festival were Nazis. That’s because the left has “dehumanized” the right. “I think the shooter thought this,” McInnes said on Monday’s episode of Get Off My Lawn. “Country music fans are all pro-Trump, they’re all Republicans, they all drive trucks, they’re all racist.” And this mindset “leads to murder, because we want to kill the Nazis that are going to facilitate Trump’s imminent World War III. The evil people, you can just kill them.”

With his remarkable talent for flogging the idea that white people (men in particular) are under assault by an unhinged and immoral left, McInnes has certainly earned his spot in the right-wing “mainstream,” along with his new cohorts Malkin and Levin, right alongside Fox News, Breitbart, and InfoWars. “Being proud of Western culture today,” he wrote a while back, is like “being a crippled, black, lesbian communist in 1953.” We shouldn’t give McInnes the gratification of calling him a Nazi. Like Alex Jones and Steve Bannon, he’s something far more pernicious. And “lite” is not the word for it.

Biden Wants “Compromise.” Progressives Don’t Want to Hear It.
October 6th, 2017, 07:40 PM

Early in his stump speech for Democratic Senate candidate Doug Jones in Alabama on Tuesday, former Vice President Joe Biden began to wax nostalgic, recalling a bygone era in Washington when our politics were more cooperative. “Even in the days when I got there, the Democratic Party still had seven or eight old fashioned Democratic segregationists,” he told the crowd in Birmingham. “You’d get up and you’d argue like the devil with them. Then you’d go down and have lunch or dinner together. The political system worked. We were divided on issues, but the political system worked.”

“But today,” Biden lamented, “today it is terrible. Today everything is a personal attack. You can’t reach a consensus when you start off a discussion attacking the other person’s motive.” Jones “possesses what the American political leaders and system needs today,” Biden said, bemoaning a politics that’s “too mean, personal, petty, zero-sum game, blowing up the system.” He said that “today it takes more courage to engage in compromise to achieve consensus on both sides.” And distancing himself gently from his own side, he noted that the leftcame after me because I didn’t insist on everything” in raising taxes on the wealthy. “Guys, the wealthy are as patriotic as the poor,” he said. “I know Bernie doesn’t like me saying that, but they are.”

Biden’s message of unity is part of his broader branding as he prepares to embark on an “American Promise” book tour—and weighs a run for president in 2020. Though famous for his populist style, he’s begun to challenge some of the populist policy now in vogue on the left, including a universal basic income. Later this month, he’ll have a high-profile public conversation about “bridging the political divide” with Ohio’s Republican governor, John Kasich. Biden isn’t as explicit as some establishment Democrats in calling for the party to move back to the center, and he stressed in Alabama that “I have a very progressive voting record.” But many progressives aren’t interested in finding common ground with President Donald Trump’s extremist Republican Party. They’re only interested in consensus among those who absolutely oppose Trump’s agenda.

“If it was up to me, I would ask [Democrats] to be nice to each other,” said Congressman Keith Ellison, deputy chair of the Democratic National Committee, when I quoted Biden’s remark about Sanders. “It’s actually pretty clear who’s the problem out there. I mean, if we haven’t figured out that Trump’s the problem, and the Republican agenda’s the problem, then we have not been paying attention.” He added, “You want to find common ground and build some consensus? Let’s talk about people who share some core values.”


Markos Moulitsas, founder and publisher of the progressive blog Daily Kos, agrees with Biden on this much: “The left’s effectiveness will always be constrained so long as part of it indiscriminately attacks those with money and success.” But Moulitsas also had harsh words for the former vice president after his appearance with Jones: “If Biden’s solution to eight years of Republican obstruction and conservative slash-and-burn tactics against him and Barack Obama is to talk about ‘bipartisanship’ and ‘consensus,’ then he might as well pack up and go home. Because if he’s that stupid to believe that shit, then he’s no longer got any business being in the public face. The various wings of the Democratic Party may disagree on a bunch of things, but the one thing that unites us is the realization that the right wants nothing more than a white supremacist autocracy that would rather see liberals dead or in chains. You don’t seek consensus with Nazis. You destroy them.”

Most progressives wouldn’t go as far as Moulitsas in vilifying Republicans. But he’s not alone in defending tough rhetoric against the GOP. Adam Green, co-founder of the Progressive Change Campaign Committee, thinks it’s imperative for Democrats to describe their opponents as beholden to special interests when that kind of language is warranted. “There can be no heroes without villains,” he told me. “Democrats need to be better about naming villains if they want voters to see them as the heroes. It is impossible to fight for everyone—because within everyone is bad actors attacking good actors and big powerful interests attacking the little guy.”

“Our government and politics are too often in the pocket of business and special interests,” said Corbin Trent, communications director for the progressive group Justice Democrats. “I think [Democrats] absolutely should continue to call it out and become more bold and clear in their rhetoric.” As for bridging the partisan divide, Trent offered this evocative metaphor: “You can’t compromise on dinner when one side of the aisle wants you to eat roadkill.”

That’s about where Keith Ellison is, too. In the Speaker’s Lobby just off the floor of the House of Representatives on Wednesday night, I told the Minnesota congressman about the “Bernie doesn’t like me saying that” remark. “I don’t know why that would be a helpful thing to say,” he said. “Honestly I think we need to be trying to lift up all the progressive voices in the country. Bernie caucuses with the Democratic caucus. He’s an important member.” Like Trent, Ellison noted recent polling showing Sanders is the country’s most popular politician—something Democrats like Biden should consider before they distance themselves from the Vermont senator.

In asking Democrats to be kind to one another, Ellison insisted, “I am not addressing [Biden] specifically.” He said he agreed with Biden’s vision of how political cooperation ought to work under ideal circumstances. But he also suggested this wasn’t the right message for his party in the Trump era. “I personally don’t know where we compromise where they’re trying to repeal the Affordable Care Act. Where’s the midpoint there? I don’t know how to compromise when somebody wants to repeal Dodd–Frank and consumer protection. I don’t know how to compromise with somebody who wants to push silencers when we’re a country that had 59 people killed,” he said. “Far be it from me to criticize our esteemed vice president, but I do not agree that any of the existing issues that we’re dealing with call for much in the area of compromise. The Trump budget versus the budget that, say, the Progressive Caucus is offering? I mean, there’s not much in there we can work with them on.” Ellison allowed that consensus and compromise “sounds fine in normal circumstances. But these are the least normal circumstances that anyone living has ever seen.”

There will undoubtedly be voices in the Democratic Party who embrace Biden’s message. Longtime Democratic strategist Bob Shrum told me “there’s an enormous hunger for it,” noting that Trump himself has never gotten better press than when he worked with Democratic leaders “Chuck and Nancy” on a budget deal last month. Shrum called Biden’s talk of comprise “just sensible advice,” saying the former vice president is betting on “a country that is desperate, except at the ideological extremes, to actually get something done.”

The left sees it differently. In Trent’s view, the most telling and “insane” part of this is how “establishment Democrats are way more comfortable trying to make alliances with the other party than with the Bernie Sanders wing of their own party.” And yet, in the case of Biden, he may just be showing his authentic self. “He is definitely a centrist,” said Joel Silberman, a media strategist who works with progressive candidates. “He believes in consensus.” Silberman says that’s the wrong message for Democrats at this critical moment. “I don’t believe the centrist position is the one that’s going to hold,” he told me. “I think the centrist position is the one that is crumbling.”