An airplane is the perfect, and perhaps the only, place to actually read Monocle, the globe-trotting lifestyle magazine founded in 2007 by Tyler Brûlé, a Canadian editor and erstwhile war correspondent. Its logo, an “M” with a twisted loop inscribed in a circle, lurks at airport terminal bookstores all over the world, the magazine’s glossy black cover—which the late David Carr likened to “a slab of printed dark Belgian chocolate”—conveying a placeless, easily translated sort of luxury. Inside, one encounters articles on Canadian soft power, Latin American soap operas, and Finnish domestic architecture: the casual reading of an armchair diplomat.
Over the years, Monocle has become as much a status symbol as reading material. Its editor is one of the world’s foremost lifestyle auteurs, a tastemaker of late capitalism and “a Martha Stewart for the global elite,” per New York magazine. Brûlé has promoted his personal version of the good life via high-profile columns in T magazine and, more recently, the Financial Times, where he reflects on travel etiquette and philosophizes about “relaxation” for the one percent. Monocle’s tote bags, which come with the $130 annual ten-issue subscription price, mark an itinerant tribe of 80,000 who may identify more with the magazine than with the country on their passports.
As older titles like GQ, Vanity Fair, and The Economist lose their grip on millennials, the Monocle Man—spot him by his sharp suit that ends at bare ankles, glasses with prominent frames, and wide, unbuttoned collar, just like Brûlé himself—is alive and well. In 2014, Nikkei, the Japanese media conglomerate, bought a minority share in Monocle at a purported valuation of $115 million. Although the company is private and doesn’t release metrics, it has only grown since then, launching new publications and opening several retail spaces. The editors celebrated their 101st issue this March with an understated redesign. You can’t overhaul a classic, so not much has changed: The header logo is a little bigger and the layouts less cluttered. It’s more of a victory lap than a pivot.
While Monocle projects confidence in the march of globalization, it barely hints at the growing threats to the world of open borders and free-flowing capital it depicts. The magazine’s globalist chic contrasts sharply with the nationalist movements in the United States and Europe seeking to limit immigration, including visa programs for the skilled workers in tech and finance who might read Monocle. Yet the publication shares with the right a faith in free-market economics; Brûlé himself is less a citizen of the world than a shopper in its gigantic, globalized mall. His magazine, which built its brand by identifying the world’s hippest (and most profitable) trends, feels increasingly out of touch.
Jayson Tyler Brûlé’s long, careful path to world domination began in rural Winnipeg, where he was born in 1968 to a Canadian football player father and an artist mother. His parents, rumor has it, did not include accents over their last name. In interviews, Brûlé likes to say that his international sensibility formed during his childhood, inspired perhaps by his mother, an Estonian immigrant. News and decorating magazines were plentiful in the homes the family passed through in Ottawa, Montreal, and Toronto, as was Danish furniture. Aspiration came early. When he was 14, Brûlé cleaned yachts for a summer job and bought a Rolex with the proceeds.
Idolizing his fellow Canadian, the ABC news presenter Peter Jennings, Brûlé aspired to be an anchorman himself, and in his early twenties worked in London for the BBC. He reported freelance from Afghanistan for several years, and in March 1994 he was shot twice in a sniper attack, leaving him with little use of his left hand. While recuperating from surgery back in London, he read home decor and cooking magazines and contemplated, as we all must, how to live. In 1996, Brûlé took out a small-business loan to launch Wallpaper, a design and lifestyle magazine. Its polished photo shoots and furniture-as-fashion aesthetic paved the way for commercially and critically successful publications such as Dwell and Apartamento.
Brûlé’s description of the ideal audience for Wallpaper is often quoted: “I call them global nomads,” he told The New York Times in 1998, before the term had become a technology-inflected cliché. “Whether they’re a West Coast snowboarder, a copy writer for a hot advertising firm in Stockholm, or a grunge kid working in an indie record shop that suddenly got a film deal, there’s a degree of affluence all of a sudden.” In other words, bohemians made good who could suddenly spend a lot of money, anywhere, fast. They were a new archetype, with taste, capital, and mobility in equal measure. The magazine’s hopeful globalism, meanwhile, fit with the end-of-history triumphalism that permeated the late ’90s, in the wake of the Cold War but before September 11.
The concept was a hit. In 1997, after only four issues, TimeWarner acquired Wallpaper for $1.6 million. Brûlé was all of 29 years old. He stayed on as editor in chief until 2002, when, chafing at corporate leadership, he left. (Another version of the story says that he and TimeWarner clashed over his habit of indulging in “private helicopter rides.”) Having signed a noncompete that prevented him from starting another magazine, Brûlé launched Winkreative, an advertising agency for clients that might have bought pages in Wallpaper: tourism boards, airlines, and luxury condo developments.
After his noncompete expired, Brûlé assembled a few wealthy investors for Monocle, “a briefing on global affairs, business, culture & design,” according to its cover slogan for the first hundred issues. In some ways, the timing did not seem opportune: The first issue appeared in February 2007—on the eve of the financial crisis, and at the start of a sharp decline in the fortunes of print media. But Monocle performed a crucial function for the wealthy and those who market to them: Its content provided the global elite with a fresh definition of luxury and a renewed sense of confidence in the material rewards of the economic system that had just failed. A recurring column in the magazine analyzes a world leader’s personal style, and each year Monocle prints a Quality of Life Survey that suggests which city you might move to if you could move anywhere. (Tokyo was the 2016 winner. The same city also won in 2015. In 2014, it placed second.) Monocle gave its readers—and those who aspire to be like them—a way to define themselves and recognize each other, making Brûlé’s company an influential, if unlikely, force in publishing.
Monocle hasn’t just given globalized capitalism a hip aesthetic; it has also operated skillfully as a business in the many markets it both covers and covets. In the magazine, it’s next to impossible to discern what space is paid for as advertising and what is not. In the March 2017 redesign issue, for instance, there are “collaboration” ad packages with the nations of both Thailand and Portugal; each package appears next to unpaid Monocle editorial content about said countries. These are less evenhanded appraisals than buoyant travel guides to new resorts, museums, and shopping districts—the flows of creative capital, broadly speaking, that one might experience while doing business in such places.
This murkiness stems directly from Monocle’s business structure. Under an umbrella entity incorporated in Switzerland called Winkorp, Brûlé’s ad agency, Winkreative, sells creative services to companies that also often buy ads in Monocle. Both firms share a London headquarters that Brûlé, a Japanophile, has branded Midori House. The model is similar to that of the T Brand Studio at The New York Times, The Washington Post’s Brand Studio, and other recent in-house creative agencies developed by legacy media brands. Winkreative, which predates them by a decade, must inspire jealousy both for its financial success and for its weak business-editorial divide.
Brûlé has positioned himself at the head of a vertically integrated and ever-expanding media empire. On top of the Winkorp cake, he has added Monocle-branded clothing lines that can be bought through the magazine; hardcover books about home decoration and nation building; a 24/7 streaming radio station; retail stores around the world; and cafés in Tokyo and London that serve up a wan, placeless cuisine of Monocle chicken katsu sandwiches and Monocle taco salad bowls. The magazine is essentially a giant branding opportunity, an omnipresent advertisement for the true product: the Brûlé lifestyle.
It’s a way of life that has increasingly come under fire from political leaders. “If you believe you are a citizen of the world,” British Prime Minister Theresa May pronounced last year, “you are a citizen of nowhere.” May became prime minister in no small part because she argued that the people of her country couldn’t trust the Monocle class—not just the Eurocrats in Brussels, but also the global financial elite and PR gurus like her predecessor, David Cameron—who were too untethered to act in the best interests of the nation. The rise of May, the vote for Brexit, the election of Donald Trump—all represent a conscious renunciation of the globalist ideal that Monocle has helped to cultivate.
When Brûlé was recently asked about May’s observations on “citizens of nowhere,” he neither engaged with the discontents of globalization—the real-world worries over lost jobs and the erosion of local identity—nor did he make the case for his magazine’s brand of enlightened cosmopolitanism. He simply dismissed critics like May as “silly people.” “This is the way of the modern world,” Brûlé told the South China Morning Post. “It’s not just nationalism, it’s petty.” In other words, putting up barriers is more than a political mistake, it’s tasteless—a cardinal sin in the Monocle theology.
With the recent redesign, some glimmers of political reality are beginning to enter the magazine’s editorial voice. The new page layouts are more text-heavy, with longer articles and fewer glossy photos and twee spot illustrations. The content has a new seriousness, though it remains ever-optimistic. In an interview for the March issue, the CEO of Lufthansa says he is confident that globalization “cannot be stopped or slowed down, even though some people are trying hard.” The president of Portugal, adopting the vocabulary of a start-up founder, pitches his country as “a platform between cultures, civilizations, and seas.” (“We were an empire,” he reassures readers, “but not imperialistic.”)
Monocle views the world as a single, utopian marketplace, linked by digital technology and first-class air travel, bestridden by compelling brands and their executives. Diversity is part of the vision—the magazine’s subjects are from all over the world, and its fashion models come in every skin color—but this diversity is presented, in a vaguely colonialist way, more as a cool look to buy into than a tangible social ideal. Cities and countries are written up as commodities and investment opportunities rather than real places with intractable problems that require more than a subsidy to resolve. If London is too expensive, Brûlé proposes, why not found your next business in Lisbon, or Munich, or Belgrade? If you don’t, someone else will, and you might just get priced out again.
The magazine doesn’t idealize homogeneity of race or gender norms, but rather a global sameness of taste and aspiration. Every Monocle reader, regardless of where they live or work, should want the same things and seek them out wherever they go in the world, forming an identity made up not of places or people but of desirable products: German newspapers, Thai beach festivals, Norwegian television. The end result of this sameness is that a country can pitch itself to the monied Monocle class simply by adopting its chosen signifiers, or hiring Winkreative to do it for them in a rebranding campaign. In this way, the magazine warps the real world in its own editorial image.
A recent article in The Guardian about Lisbon described the city as embracing “Monocle urbanism,” a shorthand for all that the magazine glorifies: plentiful local culture, a relaxed pace of life, modernized airports, and co-working spaces. The new Lisbon resembles “a speeded-up east London,” as the city transforms from off-the-grid backwater to Airbnb-infested production hub of the international creative elite. Capital, and with it cultural capital, floods to the place where it can most efficiently reproduce itself, places that Monocle takes pains to identify and share. In 2015, in fact, the magazine hosted its very first Quality of Life Conference—in Lisbon.
It’s as easy to be charmed by Monocle as it is to hate it. Who doesn’t like a good Japanese leather origami bag? But if nationalists have a point in decrying the “global citizenship” that Monocle epitomizes, it lies in the magazine’s subtle approach to cultural homogenization. Brûlé’s stylistic vision has reproduced itself to the point of banality: Whether due to his own efforts or to the changing tide of taste, Danish furniture, clean cafés, shared offices, and artisanal food and clothing can now be found everywhere, attracting a floating tribe of international consumers the way flowers attract bees. The magazine’s worst offense may be that it is boring.
Now that Monocle has helped make the global lifestyle so ubiquitous, giving it up is hard, even if you don’t believe in it. British Home Secretary Amber Rudd wants to implement “barista visas” post-Brexit, to enable Europeans to work in U.K. bars and coffee shops for two years, without claim to housing or other benefits. The Brexiteers want strong borders, but not at the expense of their daily flat whites—which are, in fact, an Australian innovation. Mobility to travel, consume, and work is a luxury primarily available to the wealthy; the poor and disenfranchised find themselves blocked on every front. Refugees and asylum seekers don’t number among Brûlé’s “global nomads,” nor do itinerant workers who can only travel at the mercy of ever more stringent visa regulations.
The challenge confronting globalization is not, in the end, one of individual style. It is one of inclusion and independence. Is it possible to form a truly global community without sacrificing local identity and self-determination? While the vision formulated by Monocle may have been profitable, it no longer looks plausible. In the first century A.D., the Roman philosopher Seneca wrote, “One should live by this motto: I was not born to one little corner—this whole world is my country.” Today, to the satisfaction and benefit of Monocle’s globe-trotting readership, the whole world may indeed be our country. But we’re still deciding what kind of country it is, and whether Brûlé’s target audience will be the ones who ultimately control it.
When it comes to accommodating religious individuals or groups, the government has always been given a significant amount of discretion. For example, churches don’t pay many taxes, and they’re exempt from certain anti-discrimination laws when it comes to things like hiring clergy members. But the benefits that accrue from the First Amendment’s protections of religious freedom have historically included limitations on the government’s ability—and obligations—to fund church activity. Thanks to the Supreme Court, that symmetry is disappearing.
The Court on Monday struck down a major barrier between church and state, ruling for the first time ever that the government must give direct cash aid to a church. In a 7-2 ruling, the justices found that Missouri violated the Free Exercise Clause when it excluded Trinity Lutheran Church from a reimbursement grant program to resurface its playground with recycled tire scraps. The exclusion was based on a provision in Missouri’s constitution prohibiting the use of public funds for church aid.
According to Chief John Justice Roberts, Missouri discriminated against Trinity Lutheran “simply because of what it [i]s—a church.” “There is no dispute that Trinity Lutheran is put to the choice between being a church and receiving a government benefit,” he wrote in the court’s majority opinion in the case, Trinity Lutheran Church of Columbia v. Comer.
If a state denies public funding to a religious group, the chief justice said, it has to have a compelling reason for doing so beyond mere “policy preference.” It seems that the guiding principles of the Establishment Clause, which established the church-state divide, were not enough.
Justice Sonia Sotomayor addressed this glaring omission in her searing dissent. “The Constitution creates specific rules that control how the government may interact with religious entities,” she wrote, joined by Justice Ruth Bader Ginsburg. “And so of course a government may act based on a religious entity’s ‘status’ as such.” Rather than “disfavor” religion, a state’s decision not to fund a church is a “valid choice to remain secular,” according to Sotomayor.
The chief justice remained silent on Sotomayor’s point: that the government treats religious entities differently all the time. But with Monday’s decision, the Supreme Court chipped away at the discretion states have to exclude religious entities from certain government programs.
“You’re now saying that the First Amendment not only permits but requires the government to use taxpayer funds for a church,” said Greg Lipper, a former senior litigation counsel for Americans United for Separation of Church and State. He called the decision a “minefield in playground’s clothing.”
The decision is best understood in the context of three parallel trends: the outsourcing of public services to private institutions; the increasing participation of religious groups in providing those services; and the growth of the religious exemption regime through wide-ranging laws like the Religious Freedom Restoration Act. As a result, public services are being replaced by “private programs that can place all sorts of religious- and cultural-based tests on who gets those services,” according to Lipper. Many religious groups are insisting that they be treated neutrally when it comes to benefits but receive special treatment when it comes to regulations.
A press release published Monday by the Alliance Defending Freedom, which argued the case on behalf of Trinity Lutheran, highlights this contradiction.
“We didn’t ask for special treatment,” said ADF President Michael Farris. “We asked for equal treatment for people of faith. And the court agreed that the government cannot discriminate against people of faith by treating them unequally.”
Also on Monday, the Supreme Court agreed to hear Masterpiece Cakeshop v. Colorado Civil Rights Commission, in which a Colorado baker is seeking exemption from a generally applicable anti-discrimination law because of his religious views. Lawyers from the ADF are representing the baker, who is basically asking for special treatment.
Viewed in this light, the court’s ruling in Trinity Lutheran is “an asymmetry that can only be explained by a preference for religion,” according to Nelson Tebbe, who teaches constitutional law at Brooklyn Law School. Caroline Mala Corbin, a law professor at the University of Miami, agreed. “The bottom line is an incredible privileging of religion,” she said.
But Missouri had an Establishment Clause concern that went beyond protecting secular values. It’s worth noting that religious groups weren’t unanimously on Trinity Lutheran’s side in this case. Several, including the Baptist Joint Committee for Religious Liberty, the General Synod of the United Church of Christ, and the Union for Reform Judaism, submitted briefs to the Supreme Court in favor of Missouri, arguing disestablishment is necessary to protect individual religious liberties and the rights of religious dissenters.
After all, the Establishment Clause was written in large part to protect religion from government intervention, as well as to prevent some religious groups from being favored over others.
When I spoke with Michael McConnell, a former judge on the Tenth Circuit Court of Appeals who now teaches at Stanford Law School, he said the Trinity Lutheran ruling was “spot-on, exactly right.” He echoed the chief justice’s conclusion that the rubberized playground grant is an entirely secular public benefit akin to police and fire protection.
And yet fire protection is granted to everyone equally. The scrap tire program required applications, and grants were only awarded to 14 out of 44 applicants. The grants were more likely to be received by majoritarian religious groups with enough funding, membership, and connections to make it to the top of the application pile.
Justice Sotomayor raised this concern in her dissent, writing that the program favors “those with a belief system that allows them to compete for public dollars and those well-organized and well-funded enough to do so successfully.”
Beyond hampering the state’s interest in maintaining secular policies, Monday’s decision is a “terrible blow” to the Establishment Clause values that protect religion, according to Corbin, who astutely warned, “We are not a point in time in our history where we should be ignoring the ramifications of a decision on minority religions.”
Lisa Durden, an adjunct professor at New Jersey’s Essex County College, appeared on Tucker Carlson’s primetime Fox News show earlier this month to debate the merits of a blacks-only Memorial Day party held by a New York City chapter of Black Lives Matter. Carlson, as he does, provoked Durden. “You’re demented, actually,” he said. “You’re sick, and what you’re saying is disgusting.” Durden, a media personality herself, did not respond in kind. “Boo-hoo,” she said, “you white people are angry because you couldn’t use your ‘white privilege’ card to get invited to the Black Lives Matter all-black Memorial Day celebration.”
Two days later, Essex County College suspended Durden, and last week she was fired. Because of Durden’s appearance, school president Anthony E. Munroe wrote on Friday, “The College was immediately inundated with feedback from students, faculty and prospective students and their families expressing frustration, concern and even fear that the views expressed by a College employee (with influence over students) would negatively impact their experience on the campus…. While the adjunct who expressed her personal views in a very public setting was in no way claiming to represent the views and beliefs of the College, and does not represent the College, her employment with us and potential impact on students required our immediate review into what seemed to have become a very contentious and divisive issue.”
This justification for Durden’s firing, while disappointing, didn’t surprise me. I’m also a college professor, and appeared on Carlson’s show in April to defend my New Republic article about why colleges have a right to reject hateful speakers. While my appearance didn’t generate the same controversy as Durden’s, a wave of people contacted my school, Colby College in Maine, in an attempt to have me fired (they apparently missed the irony of trying to get someone fired for their speech, about speech, because you disagree with that speech). In addition, my colleagues in the English department, plus a few lucky senior administrators, have been hapless recipients of racist and anti-Semitic diatribes, thus burdening our IT staff.
Meanwhile, I received thousands of insults and threats. Beginning mere minutes after my appearance, I was deluged with emails and instant messages calling me a “fucking liberal idiot,” “pussy snowflake,” “ignorant and hypocritical cunt,” “fucking Nazi,” and “Jew fag.” Strangers threatened to break my legs, scalp me, and make me “eat a bullet.” One wished, in all-caps, “Hopefully you get robbed and killed by an illegal immigrant that was deported five times and he leaves you to bleed to death slowly so you have time [to] realize how fucking stupid you were all along.” There have also been repeated and likely actionable defamation attempts, which I’m still dealing with. (It’s also worth noting that I’m a straight, white, male, tenure-track professor at a top-tier institution that supports my public engagement, which is to say my experience with strangers eroticizing my slow death has been less traumatic than most professors’. I can only imagine—though I’d rather not—the bile directed privately at Durden.)
This was all in response to my careful argument about how campuses should handle invitations to speakers who are intentional provocateurs. Indeed, I had to remind Carlson and his audience no fewer than three times, in a roughly seven-minute TV segment, that I stand emphatically opposed to any sort of violence, from the left or the right, that would shut down free speech. While the right was calling for my job (and my head), I was meeting and corresponding with members of Colby’s chapter of the Young America’s Foundation, a conservative youth organization, to discuss campus free speech issues. Over five years, in hundreds of pages of teaching evaluations by my students, I haven’t received a single complaint about political bias.
Nevertheless, Carlson and others members of the right-wing media push the contradictory argument that the campus left is at once fragile and bloodthirsty. Liberal students and professors are incapable of dealing with ideas they disagree with, so they either retreat to safe spaces or use violence to shut down opposing points of view. Right-wing media have been selling the oxymoronic image of the “snowflake” “mob” for a while now, but have recently shifted emphasis, framing political violence on campus as a particularly left-wing phenomenon. Students who protest peacefully get lumped in with Antifa, a radical wing of leftist activism.
I’m not here to brush off violent left-wing protest. While I believe campus communities have good reason to disinvite or prohibit speakers like Charles Murray or Ann Coulter, I oppose the use of violence to shut down these speakers—not only because people get hurt, but because it creates an environment in which the threat of violence can indeed stifle speech, and not only overtly bigoted speech. Just as peaceful student protesters on the left don’t deserve to get caught up in violent, outside agitation, conservative students don’t deserve to be chased out of the room as “racists” for opposing affirmative action or advocating “school choice.”
The fact remains, however, that suppressive violence does not belong only to the left. Based not just on my experiences, but on a rash of recent incidents on and off campus, the right is demonstrating its propensity to make a big show of defending free speech, only to become Antifa’s mirror image.
Johnny Eric Williams, a sociology professor at Trinity College in Connecticut, reportedly posted the following message on Facebook last week, several days after a shooter opened fire on a congressional baseball practice in Alexandria, Virginia: “It is past time for the racially oppressed to do what people who believe themselves to be ‘white’ will not do, put end to the vectors of their destructive mythology of whiteness and their white supremacy system. #LetThemFuckingDie.” Campus Reform, Fox News, The Washington Times, and The Blaze all jumped on Williams’s post, suggesting to varying degrees that he was saying Representative Steve Scalise and the other shooting victims should have been left to die. Williams neither mentioned nor alluded to the shooting, but he did also share a Medium post title “Let Them Fucking Die,” written by someone else, which asked, “What does it mean, in general, when victims of bigotry save the lives of bigots?” Regardless, the anger whipped up by right-wing media led to death threats against Williams, including a bomb threat. Williams took his family into hiding, and Trinity shut down for a day.
Just a week before Williams’s Facebook post—for which he apologized—another professor faced death threats for a social-media message. Syracuse University professor Dana Cloud tweeted:
Cloud was encouraging more people to join a counter-protest to an anti-Muslim gathering in downtown Syracuse. Ann Coulter retweeted Cloud, noting where the professor worked, and Campus Reform called the tweet a “veiled call to violence,” which prompted right-wing activists to inundate Cloud with threats and harassment.
Meanwhile, there’s an emerging phenomenon of right-wing agitators threatening Democratic candidates for office—and their families—in Iowa and New York, compelling the candidates to withdraw. Quartz reported recently:
Hours after discussing his bid for mayor in Binghamton, New York on local radio in April, Michael Treiman said he was emailed threats directed at his wife and children. The same evening, someone driving by his home yelled “liberal scumbag,” and hit him with a soda container while he was holding one of his toddlers.
Treiman said last week he has moved away from his hometown of 32 years, but will run for Binghamton mayor again when he can afford to hire a private security team.
These attempts at intimidation from the right should be enough to dispel the myth that the gravest threats of suppressive violence are coming from the left. Even at Berkeley, where violent responses to potential speeches by Coulter and Milo Yiannopoulos were largely attributed to the left, we know that Antifa and others employing “black bloc” tactics were squaring off with equally violent groups on the right. We’ve seen the same thing happen at Portland rallies, as well as at the University of Washington in Seattle, where Elizabeth Hokoana allegedly shot a Yiannopoulos protester through the stomach and her husband, Marc, allegedly fired pepper spray; the day before, according to police, Marc messaged a friend on Facebook stating he “can’t wait for tomorrow… I’m going to the Milo event and if the snowflakes get out off hand [sic] I’m just going to wade through their ranks and start cracking skulls.”
As the New Republic’s Clio Chang reported, militant groups affiliated with white supremacy and the alt-right movement have been suiting up in homemade body armor, arming themselves with bats, sticks, and other makeshift weapons, and seeking out protests to start fights with leftists. The Southern Poverty Law Center describes one such group, the Fraternal Order of Alt Knights, as a “‘fight club’ ready for street violence,” and FOAK founder Kyle Chapman affirms: “We don’t fear the fight. We are the fight.” Meanwhile, after the Public Theater’s production of Shakespeare’s Julius Caesar angered conservatives who misread the play as an endorsement of Trump’s assassination, Shakespeare theaters not even associated with the Public Theater’s production have been receiving death threats.
There’s no question, then, that the right has no standing to chastise the left for suppressive violence. But countering such intimidation from the right, which claims to believe passionately in freedom of speech, will require more than just pointing out their hypocrisy. Because the far-right activists behind these threats are playing by a different set of rules from those of us who have a responsibility to educate.
I consider it my duty to model rational, deliberative, and respectful engagement in public life as best I can for my students. So even when Tucker Carlson engages in theatrics to antagonize me, I won’t demean or mischaracterize him in return. And if I won’t return his hyperbolic rhetoric, I certainly won’t sink to the level of viewers who lob threats. Right-wing extremists understand this, which is partly whey they target professors and local political candidates—people who feel a duty to behave responsibility, but who are also unlikely to have public relations teams and media strategists at our disposal to handle the formidable interruptions that threats and harassment cause to our daily lives.
As we saw recently in Montana’s special congressional election, when then-candidate Greg Gianforte body-slammed Guardian reporter Ben Jacobs, violent intimidation is tolerated, if not rewarded, by the right, even among people holding or vying for positions of civic responsibility. By contrast, every single professor I’ve named in this essay—and many more—has had to contend with threats to their jobs as a consequence of being attacked because of, or through, right-wing media. Crude insults, intimidation campaigns, and other tactics that win plaudits on the right as signals of “dominance” are the same tactics that, if practiced by professors in return—even in our private lives off campus—would lead to calls for our jobs, or worse. The attempt by right-wing media to frame suppressive violence as a leftist tactic is a form of projection, meant only to obscure and deflect from the extent to which hatred and violence have come to define the right in the Trump era.
Lately, the novelist Arundhati Roy has been receiving death threats. Last month, the popular Bollywood actor Paresh Rawal took to Twitter to condemn her support of Kashmir’s bid for azadi (autonomy) from the Indian state. In his tweets, Rawal, who is also a Member of Parliament, said that Roy should be tied to the bonnet of an Army jeep as a human shield to protect the armed forces in Kashmir. The actor’s comments about Roy were reinforced by droves of nationalist supporters on social media, who echoed the threats. He seems to have made the comments after reading a bogus interview that misquoted Roy. But his Twitter rant was perhaps prescient, as Roy’s second novel, The Ministry of Utmost Happiness, will almost certainly rankle India’s nationalist establishment. Roy’s return to fiction after two decades cements her long-running disapproval of Indian policy, particularly in Kashmir, and it’s a stance that has come with consequences: The threat of being branded a traitor whenever she writes, thinks aloud, or makes a public appearance has become the defining feature of Roy’s career.
Roy has been a challenge to the Indian state ever since she won the Booker Prize for her debut novel, The God of Small Things (1997). In her subsequent career, writing anti-establishment non-fiction, she’s railed against India’s injustices: the inequitable distribution of wealth born of capitalist enterprise, for instance, and the enduring ills of the Hindu caste system. Roy’s idealism fits snugly with her unabated dedication to the others of Indian society—from tribal Maoists to Kashmiri rebels to Dalits (untouchables) to slum-dwellers. She writes from the cracks of the Indian democracy, prying them open with unforgiving resolve in her attempt to reveal a deeply fractured nation.
Over the two decades since the publication of The God of Small Things, Roy has also earned her stripes as an activist, spending a night in jail on charges of sedition, camping out with the guerrilla Maoists in Chhattisgarh for a few days, and being treated like a pariah by political elites and the privileged class. Roy’s radicalism is at odds with the nationalists’ support of free market economics and Prime Minister Modi’s ambitions for rapid economic growth. Her relentless agitations frustrate their narrative of India’s emergence as a formidable force in the new world order and question the spoils of India’s rich and the newly empowered Hindu nationalists. Roy’s crusades are validated by the various social injustices that are emergent under the pretext of economic development under Narendra Modi’s India—she’s right to rail against these injustices. But Roy never quite passed muster as a non-fiction writer: She lacks the cogency to make best use of the journalistic form. Her true talent was always in fiction.
The Ministry of Upmost Happiness, the highly-anticipated follow-up to The God of Small Things, was 20 years in the making. It’s an ambitious fictional treatment of the author’s political discontents. At its heart, the novel is an ode to the outcasts of India, highlighting the betrayals of a society that holds them in contempt, and the many ways that society has failed its minorities and marginalized communities. Almost every character in the book is a product of oppression, subject to the whims to corrupt political forces and the tyranny of Hindu nationalism.
The book starts out with the story of Anjum, a hijra (third gender person). As a Muslim hijra, Anjum practices her Islamic faith while also inhabiting a queerness defined by age-old Hindu tradition. Hijras date back at least as far as the Mughal Empire, when they were included in the queen’s entourage. The tension makes for an intriguing reflection on the evolution of identity politics in the Indian context, and Roy negotiates the dynamics between old world nostalgia and the modern-day notion of gender as a spectrum with a fine delicacy: The complications of Anjum’s positon aren’t shied away from, and her individuality is never compromised.
When Anjum’s (née Aftab) is born, her mother is forced to confront the narrow definitions of sexual identity and the politics of language: “Was it possible to live outside language?” she wonders, defining the question that will follow Anjum throughout her life. In her telling of Anjum’s story, Roy gracefully narrates the peculiarities and tragedies of the hijra community: though they’ve been a part of Indian culture for ages, hijras were not officially recognized as the third gender by the Indian state until 2014. In the novel, Anjum emerges as a matriarch of inclusion, who serves as a juncture for the complicated, intersecting lives of the other marginalized characters. But her suffering is acute. Having survived the 2002 anti-Muslim riots in Gujarat, Anjum is traumatized. She’s debilitated by grief and rage, and begins to feel more and more alienated—from the Indian society that rejects her, and from those around her who cannot or will not help.
Anjum builds her home in the graveyard where her ancestors are buried—the better to be close to them, and to a phantom sense of community and belonging that she’s denied elsewhere. Gradually, her home doubles as a guest house for the lost, the downtrodden, and the discriminated against. It is “the ministry of utmost happiness” for misfits, their lives suspended and delegitimized at the whim of the state.
In her later life, Anjum finds a close comrade in Saddam Hussain. Saddam is a Dalit, or “untouchable,” who took a fake name to escape his position in the Hindu caste system—he chose Hussein’s out of naïve admiration for the dictator, after seeing a viral video of his execution. He is consumed by his desire to avenge his father’s death: He plans to kill the cop who riled up a mob of Hindu vigilantes to beat his father to death when they found a dead cow in his truck. Since the Modi administration’s encouragement of beef bans in India, not only have Muslims been targeted, but also Dalits, who are traditionally tasked with skinning dead cows.
In the midst of this, Roy counters with a sympathetic take on ritualistic sacrifice practiced in Islam. Anjum raises a goat with great affection and tenderness that is to be offered up on the holy festival of Eid. “Love, after all, is the ingredient that separates a sacrifice from ordinary everyday butchery,” she says. Anjum’s character arc is perhaps the book’s triumph, and one wishes Roy had not strayed from her to address a multitude of other social issues manifested in other characters and storylines. Any injustices of Indian society that can’t be embodied in Anjum’s story are represented by a motley crew of characters that populate her universe. In this regard, The Ministry of Utmost Happiness stands in stark contrast to The God of Small Things: The novel’s politics are aggressively present. This might be to the edification of her readers, but it is not always to the benefit of the story.
That’s not to say that Roy is an uncritical observer of the Indian left. For the most part, the novel is set in Delhi, and Roy explores the city’s new wave of dissent, centered around the Jantar Mantar—an ancient observatory where civil protests are authorized. Roy describes a circus of protestors, setting up shop for their various causes at the Jantar Mantar. She does not paint every protest with same brush—like those seeking delayed justice for the Bhopal Gas tragedy, for instance, are portrayed sympathetically—but she points to the personality cult that some of the leading activists furnish around themselves. Roy is wary of activists she considers deceptive—like those who are aspiring politicians, or journalists who engage in a kind of showmanship that Roy calls “performative radicalism.” Interestingly, the author’s activism has been viewed through a similar lens, with critics on both the left and right calling her insincere, self-righteous, or “performative” in her concern for India’s marginalized.
Another major storyline deals with the contentious question of Kashmir. This story is told through the character Tilo, who looks conspicuously like Roy herself: both are educated, both are political, and both have Syrian-Christian mothers. Tilo finds herself involved in the Kashmiri separatist movement. Her world is full of young idealists negotiating their politics and beliefs with the realities of life in Kashmir, which is beset by a vile government bureaucracy and the rise of rabid radicalism. Roy paints a picture of a dystopian Kashmir ravaged by human rights abuses and the cooption of the resistance by violent organizations. The author paints a grim picture of the brutality inflicted on Kashmiri civilians by the Indian army. There is no ministry of utmost happiness here: Anjum’s sanctuary house in the graveyard is a place of redemption and acceptance, but in Kashmir, we find only funerals.
Despite the 20-year wait, Roy’s fiction writing might leave fans wanting. There are strong echoes of her Booker-winning voice in The Ministry of Utmost Happiness: Her writing is still suffused with the delightful whimsy and levity of The God of Small Things. But in dealing with such weighty and dark themes, the book suffers from an inconsistency in tone—jarring transitions, from virtuosity and snark to heavy handedness and moralizing, make Roy sound unconvincing as a narrator. When she writes about low-wage laborers in the city, she does so with a conspicuous tin ear: “Their emaciated parents, hauling cement and bricks around in the deep pits dug for new basements, would not look out of place in a construction site in ancient Egypt, heaving stones for a pharaoh’s pyramid.”
Roy’s critique of Indian society becomes distracting, as the surfeit of social issues diffuse the book’s thematic core. It’s a labored chronicling of the tragedies that plague India’s conscience: the Indira Gandhi emergency, Sikh riots, Kashmir’s resistance movement, the Gujarat riots, the Maoist revolution, the oppression of Dalits, the election of Prime Minister Narendra Modi. At times, the ambition to take on all of these tragedies undermines the intimacy of her storytelling. Eventually, most of the main characters meet in Anjum’s guesthouse: The moment is supposed to tie together the threads of injustice that run through all of their lives, but the scene is clumsy, and the ultimate resolution is as ambiguous as the future of India’s oppressed.
The Republican health care effort is, like the Democratic effort before it, largely about making significant changes to Medicaid, the nation’s largest insurer. Some 20 million Americans got coverage under the Affordable Care Act, three-quarters of them thanks to the law’s expansion and promotion of Medicaid. Under Trumpcare—that is, the respective GOP plans in the House and Senate—20 million Americans or more would become uninsured largely due to massive cuts to Medicaid. This is a fact, just as it’s a fact that Republicans are cutting Medicaid to fund a tax cut for the rich.
The non-partisan Congressional Budget Office’s analysis last month of the House bill, the American Health Care Act, said, “The largest savings would come from reductions in outlays for Medicaid”—a reduction of $834 billion over 10 years, to be precise. The CBO’s analysis of the Senate bill, the Better Care Reconciliation Act, was released on Monday. “The largest savings in the federal budget,” it concluded, “would come from reductions in outlays for Medicaid—spending on the program would decline in 2026 by 26 percent in comparison with what CBO projects under current law... By 2026, among people under age 65, enrollment in Medicaid would fall by about 16 percent and an estimated 49 million people would be uninsured, compared with 28 million who would lack insurance that year under current law.”
Republicans are not proposing to repeal Obamacare, exactly, but to slowly starve it to death by rolling back the Medicaid expansion.
Faced with such truths, though, the Trump administration and the Republican Party are betting everything on a strategy of outright lying.
“These are not cuts to Medicaid, George,” White House advisor Kellyanne Conway told George Stephanopoulos on Sunday’s This Week. “You keep calling them as cuts. But we don’t see them as cuts. It’s slowing the rate of growth in the future and getting Medicaid back to where it was.” This proved too much even for a Republican colleague, Senator Susan Collins of Maine, who noted that the Senate bill’s failure to increase Medicaid funding to keep pace with inflation would mean people would lose benefits. “I respectfully disagree with her analysis,” Collins said on the same program. “Based on what I’ve seen, given the inflation rate that would be applied in the outer years to the Medicaid program, the Senate bill is going to have more impact on the Medicaid program than even the House bill.”
Collins is exactly right. But her candor is the exception, while Conway’s lie has become standard Republican fare. On Friday, White House press secretary Sean Spicer said that Trump is “committed to making sure that no one who currently is in the Medicaid program is affected in any way, which is reflected in the Senate bill and he’s pleased with that.” Pennsylvania Senator Pat Toomey claimed on Face the Nation that “no one loses coverage” under the GOP plan. On Fox News, Wyoming Senator John Barrasso said that “the amount of dollars going to Medicaid, from today on out, continues to go up year after year after year.” (As with Conway, these deceptive comments ignore inflation and increasing population.) Republicans in the House of Representatives have taken a similar approach to their own bill:
The Republicans have settled on a policy of deception to weather the potential political storm over the Medicaid cuts. In theory, they should face dire consequences from constituents, given that Medicaid is a broadly popular program that provides crucial assistance to millions of people, including many who voted for President Donald Trump. But their punishment is by no means assured. In fact, recent political history suggests a more disheartening possibility: Republicans’ relentless health care lies might well work, at least long enough to do massive damage to America’s most vulnerable citizens.
In the Washington Post last week, Greg Sargent cited a Kaiser Foundation poll on public attitudes toward health care. The findings were disturbing. “[O]nly 38 percent of Americans know that the GOP plan makes ‘major reductions’ in Medicaid spending,” Sargent wrote. “Another 27 percent say it makes ‘minor’ reductions; 13 percent say it makes no reductions; and 20 percent say they don’t know. If this polling is right, that means at least 6 in 10 Americans are unaware of the central feature of the GOP plan to reconfigure one-sixth of the U.S. economy, one that will impact many millions of people over time.” The poll also found that 74 percent of Republicans think their family would be better off if Trumpcare passes.
These numbers suggest how the Republicans could pull off an amazing feat of passing a much-hated bill, with provisions that will hurt their own constituents, and survive politically. Politics in America have become so tribalized along partisan lines that it’s possible to lie on a grand scale about policy and get away with it; most Republican voters will believe whatever elected Republicans say. As Post reporter Dave Weigel observed, Karen Handel’s defeat of Jon Ossoff in the special congressional election in Georgia, where Republican ads relentlessly tied Ossoff to House Minority Leader Nancy Pelosi, “only reinforces the GOP’s sense of immunity from consequences,” one effect being that Republicans “think that whatever happens w this bill—if they face angry voters—they can paper over it w Pelosi ads.” And if Trumpcare does cause misery in the coming years, everything can be blamed on Obamacare, which might convince enough Republican voters to help keep Congress under GOP control.
A contributing factor is the mainstream media’s failure to adequately cover the Republican health care effort. Senate Majority Leader Mitch McConnell wants a vote on the Senate bill this week, but most American newspapers still aren’t giving the Medicaid cuts front-page attention.
While hardcore Republican voters might be fooled by the deceptive words of the Trump administration, the larger reality is that most voters, including persuadable ones, aren’t being told about potentially monumental changes on healthcare coming down the pike. Perhaps that will finally change thanks to Monday’s CBO score.
It’s easy to blame the Trump for introducing a new level of deception into American politics. Yet in many crucial ways, his administration is just perfecting longstanding Republican tactics. When supply-side economics was first promoted by fringe economists like Arthur Laffer and George Gilder in the 1970s and 1980s, it was viewed with scorn by many old school Republicans who believed in the party’s devotion to balanced budgets. The promise of supply-side economics (tax cuts that would pay for themselves) seemed like an obvious con. It was “voodoo economic policy,” as then-presidential candidate George H. W. Bush said in 1980.
But once Ronald Reagan won the nomination and the presidency, Bush and the rest of the party got on board; supply-side economics became GOP orthodoxy. Never mind that this theory failed to live up to its promises time and again: Reagan’s tax cuts were accompanied by a boom caused more by cuts in interest rates, which nonetheless led to massive deficits, while George W. Bush’s tax cuts were followed by anemic growth and high deficits culminating in a global economic crisis. “Reagan proved deficits don’t matter,” Vice President Dick Cheney famously said in 2002. “We won the midterms. This is our due.” The lesson Cheney drew from history is still GOP dogma to this day, with the current efforts at tax cuts justified on supply-side grounds. The whole sorry history shows that you can keep selling dishonest policy as long as it pleases the donor class and the ideological base of the party.
One occasion reality does catch up with lies, as the latter Bush found out when the truth became clear about Iraq’s non-existent weapons of mass destruction. But even still, Bush won re-election 2004 and his party only started suffering electoral losses for the Iraq war in the 2006 midterms, before paying dearly in the 2008 election. And despite these losses and even after the victory of Trump, who often falsely claims to have opposed the Iraq war from the start, there is little evidence the Republican Party regrets the lies that led to the war. Indeed, with prominent Trump allies like Senator Tom Cotton calling for regime change in Iran, the only lasting lesson might be not to make embarrassing claims about WMDs but to find another casus belli.
The incentive structure for Republican lawmakers is perverse. If they lie about the consequences of Trumpcare, there is a possibility that they could get away with it by creating a new party orthodoxy, just like they did with supply-side economics. Or, at worst, they could suffer electoral defeat a few years down the road. But in the meantime, their donors would be rewarded for their loyalty with massive tax cuts. It’s no mystery why the Republicans are lying about Trumpcare: Lying pays.
Senate Majority Leader Mitch McConnell is just a few votes away from passing legislation that would reduce health insurance coverage by 22 million people.
Ironically, the impact would be most severe in his home state of Kentucky, where the Affordable Care Act covered half a million people—a huge percent of that state’s population.
The person most responsible for that success story is Kentucky’s former governor, Steve Beshear. He joined us by telephone from Lexington to discuss GOP efforts to repeal Obamacare, and his prescriptions for a Democratic party revival.
Not so long ago, Lake Chad was one of the largest bodies of water in Africa. The thick reeds and vital wetlands around its basin provided vast freshwater reserves, breeding grounds for fish, fertile soil for agriculture, and grasslands where farmers grazed their animals. In 1963, it spanned almost 10,000 square miles, an expanse roughly the size of Maryland. But as climate change has taken its toll, the lake has shrunk by 90 percent. Today, only 965 square miles remain. Wetlands have given way to sand dunes. Farmers have abandoned their fields. Those who still live by the lake struggle to survive, beset by chronic drought and the slow onset of ecological catastrophe.
This looming crisis has only worsened with the rise of Boko Haram, which has driven some 74,000 Nigerians into neighboring Cameroon. In response, Cameroon’s government has banned farmers from using some brands of fertilizer, an ingredient used in homemade explosives, and has ordered that staples like maize, millet, and sorghum growing along roadsides be no higher than three feet, to prevent Boko Haram from hiding in planted fields.
More refugees and fewer crops have proven to be a deadly combination in a region already ravaged by climate change. More than seven million people around Lake Chad are now suffering from severe hunger, including 500,000 children wracked by acute malnutrition. Some huddle in makeshift shelters they have erected; others forage in the woods. Those fortunate enough to be granted a spot in a refugee camp often receive no more than one meal a day. Food, even in the most minute quantities, has become scarcer than hope.
We often turn away from images of the starving and hungry, from the skeletal profiles and hollowed-out eyes that attest to the misery and suffering. But photographer Chris de Bode has found a way to focus our attention on this forgotten crisis. A single vegetable, a dried fish, a bowl of red maize—sometimes this is all a mother has to divide between her children each day. She may have to choose to feed her two youngest and send the teenagers to a village to beg for food. These images do not ask us to look into their eyes and see ourselves. They ask us to look at the emptiness of their bowls and reflect on the fullness of our own. We see their hunger through what little they have. We measure their suffering in the most universal unit of all: a single meal.
The health care legislation proposed by Senate Republicans is a moral abomination leadership is trying to pass through undemocratic means. Some of the villains responsible for this are obvious, starting with Donald Trump, Paul Ryan, and Mitch McConnell. But one conservative who played a major role in making the passage of an awful, widely despised bill frighteningly likely is flying under the radar: the chief justice of the Supreme Court.
John Roberts received a lot of praise from liberals for casting the swing vote in 2012 to mostly uphold the Affordable Care Act. But his decision in the case NFIB v. Sebelius to ham-handedly re-write the ACA’s Medicaid expansion denied a lot of people health insurance, made the Republican demolition of Medicaid more likely, and ensured that the death and suffering caused by TrumpCare will be harder, perhaps impossible, to fix.
It’s not just Roberts, of course. Seven justices voted to hold that Obamacare’s Medicaid expansion was optional for states, rather than a requirement to receive existing Medicaid funds. The Court’s five Republican nominees were joined by Obama nominee Elana Kagan and Clinton nominee Stephen Breyer. I think it’s overwhelmingly likely that the votes of Kagan and Breyer were strategic gestures intended to solidify Roberts’s decision to vote to uphold the ACA (after, apparently, initially voting to strike it down entirely). But it doesn’t matter—whether Kagan and Breyer voted sincerely or not, this holding is one of the very worst handed down by the Roberts Court, and it changed the political dynamics surrounding health care in critical ways.
The direct consequences of the decision were bad enough. Nineteen states still haven’t taken the Medicaid expansion, with the result that millions of poor, disabled, and/or elderly people are being denied insurance. But the indirect effects have also been very bad. The utter decimation of Medicaid is at the core of TrumpCare (it is even worse in the Senate version than in the House one). This would have been a lot harder to pull off if those 19 holdouts—all of them Republican-controlled—had taken the expansion money.
A new study shows that when a state took the Medicaid expansion, its residents became more likely to support the ACA. It would be more difficult to wreck Medicaid if more Republican voters had benefitted from the expansion. As the policy analyst Sean McElwee acidly put it, “The Republican Party’s strategic choice to brutalize their own voters by denying them health care basically worked.”
It would be one thing if these awful consequences came from a decision with a compelling legal basis. But the Medicaid expansion holding in Sebelius was, at best, a massive stretch. Nothing in the text of the Constitution places explicit limits on the conditions the federal government can place on money it offers to the states.
When the Supreme Court held in 1987 that it was constitutional for the federal government to use the threatened withholding of federal highway funds to create a de facto national drinking age of 21, it argued that the Constitution places implicit limits on the ability of the federal government to coerce the states to achieve national objectives. But the implication of the decision was that, if there were a case in which conditions on federal spending power were unconstitutional, it would involve an indirect objective only obliquely related to the central purpose of the spending. Nothing in the Court’s decision suggested to Congress that a straightforward condition—such as, “if you want Medicaid money you have to accept the conditions of the Medicaid program”—would be unconstitutional. In fact, the conditions placed on states that take Medicaid had been changed many times before the ACA.
Far from having the compelling legal basis that would be needed to justify its sweeping implications, the Medicaid expansion holding in Sebelius is a ludicrously incoherent mess.
This matters a great deal going forward. Let’s say President Kirsten Gillibrand takes over in 2021 with Democratic majorities in both houses of Congress. One of their top priorities will be to fix the damage inflicted by the Medicaid cuts, which in the Senate’s version of TrumpCare will phase in fairly slowly. How can Congress be sure its attempts to restore funding won’t be found to be unconstitutionally “coercive” changes to Medicaid spending? The answer is, it can’t be sure. Sebelius didn’t create any kind of workable standard, providing no meaningful guidance to Congress about how far is too far. And even worse, the more people a restored Medicaid insures, through new conditions on the disbursement of Medicaid funds, the more likely it is to be struck down. It’s a truly perverse situation.
One answer is to bypass the spending power issues by simply making a greatly expanded Medicaid a purely federal program like Medicare. Only this creates its own problems. While Roberts voted to uphold the mandate in the ACA as a valid use of the taxing power, he found that it exceeded Congress’s powers under the Commerce and Necessary and Proper clauses. Especially if Trump and McConnell are able to confirm one or two more justices, there’s a real chance that expanded public insurance programs might be struck down based on whatever quarter-baked constitutional argument cooked up on a conservative legal blog sounds best to the Republican Supreme Court nominees.
The Supreme Court not only made the destruction of Medicaid more likely—it also handicapped future efforts to repair the damage and provide genuinely universal insurance. There are many institutional barriers facing health care reformers in the United States, and the Supreme Court is likely to continue to be one of them.
It is infuriatingly easy to imagine Senate Republicans passing legislation that throws millions of people off of their health plans to finance a tax cut for the wealthiest Americans this week, just days after introducing it—without holding a single hearing for debate and amendments, or waiting for a final impact analysis from the Congressional Budget Office.
That may not seem like it matches the available facts. After all, more than enough Republican senators have announced their opposition to the plan to sink it in an up-or-down vote. But the counterintuitive arithmetic of legislative horse-trading means that declaring the bill dead likely requires stiff opposition from twice the number of GOP senators needed to kill it.
Majority Leader Mitch McConnell can only afford to lose two Republican votes in the final tally. Three kills it. But unless three moderate Republicans and three conservative Republicans oppose it for mutually exclusive reasons—the moderates because it soaks the poor too much, the conservatives because it does not soak them enough—then it can’t be considered a dead letter. Over the course of the next several days, McConnell will try to pry reluctant members loose from their opposition with gimmicks and kickbacks. If three conservatives and two moderates oppose the bill, he can buy off the three conservatives, and vice versa.
Without a pincer movement like this, McConnell could race a substantially modified bill to final passage, long before CBO has weighed in on the effects of the changes.
Under the circumstances, and given McConnell’s determination to freeze liberals and the media out of the legislative process, there’s nothing concrete Democrats can do to change the mathematical problem. Most have concluded that their best options are to protest and delay and hope the backlash turns Republicans against the bill. But there is one other thing they can try.
On Friday, in a press conference with his state’s Republican governor, Brian Sandoval, embattled GOP Senator Dean Heller made it pretty clear that his vote is not gettable. Appeasing him would alienate more than one conservative on the opposite side. It is possible that Heller would like the bill to pass without his fingerprints on it, but he can’t let on that that is the case.
Separately, Republican senators Bill Cassidy and Susan Collins marked themselves as potential opponents of the McConnell bill, by teaming up earlier this year to draft an Obamacare alternative that wasn’t organized around the principle of transferring hundreds of billions of dollars in health spending dollars from the poor to the rich.
Rather than hoping support for the bill will unravel, conservative Democrats, with Senate Minority Leader Chuck Schumer’s blessing, can enlist these three (and perhaps other Republicans) in a discussion over that bill.
The Cassidy-Collins plan would preserve the revenue streams that finance the Affordable Care Act coverage expansion, and allow states to choose whether to keep Obamacare, switch by default to a system that automatically enrolls the uninsured into catastrophic insurance plan and a subsidized health savings account, or opt out of expansion altogether.
Earlier in the year, when Republicans seemed like they might repeal Obamacare much more rapidly than they’ve been able to, I argued that Democrats should engage in backchannel negotiations with Cassidy and Collins as a failsafe. Arguably January would’ve been too early to form a bipartisan health care working group. But with the Senate otherwise poised to pass a cruel and unvetted bill that would uninsure tens of millions of people, the time for such a working group couldn’t be more ripe.
Democratic senators like Joe Manchin, Claire McCaskill, and Heidi Heitkamp, all of whom face tough elections next year in states that Donald Trump won, could represent their party in the discussion. The most immediate value of such a negotiation would be to stop partisan Obamacare repeal in its tracks. As anyone who followed the 2009 debate over the Affordable Care Act knows all too well, these kind of senatorial gangs can waste a tremendous amount of time.
What would distinguish the 2009 game of six from today’s is that representatives of the minority party wouldn’t be negotiating in bad faith. It’s possible that the discussions would end in gridlock, and reopen the path to a more outright ACA repeal. But it’s also possible that the negotiation would widen and end in an agreement on far more modest health care reforms that don’t leave 20 million Americans exposed to health catastrophes and financial ruin.
If a bill like that passed the Senate, it’d be up to House Republicans to decide whether to take it up or leave Obamacare in place. That’s a much better outcome than the alternative we face today, which is that McConnell rushes his Trumpcare bill through the Senate at blinding speed, and the House sweeps the Affordable Care Act into the dustbin before the month is out.
When I asked Alexandra Petri, The Washington Post’s humor columnist, to imagine a world without Twitter, she saw endless possibility. “I have no idea what I would do with that spare time. Probably rethink my life, finish my book proposal, and, uh, reaffirm my personal connections with human beings face to face,” she said. “I guess I’d become a more productive, better person who had to go physically type in the URLs of websites. I’d regain the strength that’s currently sapped from my fingers.”
Which isn’t to say Petri is ready for anything that radical.
“What would I do in the restroom?” And if she lost her 78,000 followers, she said, “My self-esteem would evaporate, obviously.”
Petri was joking, but there’s a serious underlying truth: Many of us in journalism are addicted to Twitter. It’s a professional tool for following breaking news, sharing insights, finding story ideas, and promoting work, but it’s also more than that. Twitter is a social environment unto itself, one in which reporters often spend more time than in actual, real-world social environments (and even then, they’re still on Twitter). “If Twitter went away,” Daily Beast senior editor Erin Gloria Ryan told me, “at first it would feel like a phantom limb.”
Journalists’ awareness of their addiction, and inability to quit it, causes no shortage of angst for some.
“Like everyone, I have a love-hate relationship with it,” said Emmett Rensin, a contributor to the New Republic and contributing editor to the Los Angeles Review of Books. “I’m one of these people who declares every few months that I’m quitting Twitter and then shamefully slinks back. I think we’ve entered an era where we’re just never going to log off.”
Assuming, that is, Twitter continues to exist, which is no sure thing. As The New York Times reported in April, the company “has struggled to maintain growth even as other social media companies, including Facebook, Instagram and Snapchat, have seen their popularity and user numbers surge.” In January, Wired declared 2017 as the “Year That Twitter Learns to Thrive or Dies,” noting the site’s inability to curb user harassment and the “revolving door on the company’s C-suite.” The company isn’t on the brink of collapse—its first quarter beat earnings and growth expectations—but its long-term survival is by no means assured.
This has major implications for journalism more than any other industry. Yes, many reporters and editors would miss the “social media” aspect of Twitter, but more important, they would have to manage without a major tool of the trade—like a handyman suddenly deprived of a wrench. Given Twitter’s significance to the Fourth Estate—not to mention to the sitting president—would a private or public entity step in to save it, if necessary?
Petri, who told me “Twitter-less me sounds like a better person,” isn’t the only journalist who thinks the death of the platform would give jumpstart to deferred life goals. Everyone, it seems, has writing projects they’ve been putting off, or exercise regimens they’ve been neglecting.
“I would write the goddamn TV scripts I’ve been saying I’m going to write for years,” Ryan said. “I’d get a lot more done on feature writing. I’d probably spend a lot more time outdoors.”
“I sometimes joke that Twitter is what I do instead of smoking,” Garance Franke-Ruta, Yahoo News’ senior politics editor, told me. “It occupies the same interstitial space. I think if Twitter went away we would all go into withdrawal and have three very uncomfortable weeks—followed by being healthier, happier people.”
Dave Pell, author of the popular NextDraft newsletter, isn’t so sure. “In some ways it would mirror the effects of coming off an addiction,” he said. “On one hand, I’d be happy to be getting off the crack. On the other hand, seriously, where the hell is my crack?”
It would also be professionally devastating to some journalists.
“It would be a disaster for our news gathering here at HuffPost,” said Ethan Klapper, the site’s global social media editor. He said nothing compares to Twitter as a “real-time platform that provides for the real-time exchange of ideas and information,” and without it, reporters would have a harder time tracking down eyewitnesses in breaking news situations. Ryan told me it would be more difficult to find interview subjects who fit specific profiles—a female military veteran who spends time in Tennessee, for example.
“I do think it would be a loss for wise journalists when it comes to listening to what the public is saying and thinking and asking, and when it comes to collaborating with the public,” said Jeff Jarvis, a journalism professor at the City University of New York. He pointed to the Post’s David Fahrenthold, who recently crowdsourced his investigation of President Donald Trump’s charitable giving. Dave Weigel, a political reporter for the Post, said Twitter doesn’t compare to “the classic internet of bloggers reacting to articles and hashing it out in a more humane way,” but that it does allow him debate other writers, and often they can “figure each other out.”
Other journalists view Twitter less favorably, and are weaning themselves off of it. New York Times columnist Bret Stephens declared in a column last week that he’s forswearing the platform “for good.” “Twitter is terrific when tailored as a personalized wire service and can be a useful way to communicate with readers,” he wrote, but it “erases nuance, coarsens thought, [and] facilitates a form of self-righteous digital bullying and mob-like behavior that can wreck people’s lives.” He called Twitter “the political pornography of our time: revealing but distorting, exciting but dulling, debasing to its users, and, well, ejaculatory. It’s bad for the soul and, as Donald Trump proves daily, bad for the country.”
And yet, even Stephens isn’t quitting cold turkey. “I’ll keep my Twitter handle, and hopefully my followers,” he wrote. Which raises this crucial point: Many writers also depend on their Twitter following, which they’ve worked hard to grow, for spreading their stories widely. “Yeah, I’d be bummed,” Rensin said. “I’d be unsure at first how to distribute my work.” Weigel said that selling his new book on progressive rock would be harder without “a fan base that’s easy to activate” on Twitter. Having a large Twitter following is also an asset for journalists in the job market, because of the traffic it drives to their employers’ websites. Were the platform to evaporate, these journalists would lose that competitive edge.
Ryan said she would celebrate the death of some of the laziest forms of journalism spawned by the platform. “If Twitter didn’t exist,” she told me, “a lot of people who have relied on the bad-tweets beat would have to develop some skills. Those people would be the first eaten post-zombie apocalypse: the bad-tweets people. I’m sure they’re delicious. They’re very soft.”
So the curators at Twitchy would have to find a new line of work. But more important, what would President Donald Trump do without his early-morning megaphone?
For now, the collapse of Twitter is an unlikely hypothetical. None of the journalists I spoke with were particularly alarmed about the platform’s future. “It’s hard to imagine you can’t find a business model and survive,” Jarvis said. “My fear isn’t that it dies. The question is who buys it when the price gets low enough, and the likely candidates give one pause: Comcast and Verizon.” (He worries these companies “would try to be more controlling for the brand risk,” threatening the platform’s openness.)
“I’ve always thought that Twitter was sort of too big to fail, and that if they were in a precarious position financially someone would swoop in and buy it,” Klapper said. It would be a “Jeff Bezos type of a person” or “Facebook could buy it.” Weigel told me he’s less libertarian than when he was working at Reason magazine years ago, but he still puts faith in the free market to sort this all out.
Journalists to Weigel’s left entertain a federal solution instead. “Sometimes I like to think the only upside of the Trump administration,” Rensin said, “is that Trump loves Twitter so much that if it were failing he’d nationalize it.” Ryan told me, “I could imagine him trying to make overtures for bailing it out. If he’s going to bail out anything, it would be something that contributes to his vanity.” And if Trump didn’t rescue Twitter? Pell sees a silver lining there: “Every journalist knows that however much losing Twitter hurts them, it will be hurting Trump twice as much.”
On the other hand, Twitter could well be what brings Trump down. For his opponents, this is reason enough to root for Twitter’s continued existence, even if it becomes irrelevant to everyone but the president. “It’s tempting to wish for Twitter’s demise just to get rid of Donald Trump’s tweets, but the best thing he can do for democracy is to keep implicating himself in crimes,” Jarvis said. “I suppose one fantasy would be that it does turn into MySpace and the only person still talking there is Donald Trump, just talking to the wall.”
Ever since Donald Trump launched his presidential campaign in 2015, critics worried about his authoritarian tendencies. He’s done much to justify those fears. At his rallies, he whipped crowds into such a frenzy that protesters were beaten. As president, he’s flouted elementary rules about nepotism and conflict of interest; undermined the independence of the judiciary by impugning judges overseeing cases involving him or his administration; obstructed justice by firing James Comey after he refused to pledge loyalty to him; and arbitrarily limited media access by, for instance, replacing daily White House press briefings with off-camera gaggles where recording is banned.
Trump’s presidency is already starting to resemble the one imagined by David Frum in his March article for The Atlantic, “How to Build an Autocracy,” where he described a global “democratic recession” in terms that intentionally evoked Trump:
Worldwide, the number of democratic states has diminished. Within many of the remaining democracies, the quality of governance has deteriorated.... What is spreading today is repressive kleptocracy, led by rulers motivated by greed rather than by the deranged idealism of Hitler or Stalin or Mao. Such rulers rely less on terror and more on rule-twisting, the manipulation of information, and the co-optation of elites.
By Frum’s definition, Trump is indeed an aspiring autocrat. But he’s only one wing of the anti-democratic trend in American politics—in the Republican Party, to be specific. Equally dangerous, and intimately connected with Trump’s mode of authoritarianism, is the degradation of democratic norms in the Senate under Mitch McConnell. The majority leader and a dozen other senators—all of them male—wrote their Obamacare repeal bill in secret, concealed not only from the public and Democrats, but even their Republican colleagues. Now, having finally released his version of the American Health Care Act on Sunday, McConnell is trying to force the bill through the Senate as quickly as possible. A vote is expected next week, meaning the public and press have only a few days to pore through the 142-page document—and even fewer days to digest the Congressional Budget Office’s analysis, which is expected on Monday or Tuesday.
Compare this with the Democrats’ legislative approach to the Affordable Care Act. Republicans have long lied that Democrats rammed Obamacare into law, when in truth, as The New York Times reported this week, there were “months of debate that included more than 20 hearings, at least 100 hours of committee meetings and markups, more than 130 amendments considered and at least 79 roll-call votes.” On the Pod Save America, a podcast hosted by four former aides to President Barack Obama, Dan Pfeiffer articulated how McConnell’s approach imperils democracy:
This is not how the process is supposed to work. What you sort of realize in watching how Trump has conducted himself [and] in how Mitch McConnell has conducted himself is that [the] functioning democratic process as we know it is not embodied in law or in the Constitution. It depends on both parties ... believing in a set of democratic norms about the value of public input, about the value of transparency, about allowing the public to have a say in what’s happening. And if one of those parties ... decides to disavow all those norms, we get to a place where ... this is not American democracy. We basically have an election and live in a quasi-authoritarian state until the next election.
Trump’s authoritarianism and McConnell’s are two very different strains. The president is a narcissist who gathers power for personal gain self-gratification. He cares little for the specifics of policy outcomes, and merely wants victories that he can boast about. For instance, on Friday morning he tweeted—
—and then appeared on Fox and Friends to make the patently false claim, “I’ve done in five months what other people haven’t done in years.” Constant displays of alpha-male dominance is also central to Trump’s brand of authoritarianism. He taunted his GOP rivals during the Republican primary, and since then has mocked his Democratic foes—first Hillary Clinton, and now Nancy Pelosi and Chuck Schumer. This week, he tweeted that the House and Senate minority leaders, respectively, were doing the Republicans a favor by remaining in charge, the implication being that they’re ineffective if not incompetent.
This is the authoritarianism of pure spectacle. McConnell, by contrast, is withdrawn and diffident in his public. (He’s jokingly likened to a turtle because of his appearance, but behaves like one, too.) While the majority leader doesn’t crave attention, he does care deeply about a specific policy agenda: advancing the plutocratic preferences of the Republican party’s donor class. Infinitely more knowledgeable than Trump about how government functions, McConnell subverts norms with a laser-like focus on advancing that agenda. His authoritarianism, in other words, is one of procedure.
As different as they are, these two forms of authoritarianism depend on each other. It’s unlikely that the Republican Party would have won a unified government last fall without Trump’s theatrical flair. To judge not only by last year’s election, but also this week’s special congressional election in Georgia, Trump’s tribalist politics have far more appeal with the Republican base than a forthright agenda of tax cuts for the rich and entitlement cuts to the poor. And when it comes to that agenda, all that really matters is that the policies be sold through the lens of negative partisanship. After all, Trump campaigned on a promise not to cut Medicaid, whereas McConnell’s version of the AHCA would slash the program by hundreds of billions of dollars over the next decade. But Trump easily resolves such dissonance by reminding his supporters of the real enemy here: Obamacare.
If the Republican Party needs Trump, the president is equally dependent on the GOP. Given his manifest disinterest in policy and the details of governance, he would be unable to pass anything without crafty leaders like McConnell and House Speaker Paul Ryan. But there is a more sinister dimension to Trump’s alliance with these Republican leaders: Congress has the power to check the president, including impeachment and removal if necessary. Ryan and McConnell are the bulwarks protecting Trump from a wide range of areas where he should be held accountable. If they wanted to, they could push for laws requiring him to reveal his taxes, force him to place his assets in a blind trust, and use nepotism rules to limit the power of family members, among a range of other checks.
Republicans in the House and Senate have implicitly made a devil’s bargain with Trump, giving him a free hand to indulge his kleptocratic and autocratic tendencies in exchange for what they want: stalwart conservative judges for the Supreme Court, and a presidential signature on whatever bills Ryan and McConnell manage to pass. And make no mistake: He will sign any major legislation that crosses his desk.
In theory, Trump could reject a health care bill from Congress on the grounds that it would violate his campaign promises to protect Medicaid and “to have insurance for everybody.” He’s reportedly grumbled in private that the House bill, which closely matched the Senate’s, is “mean.” But that should not raise the hopes of Obamacare’s defenders. A presidential veto, or even the threat of one to lessen the cruelty of the AHCA, is out of the question. Reveling in his emerging kleptocracy, Trump is smart enough to know that he has to stay on Ryan and McConnell’s good side if he wants to hold on to the presidency.
Trump also wants his second big “win” as president, after the confirmation of Justice Neil Gorsuch. Consider how he bathed in imagined glory after the Senate confirmed his Supreme Court pick. “We are here to celebrate history,” he said. “I have always heard, the most important thing that a President does is appoint people, hopefully great people, like this appointment, to the United States Supreme Court. And I can say, this is a great honor.” He added, “And I got it done in the first 100 days. That’s even nice. You think that is easy?”
Consider, also, the Rose Garden celebration Trump threw for House Republicans after they passed their health care bill—the one he would later call “mean.” He described Ryan as a “genius,” and left little doubt that he would treat McConnell and Senate Republicans exactly the same if they accomplished the same feat: “It’s going to be an unbelievable victory when we get it through the Senate, and there’s so much spirit there.” Rest assured, Trump is already wetting his pants in anticipation of a second Rose Garden bash, and of an eventual Oval Office photo op where he can scratch his electrocardiographic signature on a bill condemning millions of Americans to poorer health.
Frum wrote in The Atlantic that the repressive future he imagines under our authoritarian president “is possible only if many people other than Donald Trump agree to permit it. It can all be stopped, if individual citizens and public officials make the right choices.” Alas, our public officials—the ones who control all of the levers in Washington, anyway—are making the wrong choices, by cynically enabling the president. Mere months since Frum’s essay, the war against American democracy is now being fought on two fronts: Trump’s autocratic assault on the presidency, and McConnell’s dismantling of democratic norms in the Senate. But the two fronts are part of the same ultimate struggle; these two men are the authoritarian twins, and it’s up to the final bulwark—individual citizens—to stop them.
Earlier this month, the Democracy Fund Voter Study Group’s Lee Drutman released a fascinating study that challenged many of the dominant assumptions about the 2016 election. Drutman’s data suggested that Hillary Clinton and Bernie Sanders voters were largely aligned on economic issues, and that the 2016 election was decided by issues of culture and identity, rather than economics. Drutman concluded that Donald Trump’s victory largely stemmed from his ability to drive populists to the polls by hammering home the importance of protecting America’s cultural identity and keeping immigrants out of the country.
The VSG study also explains the mystery behind the Obama-Trump voter—that odd figure who voted for Barack Obama in 2008 and 2012 and Donald Trump in 2016. According to Drutman’s studies, these were voters who simultaneously held prejudicial values and relatively liberal economic views. In 2008 and 2012, they were forced to prioritize their values and, without a candidate running an explicitly prejudicial campaign, largely came down on the side of the Democrats. In Donald Trump, however, these populists found a political unicorn: a candidate who shared both their right-leaning cultural values and their left-leaning economic ones.
I talked to Drutman about the lessons of the 2016 election, the realignment currently taking place in American politics, and how Trump might fare in 2020. This interview has been edited and condensed for clarity.
Your study found that, contrary to much of the coverage of both the election and the primaries, it was cultural issues, not economic ones, that ultimately decided the 2016 election.
If you compare the 2016 and the 2012 elections, 2016 was fundamentally about who should be an American and what it really means to be an American in ways that were off the table in 2012. Immigration was front and center. Whether Muslims should even be allowed into the United States was front and center. These were issues that were just not discussed in 2012—and discussing these issues really activated attitudes surrounding race and identity. There were a lot of voters who supported Obama who held these beliefs, but the campaign wasn’t about these issues in either 2008 or 2012.
By talking so much about the threat of immigration, and the threat of Muslim “infiltration” in America, Trump made these the most salient issues for many voters. In 2016 they decided that Trump was the candidate who understood how they felt about those issues, whereas Clinton was going to be an “open borders, letting all the Muslims in” candidate. That had an impact.
Did the Democrats’ shift on cultural issues help perpetuate this trend?
If you compare where Democrats were in 2008 to 2016, I think you would see that Democrats moved to the left on immigration. In a lot of places that had virtually no immigration for a very long time, people started to see more people speaking Spanish, or more people of Muslim heritage. If this is new to you—even if the Spanish-speaking population of your town goes from 1 percent to 3 percent and the absolute change is not high—the rate of change can feel very fast.
If you look at states that had the most rapid increases in immigration, it’s places like Arkansas. There still aren’t a lot of immigrants in Arkansas, but it feels like a huge shift. The Democratic Party also moved considerably left on these issues in a way that made it more challenging for some of these voters to support it. If Trump wanted to start a culture war, Clinton gave him the culture war he wanted to start.
Your study found that there wasn’t much difference between Clinton and Sanders voters on economic issues, though there were considerable differences in how they viewed the political establishment.
If you look at where Clinton voters placed themselves on economic issues and where Sanders voters place themselves on economic issues, there’s really no difference. There is a difference in how Clinton and Sanders talked about economic issues, of course. Sanders said he was going to take on the banks and take on the pharmaceutical companies. Hillary Clinton didn’t use that language. But there’s also a lot of evidence to suggest that a lot of Sanders’s support was motivated by the fact that his voters didn’t like Clinton and didn’t trust the Washington establishment. They felt that Clinton was an insider and that the process was set up to favor her. There’s something about the Clintons that a lot of Democratic voters don’t trust.
Do you think that the kind of economic arguments that Sanders made about money in politics and the growing power of corporations could be a way for Democrats to win back some of the populist voters they lost to Donald Trump?
As long as Donald Trump is the Republican standard bearer and pursues the policies he does on race and immigration, these identity issues are going to remain quite salient. It’s not clear to me that a more populist economic message will necessarily cut through given the loudness of the race and identity politics.
That said, I think the good news for Democrats on that front is twofold. One is that a lot of the people who voted for both Mitt Romney in 2012 and Hillary Clinton in 2016 also feel that economic inequality is a tremendous problem. They’re a little more skeptical about the government’s ability to do anything about it, but they share a lot of those concerns. Two is that there’s also a turnout and mobilization issue: it’s not just a question of reaching out to swing constituencies, it’s also a question of making sure your own constituencies turn out. As you know, midterm elections tend to have much lower turnout than presidential elections.
If Democrats have bold economic policy ideas that excite their core voters, that’s a way of getting voters to turn out and feel like they’re turning out for something. A lot of them will turn out just because they want to beat Trump, but perhaps not everybody. The younger you go in the electorate the more dominant that liberal quadrant is, whereas the populist quadrant is mostly older.
Reading your study I couldn’t help but come to the conclusion that both parties are in the middle of an existential crisis. Do you think we’re in the middle of a party realignment right now?
I think that’s right. I was writing last year that Trump heralded a realignment, in the sense that he has caught up to where the Republican electorate has been drifting for a while. What you’ve seen over several decades is a shift in which the Republican Party has become the party of the white working class, which used to predominantly vote Democrat. This shift started with Reagan Democrats and it has continued ever since. Meanwhile, the Democrats have increasingly become the party of the professional class and ethnic minorities.
Parties take a while to catch up to their voters, because party leaders tend to give up power reluctantly—they want to hold on to how things used to be because that keeps them in power. The struggle with Trump transitioning the Republican Party to being a more populist, more ethnographic-nationalist party is that there is no intellectual infrastructure or policy infrastructure to implement the vision that Trump ran on. He certainly doesn’t offer it himself. What you’re starting to see now is how some folks on the right are wrestling with what it means to put forward a vision that fits in this nationalist, populist framework. As that vision expands to eventually take over the party, what you’ll also see is an exodus from the party of more traditionalist Republicans.
Do you see continued friction between this older party infrastructure and Trump’s electorate?
The Republican Party is very much a party in transition now, as it’s caught between the remnants of an older policy vision that no longer fits and the electorate that it now represents. Presumably there’s going to be some backlash when voters who were excited about Trump in the primary—precisely because he filled a void within the Republican Party—see him put forward policies that are antithetical to the policies that he promised. Maybe he can get away with that because these voters have rejected the Democratic Party on cultural and identity grounds, but I suspect they will continue to want the Republican Party to be the party that Trump promised it would become.
I think you will start to see more candidates, like [the neo-Confederate] Corey Stewart in Virginia, coming forth. One possible reason for why Stewart came close to beating Ed Gillespie in the Republican primary for governor was that a lot of people who would have been Gillespie supporters eight years ago no longer consider themselves Republicans, and they now vote for Democrats.
The fault lines that Trump exposed in 2016 aren’t going anywhere. Even with Democrats confident that the unpopularity of American Health Care Act will propel them to victories in 2018 and 2020, Trump seems to be set on hitting immigration and identity over and over again.
Hitting immigration and identity is what brought him to the White House. If that’s the most salient issue for voters, they will stay supporters no matter what he does because he’s picked the right enemies and he’s signalled that he’s on their side. If the focus is on whether he’s still hanging on to his promises of delivering government entitlements, he loses. If the question is over American identity, he has a chance of retaining support.
If you think about it, the takeaway—which is a broad takeaway on American politics—generally on economic issues, on welfare issues, the country’s overwhelmingly liberal. On cultural issues, the American population is conservative, particularly in rural areas that tend to be overrepresented in our system of government.
Trump was able to pitch himself to voters as a marriage of social conservatism and economic liberalism, arguing that he would maintain (and possibly even expand) the welfare state. He’s clearly going back on many of the promises he made on the campaign trail and he won’t be able to run as a unicorn in 2020. Could that cost him his presidency?
Well, it may. That may be the case, but it may also be the case that as long as Trump maintains the right enemies, then he seems like he’s still the lesser of two evils to many voters. Republicans may succeed by making Democrats look like the party of globalist multiculturalism undermining American Christian greatness. That could still be enough to win.
Gore is a new element in the work of Sofia Coppola. There’s not too much of it in her new film The Beguiled, but it’s memorable: the mangled flesh and bone of Colin Farrell’s leg after he’s fallen down the stairs and broken it; the blood smeared on Nicole Kidman’s bright white nightgown as she prepares to amputate it below the knee.
Kidman is Martha Farnsworth, headmistress of Farnsworth Seminary, a school for girls in the woods of Virginia, housed in her ancestral mansion, a gated zone deprived of men and slaves by the Civil War, which is raging somewhere beyond the forest, as we can tell from the drifting smoke and occasional cannon boom. Farrell is Corporal John McBurney, a wounded Union soldier that Amy (Oona Lawrence), one of Miss Martha’s students, has found under a tree while picking mushrooms and dragged home like a stray dog with a busted leg.
Before the gore, the sponge bath. The Beguiled, adapted from Thomas Cullinan’s 1966 novel, is a dreamily stylized film. No scene is more dreamy than the one in which the corporal, confined to a daybed in the school’s music room, has his torso scrubbed by Miss Martha. The water pools on the crevices of his chest and belly and the camera closes in. What damage could this body do when restored to health? The scene is so crucial to the film that it required the omission of a character. In Don Siegel’s 1971 adaptation of the novel, the cleansing duties were handled by the slave Hallie (Mae Mercer), the one woman at the Farnsworth Seminary entirely immune to the surly magnetism of Clint Eastwood’s McBurney. Hallie’s absence also signals that Coppola’s film isn’t a work of realism, or particularly faithful to or interested in American history.
Siegel’s film belonged to Eastwood; Coppola’s belongs to Kidman. Where Eastwood was monstrous, Farrell is mopey. The predator has been softened into a mere cad, but that’s part of the plan. Siegel was making a thriller. Coppola’s film is a Southern gothic costume drama. She’s elided the saw-and-all amputation scene that was Siegel’s cringe-inducing centerpiece, as hard to watch as the gruesome bits in Lars von Trier’s Anti-Christ. Instead Coppola cuts from Kidman’s bloody nightgown to the women and girls of the Farnsworth Seminary burying the detached foot and shin in the yard. This is one of the few moments in the film where the tonal score, based on Monteverdi’s Magnificat, creeps in.
The lack of a slyly calculated pop soundtrack in The Beguiled might be the biggest departure Coppola makes from her earlier work. I think of her films in pairs with the last three films revising (and improving on) the first three. The Virgin Suicides (1999) was a melancholy confection but a conventional piece of work, heavily reliant on the first-person-plural voiceover narration (executed by Giovanni Ribisi) that distinguished Jeffrey Eugenides’s novel and with several sequences that could be confused for fan videos for the greatest hits of Heart. Working with another story of frustrated teens, Coppola delivered an altogether more satisfying romp in The Bling Ring (2013), which bestowed Dostoyevskian depth on a gang of creepy-crawling kleptomaniac L.A. kids dosed on Adderall in the morning and homeschooled on lessons from The Secret by day.
Lost in Translation was a revelation in style when it appeared in 2003, but it hasn’t aged well. Scarlett Johansson, then still a teenager, has grown up into more interesting, less blank roles, and the post-Kingpin, non-slapstick-dependent, late-middle-aged Bill Murray persona has become a little too familiar over the years. The jokes about cultural difference, trite and offensive to some Japanese audiences, are also boring. The centerpiece scene of Murray’s Bob and Johansson’s Charlotte shot from above, lying chastely on the Tokyo Hyatt hotel bed, confiding in each other about their so-so marriages, was always pretty banal: “It gets a whole lot more complicated when you have kids,” Bob says, “It’s the most terrifying day of your life when the first one is born.” Charlotte replies, “Nobody ever tells you that.” Clearly Charlotte needs to get out more.
Somewhere (2010), another study of languor in a luxury hotel and Coppola’s only other original screenplay, is also her finest film. It fixes a lot that was off about Lost in Translation. Another actor on the wrong side of celebrity, Stephen Dorff’s Johnny Marco is captive to L.A.’s Chateau Marmont and his own disappointments. Dorff is more deadpan even than Murray and his character more thoroughly depressed, given to passing out on top of the women who throw themselves at him while they’re getting it on. Coppola’s intimate vision of L.A. is stranger than her smooth but clichéd view of Tokyo. In place of the tentative, awkward romance between Charlotte and Bob, there’s the achingly sweet father-daughter between Johnny and Chloe, played by the eleven-year-old Elle Fanning. When the pair jet to Milan for a publicity junket, the foreigner jokes land because the Italian-American writer-director is on home-away-from-home turf. Best of all, a lot of the music—like the Foo Fighters’ “My Hero,” blasted while twin pole dancers (Kristina and Karissa Shannon) perform for Johnny in his room—is corny and deployed with an acid wit.
The film that The Beguiled revises is Coppola’s previous costume drama, Marie Antoinette (2006). There was too much music in the earlier film, which posited what the last years of the Bourbons would’ve been like if they had iPods and a taste for British Punk and New Wave. In the title role, Kirsten Dunst was irresistible as usual, which made the biopic motions the film went through, adapted from Antonia Fraser’s biography, all the more perfunctory. Marie Antoinette is delicious but disposable, like the many cakes the camera lingers on. When the rabble show up with pitchforks and torches, the queen comes to the balcony and quiets them. If only. It’s hard not to suspect the daughter of the director of The Godfather felt a little defensive about dynasties.
No longer. The Beguiled is not shy about violence. Nor is it shy about sex, and it’s the first time that Coppola has framed Dunst—who plays Edwina, Martha’s repressed deputy schoolteacher, and the one at the seminary who really falls for McBurney—as something other than a princess in a gilded cage, a dream version of femininity. Here Edwina is both the accidental avenger and the vulnerable, real human inside the dream. Reinventing Kirsten Dunst is Coppola’s most intriguing revision.
Nobody Speak: Trials of the Free Press, a documentary about the death of Gawker premiering Friday on Netflix, opens with A.J. Daulerio, the former editor of the site, reading an email from his bank: “We placed a hold on your account.” Daulerio can’t help but laugh as he reads out the unfathomable sum: “$230 million is the amount of the hold.”
The big legal fight in 2016 between Gawker and Peter Thiel (featuring Hulk Hogan!) was seemingly preordained for movie treatment. It featured an irresistible cast of characters: a vampiric Silicon Valley billionaire, an aging pro wrestler, and a group of dirtbag New York reporters. But Nobody Speak, directed by Brian Knappenberger, isn’t really about Gawker or Thiel or Hulk Hogan’s penis. It’s about inequality.
It was under Daulerio’s byline that Gawker published the Hulk Hogan sex tape that ultimately led to Hogan, a.k.a. Terry Bollea, being awarded $140 million in damages by a Florida jury. Gawker Media, founder Nick Denton, and Daulerio all ended up filing for bankruptcy. It wasn’t until after the trial ended that it was reported by Forbes that Thiel was secretly funding Bollea’s case. Thiel claims that he did so because Gawker “has been a singularly terrible bully,” most notably by outing him in a blog post in 2007. Denton, for his part, is convinced that Thiel was mad that Gawker Media, through its Valleywag blog, was a thorn in the side of Thiel and his powerful friends in Silicon Valley.
But as Nobody Speak points out, these are mere details. The story of Gawker’s murder boils down to the fact that a very, very rich man was able to destroy a publication he disliked with impunity. As Floyd Abrams, a lawyer specializing in the First Amendment, says in the documentary, what Thiel has done is to “potentially imperil entities who upset large, rich, powerful people and institutions. And it’s not limited to individuals. This can be corporations.” By opening with the comically enormous hold on Daulerio’s bank account, the film viscerally illustrates the division between billionaires like Thiel and the rest of us.
Knappenberger underscores this idea by dedicating the final third of his documentary not to Gawker, but to another media company: the Las Vegas Review-Journal. In January 2016, the paper was mysteriously bought by an anonymous entity—even Review-Journal staffers were kept in the dark. Audio shows Michael Schroeder, the man who helped facilitate the deal, being pressed by the staff to reveal the new owner, to which he fumblingly responds, “We really don’t think … they want you to focus on your job.”
So they did. It was Review-Journal reporters themselves who uncovered their new owner: the billionaire casino magnate and Republican heavyweight donor Sheldon Adelson. After the purchase, it was reported that Adelson barred reporters from writing stories about him and that stories about Adelson’s business deals were either killed or heavily edited.
By linking these two seemingly disparate cases, Knappenberger argues that the media story of our time isn’t about the destruction of a single news organization (Gawker) or the takeover of another (Review-Journal). He suggests that our new Gilded Age has seen the return not only of monopolization and astronomic inequality, but also outsized oligarchical influence over the media, whether it’s an envelope-pushing website, a storied newspaper, or something in between.
Amazon founder Jeff Bezos owns The Washington Post—and while he has promised not to interfere with the paper’s coverage, that promise is, of course, dependent on the whims of Jeff Bezos. The Wall Street Journal is in the grips of Rupert Murdoch. Jared Kushner, son of a wealthy New Jersey developer and son-in-law to a wealthy New York developer-turned-president of the United States, owns the New York Observer. Fancy sportswear mogul Peter Barbey bought the Village Voice in 2015, whose union he is now reportedly in the process of destroying. A billionaire Republican donor, Frank VanderSloot, nearly bankrupted Mother Jones. The very idea of media as a public service has long been in decline, with television news networks bowing before the demands of advertisers and Donald Trump threatening to eliminate funding for public broadcasting.
It’s hard to escape the sense that empowered members of the one percent are spending their buckets of money to remake the media in their preferred image. As NPR’s David Folkenflik states in the documentary, “Peter Thiel’s decision to get involved was of a different order. What he did in financing the suits against Gawker was a kind of adversarial stance and attack that was sheathed from public view.” And all of this comes amidst a historic rise in distrust of and hostility toward the press on the part of the American public itself. Trump rode his way to the presidency in part by taking the “dishonest media” to task.
It should be no surprise, then, that income inequality is strongly correlated with freedom of the press. The World Press Freedom index lists Norway, Sweden, Finland, and Denmark as countries with the freest press, while the United States is ranked at an embarrassing 43. These countries also have a much lower Gini coefficient (a measure of inequality) than the United States.
In the end, Nobody Speak devolves into a kitschy paean to journalistic heroism. The swelling music indicates that these are the Davids who will bring down the Goliaths that stride across the American media landscape. “This is a moment of real definition for the press,” Folkenflik states. “Journalism has to be independent.” The camera pans slowly across the inside of different newsrooms, presumably the site where this battle will take place.
Of course, journalists have an important, essential job when it comes to safeguarding democracy. But journalism is not isolated from the same power dynamics that govern the rest of the country. It’s fun to think of the little guys taking on the Goliaths of the world. But the real problem is that the Goliaths exist at all. Who is Peter Thiel, after all, but a man with billions of dollars to spend?
President Donald Trump’s position on repealing and replacing Obamacare has been all over the place. After the House narrowly passed the American Health Care Act in May, he and the chamber’s Republican leadership celebrated in the Rose Garden. Trump declared it an “unbelievable victory,” despite the fact that the ball he was spiking was still on the 50-yard line. Then six weeks later, Trump told the Republican senators who were crafting their own version of the bill that the AHCA, which would uninsure some 24 million people, was “mean.” He reportedly urged them to create a “more generous” alternative.
This schizophrenic messaging is characteristic of Trump’s entire presidency, but it’s particularly relevant to his approach to health care. There’s evidence that Trump’s numerous promises on the campaign trail about health care helped propel him to victory. While many voters were skeptical of the GOP’s position on the issue, Trump’s appeals were surprisingly liberal. He promised “insurance for everybody.” He said that there would be “no cuts to Medicaid,” that “no one will lose coverage,” and that “nobody will be worse off financially.” In a memorable exchange with Senator Ted Cruz in February 2016 during the Republican primary, he insisted that, unlike his heartless conservative rivals, he would not let people “die in the streets.”
The message was clear: Voters might think that the GOP’s approach to health care was cruel, but Donald Trump was not a normal Republican.
The bill that the Senate unveiled on Thursday—the Better Care Reconciliation Act—is cruel. If enacted it will result in people—particularly lower-income and middle-class people—paying substantially more for substantially less coverage. The bill’s $800 billion in Medicaid cuts would result in an unprecedented transfer of wealth from the poor to the rich. Experts that looked at the House’s bill projected that it would directly lead to thousands of people dying every year, and the Senate’s bill will likely do the same. If Trump signs this bill, people will die in the streets.
This is precisely the outcome Trump said he would not allow. On top of that, the AHCA is incredibly unpopular—more unpopular than Trump, which is no small accomplishment. And while Trump has spent much of the his first six months in office fighting back allegations that his campaign colluded with Russian intelligence, all the available polling data suggests that health care is the most damaging issue facing his administration. Everything suggests that signing this bill would be political suicide.
Still, he might just do it anyway.
When Trump finally acknowledged the Senate’s bill late on Thursday, he was noncommittal. He wrote that he’s “very supportive” of the bill, but noted that it’s not a finished product.
Space for negotiation has already been baked into the Senate’s bill. It seems likely that both right-wing Republicans and moderate Republicans who are ostensibly on the fence will extract concessions that will allow them to vote for the bill. However, for liberal activists and Democrats looking to influence the debate, the potential holdouts suggest that there is a narrow path forward to blocking the bill in the Senate, in the form of organizing, calling, protesting, and generally raising hell.
Another path may be through Trump himself. While he has stated repeatedly that he wants to repeal and replace Obamacare, he has also gotten in the way of his own message by insisting that a reform bill does things these bills do not do, most notably lower costs, increase coverage, and not kill lots of people. If the bill does pass the Senate and some form of it ultimately ends up on the Resolute Desk, the calls for Trump to veto the bill will be deafening. He will also have to contend with the fact that signing the bill as is could very well cost him his job in 2020.
Congressional Republicans have apparently concluded that repealing Obamacare is worth the political costs. Senate Majority Leader Mitch McConnell can look at the four special elections in 2017, which were all won by Republicans, and argue that a combination of negative partisanship and an influx of super PAC money may be enough to withstand the liberal mobilization that will follow the repeal of Obamacare. They can also cynically continue to blame Obamacare for whatever horrible things are happening in the market, claiming it has nothing to do with their legislation.
This would seem too clever by half, but it’s better than the alternative, which is running on the merits of the AHCA. Just as Trump has continued to run against Hillary Clinton, he and other Republicans will continue to run against Obamacare. The suffering the AHCA will cause will in effect be its own political reward.
Then there’s the suffering that the bill will cause to Democrats in particular. If Trump understands one thing it’s the tribalism that defines American politics. The mere fact that Democrats are upset is appealing to Republican voters. In Georgia, Karen Handel ran a campaign largely based on a desire to punish and humiliate Democrats, and the fact that the AHCA does both things will undoubtedly be a huge part of Trump’s messaging on the issue.
In the GOP’s (inaccurate) version of events, Obamacare was Democrats taking benefits from Republicans and giving them to people who weren’t American (read: nonwhite). Trumpcare rights the scales. Trump ran and won in 2016 by making precisely this kind of pitch, and there’s no reason he won’t do it again in 2020.
It has been nine days since a gunman opened fire on Republican members of Congress during an early-morning baseball practice in Northern Virginia, nearly killing House Majority Whip Steve Scalise.
Scalise was only struck once, far from his most vital organs, but the bullet traversed his hip, shattering bones and unleashing concussive forces that caused severe internal bleeding and organ damage. When he was medevaced off the field, he was reportedly conscious and in good spirits. By the time he arrived at MedStar Washington Hospital Center, in the District of Columbia, he was in critical condition: unconscious, and on the brink of death.
On Wednesday, after three surgeries and a week of intensive hospital care, doctors upgraded Scalise’s condition to fair, and said he is “beginning an extended period of healing and rehabilitation.” The additional good news—such that any of this news can be described as “good”—is that Scalise is medically insured.
Before the implementation of the Affordable Care Act, members of Congress were insured through the Federal Employees Health Benefit Plan, but the ACA removed them from that system, and allowed them to spend their employer-provided health insurance subsidies in D.C.’s small-business exchange instead. Not every member of Congress took the government up on this offer. Some chose to pay full freight for insurance in their home-state market places. Others joined their spouse’s employer-provided plans. But uninsurance is not a widespread problem for people who work on Capitol Hill, which means Scalise will likely be spared the second-most horrifying consequence of his injuries: the financial cost.
Through no fault of his own, Scalise has just incurred hundreds of thousands, if not millions of dollars in medical expenses. And while he may ultimately be responsible for a tiny, tiny, tiny fraction of these costs, he and his Republican colleagues in Congress are, as he convalesces, attempting to expose millions of Americans to the kind of financial ruin he has so far avoided.
The elephant in the room since the shooting in Alexandria has been the tension between elected Republicans’ reflexive expectation that one of their colleagues receive outstanding care at essentially no monetary cost to him, and what they believe millions of other Americans should expect if they meet a similarly unlucky fate.
Because Republicans are the governing majority, they have no interest in letting Scalise’s ordeal become a symbol of anything related to health policy. I sent Scalise’s office two emails requesting redacted copies of his insurance statements when they become available—to make his total medical costs and his out-of-pocket costs a matter of public record—and received no response.
Democrats have been reluctant to politicize the shooting for different reasons: Scalise is a colleague, the dead shooter was a former volunteer for Bernie Sanders’s presidential campaign, and the media surely would have punished anyone who interrupted the Kumabaya moment on Capitol Hill.
As a consequence, one of the most vivid examples of the importance of a kind and sensible health insurance system has been cordoned off from the active, urgent debate in Congress over whether the health insurance system should be crueler and more irrational. The unveiling of the Senate GOP health care bill presents an opportunity to change that.
With some modest, but important exceptions, the Senate and House versions of Trumpcare are designed to do the same things. “From what I understand, their bill tracks in many ways along the lines of the House bill,” House Speaker Paul Ryan told reporters Thursday. “I think that’s very good.”
This is Ryan’s somewhat bloodless way of admitting that the Senate health care bill would force millions of people—disproportionately old, poor, and sick people—off of their health plans to finance a huge, regressive tax cut. The House bill would uninsure 14 million in the first year alone. It is nearly a mathematical certainty that some of those people will end up getting shot, or hit by buses, or diagnosed with cancer, and incur enormous health care bills, just like Scalise. Unlike him, none of them are likely to have their own dedicated security details, but the more important differences are that their lives will be destroyed, and they will be likelier to die.
The cruel irony is that when the Alexandria shooting occurred, Republicans were far enough along in secret health care negotiations that, in their zeal, they might end up further victimizing one of their own. The House and Senate Trumpcare bills gut protections for people with pre-existing conditions in different ways: the former by allowing insurers to price gouge sick people; the latter by allowing insurers to exclude the treatments sick people need from covered benefit schedules, creating adverse selection. Both would destabilize insurance markets for people with pre-existing conditions in at least some states. The Senate bill does not exempt members of Congress, and House Republicans have gone on record with the promise that Trumpcare will apply to them, too.
We don’t know if Scalise’s recovery will take years, or if he will need chronic care when he gets through rehabilitation. Hopefully the answer to both questions is no. But it’s dreadfully easy to imagine that if a Republican health care bill becomes law, Scalise will ultimately be uninsurable under its terms, leaving him exposed to the long-term costs of his injuries, and to the costs of other ailments that might befall him between now and when he becomes eligible for Medicare.
It is painfully obvious that Republicans would like to pretend that the issues raised by the Alexandria shooting and by their health care repeal efforts don’t overlap at all. It is just as obvious that the health and financial security of people they don’t know, or who aren’t independently wealthy, isn’t of concern to them as public officials. But a recurring theme of conservative politics in America is the discovery of empathy when consequences of right-wing policies hit home. The best thing that could possibly come of Scalise’s shooting wouldn’t be some fleeting moment of political unity. It would be pulling Republicans back from the brink of trading American lives for tax cuts.
On September 18, 1858, Abraham Lincoln avowed that “I am not, nor ever have been, in favor of bringing about in any way the social and political equality of the white and black races.” Lincoln wanted nothing to do with the abolitionists. At that moment, Lincoln was running for the Illinois senate seat, against the proslavery Democrat Stephen Douglas, but he ran on almost the identical platform in his successful bid for president two years later. Douglas tried to smear Lincoln as an abolitionist on account of his well-documented opposition to slavery. Lincoln, a Republican, made a modest antislavery proposal central to his platform: He would respect each state’s constitutional right to maintain slavery where it already existed, but he would oppose slavery’s expansion into federally-controlled western territories.
Though Lincoln is remembered as the Great Emancipator, it is easy to forget that he won the presidency on a platform that not only opposed immediate emancipation, but also endorsed white supremacy. In deference to slaveholders, he pledged to uphold the Fugitive Slave Law of 1850, which forced the federal government to help slaveholders retrieve escaped slaves. He even promised anxious white northerners that he would oppose giving free blacks equal rights: “I am not nor ever have been,” he said in his 1858 speech, “in favor of making voters or jurors of negroes, nor qualifying them for office, nor to intermarry with white people.” Recovering Lincoln’s racial politics, as well as his “excruciatingly slow” embrace of immediate emancipation—the abolitionist agenda he spent a life-time avoiding—is the central aim of Fred Kaplan’s Lincoln and the Abolitionists: John Quincy Adams, Slavery, and the Civil War. Kaplan, an accomplished biographer of Lincoln and John Quincy Adams, covers well-worn territory. But he argues that it is particularly relevant now because of the deep racial prejudices that divide us still.
“We do ourselves a disservice when we self-servingly massage the record,” Kaplan writes. Lincoln did not solve the nation’s race problem: “He left us with it.” That Lincoln’s most remarkable achievement, the emancipation of four million enslaved Americans, was made possible only by circumstances beyond his control, and entailed an accommodation of the nation’s racism, is not meant as an attack on Lincoln. Rather, Kaplan argues, if we “see Lincoln plain” we are able to see our present more clearly too.
Kaplan has a vital argument to make but has chosen an odd way to make it. Rather than pit Lincoln against the antislavery activists who pressured him to slowly change his views, he makes John Quincy Adams into Lincoln’s chief abolitionist foil. To be an abolitionist meant, at base, to demand the immediate end of slavery, regardless of its constitutional protections, and to fight for free black equality; by contrast, many northerners simply held antislavery views—meaning that they were morally opposed to slavery, but were willing to tolerate the South’s right to maintain slavery within its borders, and would try only to limit its expansion westward. They were also ambivalent about, if not downright opposed to, free black equality.
Kaplan calls antislavery politicians like Lincoln “antislavery moralists,” and sees their public pronouncements against slavery as tantamount to “making oneself feel good without accepting the moral obligation to act.” In contrast, he uses the term “antislavery activist” to describe abolitionists. The problem is less Kaplan’s depiction of Lincoln than his view of Adams. Both were antislavery politicians, but he considers Adams an “antislavery activist” as well—something that, by Kaplan’s own definition, does not hold.
In certain ways, Adams makes for an apt comparison. As Kaplan explains, decades before Lincoln became an antislavery president, Adams was making an antislavery agenda central to his political career. In the mid-1830s, Adams, a Massachusetts congressman, began to attract abolitionists’ attention for his strident attacks on the Gag Rule, the 1836 congressional rule that prevented abolitionist petitions from being read on the House floor. Five years later, he agreed to help abolitionists defend the fifty-three enslaved Africans who rebelled aboard the Amistad. Adams successfully argued their case before the Supreme Court, and was increasingly remembered as a staunch abolitionist ally. In November 1860, Wendell Phillips, a white abolitionist and fellow Harvard graduate, praised Adams’s memory. The “last ten years of John Quincy Adams were the frankest of his life,” Phillips said: “He poured out before the people the treason and indignation which formerly he had only written in his diary.”
Phillips set Adams in direct contrast with Lincoln, who was “not an abolitionist, hardly an antislavery man at all.” These kinds of comparisons carry much of the weight of Kaplan’s argument because, in fact, the political careers of Lincoln and Adams barely overlapped. Adams was nearly four decades older than Lincoln, his career winding down just as Lincoln’s was starting up. They served in the House together for only one term—from 1846 to 1848—Adams’s last, and Lincoln’s first. To Kaplan’s credit, he notes how much their voting records overlapped. “Both voted the same way in every instance in which slavery was an issue,” he writes. And on the core issues, they fundamentally agreed: The Constitution constrained the federal government’s ability to end slavery in states where it already existed, but Congress could abolish slavery in lands it controlled—the western territories and Washington, D.C. In addition, both supported abolitionists’ right to petition.
Kaplan, however, is eager to emphasize smaller differences: Lincoln, he tells us, argued that any bill abolishing slavery in the capital required the consent of its white residents. By contrast, “Adams made no conditions: the evil needed to be eradicated.” But not mentioned is that fact that Lincoln, not Adams, wrote one of the earlier bills proposing to abolish slavery in the nation’s capital, in 1849. Adams never tried.
One of Kaplan’s larger conclusions about the two is that Adams was by far more prescient: He realized long before Lincoln did that slavery was the core issue dividing the nation, and that only a war would end it. It is a fair point, but not a particularly illuminating one. He quotes Adams’s diary entry from 1820, shortly after the Missouri Compromise, which created a line west to the Pacific, north of which slavery was prohibited and south of which it could expand. “If the Union must be dissolved, slavery is precisely the question upon which it ought to break,” Adams privately mused.
Given that the Missouri debates dominated Congress’ attention for nearly two years, however, Adams’s remarks are not particularly surprising, nor unique. Thomas Jefferson realized something similar, famously writing at the time: “This momentous question, like a fire bell in the night, awakened and filled me with terror. I considered it at once as the knell of the Union.” Many politicians understood that the Missouri Compromise was not the end of slavery’s intrusion into national politics, only the beginning. And while it is true that Lincoln’s thoughts at the time are unknown, it is also true that he was eleven years old.
But the central problem is this: Adams’s political positions on slavery and race were far more similar to Lincoln’s than they were different. Kaplan often ascribes Lincoln’s cautiousness to his political ambitions: Lincoln “no doubt detested slavery,” he argues, “but practical politics, especially elections, took precedence.” The question is why he does not apply the same standard to Adams. After all, when Adams’s political career lay ahead of him, he was just as cautious in his antislavery politics. In the last decade of his life, Adams was indeed an outspoken antislavery politician, but he was in his seventies, his best political days behind him.
Between 1817 and 1825, while serving as secretary of state to President James Monroe, a Virginian slaveholder, Adams consistently prioritized union over slavery, just like Lincoln. And like Lincoln, he did so because he wanted to be president. When Monroe asked for Adams’s advice on the Missouri issue, he said that while slavery ran counter to the spirit of the Declaration of Independence, “this is not the time, nor was this the proper occasion, for contesting it.” When Adams finally became president, in 1825, he refused to help Britain enforce the Atlantic slave trade ban; he even tried to arrange a convention to help southern slaveholders get back their escaped slaves from Canada.
Where full black equality is concerned, Kaplan provides little evidence to suggest that Adams, unlike Lincoln, could accept a “multiracial America.” Adams may have thought plans to send African Americans to live in colonies outside of the United States impractical, in contrast to Lincoln, who became an outspoken colonizationist in the 1850s. But that is hardly enough to justify depicting Adams as believing in African Americans’ “civic equality.” Kaplan puts a generous gloss on an essay Adams wrote on Shakespeare’s Othello, in 1835, in which he states: “The moral of the tragedy is that the marrying of black and white blood is a violation of the law of Nature.” Kaplan tries the thread the needle: Adams may have wanted to keep racial bloodlines separate, he argues, but he still believed that they should live equally under the same laws. By that logic, Adams is best described as a segregationist, not, as Kaplan suggests, a believer in racial equality.
When Kaplan focuses only on Lincoln, his general interpretation is sound. Up until the war years, little in Lincoln’s career would suggest that he would be the one to emancipate the nation’s enslaved women and men. When he ran for president in 1860, his approach to slavery was “safe and conservative.” Up until 1862, Lincoln only argued for slavery’s non-extension westward. If slavery were ever to end—something he promised never to force upon the South—he believed slave-owners should be compensated. Moreover, he embraced voluntary black colonization, something that was heresy to abolitionists.
Kaplan carefully explains Lincoln’s conservative antislavery agenda by placing him within his political context. Except for the small band of abolitionists, no one would vote for a candidate who advocated immediate emancipation. Lincoln understood that most northerners preferred maintaining the union over enforcing their “antislavery moralism.” Kaplan also makes a strong case that Lincoln, despite his shrewd political skills, was naïve to think that the Deep South, where slavery’s supporters were strongest, would accept anything less than the total renunciation of an antislavery agenda. When the South seceded shortly after Lincoln’s election, Kaplan rightly emphasizes, he did not fight the war to abolish slavery but to save the Union.
So how did Lincoln end up the Great Emancipator? Kaplan correctly argues that, for Lincoln, the Emancipation Proclamation, issued on January 1, 1863, was a “military necessity.” The Union forces were losing badly, and by attacking the heart of the Confederacy’s wartime economy—slavery—Lincoln could force them into submission. Yet Kaplan misses the radical nature of Lincoln’s act. His eagerness to highlight Lincoln’s failure to embrace racial equality forces him to downplay the Emancipation Proclamation’s true significance. The proclamation said nothing about black citizenship; Kaplan is right about that. But it also meant that nearly all abolitionists finally embraced Lincoln’s war effort too: No longer a war to save the Union alone, it was now a war to end slavery.
Kaplan tries to argue that Lincoln’s Emancipation Proclamation should have gone further. Apart from enshrining black citizenship, he argues that Lincoln could have also freed all the nation’s slaves; instead, he chose only to free slaves in the rebellious states, not the slaves in the border states that stayed out of the war. Had Lincoln interpreted the Constitution’s “war powers” clause more broadly—the clause that allowed the president to enact martial law, and impose emancipation during wartime—he could have argued that the border states were part of the larger “theater of war,” Kaplan writes, giving him the constitutional authority to end slavery everywhere.
But that greatly underestimates the importance of keeping the border states out of the war. If Lincoln angered them, as surely emancipating their slaves would have, they very well might have joined the Confederacy; Lincoln would have risked losing the war, and with it, the opportunity to enforce the Emancipation Proclamation. Instead, he chose to leave ten percent of the nation’s enslaved population in bondage—the percentage of slaves in the border states—while freeing ninety percent of the rest. It was a temporary bargain, but one that even abolitionists accepted. In fact, while it never disillusioned abolitionists of Lincoln’s failures in regard to race, all of them realized that, when it came to slavery, Lincoln was now their man. Even Frederick Douglass, hardly blind to Lincoln’s limitations, called him in 1865: “emphatically, the black man’s President: the first to show any respect for their rights as men.”
has found an important subject for a book, but he has misidentified the
abolitionists. Had he focused more on genuine antislavery activists and less on
politicians, he might have arrived at a different conclusion. As he rightly
emphasizes, the visionaries were not politicians like Lincoln. But they were
not politicians like Adams, either. The visionaries—the ones who saw a
multiracial future—were activists like Frederick Douglass and Sojourner Truth, Frances
Ellen Watkins and William Lloyd Garrison. They were runaway slaves. They were
women, men, black, white, rich, poor. They understood that politics involved
compromises. But they also understood what an accomplishment emancipation was.
And they were under no illusions that a long fight for racial equality lay ahead.
To call Blood Drive a bad show would miss the point. It is bad, but it wants to be. Specifically, it wants to be so bad that it’s good: like a low-rent ’70s thriller you happen upon while flicking through the channels one afternoon and later remember as a kind of fever dream, or a horror movie filmed in a backyard or an abandoned mall, so delighted with its own schlockiness that it can’t really be scary. Blood Drive wants to be bad, good-bad, like that, and figuring out how it fell so wide of the mark—how it ended up just feeling tired and bland—means finding out what good dystopian entertainment does for us, where it comes from, and why it is so hard to produce.
Blood Drive, a new horror series on SyFy, is set in a distant future in which a series of earthquakes caused by fracking has set off an environmental catastrophe. The police have been privatized, water is being rationed, the people are desperately poor, and gas is obscenely expensive—$2,000 per gallon. One way to make money, if you can stand it, is to sign up for the Blood Drive, and ambiguously legal car race through vast expanses of the empty West, where the cars run on blood and the losers of each leg can be killed off for fuel. The premise touches on multiple issues that plague our own world: police brutality, economic inequality, climate change. But the show mostly ignores them. For the most part, the producers are interested in these injustices only insofar as they give them excuses to indulge in campy visuals of cars, half-dressed women, and gory death scenes crammed with butcher shop viscera and syrupy blood.
Ours is a potent time for dystopia. Hulu’s The Handmaid’s Tale has started to feel as ripped from the headlines as Law & Order ever was, and, two weeks after the presidential election, Amazon pulled ads for their series The Man in the High Castle from the New York City subway after Mayor De Blasio called them “irresponsible and offensive.” The series imagines an alternate reality in which Nazi Germany colonized the United States; in his statement, De Blasio cited his concern for “World War II and Holocaust survivors,” but the ads, coming so soon after Trump’s victory—and the strains of American racism, white supremacy, and actual Nazism that victory empowered—seemed to speak not just to the past, but to the future.
Can we no longer idly imagine dystopia? Do we now have a moral responsibility to see which machinations of the present are enacting our worst fears? And, maybe more to the point—at least for the people who make a living by conjuring our nightmare visions on TV—can dystopia still be fun?
Blood Drive is not fun, or at least it
wasn’t fun for me. I wanted it to be. I like movies about murderous road races
through futuristic wastelands (the Mad
Max series, Death Race 2000), and
movies about people forced to compete in life-or-death games motivated by
corporate greed (The Running Man,
Rollerball), and movies about outcasts struggling to survive in cities that
the rest of the world has done its best to forget (Blade Runner, Escape from New York). Yet Blood Drive is not a response to the dystopian fears of 2017, but a
pastiche of old dystopias. It’s a sizzle reel of old anxieties, and, for all
its joyful goriness—the nudity! The knocked out teeth! The blood!—it’s more
froth than heft, a confection that can harm no one.
of Blood Drive’s most
thought-provoking missteps comes from its determination to recreate the
hallmarks of dystopian worlds, but without the context that summoned them forth
to begin with. The show’s main plot kicks off when Arthur (Alan Ritchson), a
by-the-book cop, stumbles across the celebrations marking the start of the
annual Blood Drive, a race undertaken by drivers whose cars run on human flesh.
He’s discovered, and forced to become a racing partner to Grace (Christina
Ochoa), an outlaw driver whose sexuality the writers have chosen to express by
making sure she’s licking a lollipop at all times. (In a futuristic wasteland
where gas costs $2,000 a gallon, how much is hard candy?) The Blood Drive
itself is presided over by a master of ceremonies named Julian Slink (Colin
Cunningham), who’s a little like Mad Max’s
Toecutter, a little like The
Running Man’s Damon Killian,
and a lot like the emcee in Cabaret, which depicted its own form of dystopia—the rise of the
Third Reich. “To the queer and the strange, in the crowd and on the stage”
Slink announces at Blood Drive’s kickoff, “to the violent, the malevolent, and
those seeking a grave: welcome home.”
Blood Drive feels most alive—and least like a collage of older stories—in scenes of the giddy crowds that watch the race. Dwelling on the spectator’s perverse joy, Blood Drive hints at that unique longing that animates the most haunting dystopian narratives: The longing we feel to see society disintegrate around us, and to see what would happen to us in the aftermath. Dystopian narratives, after all, rarely depict the destruction of civilization itself. Instead, the story begins after the fact, and often lets us imagine what fun we will have playing in the wreckage of the world. John Carpenter’s Escape from New York gives us a future in which Manhattan Island has become an enormous and entirely unsupervised penal colony, but also takes visible pleasure in imagining an abandoned city, creating villains who patrol the ruined streets in chandelier-bedecked art cars, put on all-inmate can-can shows, and, in the case of Harry Dean Stanton’s character, move into the New York Public Library. Blade Runner tells a whole shadow narrative through billboards and neon: We know Los Angeles has somehow become a place where an entire skyscraper is used to broadcast a Japanese-language commercial featuring a Geisha swallowing a birth control pill, but we can only imagine how.
Dystopia allows us to see not just beautiful ruins, but the strange cross-cultural bonding that can occur when society as we know it no longer exists. Dystopia is queer time, in Jack Halberstam’s formulation of the term: an experience of life in which there is no established order of events, particularly with regard to relationships, and therefore perhaps more room for intimacy. A dystopia’s potential for unexpected trauma can be matched by its potential for fostering intimacies that would, in another society—in any society—be impossible. Dystopian stories afford us this comfort, and perhaps it is for this reason that we continually seek them out, even when they are capable of cutting us so close to the bone.
Like its plot, the visuals of Blood Drive are a collage. Grace’s world of desert blacktops, muscle cars, and blistering sun is an Easy Rider-style post-apocalypse, while Arthur’s L.A. is the drizzly, crime-riddled metropolis of Blade Runner, and the contradictory aesthetics both characters wander through means that it’s feasible for there to be a global water shortage, as the series claims in some shots, and conspicuous rain in others. The show has no internal consistency; it looks the way it wants to when it wants to, based on the influences it’s working to conjure. Even the Blood Drive itself draws heavily on stereotypes of apocalyptic possibility. The festivities surrounding the race look a lot like Burning Man; there are amps and gas flames and the word MAYHEM scrawled large. The possibilities of dystopian narrative are distilled to a visual vocabulary, severed from any real meaning.
In the days following the presidential election, the dystopian narrative I reached for first was 2006’s Children of Men, a movie that has come to seem, in the last decade, more alarmingly prescient than ever. It imagines a world where women have lost the ability to bear children, and allows the viewer to imagine that the scenes of societal destruction they witness—of terrorist splinter groups and totalitarian governments, sudden violence and state-sanctioned torture—are mankind’s response to imminent extinction. “As the sound of the playgrounds faded, the despair set in,” one character, a former midwife, recalls. “Very odd, what happens in a world without children’s voices.”
But Children of Men also suggests that the dystopian scenes it depicts may be just as much the cause of mass infertility as its result. Children of Men shows us nothing new. Its images of torture echo news from places like Iraq and Serbia; these are things we have seen before, but never in Britain, and therein lies their power. Children of Men is about trauma coming home, and about a world in which the apparent end of the human race is not the cause of our manifest inhumanity to each other, but just punishment for it.
This is a question that perhaps every dystopian narrative hazards, at its core: Does a dystopian world make us treat each other cruelly, or does our cruelty create a dystopian world? Blood Drive doesn’t take on these these questions: It’s happy to plunder the aesthetics of other dystopian dramas without taking on their curiosity. It also renders itself, for all its gory charm, extremely boring. Not all dystopias have to be grim, or even political, but the genre may require its creators to take at least their own questions seriously in order to create a world that a viewer can fully experience—in pleasure, hope, or fear.
Jon Ossoff’s disappointing loss in Tuesday’s special election in Georgia has triggered rebellious feelings within the Democratic rank-and-file, as some call for House Minority Leader Nancy Pelosi to step aside. “I think you’d have to be an idiot to think we could win the House with Pelosi at the top,” Filemon Vela, a congressman from Texas, told Politico. “Nancy Pelosi is not the only reason that Ossoff lost. But she certainly is one of the reasons.” Vela’s impassioned comments are all the more startling because he supported Pelosi in the leadership race in November, when Congressman Tim Ryan of Ohio ran as the populist alternative. While Pelosi won decisively, 134 to 63, Ryan did well enough to prove that congressional Democrats were, like the broader party, deeply divided.
If the race were held again today, Pelosi likely would face a much tougher fight—though as Politico reports, “There is no challenge to Pelosi’s leadership, and none is going to happen at this point, said numerous Democrats.” Pelosi, who is 77 and has led House Democrats since 2003, isn’t going anywhere for awhile—and that’s what worries restless Democrats.
The case against Pelosi is by no means clear cut. Her detractors note that Republican attack ads in the Georgia race gave her prominence, apparently evidence that her unpopularity is a drag on the party. “The fact that Republicans spent millions of dollars on TV ads tying Democratic hopeful Jon Ossoff to Pelosi — and the brand of progressive policies she represents — shows that she will once again be an issue for Democratic challengers in the very districts that the party needs to win to make her speaker again,” Politico notes. Yet as Dartmouth political scientist Brendan Nyhan tweeted:
The ads that featured Pelosi were aimed at energizing highly partisan Republicans, the type of people who would know and hate any Democratic leader. Pelosi is a villain in these ads almost by default, since more prominent party leaders—Barack Obama and Hillary Clinton—have stepped aside, and thus lack sufficient scariness as bogeymen. Moreover, getting rid of Pelosi just because Republicans hate her would be a singularly craven move for Democrats—and would probably be ineffective to boot, in this regard. Her successor would become the GOP’s new top enemy.
The ideological critique of Pelosi is equally confused. To Republicans, she’s the archetypical “San Fransisco Democrat,” committed to unrestrained liberalism and out of touch with heartland values. Yet to Pelosi’s left-wing critics, she’s utterly without principles and cares only about holding the reins. “The Democratic House leadership is dedicated to retaining power for themselves and nothing else,” argues Matt Stoller, a fellow at the New America Foundation. “Nancy Pelosi is utterly incoherent. She’s not a leader, she’s in charge of making sure no other leaders emerge.”
Both these critiques miss the central fact about Pelosi: She’s been an extraordinarily effective parliamentarian. The daughter of a former congressman and mayor of Baltimore, she was born into the machine politics of the Democratic Party and has the gift of a ward boss who knows how to trade favors and cut backroom deals in order to hold a coalition together. She is also perhaps the most talented fundraiser in American politics, having brought in more than half a billion dollars to Democratic coffers since taking over the leadership in 2003. She inherited a minority party that had been in retreat for more than a decade, and brought it back to majority status in 2006.
Unlike the Republican speakers who preceded her (Dennis Hastert) and succeeded her (John Boehner, Paul Ryan), Pelosi held her caucus together during their minority (under President George W. Bush) and delivered major legislative victories during their majority (especially in President Barack Obama’s first two years). A 2009 profile in Time surveyed her accomplishments: “Pelosi holds the highest post ever attained by any woman in U.S. history, and stands second in line of succession to the presidency. She has consolidated more power than any other Speaker in modern history, scholars of the office believe. In the first year of the Obama presidency, she has used that power — and an 81-seat Democratic majority, the largest either party has enjoyed in the House in 14 years—to pass every item on his agenda: health care, energy, regulatory reform, education, pay equity.”
The determination and strong-arming Pelosi showed in pushing through this ambitious agenda might also point to why it’s time for her to go. Her iron grip on House Democratic leadership is preventing a new generation of leaders from rising, which the party desperately needs to attract new voters. Her use of the Democratic Congressional Campaign Committee is a case in point. As Tom Dickinson notes in Rolling Stone, “The committee has functioned as the political machine of Nancy Pelosi, leader of House Democrats since 2003, who is the DCCC’s prodigious chief fundraiser and has hand-picked its chairman. On Pelosi’s watch, the committee has caught flak from allies for being slow to adapt to the digital and demographic revolutions in politics, creating a disconnect with the emerging electorate.”
When Pelosi herself first rose to national prominence in the early 2000s, she was the upstart politician rebelling against a sclerotic leadership. As Karen Tumulty wrote in a 2009 profile for Time, when Pelosi argued for new strategies to win back the House she “could feel the dismissiveness from the House Democratic leadership, which hadn’t added a new face to its top echelon in nearly a decade.” The Democrats are once again in a situation where the party elite needs new blood. But if it’s time for fresh leadership, Democrats would do well to remember and celebrate Pelosi’s genuine achievements as one of the most substantive legislators in modern American politics. In fact, they should be so lucky as to find the next Nancy Pelosi.
How many buildings have been designed but never built? Countless cities, whole extra urban universes. Every time a sketch or a blueprint is scrunched up and thrown in the bin, it’s like a wrecking crew has razed it. But in the ghostworld of imaginary structures, every unbuilt building stands.
A new show at the Museum of Modern Art collects the drawings (and a few three-dimensional objects) from the archive of Frank Lloyd Wright—the architect with the best brand-name recognition in America—on the occasion of his 150th birthday. Being a celebrity, however, did not guarantee that Wright’s visions would be built. The show is full of gorgeous places that never came into existence. Strange houses, buildings designed around hexagonal units and arranged on the diagonal.
Wright designed 532 buildings that were made, and about the same number again that never were. His career spanned seven decades. His personal life was beset by chaos. He left his first wife Kittie, then in 1914 his partner Mamah Cheney was murdered alongside six other people by a domestic worker named Julian Carlton. His second wife, Miriam Noel, was a hopeless morphine addict. His third marriage, to Olgivanna, seems to have been all right. Wright famously said that, “not only do I fully intend to be the greatest architect who has yet lived, but fully intend to be the greatest architect who will ever live.” Walking around this show, a beautiful edifice built of the flotsam and jetsam of a long career, one realizes that even a man like that didn’t always get his way.
In the late 1920s and 1930s, Wright made three houses that defined his “organic style”—Graycliff, Tallesin West, and Fallingwater. Wright’s ideas about organic form are among his most influential legacies. He began using the word “organic” as early as 1908, although he never really articulated it into a slogan (as his mentor Louis Sullivan had done with his own mantra, “form follows function”).
Fallingwater, also known as the Kaufman Residence, is Wright’s most famous work and probably the model of organic architecture that lingers closest to the front of the American imagination. The principle of organic architecture is simple, commanding its followers to sympathetic and congruous relations between the structure and the environment in which it is built.
The building is daringly cantilevered, with reinforced concrete balconies shooting off in multiple directions over the water flowing beneath it. The underside of them is naturally a pretty humid spot, and Fallingwater has become very moldy at times. In fact, the whole building resembles a fungus, one of those tough ones that grow out horizontally from a tree like a stubborn little shelf. (Kaufman himself nicknamed the building “Rising Mildew” because of the mold and the leakiness.)
This principle of integration with the landscape carries over into Wright’s visions for ideal cities, which are characterized, to quote a MoMA wall text, by an “urbanism of dispersal or decentralization made possible by new forms of transportation.” Although midcentury concrete of the Wright style has come to signify the plantless, dense urbanism of the city, drawings from The Plan for Greater Baghdad Project and for the Rosenwald School (both exhibited at length here) show a more sensitive, greener vision.
On a much smaller level, this show also emphasizes Wright’s interest in ornamentation. This work is surprisingly colorful, bold, even Kandinsky-esque (Wright liked orange). One of the show’s most interesting galleries focuses on Wright’s work on the Imperial Hotel in Tokyo. He designed the building roughly in the shape of the hotel’s logo, an H with an I cutting through it. The Imperial Hotel pieces in this exhibition include a number of decorative terracotta blocks, as well as upholstery textiles and furniture.
The building suffered greatly from a series of earthquakes. After the Great Kantō earthquake on September 1, 1923, Wright received a telegram from Baron Kihachiro Okura testifying to its endurance. It read: “Hotel stands undamaged as a monument of your genius hundreds of homeless provided by perfectly maintained service congratulations.” However, this was not quite true. It had been damaged, and the hotel also gradually fell into disrepair. In 1967 it was destroyed and rebuilt.
Four hundred and fifty pieces make for a big exhibition, and this one takes a while to walk around. There are architectural drawings, of course, but also models, film clips, textiles, photographs, bits and bobs. The show is in twelve themed sections and MoMA says that it is “structured as an anthology rather than a comprehensive, monographic presentation.”
When I visited the show, by far the most crowded room contained nothing but a large, projected clip of Frank Lloyd Wright on What’s My Line? The clip reiterated the obvious, that Frank Lloyd Wright was and remains a very famous man indeed.
But if Fallingwater is a big celebrity place you should “see before you die,” like the Taj Mahal or the Parthenon, MoMA’s exhibition draws attention to the careful, quiet labor of the studio. In placing many medium-sized drawings at eye height through a number of rooms, without much fanfare, this exhibition creates a tribute to the unrealized.
To preserve a drawing of a place that never had a chance to exist is to treasure the quality of an idea over the beauty of its actual form, the way that it looks once made flesh. That act of preservation is also a tribute to the mind that had those visions, rather than the works that remain in the world. But where exactly is the line between the mind and the work of an architect? If that line exists, it is in the difference between two and three dimensions, or perhaps in the translation of scale that comes between model and construction. The line may exist in multiple ways and in multiple forms, criss-crossing like the floor-plan of a house built to hang over a river.
Mamie Till-Mobley wrote her memoir, Death of Innocence: The Story of the Hate Crime That Changed America, in 2003, the same year she died of heart failure, and 47 years after the lynching of her son, Emmett Till. “When I am out and about,” she explained, “people recognize me and they want to talk about him, what his death meant to them, what I mean to them still. They just can’t help it.”
If the public’s curiosity about Till-Mobley’s suffering had come to seem natural over the years, it was in part because she established a tradition of victims’ families publicly grieving for the lives taken in brutal racist attacks. It is thanks to Till-Mobley that those of us who are familiar with Emmett Till’s story know him through a few select images, those she decided to make public after his death. In the first of these photos I ever saw, a 13-year-old Till sports a wide-brim hat and the kind of nonsmiling smile of the reluctant subject of a photograph. In another, taken that same Christmas Day in 1954, he is faintly grinning, and if you look hard enough at his round, unblemished face you can see the outline of a mustache growing above his lips—the kind of facial hair a boy anxious for the world to call him a man may be a little too proud to show off.
And then there’s the most famous photograph of Till, one in which none of these features can be distinguished because they no longer exist. Lying dead in his casket, his face is recognizable as a face only because we know where eyes, ears, nose, and mouth are supposed to be. It is this photo that his mother wanted published in Jet magazine and The Chicago Defender, leading black publications of the time. “People had to face my son and realize just how twisted, how distorted, how terrifying race hatred could be,” she wrote in her memoir. “The whole nation had to bear witness to this.” It is also this photo—evidence of Till’s lynching at the hands of two white men from Money, Mississippi—that helped to launch the civil rights movement. “I thought of Emmett Till,” Rosa Parks said in 1956, “and when the bus driver ordered me to move to the back, I just couldn’t move.”
Mamie Till-Mobley’s memoir keeps company in a small literary subgenre alongside books by Myrlie Evers-Williams and Coretta Scott King, the widows of assassinated civil rights leaders Medgar Evers and Martin Luther King Jr. In 2013, the year after Trayvon Martin was killed, Jesmyn Ward published her gorgeous memoir, Men We Reaped, which chronicles the lives and deaths of five young black men, including Ward’s brother, in her hometown of DeLisle, Mississippi. In the wake of Trayvon’s death and the acquittal of his killer, George Zimmerman, Ward’s stories of these unrelated black men served as a stand-in for Trayvon’s story. Now his parents, Sybrina Fulton and Tracy Martin, have published their own book about his life and death, while Lezley McSpadden has told the story of her son, Michael Brown, and his killing at the hands of a white police officer in Ferguson, Missouri. These works are an outlet for grief, but also part of what has become an obligation for black families to mourn in public.
In an America where mass shootings are a common occurrence, there is no shortage of white people who have lost loved ones in a highly publicized tragedy. But the responsibilities foisted on them are not the same: Relatives of white victims can choose to become activists—they might take up, say, the cause of gun control—but they aren’t required to, and they never have to, prove that their very lives have value. Their grief is their own. Black grief belongs to the world, and is regulated by the same forces that caused such deep pain in the first place. Black families become advocates, activists, and spokespeople, historians, journalists, and policy experts, while also being the gatekeepers of the legacy and humanity of those they’ve lost.
And they must somehow do all of this while comforting a society that both produced the conditions for these tragic deaths and still refuses to acknowledge its role in them. If the condition of black life is one of mourning, as Claudia Rankine has reflected, we should at least be able to own our own tears.
In the five years since George Zimmerman, a neighborhood watchman, shot and killed Trayvon in Sanford, Florida, Trayvon has often been referred to as this generation’s Emmett Till. That’s a heavy distinction to bear, for reasons beyond the loss. Branding a black child’s death as the latest incarnation of Till’s recognizes that there exists a 60-year period during which the conditions that produced Till’s brutal killing have remained relatively unchanged.
Till-Mobley waited until the end of her life to tell her and her son’s story. “It took quite a while for me to accept how his murder connected to so many things that make us what we are today,” she wrote, but she came to see that “there was an important mission for me, to shape so many other young minds as a teacher, a messenger, an active church member.” Her responsibility, as she saw it, was to children like Trayvon, who would be able to read what happened to Emmett—or Bo, as she called him—and to see what white supremacy is capable of. In their joint memoir, Rest in Power: The Enduring Life of Trayvon Martin, Sybrina Fulton and Tracy Martin—they write alternating chapters—describe how, in fact, they raised their son to stay on guard in this very way. Tracy Martin writes:
I knew good and well that if there was a racial confrontation, no matter the right and wrong of it, the black person involved would be saddled with the presumption of guilt.... A generation later, I had to give my sons the same instructions my mother gave me.
Education and vigilance were, of course, not enough. “Progress,” Trayvon’s father adds, “is sometimes hard to find.”
Martin and Fulton stepped into the responsibility they felt to tell Trayvon’s story much more swiftly than Till-Mobley, but clearly not without their own reservations. “How can I show you the hole in my heart?” Sybrina Fulton asks in her introduction to the book. “How do I write about the death of my son?” For Fulton, this is a question about how to communicate an insurmountable emotion—grief—but it’s also about the extraordinary difficulty of grieving amid so many public responsibilities. How can she relive her pain over the course of writing a book? How can she sit still for long enough to process it all, between speaking engagements and presidential endorsements, between foundation fund-raisers and media interviews, between eating and sleeping and breathing her way back to normal? These obligations are work: labor that is necessary to preserve the meaning of Trayvon’s death for generations to come, but that weighs on those who grieve nonetheless.
Some of this work of grieving is undeniably gendered: Fulton and Martin’s memoir is an anomaly in that Martin, as the father, is a participant in telling Trayvon’s story. He recalls what their attorney told him in the early days of media coverage: “We have to get Sybrina involved. She’s the mother, and people need to hear from her.” His voice, in short, will resonate less than Sybrina’s. It is the mother’s pain we want, the mother’s tears and anguish. When racist violence kills their children, black women are called forth to perform a version of womanhood that is meant to convince white people they value motherhood and to soothe white fear. Black women must be devastated, but not angry. They must stand up for their children, but never question the system. They must fight for justice, but forgive everyone when none is delivered. And when the dust settles, and white people have decided that the “race conversation” has run its course, the mothers must quiet themselves and fade away, until white people decide it’s time for the families to perform their grief once more.
It is no doubt because of the fraught nature of grieving that people reacted so strongly to Dana Schutz’s painting of Emmett Till at the Whitney Biennial earlier this year. Some said that Schutz, as a white artist, could never understand the nature of the racial violence that her painting depicted. She defended herself by shifting the focus from race to gender. “I don’t know what it is like to be black in America,” she said, “but I do know what it is like to be a mother.... The thought of anything happening to your child is beyond comprehension. Their pain is your pain.” On one level, that may be true. But as Fulton’s chapters underscore, there are aspects of motherhood that an artist not connected to this history would struggle to notice, much less convey in a work of art.
Rest in Power is difficult to read, in part because Sybrina Fulton and Tracy Martin did not ask to become authors. The pain of their obligation hangs over every page, every sentence. They are not professional writers choosing to make beautiful sentences out of the darkness as their vocation—as Jesmyn Ward does brilliantly in her memoir. In Men We Reaped, Ward sets out to help us understand what makes her stories important for the rest of us—that acknowledging who killed these young men is a vital part of the healing process:
I write these words to find Joshua, to assert that what happened happened, in a vain attempt to find meaning. And in the end, I know little, some small facts: I love Joshua. He was here. He lived. Something vast and large took him, took all of my friends. Roger, Demond, C.J., and Ronald. Once, they lived. We tried to outpace the thing that chased us, that said: You are nothing. We tried to ignore it, but sometimes we caught ourselves repeating what history said, mumbling along, brainwashed: I am nothing.... There is a great darkness bearing down on our lives, and no one acknowledges it.
Where Ward’s was a private grief made public, Fulton and Martin’s grief was never fully their own. They are charged with taking a story that many of us feel connected to and filling in the personal details we did not know. Their son’s death gripped a nation and sparked a movement. There is a built-in understanding of what is important about their story; the function of their grief is not to illuminate. What is being asked of them is to turn their personal pain into a healing process for the rest of the country. The same nation that denies that the problem of racist violence even exists asks the most vulnerable to diagnose and treat it.
Over and over again, Tracy Martin reiterates that he is a truck driver. In other words, he is an ordinary man, with an ordinary job, trying to lead an ordinary life. He didn’t set out to be an activist or an expert on the Stand Your Ground policy that allowed Zimmerman to escape any legal responsibility for Trayvon’s death; Martin wanted only to provide a decent life for himself and his children. “All I wanted was to be a mother, to work at my job and raise my kids and live a normal life,” Fulton writes, echoing her ex-husband’s sentiment. “Then my son was killed and that world went with him.” Her new world is one in which she must perform her grief, not only because of a public desire to know, but because of a public penchant for distorting the victims’ characters.
If Fulton and Martin’s book lays bare the work of grieving, Michael Brown’s mother, Lezley McSpadden, has taken on an even bleaker task: the labor of rehabilitating her son’s image. “Michael Brown, 18, due to be buried on Monday, was no angel,” The New York Times wrote less than three weeks after he was shot by Ferguson police officer Darren Wilson. After much outrage, Margaret Sullivan, the newspaper’s public editor at the time, acknowledged that it was an “ill-chosen phrase.” But the piece accurately reflected a narrative that had formed about Brown. Video had surfaced that showed him aggressively handling a cashier at a convenience store and allegedly stealing a pack of cigarillos shortly before he was shot in the street by Wilson. That was enough to convict Brown—not of stealing or assault, but of the higher crime of being a “bad nigger.”
In polite society today, of course, no one would actually say those words aloud. (Though in a day and age when a candidate can be elected president after referring to “bad hombres” and “nasty women” on the campaign trail, it’s not hard to imagine that “bad nigger” is only biding its time until it is once again socially acceptable.) But the same narrative repeated itself in killing after killing. Eric Garner’s death at the hands of an NYPD officer in Staten Island earlier that year became an indictment of Garner’s illegal activity of selling loose cigarettes; Sandra Bland’s mysterious death in a Texas jail cell in 2015 was reduced to her “mouthing off” and failing to comply with an officer during a traffic stop.
In Tell the Truth & Shame the Devil: The Life, Legacy, and Love of My Son Michael Brown, McSpadden sets out to make clear that while her son may have been “no angel,” he was human. He was a person with goals. He was kind, gentle, a caring friend, and a son she loved with all her heart. Not only was his life taken from her, but in death he was defined as something other than the son who brought her joy. And now it has become her responsibility to set the record straight.
“I wasn’t there when Mike Mike was shot,” she writes. “I didn’t see him fall or take his last breath, but as his mother, I do know one thing better than anyone, and that’s how to tell my son’s story, and the journey we shared together as mother and son.” Her book, which includes a foreword by Myrlie Evers-Williams, largely recalls Brown’s life before he was killed. She writes lovingly of his affinity for music from an early age, and less so about her struggles to raise him as a teenage mother who faced abuse at the hands of his father. Mostly, what she strives to get across is that before Wilson killed her son, Michael Brown was working hard to live a normal life. Even if the rest of the world never comes around to embracing Michael Brown as human, with all of the contradictory and complex characteristics implied by the term, his mother now owns her own narrative of his life. In these pages, he exists, whole and loved.
Yet it should never have been Lezley McSpadden’s job to reclaim the story of her son’s death. Part of owning the narrative must be the ability to grieve on your own terms. On April 29 this year—the twenty-fifth anniversary of the day four LAPD officers were acquitted in the beating of Rodney King, an act of injustice that ignited the L.A. riots—15-year-old Jordan Edwards was shot and killed by Roy Oliver, a white police officer in Balch Springs, Texas, who fired his rifle into a car full of teenagers leaving a party. In a statement released ahead of Edwards’s funeral, his family wrote:
At this time, we ask that you please be respectful of our family, and allow us the opportunity and space to grieve. This entire ordeal has been inescapable.
And in a second statement: “Though we understand what his life and death means symbolically, we are not ready to make a martyr of our son.” In the absence of justice, perhaps progress is simply returning to black families their right to grieve in peace.
On June 18, Wisconsin ironworker Randy Bryce announced his campaign to unseat Speaker Paul Ryan with a campaign ad that instantly went viral. In it, he talks about his own battle with cancer and his mother’s needs as a patient with multiple sclerosis. He attacks Ryan’s opposition to the Affordable Care Act. And he presents himself as an outsider, populist candidate who can re-energize the Democratic Party. “I decided to run for office because not everyone is seated at the table,” he says in a voiceover. “It’s time to make a bigger table.”
At Payday Report, Mike Elk reports that Bryce has already lined up endorsements from the Milwaukee Building Trades, state Senator Chris Larson, state Representative JoCasta Zamarripa, and former House candidate Rob Zerban.
Though this isn’t Bryce’s first run for office—he ran for state assembly and state senate and lost both times—he may be just the candidate his beleaguered party needs. Not only is he running as a blue-collar progressive and well-known union member in a state with a storied history of labor politics and agrarian populism, Bryce could also be something of a unity candidate for Democrats. “Although he was a Sanders surrogate during the primary, he campaigned for Hillary Clinton in the general election and would have been an elector for her had she won,” Elk writes.
Bryce has one Democratic primary challenger so far: David Yankovich, who announced his candidacy on May 30. In this interview, Bryce explains to the New Republic his reasons for running and how he plans to win a district the Democratic Party hasn’t held since 1995. This interview has been edited and condensed for clarity.
Tell me why you decided to challenge Paul Ryan.
I’m a lifelong resident of Southeastern Wisconsin. I graduated from public schools, went into the Army after that. When I came back, I was diagnosed with cancer and I didn’t have insurance, and now it’s considered a preexisting condition. I worked sometimes two full-time jobs to make ends meet. Finally, I joined the union, the Ironworkers Union, which had an apprenticeship. I got my journeyman’s card and I’ve been doing that for 20 years now. As I drive through the district I can look and see, “I worked on that, I built that.” So literally I spent the last 20 years of my life building the district. Looking over at Paul Ryan, I’m wondering what he’s been doing.
Things have been taken away from us. Autoworkers used to have a lot of great-paying jobs building cars. Right now they’re tearing down the UAW plant—the General Motors plant—in Kenosha, there’s a huge abandoned facility in Janesville, and some of the best-paying jobs in Waukesha County are going up to Canada.
People are working harder these days and having less to show as a result for it. Paul Ryan hasn’t been in the district for a town hall in over 600 days and it’s time to make a change. If I can’t perform my job I get fired at work. And it’s time to get someone who can do the job Paul Ryan was hired to do.
How will your experience with the union influence your campaign?
I see this as an opportunity to create stewardship, to look out for the rest of the people in the community. Just like I’ve done as a member of the union’s executive board. It’s about taking care of people, and making sure that they’re heard, and that people are treated fairly. Nobody’s been heard, and that’s the biggest complaint right now.
Donald Trump won an area in Kenosha that had traditionally been Democratic, but people are waking up and they’re seeing that it was all talk. They have buyer’s remorse now. I’m a working person, I don’t play one in a video. That’s my life, and I’ve always stood with working people. That’s where I’m coming from. The majority of the people in this district are working people. They’re not corporate donors, and that’s who Paul Ryan’s been spending most of his time with.
Do you support the Fight for 15 campaign?
I do, and I’ve been at numerous actions on behalf of providing people a livable wage. I feel strongly that anyone who works a full-time job deserves the freedom to be able to stand on their own two feet.
Your first ad focused prominently on health care. Do you support single-payer health care?
I do. I am convinced we need to move towards single-payer. It works every place else. There are improvements that need to be done with Obamacare, but to completely remove it and the protections that are in place, I see that as the wrong way to go.
Can you tell me a little bit about how you decided that your first ad would be about health care and that you would feature your mother?
Well, it’s one of the issues that’s intergenerational. The Ironworkers are self-insured, so it’s based on hours worked. So especially during the winter months, when there’s not a lot of work, it makes me, as a dad, concerned I might lose health insurance, which would affect my son. Do I make him stay inside in his room and wear knee pads and a helmet to eat dinner? Or can I let him be a kid? Parents shouldn’t have to worry about that. It also affects me personally being a cancer survivor. Luckily I’ve been in remission, but what if it comes back? How is it going to affect me? And with medical bills being the leading cause of bankruptcy I don’t want to be in that position. I don’t want to have to choose between paying my rent or seeing my doctor.
And it affects my parents: My mom, who is in the video, has multiple sclerosis. Luckily she has insurance that can get her the medication she needs, but there are too many people that don’t. If one person can’t get the medication they need, that’s wrong. My father’s in assisted living because he has Alzheimer’s, so that affects my mom too. Thankfully, she is able to have her independence due to the medication she takes, so she can go see my dad. Health care is a universal issue that affects all ages and all races—everybody, regardless of economic status.
What’s your position on abortion rights?
I am firmly committed that it is a woman’s choice to make decisions about what happens to her body.
And you support LGBT rights as well?
For people who aren’t from your district: What do you want them to understand about it?
It’s a broad section of Wisconsin. There’s large urban areas, cities like Racine and Kenosha, which is now the third-largest city, and more to the west it’s all farmland. It’s a big cross-section of working people. You could pick up the first district in Wisconsin and put it pretty much any place on the map and it would blend in anywhere across the United States. It’s a lot of people, and it’s made up of different ethnicities, and it’s a melting pot of what America should be. We take care of our neighbors.
How are you going to address the urban-rural divide in your campaign?
It’s easier to hit the urban areas as far as reaching more people, but there’s going to be emphasis placed on going to the harder-to-reach places. We need to pay attention to everybody in the district. It’s easier for me, living where I do, to reach out to the urban people, but there are concerns too for farmers—like making sure that rural roads are taken care of, that they have access to things like broadband service.
I’ve always had such a healthy respect for farmers. I know the hours I put in are hard hours, but we have eight-hour days. Farmers work from sunup to sundown; they don’t get days off and they have to worry about their retirement. Maybe it’s getting them access to some kind of pension system, so that after donating the best years of their life to raising the farm they have some restful years to enjoy what they earned.
Paul Ryan often appeals to his roots. But you seem to have a very different vision about what it means to be from a state like Wisconsin.
It’s obvious who Paul Ryan is making his decisions for when he has time to go to 50 fundraisers throughout the country and not have one town hall in his own district. If I don’t show up for my job, I’ll get fired and they’ll get somebody else to take my spot. I can’t imagine asking somebody for $10,000 to have their picture taken with me. That’s unimaginable. It shows where his priorities are and they aren’t the people in this district. When he shows up there are breaking news alerts: Paul Ryan has been seen in the First Congressional District at such and such a place. It’s so wrong.
How do you plan to build on the success of your first ad?
Within one day of the video coming out we’ve managed to bring in over $100,000 in donations. The average donation has been about $30, so there’s been a lot of donations of smaller amounts and I appreciate that as a working person. I know how hard money is to make these days, and just having so many people interested and willing to part with their hard-earned money means a lot to me. People have been offering from throughout the area: “How can I volunteer? How can I help?” We’re going to build on the momentum that the video generated. People are going to see exactly what this is about: standing up for people exactly like them.
It is 1996 in Brooklyn. The crime rate is on the decline, artists are fleeing Manhattan and its staggering rents for neighborhoods such as DUMBO, Williamsburg, and Greenpoint, immigrants are flocking to the borough, and you could still buy a brownstone for under $500,000. This is also the year of the Fugees’s iconic album The Score, Lil Kim’s Hardcore, Foxy Brown’s Ill Na Na, and Jay-Z’s Reasonable Doubt. The era was one of creativity, movement, and rapid innovation, making it fertile ground for the racial dynamics explored in Danzy Senna’s highly anticipated third novel, New People. In a decade when the country had witnessed the Rodney King beating, the Los Angeles Riots, and the O.J. Simpson trial, racial tension were at an all-time high. This is not the time to try and escape one’s race. But there are Black Americans whose trauma from decades of racism leads them to cultivate themselves into a world of the light-skinned elite, and a world where they hope they will be safer, more compatible with the American Dream.
This is the world in which we meet Maria Pierce and Khalil Mirsky, two light-skinned, mixed race black people who want it all and are on track to get it: a Brooklyn brownstone, a wedding at a lighthouse in Martha’s Vineyard with nouveau soul food, a dog named Thurgood, and two children “with skin the color of burnished leather” and “hair the color of spun gold” named Indigo and Cheo. Maria and Khalil met at Stanford, where they fell in love over conversations about interracial dating and misogyny in hip-hop, Giovanni’s Room and Cosby episodes, chicken and waffles. Now, Khalil, a part-time technology consultant, is about to take advantage of the dot-com boom by creating an online community of “modern tribalism” with his friend, while Maria spends her days finishing up her dissertation on the Jonestown Massacre. It’s perfect. Until it isn’t.
The novel opens with Maria being entranced by a dark-skinned, unable-to-racially-pass poet at a reading. You might think that this spellbinding moment is the genesis of an affair. Instead, we realize that Maria is watching this performance with Khalil and her future sister-in-law, Lisa. The undercurrent of this story is not about Maria’s identity as a wife, but rather about coming to terms with her racial identity and all the trauma she harbors around it. Maria believes her blackness can be reinforced through TV shows, hairstyles, music, and food—she doesn’t seem to grasp that blackness can exist beyond these things. Her obsession with quantifying and defining her own blackness leads down to an emotional spiral. In an odd scene, a trip to a Scientology church puts Maria face to face with an interrogater, who asks, “Can you remember a time when you were really real?” She breaks down.
No matter how many Different World episodes she watches or how many times she perms her hair to look like Whitney Houston’s, Maria is never at peace. Her mother believes that this is just the inevitably plight of a light-skinned person, and she’s not entirely off-base. With Khalil, her life will be safe and comfortable, but easy. Their trajectories are too privileged, too set in stone. All she has to do is put one foot in front of the other and together, she and Khalil will enter into a polite happily ever after, with a New York Times wedding feature to boot. But she’s floundering. She cannot finish her research on the Jonestown Massacre, which seems like a symbol of Maria’s internal darkness. At least 75 percent of the Peoples Temple members were black, while the core leadership of the cult was white. Its “messiah,” a white man named Jim Jones, incited his followers to commit “revolutionary suicide” and the primary victims were black women. But originally, the Peoples Temple, which was founded in 1955, had been a radical project influenced by Marxism and the utopian ideals of the New Left. The black women who died in Jonestown were people who had wanted to work toward a better world, but ultimately yielded to the deadly force of white authority. It was, in a sense, a failed experiment of a mixed race utopia. And herein are echoes of Maria’s conundrum: How can she really be liberated if the foundation of her relationship with Khalil is based off of their light skin? This fascination is reinforced when they are approached to be a part of a documentary, New People, about people who blur racial boundaries in order to usher in a new era, a new race. Khalil and Maria are going to be the focus of the narrative even though their story—of passing, of racial ambivalence, of conflicted identity—isn’t new.
There is a long history of light-skinned African-Americans forming exclusive communities among themselves. Throughout the 20th century, if you were darker than a brown paper bag, you could not access certain social clubs, sororities, fraternities, or churches. In 1790, the light-skinned free black men of Charleston, South Carolina created the Brown Fellowship Society, a funeral organization for black men that acquired proper burial grounds, supported widows, and educated surviving children. Besides light-skinned or “brown” men, only dark-skinned men with naturally straight hair were allowed to join. Perhaps as retaliation, darker-skinned men led by Thomas Smalls created The Society of Free Blacks of Dark Complexion in 1843. It wasn’t until after the Civil War that the Brown Fellowship Society expanded to women and those of darker complexion irrespective of hair texture and changed its name to the Century Fellowship Society. There was also the Bon Ton Society of Washington, D.C,. and the Blue Vein Society of Nashville; you had to be light enough for someone to see the spidery veins on your wrist in order to join the latter group.
As the daughter of the Afro-Mexican poet Carl Senna and the half-Irish, half-English writer Fanny Howe, Danzy Senna has never shied away from race in her works. Her first novel, Caucasia (1998), which explores racial passing, gave her literary fame and became a national bestseller, winning the American Library Association’s Alex Award. Her other works—the novel Symptomatic, an autobiographical book called Where Did You Sleep Last Night?, and a short story collection titled You Are Free—deal with similar themes around mixed heritage, lineage, and identity. In a 2011 NPR interview, Senna says that although she is mixed race, she was raised to identify as black. When the interviewer asks Senna “What are you?” she responds, “I’m interested in kind of deconstructing the question itself and asking the person who wants to know, ‘Why do you need to know? And why is it uncomfortable for you that you don’t know?’ And, you know, I’m less interested even in that than I am in writing about ambiguity and power and economics, and looking at the history of these terms.”
In New People, the brown-skinned poet holds an interesting position, both within the prose and Maria’s interiority. He doesn’t have many lines, and Senna is very conservative when it comes to describing him—he doesn’t even have a name. Sometimes, it’s hard to suss out if this poet is a real person, or a projection of Maria’s desire to heal her clashing views of blackness by achieving intimacy with someone darker than she. Maria seems to yearn for a bond, a connection to another black person on the other side of the color spectrum. This might be why Maria is obsessed with the poet from the moment she lays eyes on him, breaking and entering into his apartment and holding onto one of his hats so he can only retrieve it by meeting up with her for a drink. Khalil is still her partner, but when he reappears in the story, he arrives alongside some responsibility or chore: wedding dress fittings at Bergdorf Goodman’s, wanderings at Crate & Barrel, New People documentary shootings. Between the sex on their apartment floor and their Waldorf salads, it is very easy to pick up the banality between them. He is moving forward with their life while Maria is pulling back from it, despite the black women around her telling her that Khalil is a good man and it doesn’t get any better than this. The poet, on the other hand, unearths her impulsiveness—he brings out her need to expand, to exist beyond the comfort of her privilege.
The release of New People six months after Donald Trump became president might be fortuitous. Although the conversation surrounding identity politics has become a much belabored point, still, to consider the alliances and nuances within race remains relevant. It is not enough to argue that political alliances happen among and between women, black people, Latinos, immigrants, and LGBT people. Within communities and individual lives, the realities are more complicated: privileges and oppressions compound and collide. Are you rich? Are you poor? Are you light-skinned? Are you dark-skinned? Are you educated? Are you not? These questions shed light on the divisions within the aforementioned groups; they broaden and complicate our ideas of solidarity.
Still, as a novel, New People is not without its flaws. The story seems rushed, with dialogue that reads as too premature or out-of-place to be realistic. The ending may throw readers for a loop with its ambiguity, even if the open-endedness is compellingly provocative. But the question of “Can you remember a time when you were really real?” reverberates on each page. Are any of these characters really real? Could they exist beyond the confines of their own knowledge of what it means to be black, or would they be destroyed by such a potentiality? What this novel succeeds in is creating a dense psychological portrait of a black woman nearing the close of the 20th century: inquisitive, obsessive, imaginative, alive. She is as puzzling as she is alluring, even if one may finish the novel feeling as though the issues are unresolved. Maybe that’s just how it is to live a life that transcends what’s written on the page.
Rahm Emanuel has a battle plan for Democrats, and it looks mighty familiar. On Tuesday, just hours before two special congressional elections were called for Republican candidates, the mayor of Chicago and political operative Bruce Reed published a piece in The Atlantic on “How the Democrats Can Take Back Congress.” As architects of the party’s midterm strategy in 2006, when Republicans lost control of both the House and Senate, they argued that “Donald Trump came to Washington to make waves—and he may deliver a wave election powerful enough to sweep his party out of control of Congress.” But, they added, “Waves don’t happen on their own: Democrats need a strategy, an argument, and a plan for what they’ll do if they win.”
Emanuel and Reed cautioned that the party can’t “rely entirely on one side’s enthusiasm or the other side’s disenchantment,” and that “Democrats don’t need to spend the next year navel-gazing over how to motivate their base.” Instead, they need to “choose the right battles” and “choose credible candidates who can win them.” “Winning hotly contested swing seats,” they argued, “requires candidates who closely match their districts—even if they don’t perfectly align with the national party’s activist base.”
By definition, good candidates match their districts. But whether that requires deserting the “activist base” is an increasingly contested question in Democratic politics. In the wake of Bernie Sanders’s unexpected success in last year’s primary, and Hillary Clinton’s unexpected loss in the general election, the ascendant populist left is arguing that its policies are broadly popular—and that energizing the base is a more fruitful path to victory than fielding centrist candidates who can court Republicans or independents.
Assessing Democrat Jon Ossoff’s loss in Tuesday’s special congressional election in Georgia, some on the left blamed his centrist messaging—his focus on deficit reduction, for instance, and lack of anti-Trump rhetoric. “In the closing weeks of the race, Ossoff and the DCCC missed an opportunity to make Republicans’ attack on health care the key issue, and instead attempted to portray Ossoff as a centrist, focusing on cutting spending and coming out [in] opposition to Medicare for All,” Anna Galland, Move On’s executive director, said in a statement. “This approach did not prove a recipe for electoral success. Democrats will not win back power merely by serving as an alternative to Trump and Republicans.”
Ossoff, one could argue, was tailored to his district, and yet he fell short. It’s one of many reasons the Democratic establishment could face criticism if they pursue a midterm strategy similar to Emanuel’s, which drew objection from the left at the time—and still does.
Meredith Kelly, communications director at the Democratic Congressional Campaign Committee, says Ossoff didn’t have trouble courting progressives. “His base was there,” she told me on Wednesday. “I would argue there’s no evidence he had a problem on the progressive side.” Nor does she see evidence he should have been more overtly anti-Trump. “This is a district where Trump still is not underwater the way he is nationwide,” Kelly said. “This is a place where Trump was still 50-50 in terms of approval.” Ultimately, she told me, “There are going to be a lot of opinions. I don’t think anyone knows for sure why Jon Ossoff lost other than this district is really hard and he ran out of Democrats and independents to support him.”
Politico reported last month that national Democrats were consulting Emanuel on how to replicate his strategy for 2018: “Democrats believe President Donald Trump has already given them enough to make the ‘cronyism, corruption and incompetence’ argument they employed in 2006 — when [House Minority Leader Nancy] Pelosi and Senate Minority Leader Harry Reid first implored voters to ‘drain the swamp’ in Washington.” It’s hard to imagine any Democrat quibbling with that message—at least as part of their midterm argument—or objecting to the type of early and aggressive fundraising Emanuel employed in 2006. He relentlessly tied rank-and-file Republicans to President George W. Bush, and there’s no doubt Democrats should soil every GOP candidate with Trump’s disastrous agenda. (To be sure, Emanuel is also stressing that Democrats need their own positive agenda to compliment their negative messaging—another popular conclusion drawn from Tuesday’s results.)
Yet as much as 2006 was a historic victory for Democrats, Emanuel’s strategy prompted intra-party divisions—debates that remain relevant to the future of the party today. Wall Street Journal editor Naftali Bendavid covered the campaign for The Chicago Tribune and authored The Thumpin’: How Rahm Emanuel and the Democrats Learned to Be Ruthless and Ended the Republican Revolution. He told me, “There was also this huge debate going on, then as now, over whether you go with populist base-type candidates or recruit for the district,” which in 2006 meant running moderate and even conservative candidates. The debate then “was less economic populism, which seems to be part of the discussion now, but it had to do with gun control, abortion, and a certain degree of social conservatism,” Bendavid said. Victory in November didn’t resolve these arguments, either; some on the left argued they could have won bigger majorities with more progressive candidates.
In 2006, Democrats were also divided over Emanuel’s focus on swing districts versus Howard Dean’s push, as chair of the Democratic National Committee, for a “50-state strategy.” The disagreement resulted in a long-running feud between the two men, and both credited his own approach with the party’s midterm victory. Asked to comment on Emanuel’s Atlantic piece, Dean said, “I don’t think anyone outside the Beltway will read this or care.” He told me Emanuel made major contributions in 2006, particularly with fundraising, but Dean downplayed the extent to which Democrats can emulate that year’s strategy for 2018. “The playbook’s going to be very different,” he said. “To think that we’re going to use the strategy from 12 years ago that was important but not sufficient is silly.”
In Dean’s view, the DCCC and the DNC should leave candidate recruitment to people outside of Washington, support activist groups, and reengage millennials. “The activist base will support some moderate Democrats in the appropriate districts,” he stressed. “We have to trust these people.” If Democrats follow this approach, Dean is incredibly optimistic about their prospects. “I believe we’re going to take the Senate back in 2018 as well as the House, but we’re not going to do that if we start screwing around inside the Beltway, thinking we know best,” he said. “Washington does not understand what’s going on in the rest of the country, and if they did Donald Trump wouldn’t be president of the United States.”
Dean and others progressives also reject the idea that the DCCC’s strategy made the difference in 2006. Markos Moulitsas, founder and publisher of the progressive blog Daily Kos, told me, “We won in 2006 because George W. Bush had worn out out his welcome.” Moulitsas argues that voters were primed to oppose Republicans “no matter what warm body Democrats had thrown into a district,” and there’s a temptation to “way overplay the quality of a candidate in these wave scenarios.” Similarly, he said, “2018 will be a referendum on Trump, and the only thing that will matter for Democrats is having a ‘D’ next to their name. And given this year’s special election results, upward of 100 Republican-held seats could be in some level of play. We won’t win that many, for sure, but 40-60 isn’t out of the realm of possibility. And at that point, candidates aren’t winning based on their charming personalities and milquetoast politics. They are winning based on massive negative public sentiment.”
Moulitsas cautions Democrats not to depress the intensity of their base by softening their anti-Trump stance or trying to engineer candidates. “If the base sees Senate Democrats slowing the Senate to a standstill to kill TrumpCare, they will be more motivated to work hard for Democrats next year. That’s what riding the wave looks like,” he said. “That means not overthinking how some candidate fits in with a district (often based on past voting patterns, not aspirational future ones based on non-voting potential base voters in a district), but rather, taking actions, starting today, that will further rev up base intensity, priming them for next year’s vital elections.”
“Midterm elections are overwhelmingly about turning out the base,” Ben Wikler, Move On’s Washington director, told me. Like Moulitsas, he thinks the “unified Democratic message about the culture of corruption” was effective a decade ago, and “gains that Democrats made in 2006 were the result of progressive organizing that capitalized on the public’s rejection of George W. Bush and cronyism.” “That is not a run-corporate-friendly-centrists strategy,” Wikler emphasized. “Our sense is that in 2018, even more than in 2006, the public is furious with politics run for the benefit of billionaires.... Especially in this moment of populist resistance energy, we’re going to be best served by candidates who unite progressives and the disaffected with authenticity and vision people can believe in.”
Kelly insists that the DCCC “has done a number of new things this cycle that get us outside the supposed bubble and closer to the ground on these districts,” including hiring local organizers to work with activists on the ground. She said “there are definitely parallels” with 2006, “but I think we’re also doing things in brand new fresh ways that are unique to the environment we’re in.” She added, “We’re in a new frontier and the grassroots are some of the most powerful people in our politics right now.”
“I think our candidates will be able to walk and chew gum at the same time in terms of appealing to Democratic base voters who are critical to winning in swing districts while also appealing to independents and some moderate Republicans,” Kelly said. “To Rahm’s point, these are Republican districts. Our whole battlefield is Republican-leaning.” She added that the party’s base “is not really imposing a litmus test as far as I’ve seen.”
The question is whether winning on a Republican-leaning battlefield requires centrists. “I’m not going to prescribe a specific ideology that a candidate needs,” Kelly said, “but what we do absolutely care about is that a candidate fits the district. We really want authentic people that understand these communities. There’s no one-size-fits-all-approach to these districts. We absolutely reserve the right to get involved in these primaries where necessary.” That’s certainly similar to the DCCC’s rationale in 2006. John Lapp, the committee’s executive director then, told me “2006 was less about an ideological profile for candidates and more about recruiting non-traditional candidates in non-traditional places. Helping recruit and win with sheriffs, military veterans, and even high school football coaches in purple and ruby red Republican districts all across the country. We’re certainly doing that now in 2017, as well.”
Everyone seems to agree on recruiting non-traditional candidates. “Donald Trump led the way in showing how the nation’s populist sentiment is anti-politician,” Moulitsas said. “Democrats would be wise to run veterans, teachers, firemen, mothers, and other such non-traditional faces. Let’s skip the typical politician, and let’s run people that voters can better identify with.” In a memo released after Ossoff’s loss, DCCC chairman Ben Ray Luján wrote, “Let’s look outside of the traditional mold to keep recruiting local leaders, veterans, business owners, women, job-creators, and health professionals. Let’s take the time to find people who fit their districts, have compelling stories, and work hard to earn support from voters.”
Where there’s likely to be debate, of course, is when these non-traditional candidates depart from progressive policies. In 2006, Bendavid said, “A lot of the people with military, police, and athletic backgrounds also were not down-the-line liberals.” Similarly, Luján’s memo argues that “the road back to a Democratic House majority ... necessitates fielding strong candidates with diverse profiles that fit unique Republican-leaning districts. It demands that we continue embracing a big tent mentality.”
The Democratic Party’s strategy aside, there’s already a sense that the grassroots are shaping the landscape more today than they did in 2006. Democrats are “busy sorting through potential candidates, who in some cases number more than a dozen interested prospects for a single district,” according to Politico. “The DCCC has been succeeding much earlier than usual in landing strong recruits.” “Democrats were largely demoralized after the 2004 election,” Lapp told me. “It took a real leap of faith for Democrats to believe again, particularly in these purple and red districts. We were desperate to have people run. That’s very different from 2017, where we’ve got an embarrassment of riches. Folks are coming out of the woodwork to run for Congress—all on their own.”
The Democratic Party will certainly play a major role in these races, as its various arms decide which candidates to allocate resources to and fundraise for. But it’s not clear that the DCCC, for instance, wields the same king-making power that it did under Emanuel. If the grassroots is indeed much stronger today than in 2006, they may well have the power to set the party on a new, leftward course, just as Sanders’s movement nearly did last year. As Lapp said, “It is up to Democratic primary voters to sort out who they prefer—pitchfork-wielding progressives or more moderate-minded candidates.”
In 2014, when Paul Ryan was trumpeting his soon-to-be-released anti-poverty plan, he accused welfare recipients of suffering from a “culture problem.” “We have to re-emphasize work and reform our welfare programs, like we did in 1996,” he stated in a radio interview:
We have got this tailspin of culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working or learning the value and the culture of work.
Immediately, liberals pounced on Ryan’s racist “poor people are lazy” rhetoric. Charles Blow at the New York Times wrote, “When we insinuate that poverty is the outgrowth of stunted culture … we avert the gaze from the structural features that help maintain and perpetuate poverty.” The Daily Beast’s Jamelle Bouie asserted that Ryan wanted to blame black people for their own ills, without acknowledging that “inner-city poverty didn’t just happen, it was built” through decades of racist policies. Over at Jacobin, Keeanga-Yamahtta Taylor wrote that only the “reintroduction of the ideas of structural inequality, institutional racism, and injustice can make sense of the reality millions of African Americans find themselves trapped in.”
It is a ballet that has played itself out in American politics over and over again: The right argues that poor people are living in poverty because of their individual failings. In response, the left insists that in fact a historically racist system, bad education, segregated housing, and stagnating wages are to blame. The debate revolves around the idea of whether or not someone can be held personally responsible for their outcomes: The right argues that most people can be held responsible, while the left answers that most people cannot. The left is correct in pointing out the overwhelming evidence that shows the huge role that structural issues play. But their failure lies in agreeing to engage with these terms at all.
In his new book, The Age of Responsibility, Harvard lecturer Yascha Mounk argues that this “responsibility framework” is woefully inadequate and leaves liberals hamstrung. Debating “who bears responsibility for what” diverts us away from the more important question of how we can foster “equality and solidarity in capitalist democracy.” Instead of denying that responsibility ever plays a role, the left should, Mounk argues, often ask a simple question: Why do we predicate the receipt of benefits on personal responsibility at all?
It was not always this way. Although responsibility rhetoric is nothing new, it has transformed over the past few decades: During the Cold War, politicians embraced the idea of “responsibility-as-duty,” the notion that people had a collective responsibility to sacrifice for the nation in order to uphold freedom in Western society. But by the 1980s, Ronald Reagan’s rhetoric, which focused on the individual achievement of the American dream and the racist scapegoating of “welfare queens,” who he claimed were gaming the system, converted the idea of “responsibility-as-duty” into one of “responsibility-as-accountability.” Reagan pushed the belief that if someone made bad choices in life, they did not deserve any help from the government. The primary role of the welfare state then became to punish those who were not thought to live responsibly (mainly poor people and minorities) and reward those who did (mainly rich, white people).
Since then, politicians both on the right and the left have run with this punitive idea of responsibility. As Bill Clinton signed the 1996 welfare reform bill (aptly termed the Personal Responsibility and Work Opportunity Reconciliation Act), he famously declared, “this legislation provides an historic opportunity to end welfare as we know it and transform our broken welfare system by promoting the fundamental values of work, responsibility, and family.” The shift, Mounk argues, was not limited to the United States. Britain’s prime minister, Tony Blair, emphasized the same ideal as Clinton, asserting that “the left argued for rights but were weak on responsibilities.” In France it was “responsabilité individuelle,” in Italy “responsabilità individuale,” in Poland “odpowiedzialność osobista.”
This heralded what Mounk terms the “Age of Responsibility”—a change so sweeping that it has become deeply entrenched in politics around the world. The result, in America at least, is telling: Since welfare reform, the numbers living in extreme poverty (defined as $2 per person, per day) in the United States have risen drastically, from 636,000 households to 1.65 million. States are threatening to add work requirements to entitlement programs like Medicaid. We now have an administration that wants to cut $272 billion from welfare programs.
The solution that Mounk proposes is no less radical: to reinvent the idea of personal responsibility itself. To argue for the need for this, Mounk runs through some of the pitfalls of the left’s response of denying that people have personal responsibility. First off, it’s counterintuitive. As Mounk notes, “ordinary people have reason to value responsibility, and they recognize this fact.” By denying that poor people and minorities have agency over their lives, liberals inadvertently diminish them.
It would be more effective, Mounk argues, to push for a “positive conception of responsibility,” one that “reminds us that our political institutions exist for a reason: In particular, the welfare state is meant to ensure the equal standing of all citizens, to give people assurance that they will continue to have access to the material goods they need to live a life of simple dignity” and to “facilitate a life full of meaningful, freely endorsed responsibilities.” Importantly, it would require us to put forward a positive vision for our welfare system; things like health care for all, free college, and guaranteed economic security.
What, in practice, would it mean to push for a welfare state that facilitated freely endorsed responsibilities? A good example is to look at the choice to have a child. For many people, having a child while they are young and healthy might mean that they might be forced to raise that child in poverty because of, say, the exorbitant costs of child care, relatively lower salaries for early-career employees, and lack of affordable housing. This, of course, is not really a choice at all. Thus only a welfare state that provided some sort of subsidized child care, universal basic income, and high-quality public housing, allowing parents to raise children without the fear of being relegated to poverty, would empower a meaningful, responsible choice when it comes to childrearing.
It’s important to note that Mounk purposely stops short of actually proposing a vision of society, stating, “It is ultimately up to the people to decide how important it is for them that their institutions should support them in caring for their children or in tending to their sick relatives.” His goal is to reveal the inadequacy of the current strategy of the left to break out of the constrictive framework that the right has erected around them.
It’s also crucial to point out that personal responsibility rhetoric in America, at least when it comes to welfare reform, has been, in large part, about white supremacy. Reagan’s “welfare queen” and Ryan’s “inner cities” all play into white people’s fear of black people bilking the benefit system and a desire to punish them for their supposed choices. Thus, a huge obstacle for any push to define our welfare state as an institution that has the collective duty to offer both societal and material equality to people is, of course, racism.
Which leaves us with an immense, but necessary task: to reimagine our welfare state as one that endorses a positive, collective concept of responsibility, whose primary goal is to ensure material and social equality for all people; to blow up the deep-rooted assumption that we should hold people accountable by denying them benefits. The post-New Democrat left cannot fall into the same traps as their predecessors. There is perhaps no better time than now for a radical rethinking of the purpose of both our policy institutions and political rhetoric.
Until a modest breakthrough this week, supporters of the Affordable Care Act had spent the month of June desperately trying to make issue of the Senate Republicans’ secret plan to take health insurance away from millions of Americans.
On May 31, I described the GOP health care heist as a “scandal marked by secret meetings, violated norms, collusion, and deceit” but that, unlike the Trump investigation, “most of Washington has decided not to care.”
In a series of segments beginning June 9, MSNBC host Chris Hayes has lit his hair on fire to underscore just how aberrant the Republican approach to its health legislation has been. “It is remarkable that this process is happening,” he said. “The House process was truncated. What’s happening now is, I think, completely unprecedented.”
On June 13, as on many other days this month, Senate Minority Leader Chuck Schumer used his daily floor remarks to highlight “one of the greatest acts of legislative malpractice Washington has ever seen…. [T]hey don’t want the American people to see how poorly they would do under this bill. They don’t want people to see just how well the special interests do under this bill.”
Soon, activist groups began demanding that Senate Democrats use all the procedural levers in their reach to force the bill’s contents into the light—and then defeat it. A draft of the bill is now expected to be released on Thursday, though the public may have as little as a week to digest it before a Senate vote.
What animated all of this justified alarmism wasn’t just the secrecy of the bill-writing process—the anti-democratic method of hiding fine print from the public—but that Republicans were using that method to fade the entire purpose of Trumpcare out of public memory. And it was working.
HuffPost reporter Jeff Young wrote last week that “as important as the legislation’s details will turn out ... to be, there’s a simple, fundamental, incontrovertible fact about whatever the Senate health care reform bill winds up looking like: The purpose of this bill is to dramatically scale back the safety net so wealthy people and health care companies can get a massive tax cut.”
This is the throughline of the entire, horrifying Obamacare repeal story, and almost without exception, it was omitted from all the places most Americans get their news—television, print, and online front pages—until the past few days.
It is tempting to attribute the weeks of lost headway to a lack of liberal imagination. Many Democrats and journalists “get tangled up in policy literalism and boxed out of being able to speak clearly about the political reality that is coming,” wrote TPM’s Josh Marshall. “To be more specific, even if they don’t quite get that this is happening, many Democrats think that there’s nothing to discuss or attack since we don’t know the fine print of the legislation despite the fact that its broad scope and impact are clear.”
For the reasons spelled out above, I think this misdiagnoses the source of the challenge and the solution to it. Senate Majority Leader Mitch McConnell didn’t lock down the bill-writing process in order to block liberals from going over the bill with a fine-tooth comb. His chief insight was in recognizing a bias—not among liberals, but within the news industry—toward what you might call “new news.” Things we didn’t know before, but do know now. It is that bias, more than anything else, that has brought us to the brink of living under a law that almost nobody on the planet has seen but that will uninsure millions to pay for millionaire tax cuts.
If you consider how the secret Republican health care bill story ultimately broke through (to the extent that it has), or refer back to the much-more-thoroughly-covered health care debate in 2009, the new-news bias effect becomes fairly obvious.
What ultimately got Trumpcare a modicum of mass coverage wasn’t a critical mass of liberal outrage about secrecy, preventable deaths, or bloodless, soak-the-poor, right-wing ideology. It was that Democrats, responding to grassroots pressure, stopped cooperating with Republicans to run the Senate in an orderly fashion, and made Republicans actively reject requests to open up the process, protect children and veterans and so on. Which is to say, Democrats made a little bit of news.
The reason the Affordable Care Act debate was so thoroughly covered eight years ago wasn’t that the reporters who covered it were better, but that the debate then was like the Sutter’s Mill of news. Reporters had a surfeit of hearings, drafts, amendments, CBO reports, speeches, symposia and votes to cover, and those stories commanded prime media real estate. Because Democrats didn’t try to pass their entire reform agenda through the filibuster-proof budget process, they needed 60 votes. And because there were exactly 60 Democratic senators at the time, every single senator was a kingmaker—a newsmaker. Any Democrat who had a change of heart about anything—whether the bill should include a public option, whether the marketplaces would be organized at the state or national level, whether the government should tax expensive health plans—could reshape the bill, and could thus turn trivial changes in senatorial brain function into critical scoops.
This is what makes Republican denunciations of the debate over Obamacare so outrageously dishonest. While Republicans faked hysteria over the supposed secrecy of the process, what McConnell recognized about the 2009 debate is that nearly all of the Democrats’ struggles and setbacks stemmed from its openness. It wasn’t Democrats who set the template for Trumpcare; what Republicans are doing now is a through-the-looking-glass adoption of the lies they told about Obamacare.
As in 2009, almost every single Republican senator today has the power to change the contents or legislative course of the secret health care bill, or kill the repeal effort altogether. They are instead using their ignorance of the bill’s contents—feigned or otherwise—to shield the bill itself. The absence of new news is the bill’s greatest source of strength. The resurfacing of old Republican tweets and comments attacking the authors of Obamacare is fine by those Republicans, because it limits the newsiness of the health care story to examinations of Republican hypocrisy rather than Republican goals and values. By withholding details, they limit the range of reportorial inquiry to questions about the process itself. Have you seen the bill yet? No. Will you withhold support for the bill unless it runs through an open process? I am very dismayed about the process.
All of this underscores the importance of treating the coming bill text, and next week’s Congressional Budget Office analysis, as if they were vaguely-written letters from James Comey. There will be mere days if not hours to distill the contents and effects of the secret bill to the public before senators cast their final votes.
But it would be better in the long run for the news industry to migrate toward a more nuanced standard of newsworthiness that doesn’t cede all agenda-setting power to people who can commandeer front pages with misleading information just because it’s new, or escape scrutiny for moral crimes whenever they want to, simply by going dark.
Karen Handel’s narrow victory over Jon Ossoff in last night’s special election in Georgia shows how Republicans can keep their coalition together despite President Donald Trump’s unpopularity. True, Trump was a drag on Handel, who won by four points in a conservative district that Tom Price, now Trump’s Health and Human Services secretary, carried by 23 points just six months ago. But in the end, Handel convinced enough Republicans to come home to the party, which she did by shrewdly realizing what unifies the party: anti-anti-Trumpism.
During the campaign, Handel, a former Georgia secretary of state, took care to avoid mentioning Trump’s name whenever possible, referring to him only as “the president.” But as David Weigel reported in the Washington Post, Handel and her political allies ran a tribalist campaign designed to remind Republicans voters that, whatever they might feel about Trump, they hate his opponents more. They relentlessly linked Ossoff to Trump’s critics, from establishment figures like House Speaker Nancy Pelosi to outliers such as controversial comedian Kathy Griffin.
“Griffin, whose involvement in the race was limited to one April tweet in support of Ossoff, has now been linked to [Bernie] Sanders and Pelosi in a lineup of ‘childish radicals’ who back the Democrat,” Weigel wrote. “The ad strategy, and the campaign visit from Republicans such as House Speaker Paul Ryan, have had almost nothing to say about what Republicans were working on in Washington. The message was that Republicans would feel terrible if they had to watch Democrats celebrate.”
Handel hit on the magic formula for keeping Republican voters from jumping ship: a politics of negative partisanship taken to its logical extreme, where political identity is based solely on opposition to the other side. This anti-anti-Trumpism is now the glue holding together the otherwise fraying Republican coalition. It’s a weirdly contorted ideology, a counter-punching worldview that shows that the power of hatred can be the strongest force in politics.
Anti-anti-Trumpism is a natural outgrowth of longstanding Republican tendencies toward negative politics, which ramped up in the 1990s when then–House Speaker Newt Gingrich and fellow Republicans made opposing President Bill Clinton the primary feature of their party. But this anti-Democratic and anti-liberal philosophy has been updated today to account for an unpopular Republican president: Whatever you dislike about Trump, rest assured his opponents are far worse.
Anti-anti-Trumpism pervades conservative thinking, and is especially strong in an unexpected quarter: among “Never Trump” Republicans. Media outlets like National Review and The Federalist, which once warned that Trump was a menace to conservatism, are now devoted to decrying the president’s critics, sometimes portraying them as subversives who will stop at nothing, not even violence, to defeat Trump. Matt Lewis, a conservative writer at the Daily Beast, on Wednesday lamented this “shift” at The Federalist, writing, “It’s one thing to point out the left’s hypocrisy and the media’s hyperventilation; it’s another thing to cast Trump as a victim.”
Anti-anti-Trumpism is an increasingly comfortable mode for many conservatives because it allows them to maintain a right-wing identity, and support the Republican Congress, without affirmatively backing the toxic president himself. It’s an especially convenient position for traditional conservative writers who want to remain relevant—that is, to retain their readership—in the age of Trump. “The anti-anti-Trump position is a safe one,” John Ziegler, a Mediaite columnist and conservative talk show host, told Lewis, “because you’re giving the Trump cult what they want while you’re also trying to pretend you’re standing on some sort of principle.”
The powerful appeal of anti-anti-Trumpism is evident in the latest New York Times column by David Brooks, once the embodiment of intellectual Never Trumpism. Brooks compares the ongoing Russia investigation with the fake Whitewater scandal that Republicans ginned up in the 1990s, a comparison that immediately falls apart when Brooks admits he doesn’t even know what Whitewater was all about: “I was the op-ed editor at The Wall Street Journal at the peak of the Whitewater scandal. We ran a series of investigative pieces ‘raising serious questions’ (as we say in the scandal business) about the nefarious things the Clintons were thought to have done back in Arkansas. Now I confess I couldn’t follow all the actual allegations made in those essays.”
Starting from this place of ignorance, Brooks confidently concludes, “In retrospect Whitewater seems overblown. And yet it has to be confessed that, at least so far, the Whitewater scandal was far more substantive than the Russia-collusion scandal now gripping Washington. There may be a giant revelation still to come. But as the Trump-Russia story has evolved, it is striking how little evidence there is that any underlying crime occurred—that there was any actual collusion between the Donald Trump campaign and the Russians.” He later writes that “frankly, on my list of reasons Trump is unfit for the presidency, the Russia-collusion story ranks number 971.”
The Whitewater scandal grew out of investments the Clintons made with friends Jim and Susan McDougal in Arkansas in the 1970s and 1980s. While the McDougals received felony convictions for various shady business dealings, multiple government inquiries found no evidence connecting these crimes to the Clintons and no member of the Clinton administration was implicated. The Russia investigation has already had far more real-world consequences—for starters, the resignation of national security advisor Michael Flynn and the firing of FBI Director James Comey. And let’s not forget what started it all: Russia interfered in the 2016 presidential election with the intent of helping Trump, and perhaps were responsible for his election. The Russia investigation is still in its early days, but it is already much closer to Watergate—which, it’s worth noting, began with the burglarization of a Democratic Party office by Nixon White House operatives, not a widespread hacking campaign by a hostile foreign power—than Whitewater.
Why is David Brooks suddenly running interference for Trump? For the same reason that National Review and The Federalist have become organs for attacking Trump’s foes, and that most Republican voters reverted to partisan loyalty and voted for Handel: Politics is tribal. Brooks might present himself as a thoughtful, above-the-fray conservative, but at the end of the day, he too feels the tug of loyalty. A Republican president is being attacked, and his instinct is to find extenuating reasons for the man’s controversial actions.
In making sense of anti-anti-Trumpism, Lewis wrote that “the Trump presidency is dangerous for conservatives, in part because it confuses things. It’s hard to justify your existence as a balance to the liberal media if you are spending most of your time criticizing a Republican president.” This may well be what Brooks, a fierce critic of Trump last year, has come to realize. But his essays are now weaker for it. As Lewis wrote, “If you’re not keen on defending the indefensible (which would be most of Trump’s rhetoric), you end up making a lot of tu quoque arguments that become hackneyed and predictable.”
Trump himself might even realize the power of anti-anti-Trumpism—that would explain his otherwise inexplicable decision to keep harping on “Crooked H,” more than half a year after he defeated her. Because anti-anti-Trumpism is the cohesive force keeping the Republicans together, we can expect both Trump and other Republicans’ to continue to demonize his critics at every turn. This will only intensify as Trump finds himself in more political trouble—and it should give Democrats pause. Ossoff, like Clinton before him, bet that he could win over enough disaffected Republicans to win. But in this age of negative partisanship, as Tuesday night’s results prove, it’s extremely hard to create enough converts. As they strategize for next year’s midterms, Democrats should accept the indomitable force of anti-anti-Trumpism and focus instead on energizing the very people whom anti-anti-Trumpers are demonizing.
While congressional Republicans are busy working on their secret health care bill, President Donald Trump is already undoing pieces of Obamacare on his own. He pledged during the campaign that he would roll back a regulation issued as part of the Affordable Care Act that requires contraception to be covered without co-pay in insurance plans. The rule had angered religious employers, who objected to, they say, being complicit in providing birth control to their employees.
Despite numerous workarounds offered by the Obama administration, the Trump administration is reportedly undertaking a sweeping change that will allow virtually any employer to wriggle out of the mandate. There would be no requirement that they find another way to provide contraception, such as through a third party. Trump’s reversal would thus risk any employee’s free coverage and potentially put them back on the hook for hundreds of dollars a year. Before the ACA, 85 percent of large employers’ plans covered birth control, but most required a co-pay and deductibles. Without the mandate, thousands of employers could quickly rescind the benefit.
States will not be outdone by the federal government, however. The Missouri Senate just approved a bill that includes a provision rolling back employment protections in St. Louis for women who use contraception, which was tucked alongside a number of other attacks on reproductive rights. The legislation would make it perfectly legal for an employer to fire or refuse to hire someone because she uses birth control.
Allowing employers to discriminate against employees for using contraception, or to exclude contraception coverage in health plans, isn’t just objectionable because it gives bosses the power to dictate private aspects of their employees’ lives. It’s economic nonsense. The economy, and therefore every employer in the country, owes a huge debt of gratitude to contraception. The pill played a central role in allowing women to flood the workforce, improving prospects for us all.
Research has definitively established that the higher the female fertility rate, the lower women’s chances of working in paid jobs. It’s a lot harder to find and hold down work if you have multiple children to care for. Up until the 1960s, women tended to drop out of work once they hit their prime childbearing age.
That’s particularly true if you aren’t even able to control how many children you have and when. While the FDA approved a birth control pill in 1960, widespread use didn’t really get going until the late 1970s, after state laws were relaxed regarding who could actually obtain it. By 1973, nearly 70 percent of married women used the pill, which climbed to about three-quarters in the late 80s, where it has stayed since.
The economic effect of that change was enormous. Once women started controlling their own bodies, and their fertility rates fell, their employment rates skyrocketed. Research by economists Claudia Goldin and Lawrence Katz has found that, starting around 1970, the pill played a big role in women graduating college more frequently and marrying later. Both of those changes allowed them to have careers. Women were freed up to seriously pursue education and paid work without the risk that an unexpected baby would stop them in their tracks.
“The pill directly lowered the costs of engaging in long-term career investments by giving women far greater certainty regarding the pregnancy consequences of sex,” the authors write. This was true especially for professional careers like law and business that require extensive training. Why begin a law degree if a child could easily interrupt it? The authors found that more than 30 percent of the increase in women holding these jobs was thanks to the pill.
Another paper gets even more specific: Legal access to the pill before women turned 21 both increased how many women were in the labor force as well as how much they actually worked. Access to the pill reduced the likelihood that a woman would have a baby before the age of 22 by 14 percent. That, in turn, increased young women’s labor force participation by 7 percent. The women who first had legal access to the pill because of their states’ laws worked 650 more hours than their peers who only got it later.
The benefits of all this work experience accrued to women themselves. The women who had early access to contraception made 8 percent more by the age of 50 than the others. The pill is responsible for about a third of the closure in the gender wage gap that was achieved by 1990.
But the benefits also accrued to everyone. Between 1972 and the early 1990s, the share of women in their prime working age who were in the workforce rose from 72 percent to nearly 84 percent. Had that not happened, the economy would have been about 11 percent smaller in 2012.
Not all of that, of course, was thanks to access to contraception. But birth control played a huge role. Women’s careers, and the entire economy, would look very different without their ability to use it.
This remains true today. The ubiquity of birth control—contraception has now been used by nearly 100 percent of sexually active women—means that its impact is no longer quite as transformative, but studies of contemporary women find that having fewer children still increases their participation in paid work, while the inverse is also true. In a survey of women who use birth control, about half said they need it because it allows them to complete their education or find and keep a job.
Employers may not feel like any of this is their problem. But the potential pool of hirable employees would shrink considerably if women weren’t able to access contraception reliably and were constantly dealing with unwanted or unexpected pregnancies. And every American business benefits from the size and strength of this country’s economy, at least some of which is thanks to women who popped the pill and got a job.
There are no good reasons to let employers decide whether employees can or can’t use contraception. The list in favor of giving women the ability to control their reproduction, on the other hand, includes bosses’ own bottom lines.
“Have you ever been beaten?”
This is the question Vladimir Putin, barely suppressing a mischievous smile, asks Oliver Stone at the conclusion of The Putin Interviews, Stone’s ambitious, illuminating, and often bonkers series of conversations with the Russian strongman.
“Beaten?” Stone responds. “Oh yes, I’ve been beaten.”
“So it’s not going to be something new,” Putin says, now openly chuckling, “because you are going to suffer for what you are doing!”
“I know,” Stone says, nodding. “But it’s worth it.”
It’s a self-important note to end on, but a fitting one for a director who views himself as a martyr to the truth. The Putin Interviews aims to undercut every story you’ve heard about Russia since Donald Trump was elected president—and much more. Stone has long been intent on exposing the seedy underbelly of U.S. foreign policy and the many-tentacled reach of the deep state, even if it means he’s dismissed as a conspiracy nut. “Why does that bother people so much?” he asked Matt Zoller Seitz, as recorded in the indispensable The Oliver Stone Experience. “Because it could be true that their government is a monster? That there’s a malignancy in our military-industrial-security complex that has grown so much more immense now?”
These are sane questions, but they become something else when they start to dovetail with Donald Trump’s own pet theory for the troubles that have hobbled his young presidency: that it is being derailed by leaks, insubordination, betrayals, and other manifestations of the deep state. And those questions become even more problematic when they are funneled through the mouth of Vladimir Putin, who has an axe to grind and Western democracy to undermine. But while Stone has gotten a lot of grief for his interviews with Putin, this is actually the kind of argument that he has been making for decades, through movies that show hidden puppet-masters pushing the country into conflict to perpetuate their power and serve their self-interest. The Putin Interviews is just the crude culmination of a long and singular career.
The Putin that emerges in The Putin Interviews could almost be an Oliver Stone character. Some of Stone’s best characters—most notably James Woods’s wiry performance in Salvador—are manic figures driven by restless, nervy energy, but his signature leading men, from Jim Garrison in JFK (Kevin Costner) to Chris Taylor in Platoon (Charlie Sheen) to Ron Kovic in Born on the Fourth of July (Tom Cruise), are straight shooters. Their defining trait is an almost simplistic patriotism, albeit one that is eventually shaken by the horrors of their country’s sins. Throughout The Putin Interviews, Stone marvels at Putin’s composure. The Putin that Stone presents is deeply patriotic, clear-eyed, optimistic. He recognizes injustice, in this case the injustice of the United States’s treatment of Russia, from the “shock doctrine” policy of the 1990s to the expansion of NATO, all of which is portrayed as the justification for Russia’s interference in the 2016 election.
Above all, Putin is macho—a quality he shares with Stone’s characters—joking about menstruation and showing off his hockey prowess in a clearly rigged game.
Putin, of course, could also be an Oliver Stone villain. He is vain, conniving, dishonest. He is a genuine puppet-master, one who manipulates Russian patriotism to amass power and serve his own interests. And he lies about it, arguably the biggest sin in Stone’s worldview, articulated throughout his body of work, from Salvador to Snowden.
But Stone is not particularly interested in discerning the “real Putin,” if such a thing exists. Putin emerges from his documentary just as mysterious and contradictory as ever, partly as a result of his own shrewdness and partly because of Stone’s lack of guile. He is so willing to acquiesce to Putin’s version of events that he ends up being viciously trolled, convinced that a video Putin shows him of Americans fighting Taliban insurgents in Afghanistan is actually a video of Russians heroically fighting ISIS.
Perhaps unsurprisingly, The Putin Interviews is far more about the United States than it is about Russia. The case that Putin makes against America is one that Stone himself has made throughout his oeuvre: the disastrous impact of American foreign policy on the people and countries it arrogantly aims to help (Platoon, Salvador, and Born on the Fourth of July); the way the military-industrial complex both controls the country and reflects its dark heart (JFK, W., and Snowden); the inevitable corruption of empire (Alexander); the spiritual poison of Western-style capitalism (Wall Street, Any Given Sunday, Natural Born Killers).
As Zoller Seitz writes in The Oliver Stone Experience, in JFK
the Kennedy assassination is, incredibly, mere means to Stone’s larger end: to warn the viewer that since the end of World War II, the United States has not truly been a democracy, in the sense that school textbooks idealistically claim, but a whey-faced dictatorship run by the military-industrial complex—a loose consortium of interests linked by the desire to acquire and hold power by generating public fear of “enemies” within and without, then generate profits by selling arms and munitions to the U.S. military in order to defend against those same enemies.
The United States, Stone argues, is trapped in this destructive imperialistic cycle. At his free-wheeling best, such as in JFK and Salvador and Platoon, Stone offers a refreshing counterpoint to the cant about American greatness. At his feverish worst, such as in W., his movies read like a Salon.com article circa 2005.
Putin paints this version of America repeatedly in The Putin Interviews, both unprompted and with Stone’s encouragement. Asked about the United States’s decision to expand NATO in the two decades that followed the end of the Cold War, Putin says, “I get the impression that they have to enforce control over the Euro-Atlantic camp, and to that end, they need an external enemy.”
In Putin and Stone’s conception of events, the ongoing tension between the United States and Russia is an inevitable result of the U.S.’s imperialistic ambitions and a deep state that is addicted to conflict. Asked if anything changes now that Trump is president, Putin, with a twinkle in his eye, says nothing does: “Everywhere, especially in the United States, bureaucracy is very strong—and bureaucracy is the one that rules the world.”
It’s a more compelling argument than many in the foreign policy establishment would admit. But in The Putin Interviews, the message isn’t as big a problem as the medium—and as Stone has aged, he has increasingly embraced any medium that shares his message. Stone himself referred to the idea that Russia hacked the Democratic National Committee during the 2016 election as a “great fiction.” The irony is that Putin’s coy denials in The Putin Interviews all but concede that Russia did it. “When there is an action, there is always a counteraction,” Putin says, not-quite admitting what everyone already knows.
But this, too, lines up with Stone’s great thesis. In Stone’s universe, this is a truth that the people of the United States must understand: that the U.S.’s hostile acts abroad have consequences, that chickens ultimately come home to roost. But Vladimir Putin is not Jim Garrison or Chris Taylor, even if he isn’t exactly Natural Born Killers’s Mickey Knox either.
“[W]e have reached Roman Empire grotesquerie,” Stone told Zoller Seitz. “The change of leadership means nothing; the new emperors mean nothing. They’re wrestling with greater and greater scandals, more distracting things. There’ll be a thousand Fergusons. There’ll be a thousand ISISes, groups that are essentially our creation. There’ll be messes everywhere, and we’ll be fighting with them. We can’t even get judges appointed. Gridlock has reached the point of Roman corruption madness. How do you come out of that? Hopefully, for us, it’ll be a slow decline. The barbarians, so to speak, came into Rome and extended it another seven hundred years.”
In that wild hodgepodge of ideas, there is a message. It would just mean a lot more coming from Stone than Vladimir Putin.
It’s perhaps the most ubiquitous image of Trump’s administration to date: the president at his desk, preening for the cameras as he affixes his jagged signature to yet another executive order. Since his first day in office, when he signed an order to roll back the Affordable Care Act, Trump has issued a flurry of commands on everything from education and the environment to immigration and sanctuary cities. Not since Franklin Roosevelt has a president relied so heavily on administrative proclamations in his first 100 days—in Trump’s case, to create the appearance of forward motion amid legislative gridlock.
Trump’s executive orders serve another purpose: They enable him to pursue a doctrinaire conservative agenda outside the legislative arena. In a direct contradiction of his campaign pledges, Trump has sought to enrich private corporations at the expense of his blue-collar base. He has called for reorganizing or eliminating all federal agencies—a move that will create lucrative opportunities for private contractors. He has ordered government employees to lift “regulatory burdens” on American businesses. He has signed decrees that would open up millions of acres of federally protected land to private development, promote oil and gas drilling, fast-track infrastructure projects, and roll back regulations designed to protect consumers against Wall Street scams.
For the most part, Trump’s orders are merely aspirational: While they may require federal agencies to start drawing up plans or reviewing procedures, they’re largely powerless to set policy. But in one recent memo, Trump took his pro-business agenda a step further. On April 21, the president issued an executive memorandum that could directly benefit one of his generous corporate backers: the insurance giant MetLife.
At issue is the Financial Stability Oversight Council, an unheralded panel of top banking regulators. The FSOC was established in 2010, after Wall Street cratered the global economy, to monitor the financial system for undue risk and the threat posed by financial institutions that are “too big to fail.” One of the FSOC’s most important functions is to decide which financial giants should be designated as “systemically important”—large enough to threaten the overall economy. Megabanks already fall under this category, but the FSOC must decide which other institutions also deserve such a designation. So far, the FSOC has singled out three insurance companies: AIG, Prudential, and MetLife. Given how important these firms are to the country’s financial health, they are required to raise more capital to safeguard against an unforeseen catastrophe, and are subject to more stringent supervision from the Federal Reserve.
MetLife, however, has been fighting back against the increased oversight. In 2015, it sued the FSOC, claiming it had been improperly designated. The insurance giant was represented by the notorious bank lawyer Eugene Scalia—son of the former Supreme Court justice. Last year, U.S. District Judge Rosemary Collyer, a George W. Bush appointee, sided with MetLife, overturning the designation on two grounds. First, she ruled, the FSOC should have assessed the likelihood that MetLife would experience financial distress and projected specific losses. Second, it should have considered the cost of the designation to MetLife’s business.
The FSOC appealed the ruling, and a federal appeals court was on the verge of issuing its decision in the case. But then, in what looks like a blatant attempt to protect MetLife, Trump stepped in with his memo to Treasury Secretary Steven Mnuchin. In the order, Trump instructs Mnuchin, as chair of the FSOC, to review the process used to designate financial corporations as systemically important and to recommend improvements. What’s more, Trump ordered the Treasury Department to analyze the same criteria that Judge Collyer cited in her ruling: whether the FSOC should assess the likelihood of financial distress, include specific loss projections, and consider the financial costs to companies.
Trump’s memo was issued on a Friday. The following Monday, MetLife asked the appeals court to delay its ruling until the Treasury Department completes its 180-day review. In its motion—large parts of which consist of quotes from Trump’s memo—MetLife suggests that a delay “will enable the new administration to determine whether any of the FSOC’s positions in this case should be reconsidered and whether it is appropriate for the government to continue pressing this appeal.” Translation: Trump’s memo could kill the entire case against MetLife. The Justice Department, which is representing the FSOC in the case, quickly agreed to delay the ruling by 60 days while it reconsiders its position. The court will take no further action until July.
At the very least, Trump’s order bought additional time for MetLife, which contributed $100,000 to his inaugural committee. Trump also has direct ties to Scalia’s law firm, Gibson, Dunn & Crutcher, which had two associates working on the president’s transition team. In addition to blocking an immediate ruling against MetLife, Trump’s order is saving the company lots of money in compliance costs, while weakening the FSOC’s ability to protect the public. “In modern history, this is the only executive order that’s custom-designed to help a single company in litigation against the government,” says Dennis Kelleher, the president of Better Markets, a nonpartisan Wall Street watchdog group.
Trump is hardly the first president to use executive actions to get his way. The suspension of habeas corpus during the Civil War, the internment of Japanese-Americans during World War II, and the use of federal troops to enforce school desegregation in Little Rock all took place through executive orders. But Trump is already averaging more executive orders than any president since Truman, according to data from the University of California, Santa Barbara. More important, the nature of Trump’s FSOC memo appears to be unique: According to Better Markets, the order represents “a carefully choreographed dance between the Trump administration and Wall Street’s lawyers and lobbyists”—an effort by MetLife to secure an official-sounding pretext to tilt a court case in its favor.
There’s no guarantee that Trump’s memo will succeed at weakening the FSOC, or alter the MetLife ruling. Although Mnuchin chairs the FSOC, he must get support for any rule changes from the ten-member panel, which includes several Obama appointees. And the federal appeals court could still rule against MetLife and uphold the company’s designation. But Trump’s memo has effectively lowered the standard for executive orders. They’re supposed to ensure that the nation’s laws are faithfully executed—not that a rich corporate ally gets bailed out.
The opening shot of My Cousin Rachel, based on Daphne du Maurier’s 1951 novel, instantly connects it to the other movie adaptations of du Maurier’s books. Peaceful, green cliffs give way to a sea that seethes and writhes, and we think of Maxim in Hitchcock’s Rebecca dragging the second Mrs. de Winter to a clifftop. We might think also of Melanie Daniels crossing the doomed bay in The Birds, or of the horror of Venice’s canals in Don’t Look Now.
This, however, is England. We meet an orphan named Philip Ashley (Sam Claflin), who has been raised by his cousin Ambrose in a house where no women are allowed to enter. The plot is set into motion when Philip receives a letter from Ambrose, who has moved to Italy for his health. Ambrose is ill and afraid. Philip sets forth to recover his cousin but finds he has died of a brain tumor. He also finds that Ambrose has married a beautiful woman named Rachel. Philip is filled with rage at Rachel, whom he blames for Ambrose’s death, until the day that she walks through his door in Cornwall.
It takes at least 15 minutes for Rachel Weisz to appear on screen, but when she does it is a quiet, contained event. Nonetheless, the effect of her face is a payoff akin to an explosion in an action movie. Weisz’s face is very slightly asymmetrical, and she can control the cadence of her voice in a way that introduces the character of Rachel as instantly magnetic. She’s festooned in widow’s weeds throughout the movie, and from inside that black cocoon her face is like one of those floodlights the cops put up in New York at night: blinding, gentle violence.
The mystery animating My Cousin Rachel is about the motivations of its main two characters, and it’s explored through a pas de deux between Claflin and Weisz. The manor house where no woman has been admitted has a big hole at the center of it, where Philip’s dead mother should have been. His maternal ancestors are only present through a super-symbolic pearl necklace, worn by Philip’s mother and grandmother and great-grandmother at their weddings. All he has is this bauble and some money. The orphan has no understanding of women, and so he is powerless to resist Rachel’s searching face.
The movie thus plays out like the worst nightmare of a man who hates women. The story is about the terror of women’s unknowability, a power that doesn’t quite reside in their bodies or their actions but in their quasi-mystical ability to manipulate. Early on in Rachel’s stay at Philip’s house, he comments that a local has described her as “feminine.” She asks him about the quality of her femininity, how Philip might describe it. He has no language.
The domestic element of My Cousin Rachel is only about half of its drama. The rest takes place outdoors. This is where director Roger Michell’s ability really comes into play. The palette of the plein air scenes is dark and mineral. The grass, horses, waves, and rocks look painted. In one scene, Philip and Rachel sit on the beach. They are to the left of the frame. In the center, a massive tangle of rocks and dead branches splays. To the right, a gorgeous horse nestles in the crook of the cliff. The contrast between light and shade is as bold as chiaroscuro, but the scene looks naturalistic, in the sense that “nature” is dramatic.
In another key scene, the two ride out into a wood filled with bluebells. He is a little drunk—it’s his birthday—and it’s unclear whether she wants to be there at all. We see Rachel’s face against the bluebells on the ground; the face is like a wildflower, a natural phenomenon. Like a bluebell, she seems beautiful and totally without conscience. When they stand up the bluebells are crushed like a city destroyed by earthquake.
As some critics have pointed out, the mystery at the heart of My Cousin Rachel is not all that compelling. But as Julie Myerson wrote in the Guardian about du Maurier’s original novel, the joy in this story is not where you think it is.
The mysterious cousin Rachel is funny. Myerson quotes du Maurier: “With her eyes full of a ‘solemnity’ which ‘spelt mischief,’ we can’t help but warm to her.” Rachel Weisz’s performance layers these elements expertly, and that is where the movie’s center rests. We see her make others laugh, and we warm to her. Later, we see her do the same thing, but in Italian: Philip doesn’t understand Italian, and so she now seems sinister. In every case, we see Rachel through the gormless eyes of the young man.
When Rachel is crying and her pale hands snake around Philip’s shoulders to turn his comfort into an embrace, it’s as if we are interpreting her actions from the perspective of the shoulders themselves. Roger Michell has done an interesting and difficult thing by replicating in cinema the sensibility of first-person literary narration. At a level deeper than the operations of gender is a horror about the unknowability of other people and their motivations, even in the most intimate relationship. Du Maurier was a virtuoso of the hidden; her novels are best in the places where information is missing. Rachel Weisz delivers an equally virtuosic turn as a woman barely understood by the very story in which she lives.
Books authored by sitting politicians can take a number of non-fictional forms, but the books themselves tend not to be very good. Whether the work is a memoir or policy briefing, self-help or polemic, these authors can’t be expected to take the risks necessary to produce something great. Considering how busy their day jobs keep them, politicians can’t even be expected to actually write the books with their names on them. If politicians’ books are known for anything, it’s that their campaigns buy big piles of them in a sketchy way. For his book, Senator Ben Sasse (Republican of Nebraska) made an unconventional choice: He wrote a parenting manual.
Sasse has tried to cultivate an unconventional image for a senator: A young faith-and-family-first prairie Republican, Sasse distinguished himself in his first term, after being elected in 2014, by refusing to endorse or vote for his party’s 2016 presidential nominee. As a leader of the failed #NeverTrump movement he may be on the outs with the executive branch for a while, but Sasse is only 45 years old, and in the long run he kept his dignity intact by declining the President. It’s the kind of test that every God-fearing Christian should have passed, but among elected Republicans it took a guy with degrees from Harvard, Oxford, and Yale.
The Vanishing American Adult: Our Coming-Of-Age Crisis—and How to Rebuild A Culture of Self-Reliance was in some ways a brilliant premise for a politician’s book. Railing against the young wussy generation—with their safe spaces and iPods and blah blah blah—is an easy shortcut to perceived seriousness. The media knows this strategy well; the New York Times op-ed page in particular has been running the same play once a month or so for the past few years. And if it works for op-ed writers, it might work for a politician: Sasse gets to position himself first and foremost as a dad. He is not the cool dad who lets you stay up late, nor is he the stodgy dad who can’t imagine why you’d want to. Sasse is the tough but fair dad who has loaded you with enough responsibility (the ethic) and responsibilities (the tasks) that you go to bed early so you can get up and do your chores. After a term or two of Trump, it’s not a bad bet that the American people would be open to someone like him.
Sasse focuses on five habits that he sees as centrally important in the development from child to adult: intergenerational experience, limiting consumption, developing a work ethic, travel, and reading. It’s a list that could come out of a parenting guide from almost anywhere on the ideological spectrum, and honestly, it’s not a terrible one. Sasse is extremely corny from time to time (“We have some friends who camp occasionally not because they like it, but so that they appreciate their home even more”), but that seems like more of a feature than a bug. He doesn’t pose traditionalism as a new counter-culture because Sasse doesn’t have any interest in being part of a counter-culture. And yet, his fidelity to timeless values feels almost refreshing in a political moment when all the compasses seem to be spinning. Even Rand Paul starts to sound good when no one else will speak against a blank check for war.
In the book, Sasse goes far out of his way to be uncontroversial and extend his appeal across the board. Policy disagreements are reduced to asides, and he spends roughly zero time complaining about Obama, something I don’t think many elected Republicans could manage over nearly 300 pages. Sasse doesn’t foam at the mouth. More interesting is the way he reflects a growing willingness on some parts of the right to incorporate left-wing critiques of capitalist society. The senator cites hippie favorite Paul Goodman on public education, C. Wright Mills and John Kenneth Galbraith on consumerism, and even concedes that Karl Marx had a point about the “alienation of labor.” Sasse is a big supporter of traditional values, and it doesn’t bother him to take a shot or two like: “Our global systems of production have radically reduced the prices of almost everything, but they have also come at the cost of promoting a new mentality that everything is disposable.” Compared to the grinning nihilism of the neoliberal consumer implied by popular television (and overrepresented by both parties), it’s easy for even a Christian conservative of conscious to appear progressive.
I have no doubt that there are political consultants already very excited about Sasse and his future. He manages an inclusive, above-the-fray rhetoric, while remaining one of the most dogmatically conservative legislators. His ratings from various interest groups are highly polarized: the Heritage Foundation rates him the second-best member of the Senate (behind Mike Lee), while the NAACP thinks he’s among the second-worst (also behind Mike Lee). Sasse wants to force women to bring their pregnancies to term, put Christ back in the classroom, and cut taxes to make sure that’s all the government can do. At his day job he lacks even the occasional rogue streak; he is as consistently and as conventionally conservative as anyone (except Mike Lee). But you wouldn’t necessarily know that from his parenting advice. Whining about millennials could be coming from pretty much anywhere.
The biggest publicity hit The Vanishing American Adult has received reduced the book and the author to afterthoughts. Sitting one-on-one with Bill Maher, Sasse invited the comedian to come out to Nebraska and “work in the fields.” Maher couldn’t resist. It was a big swing and a miss on Maher’s part, which is too bad, because it was also an opportunity to poke at a weak spot in the guest’s argument. Sasse’s section on developing a work ethic is based on his own experience as a kid weeding soybean fields and detasseling corn (a detail mentioned in nearly every article about the senator, as well as the book’s short biographical note), but there aren’t enough ears to go around, and for most young Americans work experience is less picturesque. It’s not clear how a summer behind the counter at Starbucks drives home the value of “Work first, play later; and limit your play as much as necessary to get back to bed to be able to work first thing again tomorrow.” I’d agree that there is value to midwestern communalist agricultural practices, but to focus on that would require Sasse to consider the social relations of production instead of individual virtue. Easier to say that kids should work harder, like he did, weeding the soybean fields and detasseling corn.
Perhaps Sasse foregrounds his idyllic Nebraska childhood because the rest of his biography doesn’t gel quite as well with the brand he wants to project. When contemplating a worthy work life, Sasse quotes Martin Luther’s advice to a cobbler convert: “Make good shoes, and sell them at a reasonable price.” Except Sasse doesn’t make anything. As an adult, he has bounced between academia, the worlds of consulting and finance, and the government. That would all make more sense if he placed a high value on scholarship or public service, but despite his degrees Sasse relentlessly attacks schools—and as for public service, don’t forget about his Heritage rating. He exalts earthy labor that connects men to the land they live on, but he worked as an outside advisor to McKinsey: a consultant to consultants.
Reading Sasse’s book, I was interrupted by a persistent thought: Why isn’t this guy a youth pastor? All of the values and advice he so passionately dispenses sound like they should be coming from someone who has dedicated himself not just to service, but to modest service. Even with the ultra-conservative policy preferences, there’s nothing in Sasse’s set of declared principles that suggests pursuing that agenda is of life-defining importance. Many people oppose abortion in many different ways; very few of them become Republican senators. If anything, Sasse’s small government ethic should make him uneasy about living off the taxpayers. But maybe for him ambition isn’t a general-purpose value. Maybe he thinks most people should be humble, and others should not.
One of Sasse’s claims to fame is that he turned around a small, struggling hometown college. In the book he mines the tenure for credibility and a hard-to-believe anecdote about an entitled student. But the way Sasse reformed Midland owes more to his experience in private equity than wisdom about hard work. As president, Sasse rebranded Midland Luther College as Midland University, swallowed up half the students from a nearby de-licensed for-profit college, survived his own close call with the licensing board, invested in sports and a business program, and changed the school colors. For that he was paid hundreds of thousands of dollars a year, at a university with an enrollment smaller than that of many public high schools. There’s nothing folksy or visionary about his work at Midland; it’s straight out of the corporate consultant playbook. And there’s no sign from Sasse’s professional life that he knows how to do much else.
Writing in favor of delayed gratification and attempting to channel Augustine, Sasse says, “I remain selfish and impatient today, but it is surely not fake or wrong to seek to sublimate these traits.” When he was in his late 30s, Augustine sold his property, gave the money to the poor, and pursued a life of preaching, introspective scholarship, and monastic friendship. Sasse is a little behind, but it’s not too late to try.
Republicans are crafting a secret health care bill, and nobody has any idea how to stop them from passing it.
President Trump is champing at the bit to fire the Justice Department special counsel investigating him and his associates for obstruction of justice, and financial and election-related crimes.
In the suburbs of Atlanta, voters will determine Tuesday night whether Democrats flip the conservative sixth district, and create a new benchmark for the political environment Trump’s presidency has created, or not.
Here’s how all these stories interlock.
We were one month deeper into the 2009 health care debate than we are into the current one when a development that seems quaint by today’s standards nearly derailed the whole process.
“I’m going to really put you on the spot,” Senator Kent Conrad, the Democratic chairman of the budget committee at the time, asked Doug Elmendorf, then the director of the Congressional Budget Office. “From what you have seen from the products of the committees that have reported, do you see a successful effort being mounted to bend the long-term cost curve?”
Elmendorf answered, “No, Mr. Chairman.”
For the most part, the only people who will remember this exchange are reporters who covered the Obamacare legislative process, and an assortment of other wonks and nerds, but it is no exaggeration to say Elmendorf’s unassuming testimony rocked Capitol Hill and threw the fate of health care reform (not for the first time or the last) into doubt. The Washington Post called it a “devastating assessment … of the heath-care proposals drafted by congressional Democrats.” Soon, the White House’s frustration with the CBO’s pronouncements spilled out into a very public fight.
Today it is impossible to imagine Republicans inviting the CBO director to a public hearing to testify about the American Health Care Act. Republicans consider it a matter of great urgency that they pass a bill with minimal public scrutiny before the July 4 recess. They intend to release a bill text Thursday and hold a final vote within a week, with no hearings in between. It is even further-fetched to imagine Republicans imperiling the bill’s odds of passage by exposing it to constant public scrutiny for a full year, as Democrats did with the Affordable Care Act.
But while some Republicans might admit that their own methods are contemptible, they will never allow that they are essentially unprecedented or even worth stopping. The aftershocks of the AHCA, should it make the jump from secret bill to public law, are almost impossible to contemplate, and will roil politics for years. Every single Republican in the Senate is responsible for the fact that we’ve reached this precipice.
The popular perception that Democrats “rammed” Obamacare down the country’s throats is entirely a product of Republican myth-making. Having succeeded in perverting the public record with lies about Obamacare, Republicans in Congress see no downside to asserting that what they’re doing today is the same thing Democrats did eight years ago.
Generally speaking, this false equivalence takes two forms. The first is characterized by crocodile tears from Republicans who want credit for criticizing their own party, without taking any affirmative steps to change their party’s course:
The second is characterized by testiness and pre-adolescent retribution.
Both camps of Republicans are posturing based on a falsified history of Obamacare, with the single aim of controlling the bounds of dissent. Senator John McCain’s mealymouthed admission that secrecy is bad lays down the marker that Republicans are allowed to whine disingenuously about the process, but not to use their senatorial powers to do anything about it. It permits critics in the media who are reluctant to consult the public record to level charges of hypocrisy, but nothing more.
The insistence or implication that Democrats did it first is fabricated to protect the Senate health care bill from being treated as the scandal it is. Senator Rand Paul didn’t defend his claim, because he knew it was false; he just wasn’t expecting a reporter to challenge his perversion of historical fact. There are two interlocking things happening here: Republicans think they need to defend their actions with indefensible claims, and think they can make those claims to reporters’ faces without being asked to defend them. Both things should incense journalists and the broader public.
Republicans’ contempt for the intelligence and hard work of people who covered the 2009 health care debate—indeed the contempt for truth itself—doesn’t just make this attempted heist easier to pull. The fate of their bill turns on it.
During last year’s presidential campaign, Donald Trump and Hillary Clinton had heated debates over Syria. Clinton took a hardline stance, advocating more airstrikes against the Islamic State and the enforcement of no-fly zones, even at the risk of clashing with President Bashar al-Assad’s ally in the conflict, Russia. Trump countered by arguing that if defeating ISIS was so important, the United States should be willing to work with anyone. “I don’t like Assad at all, but Assad is killing ISIS,” Trump said during his second debate with Clinton. “Russia is killing ISIS. And Iran is killing ISIS.... I think you have to knock out ISIS.”
Trump was, in effect, calling for a grand alliance against the Islamic State, but it’s become abundantly clear that this is not to be. Instead, the president is undergoing a remarkable transformation on Syria: He’s rapidly turning into Hillary Clinton, and it’s making a major U.S. war more likely.
Though Trump derided such an approach in debates with Clinton, the U.S. is now fighting a multi-pronged war in Syria, as opposed to focusing exclusively on ISIS. In April, after a chemical weapons attack by Assad forces, Trump ordered a missile strike on a military airfield—a harbinger of a shift in policy, it now seems. Over the past week, the U.S. has downed two Iranian-built drones, and on Sunday the military shot down a Syrian jet—the first time the U.S. has done so in this war. In response, the Russian government suspended its air-traffic hotline with the U.S. and warned that it might target U.S. and allied planes if they fly west of the Euphrates.
The U.S. now finds itself in a much more dangerous situation in the Middle East, where the war against ISIS, which has broad bipartisan support, could become a wider regional conflict of the type that Trump specifically promised to avoid.
This shift is partially a response to success on the field. ISIS is in retreat, and expected to lose its strongholds in Mosul (in Iraq) and Raqqa (in Syria). Thus, the various factions fighting ISIS have no reason to form the alliance that Trump hinted at. Instead, they are already preparing for a post-ISIS world by securing as much territory as they can. For Iran in particular, the end of ISIS would be their biggest unexpected bounty since the early days of the George W. Bush administration, when the U.S. obligingly removed Saddam Hussein and the Taliban from power in Iraq and Afghanistan, respectively. With ISIS out of the picture, Iran would have an unbroken swath of allies from Iraq to Syria to Lebanon—a land bridge to Israel’s border.
“As ISIS disappears off the map,” Ilan Goldenberg, a former state department official, told to the Guardian, “this tolerance that Shia Iranian-supported groups and American-supported groups have shown for each other—there is a danger that will go away. You can see it all going haywire pretty quickly.”
Goldenberg’s warning should be heeded. Clinton’s policies were controversial but at least sprang from a coherent foreign policy worldview which could be debated. Trump might be following Clinton’s policies in broad outline but with little thought to consequences. He appears to be responding to unfolding events in an ad-hoc way, rather than being guided by a broader strategy. This “incremental escalation,” according to national security expert and former Obama adviser Colin Kahl, could quickly escalate because of the “asymmetry of stakes.”
Kahl notes that despite Trump’s more aggressive stance in Syria, the Assad regime, Russia, and Iran “haven’t backed down. They keep pushing, probing, testing, countering. They haven’t been cowed & deterred.” That’s because, whereas “the interests for the U.S. are important,” they are “vital” for the “Axis of Assad.” For Iran, the conflict is a chance to prop up a key ally in its broader regional struggle against Saudi Arabia. For Russia, propping up Assad would be a victory for their policy of national sovereignty versus America’s preference for regime change. For Assad, as Kahl says, the interests are “existential”—and not just for his government. If his regime falls, he may well end up like Hussein.
So the “Axis of Assad” won’t back down. Will Trump? Kahl seems pessimistic. If the axis retaliates over these recent provocations, the hawks in Trump’s National Security Council “will argue U.S. credibility has now been engaged, so we have to keep punching.” In effect, Trump is playing a game of chicken with foes that cannot afford to surrender, and the president might refuse to back down out of fear of losing face.
The pressure on Trump to escalate will come from outside his administration, too. Trump has made its alliance with Saudi Arabia the cornerstone of his Middle East foreign policy, which means supporting the Saudi government’s view of Iran as the chief promoter of instability and terrorism in the region. Saudi Arabia would welcome America joining in its larger regional proxy war against Iran.
In Congress, Iran hawks will welcome Trump’s shift on Syria. Because the president is a foreign policy novice with heterodox views, Congress has been unusually eager to serve as backseat drivers for his diplomatic efforts. On Iran, there’s a strong contingent of Republicans and Democrats who think Trump is not hawkish enough. House Intelligence Committee chairman Devin Nunes, known for his protectiveness of Trump in the Russia investigation, told the Washington Examiner, “One of my highest concerns is the Iranians’ ability to get a land bridge out to the Mediterranean to increase their logistical support for terrorist networks.” New York Congressman Eliot Engel, a member of the foreign affairs committee, said he worries that “there will be some collusion between Russia and Iran.”
A final factor that makes escalation likely is that the State Department, which normally would establish a modus vivendi for the post-ISIS Middle East, is being gutted by the Trump administration. With the department understaffed and demoralized, and managed by a secretary who seems isolated from the agency he oversees, America’s Middle East policy won’t be guided by those who believe in diplomacy. Trump’s preference for bilateral, zero-sum deals is at odds with the balance-of-power approach needed with divergent parties like Syria, Saudi Arabia, Iran, and Russia. This would be a monumental challenge for even the most skilled diplomat. Under Trump, it looks impossible.
The good news is that with ISIS on the decline, the type of nihilistic violence it sponsored might also diminish. The bad news is that the broader regional problems that created ISIS—sectarian strife, failures of governance, and despair over the permanent rule of autocrats—remain unchanged, if not worse. Indeed, if the U.S. continues to escalate this proxy war in Syria, these problems will only spread. “Red alert” indeed, and may cooler heads prevail somewhere within Trump’s hot White House.
On July 19, 2015, Raymond Tensing, then a 25-year-old police officer at the University of Cincinnati, shot and killed Samuel DuBose, 43, a musician, father, and likely pot dealer. Their encounter, during a routine traffic stop in Mt. Auburn, a hilly and near-impoverished neighborhood just north of downtown Cincinnati, lasted less than two minutes and was captured on Tensing’s body camera.
Two years later, Tensing’s trial on charges of murder—his second trial after a mistrial was declared in November 2016—has come to an end. On Monday, the two sides delivered their closing arguments, with the prosecution claiming that Tensing had repeatedly changed his story about the incident, had been the beneficiary of a “good ole boy” network among Cincinnati police, and had shown no genuine remorse for DuBose’s death. As the jurors begin deliberations, supporters of police reform are understandably pessimistic that they will return with a guilty verdict. Just last week, Jeronimo Yanez, the police officer who killed Philando Castile in 2016 during a similar traffic stop in Minnesota, was acquitted of all charges. Video evidence also played a role in that case, but unlike Castile, DuBose was unarmed at the time of his death.
There are concerns about how the city will respond if Tensing is found not guilty. Cincinnati suffered a massive urban riot in 2001, sparked by the killing of a 19-year-old unarmed black motorist named Timothy Thomas by Cincinnati police. The city’s police force had been responsible for the killings of 15 black men in the previous five years, but the city had never convicted a police officer for murder. Stephen Roach, the officer who shot the fleeing Thomas in the back after a brief foot chase, was acquitted. The county that encompasses Cincinnati, Hamilton County, uses only registered voters as potential jurors, which tends to skew jury pools whiter and more affluent than the general population. Hamilton County is 26 percent black, whereas Cincinnati, within which lies the narrow parkside corridor where DuBose was shot, is 42 percent black.
In the video of the killing, DuBose fails to produce a license time and again, asking Tensing, “Why’d you stop me?” He was unaware that not having his front license plate attached is against the law in Ohio. It’s an unevenly enforced law at best, but a good pretext for many a police officer to meet ticket quotas. DuBose tries to show Tensing that the plate is in the glove box, but Tensing is more interested in his identification, which DuBose can’t, or refuses, to produce. Noticing a bottle near DuBose’s feet, Tensing asks what it is. DuBose hands Tensing cheap perfume housed in a bottle of even cheaper gin. Tensing decides to officially detain DuBose, saying, “Go ahead and take your seatbelt off.” He tries to open the car door. DuBose stops him from doing so, putting his left hand on the door and drawing it back toward him.
DuBose had been arrested on marijuana-related charges 25 times previously. He had five containers of marijuana in the car at the time. In the video he appears agitated, perhaps fearful of having his suspended license and the marijuana detected, and puts the keys back in the ignition and begins to start the car. Then something deeply tragic happens over the next five seconds or so. In the jarring and blurry footage that unfolds, we see Tensing lunge into the car, ostensibly to prevent DuBose from driving off; Tensing yelling “stop” twice; Tensing raising his gun, which enters from frame right; and Tensing firing a round that severs Samuel Dubose’s brain stem. But what took place during those five seconds has been up for debate ever since.
Unbeknownst to those watching the footage, Tensing was wearing a shirt under his uniform depicting the Confederate flag. He had pulled over a higher percentage of African-Americans than any other cop on the force at that time, issuing 83.5 percent of his tickets to minority drivers. Everyone he had pulled over earlier that day was black. This is what it meant to protect and serve.
A week after the shooting, following lawsuits filed by local media outlets demanding the release of the footage, Hamilton County prosecutor Joe Deters published the video and indicted Tensing for murder as well as a lesser charge of voluntary manslaughter. A formerly scandal-ridden state treasurer who had once been labeled “pro cop at any cost” by the local alternative weekly Cincinnati Citybeat, Deters was quickly hailed as a villain by the Fraternal Order of Police for throwing cops under the bus and as a hero by the Los Angeles Times for saving the city from unrest.
Tensing’s trial was originally slated for November of 2015, but it began a year later, in the days leading up to the election of Donald Trump. It was widely viewed in the local media as a dangerous flashpoint reminiscent of the city’s troubled past; fear of unrest gripped Cincinnati both before the indictments were handed down and as the trial came to a conclusion. One juror was excused for fear of having their identity revealed by the media. On the third day of testimony, jurors left for a lunch break and did not return that day, allegedly due to fear of exposure.
Despite his reputation (or perhaps because of it), Deters was aggressive in his presentation of the case, calling Tensing a liar for claiming that DuBose was speeding off as Tensing’s arm was trapped in the car. Tensing said he was being dragged, prompting him to shoot DuBose from two feet away for fear of his own life. “The evidence is going to show you that this tragic case, this murder, was totally unwarranted, was completely intentional, and was truly unjustified,” Deters implored in his opening statement.
Tensing cried during his testimony, tears both Deters and DuBose’s family claimed were fake. “We were at the Oscars just now. We were watching an actor on stand right now,” DeShonda Reid, DuBose’s fiancée, who had become engaged to him only two days before his death, told FOX 19 News. “Imitating tears that we’ve been crying over a year as a family, very passionate tears. Tears because we lost someone.”
If you watch him long enough, the fear and regret in Raymond Tensing’s blue eyes startle you. People pay the best Hollywood actors millions of dollars to be able to fake that kind of expressive pain. He’s a tall, open-faced, handsome man, a cross between a young Tim Robbins and a younger Jeremy Renner, and when he weeps with sorrow as he tries to describe what led to him ending Samuel DuBose’s life, it is easy to imagine how someone might feel bad for him.
In the end, the jury couldn’t reach a verdict; only two jurors were unwilling to convict Tensing for voluntary manslaughter, while four of the twelve refused to convict for murder.
After the trial, examination of the pre-trial jury questionnaires led the Cincinnati Enquirer to speculate that several jurors on the panel exhibited opinions that could be construed as racially biased. Five of the jurors agreed or strongly agreed that “some races and/or ethnic groups tend to be more violent than others.” Of the twelve jurors on the panel, ten were white. An official from the Hamilton County sherriff’s office later revealed that at least one juror vehemently objected to the release of pre-trial jury questionnaires.
“I don’t understand how they were able to sit on the jury and why they weren’t removed for cause,” Donyetta Bailey, president of the Black Lawyers Association of Cincinnati, told the Cincinnati Enquirer. “To me, it’s an automatic challenge for cause. It shows racial prejudice.”
Local activists demanded a retrial and Deters promised his office would press forward with the case again. A new trial was set for May 25. The prosecution wanted the trial moved out of Hamilton County, fearing that it would be difficult to seat an impartial jury there; the defense wanted it to remain in southwestern Ohio. Judge Leslie Ghiz, a tough-talking 47-year-old former Cincinnati city councilwoman, took over as judge for the second trial. She sided with the defense; the retrial would remain in Hamilton County.
Ghiz was once an operative for George W. Bush’s 2004 re-election effort, one that was alleged to have suppressed the black vote in this portion of Ohio. She is one of the stars of the fascinating 2006 political documentary ...So Goes The Nation, which explores the contest between John Kerry and Bush in Ohio at the local organizer level. She emerged furious from chambers on day one of a pre-trial press hearing. There was a conflict over whether local and national news outlets would sign an unusual edict she had handed down.
Citing the jurors’ concerns during the last trial, Ghiz didn’t want the media prying into their identities. She told us that, because of media scrutiny, it would be hard to keep the trial in Hamilton County, where you’d have to have lived under a rock to not have an opinion about what happened to Samuel DuBose. She would restrict access to the jury selection process, and allow only three journalists, selected via lottery, only one pool news camera, and one still photographer in the courtroom each day when testimony began. Journalists would have to surrender their phones and other electronic devices capable of taking photos.
“This is unconstitutional!” shouted Monica Dias, an attorney representing local news channel WCPO.
“Hush,” Ghiz replied. This was going to be tough sledding for the fifth estate.
“If jurors’ identifies are kept secret, that’s a big chunk of the case that’s walled off from the public and the public cannot determine if justice has been administered fairly,” Dias later told two reporters from WCPO. Jury voir dire was then suspended when Ghiz’s earlier order restricting media access was vacated by Ohio’s First District Court of Appeals. Another hearing was held. And then jury selection began, with somewhat fewer media restrictions than before. We would be allowed to bring our recording devices into the courtroom, but taking photos of jurors was strictly off limits. The pool camera, positioned right next to the jury box, would never show them during the trial, which would be streamed on the websites of all the major local news outlets.
The jury would not be allowed to know about Tensing’s Great Smoky Mountains Confederate flag T-shirt. Tensing had testified in the first trial that the shirt held no meaning for him, and Ghiz found the shirt “far too prejudicial” to enter as evidence. She barred mention of DuBose’s copious marijuana-related arrests as well.
In her opening statement, assistant prosecutor Stacey DeGraffenreid, an African-American woman who, along with assistant prosecutor Seth Tieger, had replaced Deters for the second trial, said that Tensing had purposefully killed DuBose, and in the process had committed a crime against the “peace and dignity of the State of Ohio.” But she did so in an impassive way, much different than the fiery anger that had fueled Deters’s remarks, which may have been less effective than he had hoped. “The evidence will show this is clearly a murder,” DeGraffenreid said with clinical restraint, to the jury of nine whites and three blacks, nine of whom are women.
Defense lawyer Stew Mathews, tall, folksy, and with a full head of silver-white hair, attempted to humanize Tensing in his opening statement. After acknowledging that Tensing’s use of the term “dragged” may not have been the best choice of words, he claimed that Tensing had indeed become trapped by the car and legitimately feared for his life. He pointed out that Tensing had wanted to be a cop since he was a boy, and had been nice to all the other black people he had stopped that day. It was DuBose, in his lack of cooperation, who had threatened Tensing’s life. “They want to portray him as a racist. He wasn’t a racist. He wasn’t a hard-head, no-nonsense cop,” Mathews said, as his client sat sullen-faced. “He was an extremely human kind of guy.”
The prosecution called on the testimony of video analyst Grant Fredericks, who narrated the video frame by frame for the jury. Fredericks showed that, without a shadow of a doubt, Tensing had not been dragged by DuBose’s car. Alicia Napier, a young mother who witnessed the encounter firsthand, claimed that she heard the gunshot, and only then did DuBose’s car start accelerating down Rice Street, before it crashed through a guard rail and came to a rest against a utility pole. This contradicted Tensing’s contention that the car had already begun moving before he fired.
The police who backed up or investigated Tensing left more room for ambiguity. Officer Philip Kidd, who had reported to the scene to assist Tensing and who initially said he had seen Tensing get dragged by the car, backed off those claims, just as he had in the first trial. When grilled by Tieger as to why he had changed his story, Kidd said that he had seen Tensing “moving with the vehicle” and that he “wasn’t going to sit there and argue” with Tensing about whether he had actually been dragged. Investigating homocide detective Sgt. Shannon Haine, a witness for the prosecution, poured cold water on the prosecution’s case during a cross-examination from Mathews, saying, over Tieger’s objection, “I thought I was looking at an officer-involved shooting where its action may be determined to be justified based on the events surrounding the actual shooting.”
The defense offered its own video expert, James Scanlon, a retired Ohio cop with no training in analyzing crime scene videos, and a “forensic animator” named Scott Roder who was ultimately barred from presenting an animated simulation of the events that had been prepared by his firm. As in the first trial, Mathews tried to insinuate that DuBose was a drug dealer and a deadbeat who, if not fully deserving his demise, instigated enough reasonable fear in Tensing to present just cause for his grisly killing. Although unable to mention it himself, he coaxed Dr. Karen Looman, chief deputy coroner of Hamilton County, into revealing that DuBose had $2,620, a pack of rolling papers, and two small baggies of marijuana in his pockets at the time of his death.
Mathews’s cross-examination of DeShonda Reid, whose integrity he also called into question, grew explosive. After Reid questioned the relevance of the queries, Mathews posed to her concerning DuBose’s means of employment, Mathews asked her if she had been convicted of “falsification and possession of criminal tools” the previous November. Reid admitted that she had, although it had been in October. She added, “Unlike your client, I know how to own up to what I’ve done.”
Finally, on June 16, just as news spread that Yanez had been acquitted on all counts of shooting and killing Philando Castile, Tensing spoke in his own defense, on the final day of testimony in his retrial. He claimed that DuBose had “mashed the accelerator to the floor” as his left arm was pinned to the steering wheel. He said that the car had turned into him as it began accelerating, causing him to lose his balance and fall backward. “I did it to stop the threat,” he said, again and again, unable to contain his tears, unable to acknowledge that by knowingly and purposefully shooting a man in the head from two feet away, he was engaged in the intentional act of killing.
As Terina Allen, DuBose’s sister, was consoled by a friend, across from where members of the Tensing family sat stoic and red-faced, Tieger gave him little quarter. “After you shot him in the head, did you think he would just walk away?”
“I never thought about that,” Tensing said.
Few journalists covering the case think Tensing will be convicted. The mood of pessimism is even starker among African-American civilians, especially in the wake of Yanez’s acquittal. There is little reason to believe, in a country where a patrolman thinks it is acceptable to wear the flag of the Confederacy, as hateful a symbol as the swastika, in precincts populated by people whose ancestors were held in bondage under its aegis, that things would turn out any other way.
It was a hot day in July, a Saturday afternoon, and Kim James was bored. Her older sisters had taken her to a church event in their small hometown in Indiana, where the girls were spending their summer. Her parents were back in Bangladesh, working at the remote Baptist missionary compound where the family had lived, on and off, for five years. For an adventurous and high-spirited 13-year-old like Kim, Indiana seemed dull compared to Bangladesh. She missed her friends, the dozen or so missionary kids everybody called “MKs.” She missed the menagerie her parents let her keep: goats, cows, a parrot, a monkey. She missed the jackals that called in the distance at night, and the elephants that sometimes crashed through the compound fence.
As she thought about the mission, though, Kim felt troubled. Something was weighing on her mind. So she decided to skip out of the church event—it was for little kids, anyway—and go see the pastor. She found him in his office, trying to compose the next day’s sermon. Kim ambled around his desk, picking things up, putting them back down. Eventually, with feigned casualness, she pointed between her legs and said, “Is it wrong when someone does this—touches you here?”
The pastor dropped his pen and looked up. “Kim,” he asked, “has this happened to you?”
At first, Kim said no. But as the pastor gently persisted, she began to sob. Yes, she had been touched, there and there, lots of times.
The pastor asked Kim who had touched her.
Uncle Donn, she said.
Donn, the pastor would soon learn, was not really Kim’s uncle. He was Donn Ketcham, the 58-year-old chief doctor at the mission hospital in Bangladesh. His father had co-founded the Baptist denomination that sponsored the missionary group, the Association of Baptists for World Evangelism; its goal was to create a “militant, missionary-minded, Biblically separate haven of Fundamentalism.” Little known outside the world of Christian fundamentalists, ABWE is among the largest missionary groups in the United States, deploying more than 900 Baptists to 70 countries. His father’s legacy made Ketcham a sort of prince within the world of ABWE: the doctor with the “magical name,” as one missionary later put it, much beloved by the family of churches that supported the group. He’d been the undisputed patriarch of the Bangladesh mission for almost three decades.
Kim gave the pastor only a partial, fuzzy account of what had happened to her; as a child raised in a fundamentalist “haven,” she lacked the vocabulary to describe sex acts, let alone understand them. But rather than call Kim’s parents or contact the police, the shocked cleric turned to a higher authority, placing an urgent call to ABWE headquarters in Harrisburg, Pennsylvania.
That, Kim would realize many years later, was when the cover-up began.
Kim’s parents, Ken and Sue James, moved to Bangladesh with their four daughters in 1982, driven by a humble calling. Their life had always revolved around the church. When Ken was a boy, his family had hosted Baptist missionaries in their home, and he’d heard them talk about having to interrupt their work as doctors to re-roof a building or do other manual labor on their compounds. So when Ken grew up and became a handyman, he saw a role for himself at the mission. “We’re not pastors or preachers,” says Sue, “but he always wanted to have his hands to help.”
It took years for the family to raise the money they would need in Bangladesh—$4,000 a month—to cover travel, living expenses, insurance, and contributions to ABWE. They spent their Sundays and Wednesdays driving to conservative Baptist churches across Indiana and several other Midwestern and Northeastern states, giving presentations about the missionary work overseas. It was a long slog, but when they got to Bangladesh, they knew it had been worth it.
The 50 or so missionaries and children at the compound all seemed like one big family. Adults were called “aunt” and “uncle”; the kids were “cousins.” Presiding over the mission was Uncle Donn Ketcham, who had moved to Bangladesh with his wife, Kitty, in the early 1960s. A handsome man, with sideswept hair and a trim mustache, Ketcham cut a dashing figure as he rode his motorcycle on the jungle roads around the compound. He wasn’t just the medical authority, but a spiritual authority too, often leading the group in worship. “He was the ideal missionary,” Richard Stagg, an ABWE official and a friend of Ketcham, would recall years later. “He was a good surgeon. If your car broke down, he could fix it. If the generator broke, he could fix that. He was also my favorite preacher. He was smooth as silk. He had everybody fooled.”
Around the compound, it was an open secret that there had been “extramarital affairs” involving Ketcham and missionary women. Mission officials back in Pennsylvania had been receiving troubling reports since 1967, and had disciplined Ketcham for inappropriate relationships on multiple occasions. At one point, ABWE had issued a directive that single women at the compound were forbidden from riding with the doctor on his motorcycle. “Watch out for this man,” one missionary couple later recalled being warned when they arrived at the mission. “He’ll sweep you off your feet.”
But despite Ketcham’s obvious pattern of misconduct, ABWE seemed more interested in silencing the women involved than in punishing their abuser. A few years after Kim’s family arrived at the compound, a female missionary wrote directly to ABWE officials, recounting how Ketcham’s own daughter suspected he was having an affair; she’d caught him in his office, door locked, with a woman. When questioned, Ketcham claimed that he had been “trying to help” the woman and insisted that “there was NOT an immoral relationship.” At Ketcham’s request, ABWE resolved the situation by banning the woman from the mission.
In a culture where sex was only discussed in whispers, and where submission to authority was paramount, Ketcham’s privileged status remained unquestioned. Besides, he had a way of making others feel thrilled to receive his attention—especially the kids. “If he saw you coming, he’d light up and smile and say ‘Hi,’ and maybe give you a hug,” recalls Kim’s sister, Diana Durrill. “I don’t ever remember it being creepy. That was the scariest thing about him: We weren’t aware of what he was doing, because he was too good at it.”
Kim quickly became Ketcham’s favorite. Donn tutored her in math; Kitty gave all the MKs art lessons. When Kim’s parents had to travel for missionary training, they often left their daughters with the couple. Ketcham let Kim hang around the hospital after school, allowing her to visit patients, change bandages, even hold newborns after birth. “I think that’s how he got his hooks into me,” Kim says. “Going up there every day and seeing what was happening really intrigued me. Overseas, there were really no rules.”
Like most fundamentalist families, the Jameses never talked about sex. Sue and Ken rarely even kissed in front of the girls. On the cusp of puberty, Kim was clueless. “I knew I was going to start my period, but I never knew why,” she says. “I thought you could get pregnant from kissing a guy.” So when, at twelve, she had questions about her body, she turned to Ketcham. “He was like a cross between a father figure and a grandpa,” she says. “I thought the world of him—that he could do no wrong. He was perfect.”
Kim wanted to know about masturbation: Was it as harmful, as wrong, as she’d been taught? Ketcham reassured her. “There’s nothing wrong with that,” she recalls him telling her. “In fact, this is how a guy does it.” And then he proceeded to demonstrate.
Kim was embarrassed. “I’d never seen a penis in my life,” she says. “I didn’t know what it was. I just saw this foreign thing and I was shocked.”
She tried to protest. “What are you doing?” she said. But Ketcham insisted it was fine. “This is normal,” he told her. “This is what guys do.”
“I don’t think this is normal,” she said. “Because my dad has never done this.”
Ketcham encouraged her to touch him. Kim “didn’t know what to do,” she recalls. “All I kept thinking was, ‘This is a doctor, the most godly man here. He wouldn’t do anything that’s not right.’ Then I thought, ‘Kim, just accept it. It’s OK.’ That’s everything that was going through my mind.”
After that incident, Ketcham began conducting regular breast and pelvic examinations on Kim when she stopped by the hospital or came to his house for math tutoring. At first, she’d ask questions, challenge him a little; Kim had never been shy.
“What’s wrong with me, what are you feeling for?” she wanted to know.
“Nothing,” he would reply. “Just checking.” Although the exams were medically unnecessary—breast and cervical cancers are extremely rare in preteen girls—they became an almost daily occurrence.
In the months that followed, Ketcham escalated his abuse. He kissed Kim. He asked her to touch his penis. He had her over to his house while his wife was away, and sometimes when she was there, in the next room. Her gave her pills that made her drowsy. He told her how special she was, and gave her affectionate nicknames, like “Mugwamp,” to prove it.
The Bangladesh compound was its own small world. Everybody noticed how close Kim was to Uncle Donn, and some were worried—though they never reported their suspicions. One day, a missionary who’d been watching the James girls for Ken and Sue found Ketcham leaving Kim’s bedroom after an in-home visit; later, when she checked on Kim, she found her crying. Another adult, passing by Ketcham’s house one day, saw Kim sitting on the doctor’s lap in his bedroom. A few minutes later, Kim came bolting out of the house, begging: Please don’t tell my mother. Other missionaries blamed Kim for what they were seeing; the girl, some said, had a reputation as “boy-crazy,” with a “habit of clinging to men.”
Kim’s confusion only grew. “My body likes it,” she thought, “so either I’m a bad person or it’s OK.” The biblical teachings she’d grown up with, in a culture that preached abstinence and the sinfulness of sex, told her it was wrong. But Uncle Donn was a close friend of her parents, and the holiest man she knew. He wouldn’t do anything that’s not right, she thought. Ketcham told her God was using him to help her. If she told her parents, he warned, her entire family would be banished from the mission, “where God wanted them to be.” Kim was old enough to understand the implicit threat: By speaking up, she would be ruining God’s purpose for her family. And the blame would be no one’s but hers.
In the spring of 1989, when Kim was 13, Ketcham raped her at the hospital. As she lay on an exam table—“like a corpse, waiting for him to get off”—she had only one way to understand the assault: Love, she thought. This must be love.
For evangelical Christians like Ken and Sue James, bringing up kids in a close-knit fundamentalist community feels like blessing them with the ultimate “safe space” from the moral laxity of the larger culture. Sexual abuse is something that happens in the secular world, not among the God-fearing. This, after all, is the universe of abstinence pledges and old-fashioned courtship, where parents build their entire lives around shielding their children from worldly temptations.
Yet the potential for sexual abuse is actually exacerbated by the core identity of fundamentalist groups like ABWE. Like Catholics, fundamentalists preach strict obedience to religious authority. Sex is not only prohibited outside of marriage, but rarely discussed. These overlapping dynamics of silence and submission make conservative Christians a ripe target for sexual predators. As one convicted child abuser tells clinical psychologist Anna Salter in her book Predators: Pedophiles, Rapists, and Other Sex Offenders, “Church people are easy to fool.”
Over the past five years, in fact, it has become increasingly clear—even to some conservative Christians—that fundamentalist churches face a widespread epidemic of sexual abuse and institutional denial that could ultimately involve more victims than the pedophilia scandal in the Catholic Church. In 2012, an investigation at Bob Jones University, known as the “fortress of fundamentalism,” revealed that the school had systematically covered up allegations of sexual assault and counseled victims to forgive their attackers. Sovereign Grace, a network of “neo-Calvinist” churches, has been facing multiple allegations of child molestation and sexual abuse. In 2014, a New Republic investigation found that school officials at Patrick Henry College, a popular destination for Christian homeschoolers, had routinely responded to rape and harassment claims by treating perpetrators with impunity, discouraging women from going to the police, and blaming them for dressing immodestly.
Allegations of sexual misconduct have also engulfed four of fundamentalism’s most venerated patriarchs. Doug Phillips, a prominent leader of the Christian homeschooling movement, was forced to step down in 2013 from his nationwide ministry, Vision Forum, after he was sued by a former nanny who claimed he groomed her as a teenager to be his “personal sex object.” The following year, Bill Gothard, founder of the influential Institute in Basic Life Principles, resigned amid more than 30 allegations of sexual harassment and molestation by former staffers, interns, and volunteers. In the first case to cross over into the cultural mainstream, Josh Dugger, the beloved eldest son of reality TV’s favorite fundamentalist family, fell into disgrace in 2015 with the revelation that he had molested five underage girls, including four of his sisters. And this July, the chief of another fundamentalist reality-TV clan, Toby Willis, is scheduled to stand trial on four counts of child rape.
This burgeoning crisis of abuse has received far less attention than the well-documented scandal that rocked the Catholic Church. That’s in part because the evangelical and fundamentalist world, unlike the Catholic hierarchy, is diverse and fractious, composed of thousands of far-flung denominations, ministries, parachurch groups, and missions like ABWE. Among Christian evangelicals, there is no central church authority to investigate, punish, or reform. Groups like ABWE answer only to themselves.
The scale of potential abuse is huge. Evangelical Protestants far outnumber Catholics in the United States, with more than 280,000 churches, religious schools, and affiliated organizations. In 2007, the three leading insurance companies that provide coverage for the majority of Protestant institutions said they received an average of 260 reports per year of child sexual abuse at the hands of church leaders and members. By contrast, the Catholic Church was reporting 228 “credible accusations” per year.
“Protestants have responded much worse than the Catholics to this issue,” says Boz Tchividjian, a former child sex-abuse prosecutor who is the grandson of legendary evangelist Billy Graham. “One of the reasons is that, like it or not, the Catholics have been forced, through three decades of lawsuits, to address this issue. We’ve never been forced to deal with it on a Protestant-wide basis.”
To investigate and expose sexual abuse in evangelical churches, Tchividjian founded GRACE, short for Godly Response to Abuse in a Christian Environment. In 2011, the group was hired to look into what had happened on the Bangladesh compound. While the abuse itself took place long ago, ABWE’s denial and coverup spanned more than two decades—a pattern that eerily replicates the Catholic scandal. The authoritarianism that often prevails in fundamentalist circles, Tchividjian says, is what sets the stage for widespread abuse—and for the systematic mishandling of reported cases. “When you have so much concentrated authority, in so few fallible individuals, problems percolate,” he says. “And when they do, they’re not often addressed. Because the leaders who hold all the authority decide what to do with them.”
It didn’t take long for Kim to wish that she had never said a word to her pastor. Two days after his emergency call to ABWE headquarters in Pennsylvania, two high-level staffers from the organization landed in Indiana. Kim came to think of them as “the Russes.” Russell Ebersole was ABWE’s executive administrator for the Far East. Russell Lloyd was a prominent counselor for the missionary group, eschewing traditional psychology for “Bible-based” therapy methods. They arrived at her pastor’s home looking grim and official, and immediately set about determining whether Kim’s story was true or merely “the exaggeration of an immature teenager,” as Lloyd put it in “Journey to Bangladesh,” a diary he kept about the case.
For the next two days, the Russes interrogated Kim with only the pastor and his wife present, and without the knowledge or consent of her parents, who were still in Bangladesh. “It was nothing like, ‘Kim, we’re going to be talking about some things,’ ” she recalls. “No taking time to get to know me—just point-blank, we have to do this fast.” Ebersole and Lloyd asked questions involving terms that Kim didn’t understand. Did Ketcham have intercourse with her? Had he touched her clitoris? “What’s that?” she responded, bewildered. “I think I was in shock. I’m 13, and I’m being taught the whole story of sex by these men.”
The Russes already knew that Uncle Donn had a sketchy past; Ebersole had personally fielded a complaint involving one of his affairs with an adult woman at the mission. But now Ketcham was being accused of molesting an underage girl—the first time ABWE officials had heard an allegation of child sexual abuse. As Kim struggled to answer their questions, the Russes became convinced that she was telling them the truth about Ketcham touching her. What they couldn’t believe, given fundamentalist precepts about the nature of sex and women, was that she was an innocent party. “It was lust in its most base form, uncontrolled in the body of a spiritually immature woman,” Lloyd wrote of the 13-year-old in his diary. Ketcham, he wrote, had become Kim’s “secret lover.”
The next thing Kim knew, she was flying back to Bangladesh with her two interrogators. Ebersole and Lloyd had decided to confront Ketcham by surprise, to prevent him from concocting a cover story in advance. However much they blamed Kim for the “affair,” they knew the doctor would have to leave the mission if he couldn’t exonerate himself. On the long flight, they sat Kim between them and continued to drill her with questions. At one point, when she got up to go to the bathroom, Kim weighed whether to tell a flight attendant that she’d been kidnapped.
The Russes “strongly encouraged” Kim to sign a statement, styled as a confession, in which she apologized for her role in a “relationship” that “transgressed God’s word.” She didn’t understand much of it, but she signed it anyway. “I did exactly what I was told,” she says. “I think I was trying to protect Donn, because I cared about him. So I said whatever responsibility I have is fine. I guess that’s the way I was raised: You accept your responsibility, and I wanted to accept mine.”
Across the world, her parents were in a panic. All they had received was a cryptic message that their youngest daughter would be flying back from Indiana alone with ABWE officers. Unable to reach their other daughters back in Indiana, the Jameses came to fear that something awful had happened to everyone but Kim. Had their other daughters been in an accident of some kind? Were they dead?
When the plane touched down in nearby Chittagong, Ken James was almost beside himself. “Are my other two kids alive?” he asked the Russes. Assured that their other daughters were safe and sound, the Jameses were almost relieved to hear the actual news: There had been some inappropriate touching, the Russes told them, between Kim and Dr. Ketcham. Nothing was said about a rape or a signed confession. Sue and Ken wouldn’t know about any of that for many years to come. “They said it was fondling,” Sue recalls. “We thought, ‘We can handle that.’ ”
But the parents were given no opportunity to “handle” the situation. ABWE was in charge, and Ebersole and Lloyd refused to leave Kim alone with her parents. The plan, the Russes explained, was to drive back to the compound and confront Ketcham in person. If Ketcham was caught off-guard when they arrived, however, he recovered quickly. The doctor blithely owned up to what he called a “bittersweet relationship” with Kim, characterizing it as one in a long line of extramarital indiscretions. Saying he wanted to start at “the real beginning” and confess it all, Ketcham recounted “illicit sexual relationships with other women” dating back to his college days and stretching through his nearly three decades at the compound. Lloyd was impressed by the accused man’s poise. Ketcham’s “sense of humor was intact,” he wrote. “His creative wit and clever use of words were still laced throughout his comments. He even laughed on occasion. It was as if he were describing someone else!”
Still, the Russes told him, he’d have to leave the mission. After Ketcham excused himself to go home and tell his wife what was happening, the men from ABWE were surprised to see the couple return in less than half an hour. According to Lloyd’s diary, he and Ebersole assured Kitty that her husband had not “seduced” the 13-year-old, that Kim had been “a willing partner.” And when they told the Ketchams they would have to leave the mission after nearly three decades, Kitty seemed as unaccountably unruffled as her husband. “Interestingly,” Lloyd wrote, “her only notable question pertained to how long they would have to pack and be off the field, and to the severance package that ABWE would give.” He chalked this up to “unspeakable shock,” predicting that “torment, rage, bitterness, resentment, betrayal, shame, embarrassment, grief—if not present then—would soon visit her.” The Russes, so indignant over Kim’s “lust in its most base form,” were brimming with sympathy for the Ketchams. “How we ached for both of them!” Lloyd wrote.
The return to Bangladesh had been a big blur to Kim. But as dazed and terrified as she was, one thing was clear: It was her job to apologize. As soon as the Russes had finished praying over the Ketchams, they brought them to the James home for a healing visit. Tearfully, as Lloyd recounts in his diary, Kim told Uncle Donn she was sorry. As Kitty held a weeping Kim in her arms, Donn asked for the girl’s forgiveness as well. At Lloyd’s prompting, he praised her courage and integrity in coming forward. After a second prompting, he told Kim not to worry that “she alone was responsible for Donn’s ruined missionary career.” It was not, in other words, all her fault.
The matter was now closed, the Russes told everyone. There would be no need to talk about this unfortunate episode again. As everyone hugged and cried, Kim went to embrace Uncle Donn. But Lloyd and Ebersole stopped her—to her “great dismay,” Lloyd wrote in his diary, interpreting this as yet another sign of Kim’s “strong sexual bonding to Donn Ketcham.”
As the Ketchams packed their belongings, the Russes took pains to contain the story. They held what they called an “Extraordinary Meeting” to give adults at the compound a euphemistic account of what had happened, telling them Ketcham was leaving and not to discuss the matter further. (A nurse who attended the meeting recalled years later that the missionaries complied, in part, because of their strong belief in a verse from the Bible: “For it is shameful to mention what the disobedient do in secret.”) Next the Russes called together the MKs. Kim had had an inappropriate relationship with Uncle Donn, the kids were told. It was wrong, and she was sorry. Then they were instructed to give her a hug. The children did as they were told. “Some even privately offered unsolicited forgiveness,” wrote Lloyd. He was “encouraged.” Kim was numb. “I don’t remember feeling anything,” she says.
Finally, to complete the façade of healing, the Russes convinced Kim’s parents to invite the Ketchams over for dinner before their departure. That evening, Kim was bundled off to another family’s home, while Ken and Sue shared a meal with their daughter’s assailant. Ketcham—to Sue’s astonishment and relief—was “his usual, laughing, carefree person.” But looking back, she can’t believe that she and her husband were “dumb enough” to agree to the make-believe. “I know it sounds like we weren’t good parents,” she says. “But in a compound situation like that, where you eat, go to church, and work together, it’s like one big family. So when everyone is telling you to do this—and he did ask forgiveness—that’s what God wants you to do.”
Ken and Sue wanted to take Kim home to the United States to get help. The Russes talked them out of it. Bangladesh was where the family’s support system was, they said. It would be healthier for the Jameses to stay put.
Donn and Kitty Ketcham flew back to the States and settled down in Allendale, Michigan. ABWE officials rallied around the couple. On their advice, Ketcham wrote a letter to the churches that had sponsored him, confessing to “sin,” but not to child sexual abuse. Ebersole sent his own letter, explaining that Ketcham had left the mission over “immoral conduct”—a vague charge that most interpreted as mere adultery. Ebersole asked the churches to keep funding the Ketchams for two more months, until they’d resettled and found jobs. “A beloved brother has fallen!” Ebersole wrote. “May God help us to biblically restore him and to help bear the deep burden that he and his dear wife, Kitty, carry at this time.”
Because no one from ABWE alerted police or the state medical board that Ketcham had confessed to sexually abusing a child 45 years his junior, he was able to return to practicing medicine and teaching Sunday school. He would go on to see patients for another 23 years. For years after their departure, ABWE president Wendell Kempton continued to write the Ketchams warm letters, sending “love and prayers.”
Back at the compound, Kim became increasingly withdrawn and isolated. “I wasn’t allowed to talk about it,” she says. “We were told it was over and done with: Move on.” The other missionaries blamed Kim and her family for driving away the compound’s most revered leader. “Donn is needed here,” a few told Kim to her face. “You aren’t.”
Worst of all, Kim felt betrayed by her own parents. “It almost killed me to see my mom and dad hug Donn and Kit, like nothing had ever happened,” she recalls. She tried telling herself they were just being dutiful Baptists, “doing what they thought God would do: God wouldn’t slap the crap out of him. God would turn the other cheek.” But deep down, it was hard not to wish they had come to her defense. She stopped talking to them. She stopped eating. In 1991, two years after she told her pastor about Ketcham, she attempted suicide, taking an overdose of the Paxil she’d been prescribed. “I just felt alone,” she says. “I told God if I could talk to him, I’d rather be there with him than down here not able to talk.”
The family returned to Indiana, but Kim’s downward slide continued. She repeatedly cut herself, requiring emergency runs to the hospital. She developed multiple eating disorders, at one point shrinking to 96 pounds. She tried repeatedly to kill herself. She enrolled in community college, but couldn’t keep up. She couldn’t hold a job. She couldn’t make a life.
Sexual abuse often derails the lives of its victims in painful and lasting ways. But when the abuse happens in a church setting, there’s an additional burden—a kind of spiritual abuse, the sense that religious leaders have betrayed the power bestowed on them by God. “It really rattles people at their core in terms of faith,” says Diane Langberg, a psychologist and seminary professor who serves on the board of GRACE. “People walk away thinking that God is a perp or complicit.”
That’s precisely how it felt to Kim. “God,” she prayed, “you’re a sick God to allow this to happen.” But she was losing more than her faith; she was losing her entire world, the close-knit missionary community that had served as her extended family. ABWE was her whole life—the only one she had ever known. So when the organization finally reached out and offered to help her, Kim jumped at the chance.
In the summer of 2002, unbeknownst to the Jameses, a group of Bangladesh MKs gathered for a reunion in Pennsylvania. Nine of them asked to meet with Michael Loftis, ABWE’s then-president, to discuss Donn Ketcham. The meeting lasted for three hours, until 1:30 in the morning. Seven of the women told nearly identical stories of how Ketcham had molested them as children, often under the same guise that he used with Kim: breast and pelvic exams, sometimes conducted with their mothers sitting unaware in the room. One former MK recalled going on a trip with Ketcham and blacking out, leading her to wonder whether she had been drugged and molested. ABWE officials, the women told Loftis, had “always protected Uncle Donn”—and poor Kim James had been blamed for her own abuse.
Loftis seemed shocked. He promised to launch an investigation and pay for any treatment the MKs needed. But the investigation went nowhere, and ABWE still neglected to report Ketcham to the authorities. Loftis did take action on one front, though: He called Kim and invited her to come to ABWE headquarters in Pennsylvania for free medical assistance and counseling. Kim, who was unemployed and living with her boyfriend at the time, had heard of a program for eating disorders that she wanted to try. What did she have to lose?
In a bizarre reprise of the events 13 years earlier, Kim’s parents had no idea what was happening. One Sunday morning in July, Sue got a call from their pastor. “Kim’s in Harrisburg,” he told her. “Russ Ebersole wants to call you.” Later that day, when the family reached Ebersole and Russell Lloyd in Pennsylvania, the two Russes told them that Kim was once again with them. And she had something to say. Then Kim’s voice came on the line. “I got saved!” she told her parents.
ABWE, they feared, had taken over Kim’s life again. The Russes told the family to meet them a few days later at the airport; they were flying to Indiana with Kim to go to her boyfriend’s apartment when he wasn’t home and clean out her possessions. When they showed up, Ken and Sue thought their daughter looked dazed, out of it. They couldn’t understand why she was being rushed out of her apartment, but the Russes were adamant. “Kim,” her father told her, “you don’t have to go. I’ll tell them you’re not going.” But Kim said she wanted to.
This was the start of what the family refers to as the “Bermuda Triangle years.” For nearly two years, ABWE blocked almost all contact with Kim. When she arrived in Harrisburg, Kim says, officials took away her cell phone, telling her not to contact her family so she could focus on getting well. When her parents tried to check on her, she was only allowed to speak to them with the church’s staff or lawyers monitoring the call. They begged her to come home, but the ABWE handlers would cut in, telling them not to interfere with Kim’s “recovery.” Then the calls stopped.
Kim remembers little about the next 22 months. She was bounced between Pennsylvania and North Carolina, where she lived at one point with Lloyd and his family. Ken and Sue James received occasional letters, which didn’t sound like they were written by Kim. Then the letters stopped, too. Ken tried to track her down in North Carolina, to no avail. Finally, in a panic, he called ABWE and threatened to “get in the pulpit of every church in the country and say what’s going on” unless he heard from Kim immediately.
That week, Kim called. She was in a homeless shelter in Asheville, North Carolina. Her sister Diana was living about 70 minutes away in South Carolina, and a shelter worker drove Kim to her house. She was disheveled and confused. She didn’t want to talk about what had happened—partly out of embarrassment, Diana suspected. The next day, Kim called her old boyfriend back in Indiana, who bought her a plane ticket home.
Over the next five years, Kim continued to struggle. She still cut herself, still had eating problems. Whatever had happened to her during her time with ABWE, it hadn’t helped. She had not been saved.
Then, one day in 2010, Kim got a call from a former MK named Susannah Beals Baker. The gathering at the reunion eight years earlier hadn’t forced ABWE to reform itself—but it had gotten former MKs talking about Donn Ketcham and remembering things that they thought were similar to what happened to Kim. Talking to Baker, Kim knew for the first time in her life that she hadn’t been the only one. It gave her an unfamiliar burst of empowerment.
At the urging of a new counselor, Kim demanded that ABWE hand over all the documents it had on her case. Not surprisingly, officials resisted at first. But Kim told them that her counselor needed to have her history to help her. “If you want to talk to my lawyer,” she added, “feel free.” That did the trick. ABWE didn’t send all the documents, but they did include portions of Lloyd’s diary, a copy of Kim’s “confession,” and the correspondence that allowed Ketcham to reestablish himself in the United States. ABWE, for all its efforts to bury Ketcham’s crimes, was finally losing control of the story.
In 2011, Kim helped Baker launch a blog, Bangladesh MKs Speak. They began publishing testimonies of those who had suffered sexual abuse at the hands of Ketcham—and, most explosively, the ABWE documents Kim had obtained. Within the first week, the blog attracted some 1,600 comments, including stricken responses from ABWE parents and former MKs, and a horrified testimony from Ketcham’s pastor in Michigan, who said ABWE had grossly misled him about why Ketcham had left the mission.
The blog sparked new allegations of abuse. One day, as it was preparing to launch, Diana called an old missionary friend to talk about how best to be supportive of her sister’s project. As they chatted about Ketcham, Diana recalled the time she’d stayed at Uncle Donn’s house while her parents were away. She was in bed, fading in and out of consciousness. Ketcham, leaning over her, told her she had typhoid fever. The rest was a blur.
Her friend was stunned. “The same thing happened to me,” she said. Left with the Ketchams, the friend had also been diagnosed with “typhoid.” She woke up foggy-headed and troubled by strange dreams, with symptoms of a urinary tract infection. Soon thereafter, she began to experience insomnia, depression, and severe anxiety—symptoms of PTSD that would last into early adulthood.
ABWE officials were undone by the public revelations on the blog. They posted their own “confession,” acknowledging that “a precious 14-year-old should never have been asked to sign a confession,” and asking—nine times—that the MKs “please, please forgive us.” They held a bizarre “sackcloth and ash” ceremony, captured on video, in which Loftis, the ABWE president, prostrated himself before a representative MK and cut his hair and clothes as he confessed the church’s failure to protect children from Ketcham. (The MK to whom he confessed later called the episode a “freak show,” and said she just sat there “frozen in shock and horror, disbelief.”) More important, ABWE finally reported Ketcham to the Michigan Medical Licensing Board. Twenty-three years after he’d admitted to child sex abuse, Ketcham, who was still practicing medicine in his early eighties, forfeited his license.
But ABWE wasn’t done with the coverup. To placate the Jameses and the MKs, the group hired GRACE to dig up the whole story. Then, when GRACE was only two weeks away from publishing its report, ABWE abruptly fired the group. The MKs and their families were livid. ABWE announced it had hired a private investigative firm, Professional Investigators International, to replace GRACE. But PII, the MKs quickly discovered, had been founded by a Mormon couple who also ran an image-consulting business. The former missionaries were convinced that this “investigation” would amount to nothing more than a whitewash. Kim and Diana declined to be interviewed.
Last spring, however, PII published 280 pages of findings, drawn from 204 interviews. Even for the MKs, the report was a bombshell. Donn Ketcham, the firm found, had been molesting girls and women at the Bangladesh mission as far back as the 1960s. Investigators identified at least 23 missionaries who had been molested or raped, 18 of them children. “Donn Ketcham engaged in a wide range of sexual misconduct,” PII determined, including “sexual harassment, consensual extramarital affairs with adult women, sexual abuse of minors and adults under the guise of medical care, rape, and statutory rape.”
In exhaustive detail, the investigators confirmed both Kim’s story of abuse— finding she had “a minimum of 10 to 15 sexual encounters” with Ketcham—as well as her subsequent mistreatment by ABWE, which “treated the victim as if she were complicit.” Other former missionaries told stories that were sickeningly similar to Kim’s. Several said that Ketcham had started giving them breast and pelvic exams when they were as young as three. In 1969, an eight-year-old girl had come down with a bad case of shingles—rare among children—after seeing Ketcham and possibly having sexual contact. In 1970, one victim said Ketcham raped her during a physical. In 1975, an MK ran away to another family’s home rather than go to her physical with Ketcham. And over the years, several former MKs had said they’d received injections from Ketcham and blacked out during exams; medical staff at the mission’s hospital had speculated that Ketcham might have administered ketamine, a powerful anesthetic, and molested the girls. The hospital eventually stopped using ketamine, in fact, because multiple women had reported that after surgical procedures, they “dreamed” they had been raped.
The investigators were unsparing in their description of Ketcham, a “confessed pedophile” who expertly practiced “manipulation, deceit, and sociopathic behaviors.” But they came down hardest on ABWE, which gave Ketcham “preferential treatment,” blamed his victims, and failed to dismiss him from the mission field years sooner. By 1974, they found, ABWE officials had more than enough evidence to warrant Ketcham’s removal, “which would have preempted his access to many of his victims.” While other missionaries were banished for minor infractions—one man for being “cocky,” a woman for showing a “lack of essential reserve” in dealing with Bengali men—Ketcham went unpunished. Instead, ABWE kept missionaries silent about his abuses by requiring an “unquestioning compliance with authority”—an approach that drew on the “prevailing attitude toward authority in evangelical circles.” To cover up the scandal, the group had burned files related to Ketcham, and redacted huge portions of the documents it did turn over. ABWE administrators had even proposed creating a “Dark Information Book” to hide similar scandals. As a result, PII concluded, “children were ‘sacrificed’ so that the ministry would not be ‘discredited.’ ”
ABWE officials who dealt with Ketcham, including Loftis and Russ Ebersole, refused to comment for this story (Russ Lloyd could not be reached for comment). The group’s current president, Al Cockrell, responded to questions by issuing a statement. He suggested that PII’s report may include unspecified “misinterpretations or errors,” but acknowledged that it contains “absolute facts” showing that “past ABWE leadership failed to act with integrity and accountability in our handling of abuse perpetrated by Don Ketcham,” and “utterly failed in our response to his victims.”
The report was undoubtedly incomplete; the number of Ketcham’s victims had almost certainly been higher, and investigators made no attempt to interview the Bengali “nationals” who were his main patients at the hospital. But for the MKs and their families, it was enough. “I was frankly shocked that ABWE actually released the report,” says Diana. “It was accurate for the most part, as far as how the mission covered it up, how they treated our family.”
For the Jameses, the report underscored just how much they’d been kept in the dark for decades. They never saw Kim’s “confession” until it was posted on the blog, and didn’t know that Ketcham had raped her until they read her account of what happened. “We saw it when everybody else did,” Diana says. “It was absolutely, completely devastating.” But after the initial shock, Diana started asking Kim questions about what Ketcham had done to her. It was the first time the two sisters had discussed it in detail. “She started answering and we just cried,” Diana recalls. “She thought our parents knew.”
Now Sue and Ken finally understood why their daughter had struggled so much. “Kim thought we were choosing God’s work and the mission over her,” she says. ABWE had lied to them. If only they’d known, perhaps Kim could have moved on. “The knowledge of that would have changed the last 22 years,” Diana says. “It would have changed her life if my parents had been told the truth.”
The explosive findings about Donn Ketcham’s serial abuse, and ABWE’s role in covering it up, did not make big headlines. Such stories rarely do. It’s another product of the sprawling, disparate world of Christian fundamentalism: Even the ugliest story about a relatively obscure Baptist denomination isn’t going to get Catholic scandal–level attention. But the report that finally emerged, almost three decades after Kim James was raped in Bangladesh, added to the growing evidence of a widespread crisis of sexual abuse in conservative Protestantism.
Kim’s name is now on legislation that would close the legal loophole that helped Ketcham evade punishment. The Kimberly Doe Act, drafted by GRACE founder Boz Tchividjian and conservative activist Michael Reagan, would hold U.S. citizens overseas to the same requirement to report suspected child sexual abuse that applies stateside. (If such a law had existed in 1989, ABWE officials, doctors, nurses, and parents would have been obligated to report what happened to Kim.) The bill would also hold organizations like ABWE responsible if they don’t train their employees to report sexual abuse.
“Someone asked me: Are you more mad at ABWE or Donn?” says Sue James. “Donn Ketcham, yes, we’re very angry at him. But in some ways it’s a different anger at ABWE. All these kids would have been safe if they’d taken the guy off the field when these things first happened. Think how many MKs would have not been hurt.” (Ketcham, who refused to cooperate with PII’s investigation, did not respond to repeated requests for comment for this story.)
Even if Kim’s law passes, it won’t enable her or other MKs to hold Ketcham accountable for his crimes in Bangladesh. But the accounts from the blog and the PII report may yet result in the doctor receiving a measure of justice. Last August, Ketcham was charged with abusing a six-year-old patient in Michigan while conducting a medical exam. The alleged abuse, which took place in 1999, came to light after the patient’s mother happened on the blog and read Kim’s documents. In February, Ketcham was ordered to stand trial in Michigan District Court. At 86, he faces a life sentence for first-degree sexual assault—half a century after he started abusing women and girls in Bangladesh. Twenty-eight years after he raped Kim. Eighteen years after he allegedly molested a six-year-old.
Within that timeline is a world of blame—and warning. Sexual abuse among the nation’s thousands of evangelical denominations may never come into focus the way it has in the Catholic Church. But more and more cases will inevitably come to light—revelation by revelation, report by report, headline after headline—even as conservative churches cling to their happy-family images, no matter who gets hurt. Boz Tchividjian, the founder of GRACE, says his fellow Protestants should reject the impulse to view the scandal the way many Catholics did for years: as a matter of a few bad apples being belatedly punished. “Protestants are going to have to accept the fact that we have many more similarities than differences with our Catholic brothers and sisters when it comes to how we have failed to protect and serve God’s children,” he says.
Kim, who’s now 42, still can’t bring herself to read the PII report. Her parents can only manage to digest small portions at a time. But the family can talk now. “My daughter and I are mending for the first time, because the truth came out,” says Sue. Kim still struggles. This past summer, she cut deep gashes in her legs. She doesn’t have full-time work, but she helps her boyfriend with his car-detailing business and volunteers in the doctor’s office where her mother works—a small way, she says, of finding her way back to the medical field that she loved as a child.
Not long ago, Kim’s sister Diana was back in Indiana to attend a wedding. She and her two daughters, ages twelve and 14, took Kim out to eat. Staring at her two young nieces, she was suddenly struck by a thought: Do you realize this girl here is the age you were when Donn started molesting you? And the girl next to you is 14—the age you were when you brought it to light?
The moment had the impact of a revelation. “I looked at the twelve-year-old and I was like: I was that young? It just hit me.” It really hadn’t been her fault. “I never saw that before, I never did. It’s a shame it took me this long.”
Last month, Saudi leaders took the dramatic step of leading a coalition of Gulf states to cut ties with Qatar, accusing the tiny emirate of supporting regional terrorist groups. Qatar has long had antagonistic relations with Saudi Arabia, but this marks the most severe confrontation in over three decades. Given that Qatar is critical to stability in the Gulf—and is home to a major U.S. airbase—it is of international interest to find a speedy solution to this brewing regional crisis. Among the concessions being considered, according to the Washington Post, is to shut down Al Jazeera, Qatar’s state-funded news outlet.
If people outside the Gulf are familiar with Qatar, it’s likely because of Al Jazeera, which has been the emirate’s most visible export for more than two decades. Positioned between Iran and Saudi Arabia—“like a mouse sharing a cage with two rattlesnakes,” as journalist Hugh Miles put it in his excellent 2005 book on the network—Qatar was little more than an isolated desert outpost until the late 1930s, when it discovered troves of natural gas second in size only to Russia’s. Now one of the richest countries in the world, its population has more than tripled in the past fifteen years, though just under twelve percent of its 2.3 million residents are Qataris, the rest being migrant laborers. Since 1995, when Sheikh Hamad bin Khalifa al Thani seized control of the country from his father, Qatar has sought to assert its independence from pushy neighbors through Western-style educational initiatives, and human rights and business reforms, all touched in some way by the Emir’s combative style. The goal, according to Miles, was to become “an Arab version of Switzerland: rich, neutral, and secure.”
Launching a credible, Arab-language news network that was neither foreign-run nor a government mouthpiece was central to this plan, though critics are always quick to note that editorial independence is suspect when the Emir is signing the checks. Al Jazeera was formed in 1996 from the remnants of a failed BBC-Saudi endeavor to start an Arabic-language news channel. When Saudi censorship proved unacceptable to the Brits, the Qataris swooped in, hiring 120 laid off BBC journalists and broadcasters, staffing up with locals, and giving the network what was intended to be a one-time loan of $137 million to get things going. Within two years, it was watched all across the Arab world, distinguished by its taste for controversy and its willingness to give airtime to figures who had historically been censored, including Israelis, members of Hamas, and—in a move that enraged both the Saudis and Americans—Osama bin Laden. It introduced panel-style shows to the region: one of the most popular programs featured a well-known Islamic cleric who fielded calls on “everything from extramarital sex to suicide bombing,” and incurred the wrath of conservatives by declaring that the Koran did not prohibit fellatio.
Al Jazeera was accused, respectively, of being anti-Western, pro-Israel, Islamist, pro-Iraq, anti-religious, and funded by the CIA. It was also enormously popular. By 1999, when the channel began 24-hour broadcasting, it had twelve international bureaus and employed over 500 people.
After 9/11, the network gained prestige and made an enemy out of the Bush administration by regularly broadcasting interviews with Bin Laden and footage of American airstrikes in Afghanistan, where it was initially one of the only international channels with a camera crew on the ground. When Colin Powell asked Emir al Thani during a visit to Washington if Al Jazeera could “tone down” the coverage of the Afghan war, the Emir displayed his knack for PR by making this request public. As the American public’s support for the war began to wane, the network’s credibility rose, though its revenue did not. While Al Jazeera was expected to be profitable within five years of its launch, it was hindered by Saudi and Kuwaiti pressure on advertisers, the prevalence of illegal satellite dishes in the Middle East, and the network’s tendency to provoke regional governments into blacklisting it. In 2001, Al Jazeera borrowed an additional $130 million from the Qatari government to keep the lights on, and to prepare for the launch of an English-language channel.
Al Jazeera English went live in 2006 with lofty editorial aims. As William Youmans puts it in An Unlikely Audience, his history of the network’s English-language expansion, the ambition was to “cover parts of the world to which the global news titans gave scant attention: Southwest Asia, Sub-Saharan Africa, Latin America, and urban ghettoes in the West.” Its news agenda would range from “poverty and the plight of minority groups, to the social, cultural, and environmental costs of global capitalism and power politics.” This soft power strategy earned them little attention until 2011, when revolutions broke out in Tunisia, Egypt, Libya and Yemen, and Al Jazeera became the go-to channel for international viewers. AJE’s coverage of the Arab Spring was not without controversy—it was noticeably cool on the rebellion in Bahrain, a Gulf Cooperation Council ally—but its livestreaming attracted over 1.6 million American viewers to its website, and solidified the network’s reputation enough that its leaders decided to pursue their most ambitious expansion to date: entering the U.S. market.
From beginning to end, Al Jazeera America (AJAM) lasted little more than three years and was known to cost at least $2 billion, the majority going to expensive real estate and TV distribution deals that never managed to push the channel above a dismal average of 30,000 viewers a day. Along the way, it provoked a raft of lawsuits, highlighting the tensions between Qatari managerial culture and that of the left-leaning American newsroom, and provided a cautionary tale about pursuing old-school TV news in the age of the Internet.
It also produced some excellent journalism, taking a page from AJE’s mission to serve as the voice of the “global south” while also focusing on underrepresented groups in the U.S. After a series of scandals and personnel shuffles, everything came to an abrupt end one morning in early 2016. At that point, ratings were still in the toilet, oil prices were plummeting, and rumor had it that the new Emir preferred spending his money on thoroughbred racehorses. Staff members were notified of their impending unemployment in a grim all-hands meeting in the ballroom of Manhattan’s New Yorker hotel, the Moonie-owned building where the TV network had been leasing space until construction was completed on the 55th Street headquarters. Getting out of the 55th Street lease alone cost Qatar about $45 million, and this financial hit was deepened by severance and health care payouts for around 900 employees, as well as settling the remaining lawsuits, of which there were more than a few.
I had joined the network a year earlier as a features editor on Al Jazeera America’s website, which was based downtown on Hudson Street in a large, airy office directly above the New York Review of Books. It was a collegial newsroom with a dedicated and talented staff of around 50, who, lacking direction from the higher-ups, decided to do good work on their own. I was told that the CEO, a former management consultant with no newsroom experience, once remarked that he enjoyed reading the Economist. This was the most I heard about editorial guidance.
By the time I got there, much of the early optimism had waned, and my colleagues would refer darkly to the arbitrary machinations of Doha (the Qatari capital) or the employees who had left for greener pastures. Many had been lured from prestigious and progressive publications with the promise of creating the news network they wanted to work for, only to be disappointed by an arcane bureaucracy whose mandates arrived in cryptic bilingual emails. (Our internal newsletter was called Tawasul, which can be translated as “a position of power due to one’s proximity to the king.”) Reporters and editors were in the early stages of unionizing, and I awkwardly interrupted many furtive conversations in the hallway before figuring out what was going on. Despite the mood, there were perks. My team had no budget constraints (nor an actual budget), editors were encouraged to pursue substantive stories, and we were overseen by a boss who quoted Gramsci at staff meetings. Uptown, at the 34th street TV headquarters, the stakes were higher and the turmoil heavier.
In An Unlikely Audience, Youmans, an assistant professor of media at George Washington University, offers a forensic account of why the endeavor tanked, contrasting the failure of AJAM’s TV station with that of AJ+, the network’s digital media channel, which has prospered as a producer of youth-oriented viral news videos. (AJAM’s website gets almost no consideration, which more or less reflects how it was treated by management.) Unless you are a serious Al Jazeera Kremlinologist, this book is not for a general audience. It makes frequent use of terms like “glocalization” and approaches its subject through the concept of “port of entry,” which, as far as I can tell, is the notion that companies are shaped by the places they choose to set up shop. Still, Youmans’s research is impressive, and it is to his detriment that “clusterfuck” is not part of the accepted vocabulary of media studies.
There are few late-breaking surprises in AJAM’s story—the network’s problems were evident from the beginning. AJAM came into existence on January 2, 2013, when Doha announced that it had purchased Al Gore’s Current TV network for half a billion dollars. For Al Jazeera, the deal meant readymade bureaus in New York and San Francisco (the latter would become the laboratory for AJ+) and access to sixty million viewers, in other words, an instant foothold in the American market. For Al Gore, the deal was a golden parachute out of a rapidly failing business endeavor, as TV was quickly losing ground to Internet streaming services, and cable companies were attempting to ward off the threat by packaging together even more channels, effectively watering down the offerings. The golden days of TV news were over, and Doha had not yet gotten the message.
The Current TV arrangement was followed by months of lawsuits and infighting with skittish cable companies, which resulted in concessions such as the agreement that no TV content could be featured online—effectively partitioning AJAM’s channel and website. Doha also turned a blind eye to the company’s negative associations in the U.S. Though Al Jazeera had been fondly nicknamed the “terror network” by the Bush administration, company officials refused to tweak the name or logo to assuage its stateside audience. This obstinacy likely had the effect of alienating (and inflaming) potential viewers. While sitting in a park several weeks ago reading An Unlikely Audience, a man noticed the cover and started opining on Al Jazeera’s strategy of exporting extremist ideology. When pressed, he confessed that he had confused the network with Al Qaeda.
The network went live eight months and 900 new hires after January 2, and did so without much of a plan, or a sense of how it was expected to interact with existing Al Jazeera structures. While AJ+ benefitted from the sandbox-experimentation culture of San Francisco (and a ten-hour time difference with Qatar), AJAM was hidebound by the sober traditionalism of TV newscasting, which was becoming more outdated by the day, and a vexed relationship with Doha. Though HQ was hands-off at first, problems arose when editorial agendas began clashing with AJE, and as seasoned American producers found themselves having a hard time adjusting to the perspective of their new employer.
The best parts of An Unlikely Audience capture these problems succinctly: Youmans recounts a conversation in which a producer tells him that the network would not know whether to lead with “a report of a massive attack that leaves many casualties in Syria or a Texas shooting that kills three Americans,” suggesting that contrary to AJE journalists, “those trained in U.S. news organizations appeared conditioned to think ‘American lives are more important.’” Similarly, while many Western news organizations promoted the perspective of the home country, Al Jazeera took a more decentered approach, encouraging correspondents to present the viewpoint of wherever they were reporting from.
The most interesting wrinkle behind all these problems, however, was that Al Jazeera did not need to be profitable. The network, like the BBC, is more of “a state, public broadcaster than a private, commercial, profit-maximizing company,” meaning that it could theoretically exist off the largesse of the Emir. While it claimed to aspire to financial independence, Doha defined success in terms of influence, rather than profit, and exerted minimal effort in finding advertisers. Unlike other U.S. networks, which would commonly dedicate fifteen-to-seventeen minutes of ad time per hour, AJAM sold just six. This put it in the strangely luxurious position of “being uncompromising in journalism… to the point of being anti-commercial,” which, in turn, made it less relevant to viewers and less appealing to cable companies seeking to bring in large audiences. (Only in America does the lack of a profit motive raise suspicion.) Over time, problems escalated. Nasty lawsuits emerged in which top network officials were accused of anti-Semitism, discrimination and sexism. High-level female employees began resigning en masse. By 2015, big changes were in order.
To paraphrase Tolstoy, all dysfunctional companies are dysfunctional in their own way. (To wit: the “Al Jazeera controversies” Wikipedia page is thousands of words long and subdivided into fourteen different country sections.) In New York, things took a noticeable turn for the worse in the fall of 2015, several months after Doha sacked the Jordanian former-management-consultant and replaced him with a prosaic Brit with TV-anchor hair. In a last-ditch effort to consolidate the network, the new CEO announced plans to move website employees to the 34th Street office, a maneuver that required desk-sharing and seating journalists in the hotel basement and uncomfortably close to the bathrooms, as there was no extra space. Following a complaint to the health department, about a dozen employees (myself included) were left in the Hudson Street offices pending the results of an asbestos and bedbugs investigation, which was where we stayed until everything folded. The last several months of Al Jazeera America saw a slow parade of scandals.
In November, the New York Times revealed that Al Jazeera’s general counsel, the former president of Def Jam Recordings, had been practicing law without a license. He vanished soon thereafter, though reportedly remained on the payroll. Several weeks later, The Intercept reported that Doha had geo-blocked an op-ed by a Georgetown University law professor condemning Saudi Arabia’s plans to execute more than 50 alleged terrorists on a single day. (The piece was later reinstated, and it is worth mentioning that Doha never explicitly prohibited journalists from covering particular issues, though self-censorship was certainly a factor.)
The coup de grace came two days after Christmas, when Al Jazeera’s Investigative Unit—a six-person team headed by an American ex-marine known for promoting the controversial theory that Yasser Arafat was poisoned with radioactive polonium—released a documentary accusing more than half a dozen professional athletes, including NFL star Peyton Manning, of illegal doping. The story relied on a single source, a former intern at an Indianapolis clinic, caught bragging on a hidden-camera recording. He denied everything in a YouTube video the day before the story was published. Several of the athletes sued, the piece was widely panned, and AJAM, which had been suddenly instructed to run the story, incurred the wrath of media critics and the general public. Seven months later, the NFL cleared Manning of all doping charges and the quarterback went on to lead the Broncos to a Superbowl victory. By then, Al Jazeera America had been off the air for three months, and most former employees were looking for work.
The tragicomic story of Al Jazeera America would be more amusing if it weren’t for all the good journalism it produced. As major networks were competing over access to celebrities and exclusive quotes from politicians, the website routinely covered issues of domestic poverty and inequality, racism and environmental injustice; it sent reporters all over the world to dig into underreported subjects, and spent thousands on lavish multimedia projects and images from world-class photographers. This was at times admittedly anti-commercial, but nobody else was running front page pieces on the elections in Burundi or the refugee crises in Myanmar. We won awards for our coverage of Native Americans, which was a standalone beat, and ran story after story on the riots in Ferguson and the effects of coal mining in West Virginia.
There wasn’t, and isn’t, any American news outlet like it. While the organization was conceived to further Qatari influence, in New York it was an earnest leftist agenda that emerged, propelled by the impulse to cover issues routinely overlooked by profit-driven outlets. Should Qatar concede to shut down Al Jazeera, it will be abandoning many headaches and a long legacy of poor choices, but also this tradition, which is worth preserving and honing as international media outlets struggle to find their footing. That DNA still exists in AJ+ and AJE. As for the AJAM TV channel, I can’t say. I never watched it.
Energy Secretary Rick Perry made it official on Monday: He denies the science behind human-caused climate change—specifically, the fact that burning fossil fuels increases carbon dioxide in the atmosphere, trapping more heat. Asked whether he believes CO2 is “the primary control knob for the temperature of the Earth and for climate,” Perry replied, “No, most likely the primary control knob is the ocean waters and this environment that we live in.” Scott Pruitt, the head of the Environmental Protection Agency, uttered similar falsehoods in March: “I would not agree that [carbon dioxide] is a primary contributor to the global warming that we see.” Certain industry leaders have become more comfortable saying the same. Southern Company CEO Tom Fanning, also in March, said CO2 was “certainly not” causing climate change.
All three of these denials happened on the same cable news morning show—not Fox and Friends, but CNBC’s Squawk Box, where “the biggest names in business and politics tell their most important stories,” per their marketing copy. And the denials were all in response to nearly identical questions from Joe Kernen, who, of the three Squawk Box co-hosts, is clearly the most interested in climate change. Indeed, one look at Kernen’s Twitter feed reveals that he’s a fervent denier of mainstream climate science; that he believes those who accept that science are part of a “cult” who have succumbed to “groupthink;” and that, even though 97 percent of actively publishing climate scientists agree that carbon dioxide causes global warming, contrarians deserve just as much airtime to explain why they think unprecedented carbon dioxide concentrations are somehow beneficial to human life.
Though the frequency of climate discussion on Squawk Box is a recent phenomenon, the show has been a denial haven for some time. In June 2016, former General Electric CEO Jack Welch railed against President Barack Obama’s ambitious strategy to combat global warming. “It’s almost unbelievable,” Welch said, alleging that Obama’s plan would create “an Air Force that doesn’t have parts” and “an economy that won’t move.” Kernen agreed, and blasted the media for accepting mainstream scientists’ characterization of climate change. “The media are just lapdogs, yeah yeah yeah, they just lap it right up,” he said.
Princeton University physicist William Happer, who has been cited as a leading candidate to be President Donald Trump’s science adviser, appeared on Squawk Box in in 2014 episode to provide, in the chyron’s words, a “defense of carbon dioxide.” Kernan asked Happer to explain recent extreme weather caused by a shifting polar vortex, which scientists attributed to climate change. Happer described the shift as normal, and went on to say that climate models were too sensitive to be trusted. “CO2 is very clearly a benefit,” Happer said.
Happer did not escape the interview unchallenged, but only because Squawk Box co-host Andrew Ross Sorkin was present (the aforementioned interviews were one-on-ones with Kernen). “You don’t believe in climate change at all,” he told Happer, who became extraordinarily defensive. “Just a minute, just a minute, just a minute—I believe in climate change, shut up!” Happer said. “I get called a denier and anyone who objects to all the hype gets called a denier... The demonization of carbon dioxide is just like the demonization of the poor Jews under Hitler. Carbon dioxide is actually a benefit to the world and so were the Jews.” The Holocaust comparison rightly alarmed Sorkin, but not Kernen, who ended the segment by defending Happer’s position.
Squawk Box’s periodic focus on climate science might seem odd, given that the show focuses on financial news. But Kernen, a former stockbroker, also has a science background— and a robust one at that, for someone in his line of work. According to his CNBC bio, he holds a master’s degree in molecular biology from the Massachusetts Institute of Technology and majored in that subject at the University of Colorado. He even published peer-reviewed research in 1979 and 1980.
Perhaps this is why Kernen often uses scientific terminology to defend his own climate-change denial. For instance, he often notes that carbon dioxide only makes up .04 percent of the atmosphere, which he characterizes as “trace.” This ignores the fact that trace amounts of many substances can have huge impacts, and that the atmosphere currently contains more carbon dioxide than it ever has in human history. Indeed, as Andrew Freedman wrote at Mashable, the last time there was this much carbon in the atmosphere, “Megatoothed sharks prowled the oceans, the world’s seas were up to 100 feet higher than they are today, and the global average surface temperature was up to 11°F warmer than it is now.”
Of course, a background in one scientific field does not make one an expert in another scientific field—Happer is proof enough of that. Robert Levenson, a distinguished professor of pharmacology at Penn State College of Medicine who published at least three papers with Kernen, confirmed that their research was unrelated to climate science. “We did basic cellular biology,” he told me. “It had nothing to do with atmospheric science.” And when I told him that his old lab parter had become a climate denier, Levenson was surprised. “He’s not a stupid guy,” he said. “He should go stick his head in the garbage can.” Later, Levenson sent me an email: “If you speak with Joe, tell him hi from me and ask him what he’s been smoking.”
A CNBC spokesperson declined to make Kernen available for an interview, and only offered this brief statement: “Squawk Box is built for balance focusing on issues that impact business, finance, investments and economies.” If, by “balance,” CNBC is saying Squawk Box also interviews people who accept climate science, that much is true. But such “balance” is really false equivalence: It suggests to viewers that there are two equally informed, reasonable sides to the climate debate—that whether humans are causing harmful climate change is a matter of opinion, not fact.
But that’s exactly why officials like Pruitt and Perry appear on Squawk Box in the first place. Whereas many cable news hosts would never let them get away with the nonsense claim that carbon dioxide has nothing to do with global warming—Fox News’ Chris Wallace grilled Pruitt on this very subject in April—they know that such ignorant, dangerous views will go unchallenged by Kernen. Squawk Box is their safe space, and Kernen is their megaphone.
At his annual live call-in show late last week, Vladimir Putin wryly offered political asylum to James Comey “if he faces prosecution of any kind” in the United States, asking, “What’s the difference between the FBI director and [Edward] Snowden?” There is, of course, a clear distinction between slipping your own unclassified memo to the press and leaking a massive trove of data about classified National Security Agency programs. There’s another important difference: the latter redounded to Russian president’s benefit, whereas the Justice Department’s Russia investigation decidedly has not.
Putin surely recognizes this. But he has been striving to maintain his air of unruffled equanimity in what has been a rough stretch for him, from the massive protests across Russia to the near-unanimous vote in the U.S. Senate, which imposed new sanctions against Russia and curbed the president’s power to lift them unilaterally. True to form, Putin has coolly described these sanctions as “harmful” to U.S.-Russia relations, but called any talk of retaliation “premature.”
Donald Trump’s election was supposed to be a boon to Putin. Instead, things have been going quite poorly for him. Whatever goals the Russians had in meddling in the U.S. presidential election last year, be it to elect a president more favorable to lifting sanctions, punish Hillary Clinton, discredit Western democracies, or, as many analysts say, sow chaos in Washington and disrupt the international liberal order, Putin seems to be failing on most counts.
There is indeed chaos in Washington, though largely contained to the White House, and Trump has injected some uncertainty into longstanding relationships with allies. His refusal to affirm the mutual-defense commitment of the NATO treaty and his withdrawal from the Paris climate agreement are sowing divisions between America and Europe that could do long-term damage. The Trump administration seems uninterested in promoting democratic values abroad, and surely any time the U.S. retreats in its leadership role, it benefits other world powers such as Russia and China.
But earlier fears of a dramatic shift in U.S. foreign policy, driven by Trump’s isolationist rhetoric and friendliness toward strongmen, are not panning out. Trump stacked his national security and foreign policy teams with establishment picks who have largely stuck to conventional Republican positions: punitive policies against Russia, Cuba, and Iran; cooperation with China on deterring North Korea; more troops in Afghanistan, and more bombs in Syria.
It seems Putin may have misjudged just how powerful our presidency is. Even more so, he seems to have severely misjudged the power of the American media, which is determined to overturn every rock with regards to the Russian hacking story. Back home, Putin is used to receiving far more favorable press—and when Russian media doesn’t fall in line, he simply shuts them down or finds ways to change the subject.
But the subject stubbornly refuses to change in America, and is getting worse by the day. The domino effect since the Russian hacking revelations—starting with national security advisor Michael Flynn’s firing, then Comey’s, and now the appointment of special counsel Robert Mueller—have not only forced Republicans to double-down on their anti-Russia rhetoric, but have even forced the president to abandon any hopes for a Russian reset, for fear of corroborating the collusion narrative. As defiant as Trump can be, even he must realize that any overture toward Russia now will be viewed as suspect.
Putin’s troubles are hardly limited to the U.S. Trump’s election and Brexit were supposed to herald a new era of nationalism and isolationism in the West. Instead, pretty much every election this year, from the Netherlands to France, has seen a backlash to the popularity of far-right parties that presumably would be more favorable to Putin.
The current wave of protests in Russia, by far the largest since 2011, should give Putin the most pause. Driven by young Russians on social media, these protests bring to mind the populist enthusiasm of Bernie Sanders’s rallies, especially when opposition leader Alexei Navalny rants about Russia’s “corrupt billionaire class.” There is little doubt that these Russians, contemptuous as they are of Russian state media, have seen the massive anti-Trump protests in America and are feeling emboldened.
Putin may be hoping that Trump’s troubles in America will convince the Russian people that American democracy is in disarray. But there is another narrative that might prevail: that a democracy, led by a free press, may be able to hold power accountable. Should Trump resign or be impeached—or simply be cut down to size by Congress—this will only reinforce that Western democracies are functioning, and that with enough resistance, those who abuse their authority can be taken down to size.
One has to wonder if Putin still believes his gamble with Trump was worth the effort. He may have thought he was helping to elect an American puppet, but it turns out he’s not holding the strings. Instead, Putin seems to have pulled off the nearly unthinkable—pushing a historically partisan and divided Senate to come together for a lopsided 98-2 vote in favor of sanctions. Trump’s election was supposed to bring chaos and discord, but at least when it comes to Russia, the U.S. and Europe are moving toward consensus. And Putin, alone again on the world stage, has only himself to blame.
Megyn Kelly’s highly anticipated interview with Alex Jones, the notorious peddler of conspiracy theories, aired as planned on NBC on Sunday night, despite widespread protests and Jones’s own attempts to sabotage the segment. People praised Kelly’s performance, with Politico’s Jack Shafer writing that “she took the mendacious Jones apart in such a textbook manner you had to wonder what all the shouting had been about.” But it was telling that there was no “Megyn Kelly moment,” the term Jim Rutenberg of The New York Times used to describe that point in an interview when “you, a Fox guest—maybe a regular guest or even an official contributor—are pursuing a line of argument that seems perfectly congruent with the Fox worldview, only to have Kelly seize on some part of it and call it out as nonsense, maybe even turn it back on you.”
This was Kelly’s way of distinguishing herself at Fox News—of building her reputation as a rare independent voice, even a feminist, in a crank-filled world dominated by the likes of Roger Ailes and Bill O’Reilly. It was moments like these that convinced the executives at NBC that Kelly could be a crossover star with mainstream appeal. But her inability to replicate that strategy in the non-Fox News world suggests we were giving her far too much credit, both as a feminist and as a journalist.
It was not supposed to go this way. The Jones interview was meant to be controversial enough to boost her ratings (which have been mediocre since she joined NBC) but not so controversial that it eclipsed the interview itself. Kelly quickly ran into trouble with the most sympathetic people in America, the parents of the children slain in the 2012 Sandy Hook massacre, which, according to Jones’s site InfoWars, never actually happened. (Jones’s theory is that the parents faked the deaths of their children to push for tighter gun control.) Critics said Kelly should not air the interview, since it would only give Jones a bigger platform. At least one NBC affiliate boycotted airing the show.
Meanwhile, Jones also called on Kelly to shelve the interview, claiming that it was unfairly edited to misrepresent his views. He spent the last few weeks riling up his fans about the controversy. On Thursday night, he pulled out the big guns, releasing secret tapes of pre-interview conversations with Kelly in which she stated, “I’m not looking to portray you as some kind of bogeyman” and promised to run the clips past Jones before airing them. Jones called on NBC to put the full unedited video on its web site, or else he would publish it.
Contrary to those who found the interview “important journalism,” I thought it was hardly worth the hype. Kelly ran through Jones’s most damaging conspiracy theories, including Sandy Hook and his defamatory accusations against the yogurt company Chobani. She also brought up Pizzagate, in which Jones claimed that the Democratic Party was running a child-sex ring out of a Washington pizzeria, Comet Ping Pong, which prompted a man to enter the restaurant and open fire. Kelly stressed the symbiotic relationship between Jones and President Donald Trump, no doubt to bolster her argument that Jones is an important figure who needs to be interrogated. In an interview with Neil Heslin, one of the Sandy Hook parents, Kelly asked if he had a Father’s Day message for Jones. “I think he’s blessed to have his children, to spend the day with, speak to. I don’t have that,” Heslin said.
There was very little footage of the actual interview with Jones himself, which Jones’s supporters will likely claim proves that the video was heavily manipulated. And indeed the positive reviews Kelly has received were likely the result of the fact that Kelly reportedly drastically recut the show at the last second to address concerns.
Ultimately the controversy reveals less about Jones than it does about how the world continues to stoke Kelly’s reputation as an “intrepid gal reporter.” Her fencing with Trump during the campaign brought her great acclaim, with Kelly being hailed as the “new gold standard in American journalism.” Then she took a stand against Roger Ailes, becoming the most prominent member of a small army of women who had accused the former Fox News head of sexual harassment. She wrote a bestselling memoir, Settle for More, that cemented her reputation as a kind of hard-charging role model for women.
This was a remarkable turnaround for a person who made her name preying on white people’s fears about black people. Remember, this is the same person who once said that a black teenage girl who was slammed to the ground by a police officer “was no saint”; that student protesters who got pepper-sprayed were just getting hit with a “a food product, essentially”; and that it is a “verifiable fact” that Santa Claus is white. She consistently downplayed the issue of police brutality against minorities, was obsessed with claims that the New Black Panther Party was intimidating white voters, and called Black Lives Matter “obviously beyond the bounds of decency.” These all tapped into the conspiracy theory mindset that Fox News has excelled in popularizing. It was no surprise that, in the tapes he leaked, Jones himself can be heard saying, “I’ve always been a fan of yours until everything happened.”
Her elevation to respectability could only have happened with Fox News as a toxic backdrop. (The fact that her interviewing a conspiracy theorist has raised more backlash than all of her racist comments at Fox News put together is very telling.) Ditto her reputation as a feminist. As Jia Tolentino put it in the New Yorker, “She was a diamond partly because her company was so rough.” At Fox News, with Ailes and Trump looming over the network, Kelly was a veritable bra-burner without having to do much at all. Many of her signature Megyn Kelly moments came when she pushed for the rights of women in the workplace, such as when Mike Gallagher called Kelly’s maternity leave a “racket” and she invited him on her show to excoriate him. Even Gawker lauded it as a “momentary feminist triumph.”
All this despite the fact that Kelly rejects the term. In a review of Settle for More, Jennifer Senior of the New York Times observed, “The needle she threads has an almost microscopic eye. She is trying simultaneously to appeal to both her new Lean In fan base and the regular Fox News watchers who abhor identity politics.” In a world in which corporate titans like Sheryl Sandberg are lauded as feminist heroes, Kelly can make the transition to feminist icon herself. She is part of a Lean In circle and Sandberg herself once called Kelly to say, “I love you, you are awesome” after Kelly challenged conservative Erick Erickson for criticizing a rise in female breadwinners.
In fact, there is nothing that Kelly has said or done that runs counter to Sandberg’s empowerment feminism. Personal success is perceived to be inherently feminist, despite the fact that some people may have been steamrolled along the way. In Kelly’s case, there are a lot of people that have been run over: Newtown families, people of color, all other women. And until the world she inhabits changes, Megyn Kelly will continue to settle for more.
In 2015, when Amazon opened its first brick-and-mortar bookstore, there was a lot of speculation about what the company was really up to. Were the stores designed to sign up new Prime subscribers? Were they hubs for the company’s expanding same-day delivery operation? Could they be stations for the company’s budding drone fleet? Or perhaps these stores were just what they appeared to be: Amazon’s first step to replace the bookstores it had driven out of business.
By the end of 2017, Amazon will have seven bookstores, making it the fifth-largest chain bookstore in the country—itself an alarming sign of what happens when the company infiltrates a supply chain. But with its $13.4 billion purchase of the high-end grocery chain Whole Foods, all of that speculation seems quaint. Shortly after it opened its first bookstore, Vox’s Matthew Yglesias wrote, “Amazon is opening a bookstore for two big reasons: 1. It can. 2. It is driven by a relentless desire to conquer literally everything in its path, and brick-and-mortar retail is a thing.” The purchase of Whole Foods is a sign that CEO Jeff Bezos’s vision of the Everything Store is frighteningly literal: Amazon’s goal is a takeover of retail itself, both physical and digital.
The impact of the Whole Foods acquisition has already been dramatic. After the deal was announced, the value of Amazon’s stock went up by more than the purchase price, which means the deal paid for itself. The value of Amazon’s closest competitors, including Target, whose stock dropped by 10 percent, and Walmart, whose stock dropped by 5 percent, fell by an even larger amount.
This is not what is supposed to happen. Amazon’s stock is supposed to drop with the acquisition of a troubled company. But Wall Street is only reacting to what is obvious: that Amazon is so powerful that anything standing in its way is toast. And while this is great for Amazon’s shareholders, most Americans can’t afford to be so blithe. When one of the nation’s biggest companies enters—and threatens to overwhelm—a whole sector of the economy, the consequences are enormous. If Amazon now controls the pricing in the book industry, just imagine what it can do in the broader world of retail.
“This is horrible for competition,” the director of the New America Foundation’s Open Markets program Barry Lynn told the New Republic. “This is the crushing of competition. Amazon is monopolizing commerce in the United States. It set out to become the company that when you said to yourself, ‘I’m going to go buy something online,’ you would say, ‘I’m going to go to Amazon.’ Now Amazon is seeking to become the company when you say to yourself, ‘I’m going to go buy something’ you think Amazon.”
Like other unimaginably gigantic tech companies—most notably Google and Facebook—Amazon has benefitted from decades of a remarkably narrow interpretation of antitrust law. “Amazon has built its business strategy and rhetoric around lowering prices for consumers and serving consumers more generally,” Lina Khan, a fellow with the Open Markets program, told the New Republic. Under the current interpretation of antitrust law—which was deeply influenced by Robert Bork’s 1978 book The Antitrust Paradox—“harm to consumers is the only plausible harm,” Khan said. As long as Amazon keeps prices low—in other words, as long as it refrains from using its monopoly power to extort consumers—it’s safe from scrutiny.
But with its move into physical retail, the necessity for rethinking antitrust law has never been greater. Amazon and other tech quasi-monopolies have benefitted greatly from the relaxing of antitrust laws that began in earnest in 1982. “Walmart was the perfect creature to emerge from the antitrust changes that took place after the Bork revolution,” Lynn said. “But for the digital revolution that took place in the deregulated marketplace, the perfect creature is Amazon. Walmart understands that and they are scared.”
With the Whole Foods acquisition—and, to a lesser extent, its growing bookstore sideline—Amazon is changing the narrative that it and other tech companies have been selling for years. Tech companies have argued that the decimation of retail was an inevitable result of the digital revolution. It was the paradigm-altering innovation of companies like Amazon that was hollowing entire sectors of that economy. It was a “don’t hate the player, hate the game” argument—except they were telling people to love the game. The digital frontier was the future, and if that drove mom-and-pop stores and big-box giants out of business, well, that was the cost.
But Amazon’s acquisition of 400-plus stores of retail space suggests that it sees real value in brick-and-mortar. “You don’t spend $13 billion on physical stores because you believe that physical retail is over,” Lynn said. “What they’re proving by acquiring Whole Foods is that the collapse of retail that we’re seeing has nothing to do, or very little to do, with the technological revolution than it has to do with the abuse of power by a dominant retailer that intends to become much more dominant.”
What Amazon has planned for Whole Foods is anyone’s guess. “There are so many ways that Amazon can use its power that it’s simply impossible to figure out what it will do. Amazon probably doesn’t even know yet; it will discover and test them, relentlessly,” wrote Matt Stoller, riffing on Bezos’s original name for Amazon, relentless.com. Amazon Prime, same-day delivery, drones—all are options. What’s important is that, having taken over e-commerce, Amazon is looking for new worlds to conquer—and looking for new ways to encroach on Walmart’s turf. If the acquisition of Whole Foods isn’t challenged, Amazon will certainly continue to eat up physical retail the way it once ate up e-tailers like Diapers.com.
It is possible that the Whole Foods purchase will provoke a legal challenge from the likes of Walmart and Target. “Until this point they didn’t want to ever call Uncle Sam and say, ‘Uncle Sam we got a problem here,’ because that looks like weakness,” Lynn said. “But right now they are in a corner, their backs are against the wall. They’re going to call Uncle Sam because that’s the only thing that is going to save them from death.”
The judiciary is stacked with conservative appointments who have a narrow reading of antitrust law. But the issue isn’t with antitrust law itself so much as the current understanding of monopoly. Prices may stay low, but the effect of Amazon’s retail push will be profound for both consumers and producers, with Amazon controlling all kinds of supply chains.
“It’s really easy for the government just to say no to this merger,” Lynn said. “It’s pretty easy for enforcers to take on Amazon, the platform monopolists, to take them on in a coherent way; they’ve got more than ample tools with which to do that.” The problem, however, is finding willing regulators and an executive that’s willing to get in a fight. On the campaign trail, President Donald Trump suggested he would be willing to take on companies like Amazon (though his ongoing feud with Bezos and The Washington Post would certainly add political complications to any such effort). But in office he has shown an eagerness to placate monopolists and the mega-rich.
If Amazon’s bookstores were a toe-dip in the waters of physical retail, its acquisition of Whole Foods is a cannonball. “We’ve got ourselves a little challenge here in America: On one side you have Jeff Bezos and on the other side you’ve got democracy,” Lynn said. “We can choose who we want to trust in. Do we want to trust in America and Americans and American history? Or do we want to trust in Jeff Bezos? That’s what this comes down to.”
Like a lot of feminists who hoped against hope for Hillary Clinton’s election, there was one thing I looked forward to with special, unqualified joy: The traditional model of the first family was going to be turned on its head at last. Goodbye to the impossibly perfect exemplar of patriarchy we’ve come to expect, if not demand. No more benevolent and hardworking dad who still somehow makes time. No more selfless mom with the over-scrutinized clothes and hair and civic-minded projects. No more well-scrubbed children and adorable dogs and high-jinks-prone cats. America’s archetypal family unit was going to be organized around a powerful woman with a grown-ass daughter and an annoying, retired husband puttering around and causing mischief. If not a smashing of the patriarchy, it would have made a dent.
What we got instead was an even more radical restructuring of the first family than Hillary herself could have envisioned. As Father-in-Chief, Donald Trump hasn’t simply introduced some twenty-first-century version of The Brady Bunch, with a herd of kids from three different mothers all thrown together in a big new house, complete with maid service. He has scrapped any normal notion of the family unit, organizing his personal life around those who advance the same principles that drive the companies that bear his name—taking what you want, doing as you please, and living off other people’s money. We’ve traded the Bushes, the Clintons, and the Obamas for First Family LLC. And in the process, we’ve lost something of genuine value to the country, the world, and ourselves.
However archaic, the institution of the first family carries real cultural and political force. America looks to the White House for some sense of itself, for a reflection of what most of us aim to have in our lives: a unit of mutual affection and mutual responsibility, a place of comfort and normalcy in a chaotic and frightening world. The tools of Madison Avenue were long ago applied to the shaping of the first family brand; it’s always been a focus-grouped projection of the country’s idea of its best self. We define our national selves, in part, by the cultural conversations that the first family stirs, the image it projects. The rest of the world also looks to the president and his family for a gauge of what America stands for: They’re the ambassadors of Brand America.
The Trump family brand mirrors America at its worst—a version in which capitalism deforms all relationships, twisting everyone and everything to serve its basest needs. This is a family only in the Mafia sense of the word, ruled by a ruthless and imperious Don who offers protection in return for fealty. Trump’s children are more than mere relatives: They are executive vice presidents, the capo bastones of an organized racket. In the organizational chart, there’s no box labeled First Lady. Mother, wife, provider of counsel and comfort—these maternal roles have no place in the family business. Melania, Marla, and Ivana have their gracious livings secured, mob-style, by their silence and invisibility.
We’ve come a long way, in a short time, from the days when we argued over whether First Lady Hillary should be baking cookies or running things, or pondered late at night how Laura Bush could be a pro-choice ex-librarian and still play the gracious hostess for a husband who was so clearly her inferior. Despite myself, I long for the days when first families lived together in a place called the White House, expressed discernible tastes in music and culture, and gave every appearance of serving the country, rather than the other way around.
With the advent of First Family LLC, there is a gaping hole in American culture where the president and his family used to be, in all their outdated glory. Even a patriarchy-smasher can feel their absence. The old model, at least, was built around recognizable human needs, even if those needs were distorted or limited by the structure imposed on them. Once, we assumed, there was love in the White House. Now there is only power and greed. Those who cannot produce—who do not serve an immediate, utilitarian function—are no longer welcome there. We feel what this tells the world about America. And maybe, we are forced to concede, the rest of the world was right about us all along.
The arrival of a first family in Washington has always served as an opportunity to redefine American culture. Some families are avatars of a rising region or a cultural trend—the Bushes were both, with their transplanted Texas-ness come to D.C. The Obamas were harbingers of history. The Carters walked into the White House from the Deep South, while the Reagans gave conservatism a Hollywood glow. The Clintons brought Bubba-ness. There was a lot to talk about.
First families have always made a stab, in their own way, at promoting the best of Americana. By the end of the Obamas’ second month in the White House, they had celebrated Stevie Wonder with an East Room concert; early in their first year, the George W. Bushes honored jazz great Lionel Hampton at a star-studded show. The Nixons were always showcasing country singers and Lawrence Welk and Up With People, those icons of the “silent majority.”
As I write, there are no such events on the White House schedule—not one. The closest we’ve come to a cultural moment since the Trump clan took over was the West Wing visit by that classic-rock trio Ted Nugent, Kid Rock, and Sarah Palin. They took photos with the boss and mocked Hillary Clinton’s portrait. There is no music in Trumplandia.
And there is barely a first lady. Reportedly reluctant to take up the role, Melania stayed in New York for Barron’s spring semester in prep school, avoiding the kind of spotlight that could deflect attention from her publicity-crazed husband. Her public appearances as first lady have been few, almost as though that’s the way President Trump likes it. If anything, her absence has served to underscore her husband’s misogyny and inhumanity.
The marginalization of the first lady’s role strikes a blow against female power. This might sound strange, given that the traditional role of the first lady has always been to stay in her place. “Good” first ladies don’t overstep; they attach themselves to worthy causes that stem from the kinds of things a mom would do: tell you to eat your vegetables, read a book, stay away from drugs, get up from the computer and move. These are viewed as less important than the manly things the president does, in no small part because they are woman things.
But even a first lady who minds her p’s and q’s can wield an awesome wand of soft power. Think Jacqueline Kennedy perched atop an elephant in India, charming Nikita Khrushchev in Vienna, or taking Paris by storm. A first lady who does not mince her steps, by contrast, can challenge the world from a rarefied platform. Think Eleanor Roosevelt hosting civil rights leaders in the White House. Or Hillary Clinton going to Beijing to declare, “Women’s rights are human rights.” Or Michelle Obama challenging America’s tropes about race in ways both subtle (taking her White House portrait in front of a Thomas Jefferson painting) and direct (saying she’d lived eight years in a house “built by slaves”). Another kind of first lady can elevate a significant concern through the language of aesthetics, as Lady Bird Johnson did with her crusade to “beautify” America, a gently effective way to raise environmental consciousness.
Above all, first ladies have been arbiters of culture in Washington and beyond. The music they celebrate, the historical moments they commemorate, the art they elevate—all help shape both the form and substance of our national identity. A first lady is what organizational experts call a “power center,” however hampered she may be by the gendered nature of her role. But for Trump, apparently, even that much power is too much to grant a woman. Perhaps Melania’s recent move into the White House with Barron will add a familial touch to the enterprise. But I doubt it. Barron is almost always photographed wearing a suit, a little Trump Organization executive-in-training. (The only photographic evidence we’ve seen so far of Barron looking like a regular kid is a picture of him disembarking from Marine One on June 11 to take up residence in his new home, wearing a T-shirt emblazoned with the words, “The Expert.”)
If you believe, as I do, that the Trump administration amounts to one big project of plunder, it makes perfect sense that the president would fill the boxes on his First Family LLC organizational chart with only those relatives he trusts to shepherd his interests and play all the angles: Ivanka, Don Junior, Eric, and, by affinity of spirit, Jared Kushner. These are the family members who share the president’s confidence, who reveal his notion of the exemplary family—an autocratic CEO served by loyal deputies who are the fruit of his loins, as well as one brought in through the bond of matrimony with the CEO’s favorite loin-fruit.
This degenerate idea of family makes me miss the old-school first family—that relic of the bad old days before the notion of equality between the sexes was even a thing. Compared with the new archetype of the first family as a soulless LLC led by a pussy-grabber-in-chief, I’ll take the softer bigotry of the old sexism. I know how to wage war against the bad parts of it. And now, thanks to Trump, I know how to appreciate the hidden virtues of the old model. At least it was familiar, in every meaning of the word.
Originally published in the July 2017 print issue of New Republic, this article has been updated online to reflect recent news events.
“That man is like superhuman,” Joy Manbeck told The New York Times at the annual People’s Summit in Chicago earlier this month. “He still plays basketball. He walks to work. I don’t care. I want him. Period. I want Bernie.” So does RoseAnn DeMoro, the indefatigable leader of the National Nurses United union. In an interview with McClatchy, she said the Vermont senator is “so clearly ahead” of other potential contenders in the nascent 2020 Democratic presidential field. She added, “He has the most comprehensive program. He’s been doing this his whole life. What people want is what Bernie is saying.”
She has a point. An April Harvard-Harris survey revealed that Sanders is the most popular politician in the country: 80 percent of registered Democrats and 57 percent of voters overall say they approve of him. During the 2016 primary, Sanders won key states that Hillary Clinton later lost to Donald Trump; her defeat was at least partly attributable to an “enthusiasm gap” among expected Democratic voters. Sanders himself has not indicated that he’s planning another White House run, but thanks to his popularity and the unexpected viability of his primary campaign, questions about his political future are being asked.
But all this speculation does prompt another, equally important, question: Should Sanders run? The answer is no, though not for the reasons his critics think.
The imprudence of a Sanders run has nothing to do with his status as an independent. Nor does it have anything to do with the other criticisms his opponents tend to lob in his direction. He did not cost Clinton the election; her loss can be attributed to myriad campaign failures, Russian interference, voter disenfranchisement, and the infamous Comey letter. He is not working at cross purposes with the Democratic Party; he has in fact campaigned for the party’s candidates at its request and in fulfillment of his role as the party’s outreach chair. He and his followers do not represent some existential threat to the party’s identity; any winning Democratic coalition will need their numbers and energy, and as such it should reflect their policy preferences.
Still, there is a strong case for Sanders abstaining from making another presidential run. The first obstacle is obvious: He will be 79 next Inauguration Day. Basketball notwithstanding, advanced age is a vulnerability for any politician. This is particularly true of a politician who inhabits the Oval Office—and this critique applies to Joe Biden and other potential contenders of a certain age.
Second, while Sanders’s campaign ignited public interest in democratic socialism, he was hardly the perfect candidate. He could have been stronger on gun control, particularly at the beginning of the primary campaign. And he too often ceded ground on foreign policy to Clinton—an unnecessary failing, considering her deeply troubling record on the issue. These are questions that Sanders will have to answer all over again if he chooses to run in 2020, and they’re a reminder that there may be a better progressive candidate out there.
And there is the matter of his fame. Name recognition is key to victory, but it can also strangle movements. Sanders the individual now gobbles up so much airtime and column inches that he threatens to eclipse the American left, to its long-term detriment. This is hardly his fault, but Sanders must now consider the broader interests of the left.
Sanders’s candidacy served as catalyst for a surge of positive activity across the left. Interest swells the ranks of groups like Democratic Socialists of America, which existed before Sanders ran for president and will hopefully outlast his career. It propels Our Revolution, which emerged from the ashes of his campaign to elect progressive candidates. Though Our Revolution’s results have been mixed, it’s the sort of political infrastructure that must exist to support candidates who are otherwise ignored by the Democratic Party.
If this resurgent left is to survive and flourish, it needs to prove that it can work without a personality to prop it up. This won’t be news to longtime activists on the left, who organize mostly in obscurity on behalf of a constellation of progressive causes. It won’t be news to Sanders either, who was an activist long before he was a politician. For others, though, it may be more difficult to accept. Sanders’s candidacy energized a new generation of young voters, and he is now a celebrity. It is much easier to be publicly socialist now than it has been in decades, and this is a credit to the Sanders campaign.
But celebrity is a double-edged sword for any grassroots movement. Barack Obama’s Organizing for America withered without Obama’s involvement. The Clinton campaign also serves as a warning in this respect: Party leadership rolled Clinton out almost as if she were a product and urged voters to be “with her” rather than with any political philosophy or mission, as if her name alone implied a set of positive characteristics. Clearly, this was not enough to help Clinton last November. It may have even helped her lose the election.
Sanders, in contrast, had no national name recognition when he launched his campaign. He first became popular because of his policies; the persona followed later. It is now undeniable that his public personality currently rivals that of Clinton. His fame presents him with a singular challenge. If he were to run again, he would be synonymous with the left. If he abstains from another run, he’d highlight a key difference between left and center: For leftists, politics is based on policy, not personality.
Here, Sanders backers can learn from their British peers. As Sanders noted in a recent op-ed for The New York Times, Jeremy Corbyn’s Labour Party made up massive deficits when it released a positive manifesto that articulated a clear, concise anti-austerity message. Though Corbyn is the leader of the party and therefore its prospective prime minister, it seems to have been the manifesto, along with some serious policy missteps by Prime Minister Theresa May, that attracted defectors from parties on the right. And the party’s ads consistently reinforced a substantive political message: Vote for Labour because it is for the many, not the few.
This tactic led to huge gains. Post-election polls show Labour with a five-point lead over the Tories. Corbyn’s humane reaction to the Grenfell Tower tragedy has only sharpened the distinctions between Labour and the Tories, to the detriment of the Tories. Labour membership is growing, and the party is in excellent position to make a bid to reclaim government. This is partly because Corbyn directed party energy to building a movement centered on policy.
The popularity of Labour’s manifesto suggests an ideal role for Sanders. And he seems eager enough to embrace it: “While Democrats should appeal to moderate Republicans who are disgusted with the Trump presidency, too many in our party cling to an overly cautious, centrist ideology,” he wrote in his Times op-ed. The American left needs Sanders to help build a movement, and as a Senate ally to the progressive government we will hopefully elect. It needs him to shape the Democratic Party’s aspirations, to help demand a $15 minimum wage, Medicare for all, and free public college. Sanders can play a historically important role by mainstreaming these policies. Handing leadership off to another, younger progressive would be a sign of confidence in the party’s willingness to move left.
But there’s an important exception here. If there is no clear progressive frontrunner—no promising campaign for him to prop up—Sanders should run regardless of age. He’ll have a better shot at victory than he did in 2016, and after four years of Donald Trump, the country will need genuine progressive policies even more desperately than it does now.
Barring this scenario, Sanders should let another progressive run and embrace a movement-building role. In this capacity his age would become a boon, a testament to his long and often unrewarded dedication to a broader cause.
You have to go back to the Woodrow Wilson administration to find an example of a legislative exercise as opaque and regressive as the Republican health care bill. Back then, according to Don Ritchie, historian emeritus of the Senate, it was the Democrats who shut Republicans out of the process. But, as Ritchie told the Los Angeles Times last week, the strategy did not wear well and hasn’t been reprised in almost a century.
Until now. The GOP’s Obamacare repeal effort has been rightly criticized as secretive, partisan, and male-dominated. But an economical way to fold all of these criticisms into a single framework would be to say that the Senate Republicans writing the bill have demonstrated immense contempt for democratic and Enlightenment values.
Republicans are attempting something that hasn’t been done since before World War I, and that has rarely succeeded, precisely because the country was founded to embody nobler ideals. Rather than run draft legislation through an open committee process, Republicans have outsourced the entire deliberation to 13 male senators from 10 states. The Senate GOP advantage in small states is reflected in the working group, and then compounded by the fact that it includes both senators from Wyoming and Utah. The senators who have been looped into the process represent less than one quarter of the nation’s population. If you hail from either of the coasts, your interests are being safeguarded by zero of the senators endeavoring to overhaul the U.S. health care system.
There is no provision of Senate rules holding that legislation must run through regular order, or win the support of senators whose constituents account for half or more of the country. Republicans could pass a bill nobody has had time to read, with 50 votes, and, pending action in the House of Representatives, it would become the law we all live under. But it would not reflect the majoritarian spirit democratically enacted legislation should aspire to.
As Missouri Senator Claire McCaskill noted in a colloquy with Utah Senator Orin Hatch that went viral online almost two weeks ago, the contrast with the process that resulted in the Affordable Care Act is striking.
That bill, of course, secured 60 votes from senators representing the substantial majority of the country. It ultimately passed on a party-line basis, but only after a lengthy public hearing process in two committees through which senators in both parties were given a say in the final product. Republicans weren’t simply allowed to propose amendments to the legislation—many of their amendments were adopted. Nobody’s interests went unconsidered along the way, and though then-Minority Leader Mitch McConnell ultimately corralled his members into lockstep opposition to the entire Democratic agenda, the great irony is that the law operates as a substantial transfer of resources from states Democrats represent to those Republicans represent.
While Obamacare has grown into a popular law that insures tens of millions of people, the Republicans’ American Health Care Act is unpopular in every state in the country. We can’t know with any certainty whether a more open process would result in better, more popular legislation. McConnell’s determination to cut the broader public out of the process points to a belief that sunshine would kill any bill that reflects GOP priorities.
Republicans cannot grapple with this basic fact honestly. If the point of keeping the bill text tightly held is to prevent the public from weighing in on unpopular provisions before the Senate votes, then admitting that there’s anything unusual about their bill, or their strategy, would give away the game. They thus respond to critiques of their conduct not by opening the process, but by lying about what’s really happening.
The good news is that Republicans can’t stave off democratic accountability forever. In a sense that should be obvious to them, the fact that they’re hiding legislation from the public because they know the public won’t like what they’re doing will become a defining fact about the law when people begin suffering under it. Republicans will face a reckoning at some point, in other words; the only thing we don’t know yet is whether the reckoning will come before it’s too late for the rest of us.
Donald Trump seems hell-bent on becoming the first president to be impeached for tweeting. On Thursday and Friday, he fired off tweets that both confirmed he was under investigation and seemed to be trying to sabotage the process:
Faced with a lawless president, Democrats have to start thinking about their constitutional duties. Trump is not a normal president whom Democrats can agree with, or dispute, on policy terms. He represents a fundamental challenge to the functioning of American democracy, and he raises the most serious questions about presidential power.
As California Senator Dianne Feinstein argued on Friday, the above tweets showed an alarming contempt for the rule of law. “The message the president is sending through his tweets is that he believes the rule of law doesn’t apply to him and that anyone who thinks otherwise will be fired,” Feinstein, a Democrat, said in a statement. “That’s undemocratic on its face and a blatant violation of the president’s oath of office.” Feinstein’s words aimed to shore up the government officials whom Trump is threatening, and to serve as a warning that there would be political backlash if Trump tried to fire Deputy Attorney General Rod Rosenstein or Justice Department special counsel Robert Mueller.
As far as it goes, Feinstein’s gesture was appropriate. If Trump did fire Rosenstein or Mueller, this would be further evidence of obstruction of justice. Yet protecting them can’t be the main instrument of resisting and restraining Trump. Indeed, relying on Rosenstein and Mueller as barriers against Trump’s worst excesses is a prime example of a trap that liberals have fallen into time and again when dealing with presidential abuse of power—a tradition of “prosecutorial liberalism,” which seeks legal rather than political remedies to punish presidential misdeeds. Such an approach is dangerous because it allows legislators to pass off political problems to apolitical law enforcement officials.
Serious political crimes aren’t the same as regular ones: They require not just punishment for lawbreakers, but also political fixes. That’s why the Democratic Party can’t rely on the likes of Rosenstein and Mueller. In fact, since Republicans largely remain loyal to Trump, Democrats are the only ones capable of truly solving this crisis—if they’re given the power to do so. They just have to convince voters of it.
Watergate is often seen as the zenith of modern political scandal. Yet, there was only a minimal attempt by Congress back then to solve the problem of the imperial presidency. Instead, almost every subsequent presidency has gotten bogged down in legal quagmires, as Congress uses law enforcement as a Band-Aid, without grappling with the real problem of presidential power. To criminalize the political process is to evade checks and balances, and it has resulted in a never ending tit-for-tat, where one party seeks revenge by scandalizing the other.
Gerald Ford poisoned his own presidency from the start by pardoning Richard Nixon, thereby setting a precedent for protecting executive branch lawbreaking. Ronald Reagan’s presidency nearly capsized because of the Iran-Contra affair, which stained his successor, too; George H. W. Bush pardoned many leading figures, including Caspar Weinberger and Robert McFarlane, which broadened the precedent by showing how a wide-ranging criminal conspiracy could be shielded after the fact. Bush’s son followed this tradition by commuting the sentence of Scooter Libby, the former chief of staff to Vice President Dick Cheney, who had been guilty of perjury and obstruction of justice. Democrats got a taste of the criminalization of politics with various ginned up scandals against the Clintons, ranging from Bill Clinton’s perjury during the Monica Lewinsky affair to Hillary Clinton’s use of a private email server.
President Barack Obama seems to have escaped this pattern, since his administration was notably squeaky clean. The public largely saw the Benghazi, Fast and Furious, and IRS controversies for what they were: Desperate, partisan attempts by Republicans to damage a popular president. Yet in a different way, Obama contributed to the larger constitutional crisis that has gone unresolved. Obama greatly expanded the power of the president to operate unilaterally, notably through drone strikes and executive orders on domestic policy. This left a dangerous set of tools to be abused by future presidents, beginning with Donald Trump.
In all the major modern presidential scandals, prosecutors and law enforcement officials have played a central role—from Lawrence Walsh to Ken Starr to Patrick Fitzgerald to James Comey to Robert Mueller. It’s easy to see why both liberals and conservatives look to these lawmen as the solution to scandals real or imagined. They fit a familiar cultural pattern found in Law and Order and many other shows: the heroic prosecutor, often an overgrown Boy Scout with a crew-cut, who works relentlessly to put the bad guys behind bars. Prosecutorial liberalism is the dream that the messiness of politics can be replaced with the moral clarity of a cop show.
While Mueller has a role to play in gathering evidence against Trump, liberals need to stop extolling prosecutors and realize that the true task of holding Trump accountable belongs to Congress. The only constitutional remedies for Trump’s actions are to be found in the legislative branch—remedies that include not just impeachment, but also passing new laws restricting executive power so that future presidents can’t behave as Trump does. Congress needs to start working on rolling back the imperial presidency, a task left incomplete since Watergate.
Given that we now know that the White House can be inhabited by someone as unstable as Trump, Congress needs to sharply limit the president’s war-making ability. Congress could also pass laws requiring presidents to make full financial disclosures before taking office, making more difficult the numerous conflicts of interest that have already bedeviled Trump’s administration.
Both impeachment and pushback against the imperial presidency are going to be hard. As Jonathan Rauch noted in a Brookings Institution report, it will be an uphill battle to get enough Republicans in the Senate to support dislodging Trump, given the president’s popularity within his own party, and the GOP Congress’ obsession with repealing Obamacare and cutting taxes. Talking about impeachment might also endanger Democrats’ chances of regaining a majority in the House of Representatives in next year’s midterm elections, particularly in districts where Trump won the popular vote.
Still, a political agenda can’t be defined by what is easiest to do. Trump’s threat to the Constitution is a real one. Moreover, having so unfit a president makes clear the dangers of the imperial presidency. Obama enjoyed the power to rain death upon countries all over the world, even killing American citizens. Are we comfortable with Trump having such powers?
Democrats face an enormous problem: How can they restore faith in a political system that elected Trump? This question extends beyond the president, as Republicans in Congress continue to downplay his wrongdoing. That Trump is symptomatic of much deeper problems can also be seen in the Republican Senate’s attempt to sneak through Obamacare repeal behind closed doors. The public and Democrats are likely to receive only the smallest of windows to see the actual bill before it is jammed through.
In their attempt to retake the House and Senate in 2018, Democrats would do well to campaign not only on dethroning Trump, but also on addressing the crisis in American democracy. The argument would be a straightforward one: Trump and the Republicans have broken our politics, and only Democrats can fix it.
In 2011, Kassidy Pelletier was washing dishes at her kitchen sink when she heard that the Canadian government was evacuating everyone in her community of Lake St. Martin. Extensive flooding had swept the region, and the government had decided to protect the predominantly white city of Winnipeg by redirecting the waters to Lake St. Martin, a First Nation reservation. Within days, some 1,300 indigenous residents were evacuated from their homes, their entire community destroyed.
The evacuees were initially relocated to hotels in Winnipeg. Six years later, they’re still there, living in temporary government housing. Unable to return to their ancestral lands, a generation of elders has passed away; teenagers who grew up in hotel rooms have begun giving birth to children of their own.
Two years after the flood, when Pelletier returned to Lake St. Martin to salvage her family’s belongings, she discovered her childhood home looted, the valuables gone. Walking into her kitchen, she covered her face with her sweatshirt to mask the stench.
Lake St. Martin’s council recently broke ground on a new community for the displaced residents, not far from their original home. But the 280 houses the Canadian government has agreed to fund are just half of what the community needs.
The displacement has taken the biggest toll on the teenagers. “They’ve come of age when their sense of belonging and identity have been stripped,” says photographer Michelle Siu, who has documented the community’s ordeal. “I wonder now, as they get settled on new land, how many teens will return to a place that was once their entire world, but now must feel like a world away?”
A watery path on the land that will house the new Lake St. Martin reserve.
Displaced to Winnipeg, Rebecca Sinclair was originally from Lake St. Martin.
Diane Sinclair holds her granddaughter at their temporary home in Winnipeg. Her 20-year-old daughter Alexis took her life weeks after the evacuation in 2011.
Community members watch as votes for their chief and council are tabulated at the makeshift Lake St. Martin school in Winnipeg. Incumbent Chief Adrian Sinclair was re-elected, but the long displacement has divided the community in many ways, and this was palpable during the elections.
A young couple from Lake St. Martin watches television.
Faith Woodhouse, 14, returns to Lake St. Martin for a few days with her family to protest an emergency channel that would further devastate the commercial fishing conditions. She and her family are among the rare residents who have been able to return to visit their ancestral land during the displacement.
In October 1969, a national security official named Daniel Ellsberg began secretly photocopying 7,000 classified Vietnam War documents. He had become increasingly frustrated with the systematic deception of top U.S. leaders who sought to publicly escalate a war that, privately, they knew was unwinnable.
In March 1971 he leaked the documents—what would become known as the Pentagon Papers—to a New York Times reporter. The newspaper ended up publishing a series of articles that exposed tactical and policy missteps by three administrations on a range of subjects, from covert operations to confusion over troop deployments.
In the decades since, the Pentagon Papers helped shape legal and ethical standards for journalistic truth-telling on matters of top secret government affairs in the United States. Openness, in the eyes of the public and the courts, would usually prevail over government secrecy. In this sense, the transparency that came from the papers’ release shifted power from politicians back to citizens and news organizations.
That balance of power is taking on a renewed significance today. In the wake of Reality Winner’s alleged recent national security leak, prosecution of members of the press over the past few years as well as pointed anti-press and anti-leak rhetoric by the Trump administration, one must ask: Are we witnessing a swing back toward strengthened government control of information?
The Pentagon Papers helped Americans realize that government officials didn’t have qualms lying about policy. Perhaps more importantly, it showed them that the news media could act as a key conduit between the country’s most powerful political elites and a public they meant to keep in the dark.
“They made people understand that presidents lie all the time, not just occasionally, but all the time. Not everything they say is a lie, but anything they say could be a lie,” Ellsberg later said.
The New York Times began publishing the Pentagon Papers in June 1971. Citing national security concerns, the Nixon administration sought to stop publication of the papers. The case went all the way to the U.S. Supreme Court, where, in a landmark 6-3 ruling, The New York Times and The Washington Post won the right to continue publishing information contained in the documents.
From the Pentagon Papers until the Obama administration, there was “an unspoken bargain of mutual restraint” between the press and the government, according to legal scholars David McCraw and Stephen Gikow. The press would occasionally publish classified information, and the executive branches would treat those leaks as a normal part of politics.
Veteran investigative reporter Dana Priest described such a relationship as giving reporters “a greater responsibility to be thoughtful about what it publishes and to give government the chance to make its case.”
But since 2009, the federal government has grown increasingly hostile toward leakers and news organizations that have published classified information. As the New York Times noted in its coverage of Winner, President Trump, “like his predecessor Barack Obama, has signaled a willingness to pursue and prosecute government leakers.”
During Obama’s tenure, his administration prosecuted more leaks than every prior administration combined. He also continued to pursue high-profile cases against reporters who published stories using classified information. James Risen, a veteran national security reporter at the New York Times and target of such a case, called the Obama administration “the greatest enemy of press freedom in a generation.”
So what happened? How did this “unspoken bargain” fall apart?
Technology has certainly created more tension between the government and media outlets. Government employees and contractors can electronically access and release information to websites like WikiLeaks, which, in turn, can instantly publicize tens of thousands of pages of classified records.
Mainstream news organizations are also experimenting with new ways for leakers to submit classified information. The Tow Center for Digital Journalism at the Columbia University Graduate School of Journalism created a guide for news organizations using SecureDrop, described as an “in-house system for news organizations to securely communicate with anonymous sources and receive documents over the Internet.” ProPublica offers information on its website about how to leak “to hold people and institutions accountable.”
In a sense, this is part of a continuing battle between the seemingly incompatible traditions of a free press and a national security apparatus that benefits from secrecy.
Government transparency is a necessary ingredient for a democracy. To elect leaders, citizens at the local, state and federal level need to have as much access to accurate information about policy and policymakers as possible. On the other hand, when it comes to national security, complete transparency could mean compromising information that puts lives at risk.
However, according to University of Minnesota law professor Heidi Kitrosser, the threats that leaks pose to national security are often exaggerated by a political system that benefits from a public that’s kept in the dark about its leaders’ actions. Kitrosser wrote that in one warrant filed during the Obama administration, a member of the press was labeled as “an alleged leaker’s criminal coconspirator.”
Meanwhile, even though it’s become easier to leak information—and for news outlets to expose government corruption and misdeeds—the public has become increasingly wary about leaks.
A 2007 Pew Research Center report found nearly 60 percent of Americans felt the U.S. government criticized news stories about national security because it had something to hide. That same study showed 42 percent of Americans thought leaks harmed the public interest. By 2013, 55 percent of Americans believed Edward Snowden’s leaks about the National Security Agency surveillance programs did more harm than good.
Such a dramatic change in public opinion raises questions about whether the public today will even defend the media’s right to access and publish leaked information.
It certainly hasn’t helped that, during the first year of the Trump administration, the press has been attacked ad nauseam. The president routinely calls news organizations “fake news” and threatens increased prosecution of leaks.
The rhetoric comes at a time when the public has expressed a growing disdain for journalism. A September 2016 Gallup poll revealed Americans’ trust in the news media to “report the news fully, accurately and fairly” dropped to its lowest level since the group began asking the question in 1972.
Public opinion on this issue matters because there are flimsy legal protections for journalists and leakers. And if politicians realize they can go after journalists without facing a backlash at the voting booth, they could become emboldened.
Because of Winner’s leak, there are new questions about how much Russia interfered with the 2016 election. The Intercept, which published the document, called it “the most detailed U.S. government account of Russian interference in the election that has yet come to light.”
Nonetheless, Winner now faces 10 years in prison. There hasn’t been any legal action against the Intercept, perhaps because the government was able to track down Winner on its own.
Meanwhile, there’s no federal shield law—also known as reporter’s privilege—for journalists. Such a law would give journalists the legal right to protect the identities of confidential sources. However, 49 states and the District of Columbia offer some variation on reporter’s privilege through either case law or statute.
In 2009, a federal shield law to protect journalists from testifying against their sources made its way onto the agenda. With bipartisan support and a Democratic Congress, Obama said he would refuse to sign the bill if it didn’t include a significant exemption for national security. The bill went nowhere.
In 2008, law professor RonNell Andersen Jones studied 761 news organizations and found that reporters or editors in 2006 received 3,062 subpoenas “seeking information or material relating to newsgathering”—a number that, Andersen argued, justified federal legislation to protect them. Without firm legal protections, journalists face a lengthy, and potentially expensive, fight to fend off the government.
As journalism observers and researchers like me study how leaks, prosecutions and anti-media rhetoric impact everything from media trust to the free flow of information, we may be entering a post-Pentagon Papers era that shifts the power back to political elites, who seem more emboldened to go after leakers.
That’s not good for the average citizen. Ellsberg knew it in 1969. We should pay more attention now, too.
Summer is for scary movies. Everybody stays up late and the night is full of noises. In the heat the city shimmers with tension, in the countryside animals seem to be up all night, watching. Two movies attempt to offer up blockbuster shivers this summer: the latest installment of The Mummy franchise and the much-hyped contagion thriller It Comes at Night. Both movies connect to venerable movie-making traditions, invoking classics of American horror.
Considering that this is the 14th movie about an undead Egyptian person, perhaps the new Tom Cruise picture should have been called A Mummy. Instead, it is the fourth movie to be titled The Mummy, after The Mummy (1932), The Mummy (1959), and The Mummy (1999). Unlike any of its predecessors, this Mummy is set in Iraq and England. (It is also the first in the “Dark Universe” series of Universal monster movies, which I predict will outlive us all.)
The premise for the relocation to Iraq is that the villainous Princess Ahmanet (Sofia Boutella) was so naughty that she had to be dragged far from home for interment. Then, British crusaders carried off her favorite jewel and knife before settling under the Thames. The triad of Cruise’s American protagonist, skirmishes in Iraq, and London action scenes thus makes for a horribly parodic sketch of the alliance against the Axis of Evil, although this time the generalized, brown-skinned enemy has risen from the dead.
The Mummy stars a millennia-old lobotomized hottie with mystical powers, and a mummy. I’m kidding, but at 54, Tom Cruise looks and acts like an embalmed version of himself in the first Mission: Impossible movie. He opens the film by dodging “insurgents” and “ordering an airstrike” on an Iraqi village that presumably has been dragged into the American war. The 1999 version of The Mummy similarly opened with warfare, but that was a colonial fantasy from 1920s Egypt. Equally horrible, of course, but at least not about people who are dying now.
This new The Mummy has some excellent action scenes, particularly when our undead heroine (who remains sexy despite partial decomposition and having four eyes) summons all the glass in London to explode into sand. Unfortunately, neither Tom Cruise nor his co-star Annabelle Wallis are anywhere near as funny as Brendan Fraser and Rachel Weisz were in the 1999 classic. That mummy was funny, too—remember how he was afraid of cats? Cruise at least gets some good lines in, but Wallis is essentially a blonde plank with a face drawn on it, occasionally breathing the phrase, “Oh my god.”
To be fair to director Alex Kurtzman, he has a long tradition to contend with. The 1932 The Mummy, starring Boris Karloff, was a Karl Freund joint. The plot is remarkably similar to the 1999 version: A sinning priest named Imhotep accidentally gets resurrected, and he’s looking for his girlfriend. But the interesting difference is that Imhotep in the 1932 world has been masquerading as a regular guy. The horror hinges on not knowing whether the mummy is dead or alive. He presents a category problem, bringing a strange and unholy presence to the screen.
The mummy’s physical repulsiveness is crucial to the 1932 version, and then every mummy-centric picture since, until now. Sofia Boutella brings a gorgeous intensity to her role, but there’s none of the scarab-chewing grossness of Arnold Vosloo’s mummy in 1999, or the looming menace of Karloff, or the grodily flaking flesh of Christopher Lee in 1959.
This mummy has trendy bangs. There’s a thrill to be had in her feminine power, not least when she sucks the life out of men by kissing them (at last, a truly feminist action hero!). But by removing the grossness from the movie, The Mummy loses some of its purpose.
It Comes at Night is grotesquely misadvertised. The trailer shows black-eyed demons puking into people’s mouths, while the title implies some marauding monster. In reality, It Comes at Night is a movie about a highly infectious disease: A family is camped out in a house in the woods, escaping a contagion that is destroying society. A new family comes along. Should they be trusted, or not?
It’s an interesting premise, and It Comes at Night blends a remarkable number of traditions. There’s the family-as-microcosm thing, hearkening back to those isolated rural families of Chekhov, while the boarded-up-house-plus-contagion-fear recalls Night of the Living Dead. The woods have a menace that matches The Blair Witch Project. And the terrible prospect of losing loved ones to disease most closely matches 28 Days Later.
It Comes at Night makes a number of striking moves with genre. It will also be remembered as the breakout role for Kelvin Harrison Jr., who you may recall from The Birth of a Nation. As the teenage son Travis, his perspective holds the movie together. Harrison’s embodiment of teenage fear is excruciating to watch. He speaks little, communicating with his parents through tear-filled stares and keeping his eyes down when talking to strangers. He acts his adolescence with his whole body.
The character of Travis also develops the movie’s most interesting theme around physical intimacy. He eavesdrops on others’ breathy, nighttime conversations. We hear his breath inside his gask mask, see him in bed, enter his dreams. The house is filled with intimacies that make its inhabitants feel soft and vulnerable. The fear that haunts the house therefore feels like a commentary on the terrible emotional danger of cohabitation and family units, as much as on mistrust.
It Comes at Night is a valuable meditation on what becomes of people when they become desperate. But it isn’t the frightfest that its trailer so strongly implies. In the theater, I heard a baffled “What the fuck?” from a few rows back as the credits rolled.
In fact, neither of the two supposedly frightening movies that opened this past weekend are actually frightening at all. Each feels too concerned with engaging with or rejecting the classic tropes of horror movie-making. It Comes at Night teases with a conventionally bloody trailer, then offers its audience a product overloaded by cinema history and without enough new to offer. The Mummy tries to get out of its own box, but remains mired in the legacy of the 1999 version, which after all is only just old enough to vote.
It is hard to make scary movies in in the 21st century, and a studio might be forgiven for investing in dumb Michael Bay robo-violence instead of even trying to make a new kind of monster. Genre will always loom with a big shadow over any movie attempting to make us jump. But as psychologically innovative pictures like The Conjuring or The Babadook should remind us, it is worth at least trying to do something new, instead of clumsily defanging something old.
Donald Trump’s presidency might be a catastrophe of epic proportions, but you have to grant him one thing: He’s made Americans pay attention to whatever the commander-in-chief is saying. On April 28, for instance, the White House issued the kind of presidential proclamation that is usually the proverbial tree falling in the forest, unheard and unseen. Like every president since Eisenhower, Trump proclaimed May 1 to be Loyalty Day—the occasion first invented during the Red Scare of the 1920s to counter the traditional pro-worker May Day. JFK had done it, LBJ had done it, Obama had done it. But when Trump did it, half the country—and all the Twittersphere—went into panic mode. What was this new Loyalty Day that Trump had come up with? Another step on the road to fascism?
For once, there was actually nothing to panic about. But the reaction spoke volumes. Loyalty Day seemed to symbolize the defining neurosis of Trump’s presidency: his maniacal need for loyalty above all else. It’s the reason he has failed to nominate anyone to fill hundreds of key federal jobs. It’s the reason he has surrounded himself not with “the best” advisers, as he promised in the campaign, but with a gaggle of ego-stroking family members, hangers-on, profiteers, and rogue ideologues. And it’s why, the week after Loyalty Day, “loyalty” became the pretext for his decision to fire the FBI director who was investigating Trump’s own campaign. James Comey’s dismissal left little doubt that Trump’s preoccupation with personal loyalty—even more than incompetence, stupidity, or corruption—could be the thing that wrecks his presidency.
As Comey told associates at the time, and testified last week in Congress, the trouble began when he refused Trump’s demand that he pledge his personal fealty, as if he were a monarchal subject. Instead, Comey promised to be “honest”—and thus loyal to the duties of his office, rather than the whims of his boss. After the firing, Trump denied that he’d asked Comey for loyalty, but emphasized that it wouldn’t have been wrong if he had. “I don’t think it would be a bad question to ask,” he told his friends over at Fox News. “I think loyalty to the country, loyalty to the United States, is important. You know, I mean, it depends on how you define loyalty.”
We know how Trump defines it—and that’s the problem. This is a man who, at a campaign rally, asked his supporters to pledge allegiance not to the flag, but to him. This is a man who made his bodyguard the head of Oval Office Operations, a key gatekeeper role. Every president needs a few die-hard loyalists around him; Trump needs everyone to be a die-hard loyalist. And since Trump already rivals Richard Nixon as the most dangerously isolated president in American history, it’s essential to understand why.
In some ways, the tunnel-vision insistence on loyalty is the least surprising thing about Trump’s presidency. After all, the man has always been a self-described “loyalty freak.” Asked a couple of years back what he looked for most in an employee, he shot back the “l”-word without pause. Some observers have chalked this up to Trump’s paranoid personality and deep-seated insecurity. More tangibly, it stems from his business past: Navigating the notoriously cutthroat, mob-ridden, and litigious world of New York real estate, he naturally came to value personal allegiance above all else. When your business model orbits entirely around yourself, everyone else must yield to your gravity.
But Trump’s loyalty-mania can’t be chalked up solely to his personal quirks. For all our armchair analysis of the president’s predilections, we’ve been missing a key reason for Trump’s destructive obsession with loyalty: It’s a function of his politics.
Any populist who wins the presidency must face an inherent contradiction. To cast himself as the candidate of the people versus the establishment, Trump made radical promises: overturn trade deals, carry out mass deportations of illegal immigrants, punish runaway American companies, and replace Obamacare with “insurance for everyone.” But such demands are unacceptable to the diverse array of forces required to enact major changes in public policy.
Perhaps in Latin America, or in one of the newly minted Asian democracies, a winning candidate, with the army on his side, could cow the legislature and judiciary into following his lead. But in the United States, the presence of constitutional checks and balances, a military subordinate to civilian authority, semi-independent agencies like the FBI and the Federal Reserve, and a powerful nexus of business and interest groups all combine to undercut any threat to the system posed by populism. Trump isn’t just lonely because he’s Trump; he’s lonely because his politics require him to govern that way.
After his surprise election, Trump could have tried to ingratiate himself with the GOP establishment the way Ronald Reagan did after running as a far-right “maverick” in 1976 and 1980. But Trump bought into the notion, promoted by his chief adviser, Steve Bannon, and his ally Nigel Farage of the nationalist UK Independence Party, that he could transform the Republicans into a “party of the American worker,” as he vowed at this year’s Conservative Political Action Conference. His inaugural address was a ringing endorsement of populism: “For too long,” he declared, “a small group in our nation’s capital has reaped the rewards of government while the people have borne the cost. Washington flourished, but the people did not share in its wealth. The establishment protected itself, but not the citizens of our country.”
That’s not the kind of talk that wins you friends in Washington and corporate boardrooms. During the early GOP primaries, only a single senator—Jeff Sessions—endorsed Trump, along with fewer than a dozen House members. After the election, others in Congress had little incentive to bow down to the president, since 178 of the 241 winning House Republicans outpolled him in their own districts. Trump also had no traditional network of powerful lobbyists and donors to back his agenda: Influential Republican business interests like the Koch brothers and the U.S. Chamber of Commerce sat out the election; the Club for Growth denounced Trump. It’s small wonder that, as he suffered one frustration after another in trying to impose his platform, Trump’s impulse was to keep barnstorming the country with campaign-style rallies. He needed to manufacture a popular surge behind his policies.
But Trump has never commanded that level of public support. In November, fewer than 40 percent of his votes came from hard-core supporters who were devoted to Trump. The rest came from life-long Republicans, single-issue conservatives, and independent voters who either backed Trump with reservations or loathed Hillary Clinton more than him.
Trump is boxed in by his lack of support both inside and outside Washington. In staffing his administration, he cannot command loyalty in the various ways most presidents do. He can’t point to party, because he trashed the party to win the White House. He can’t call in the political favors he’s owed, because he hasn’t done any. He can’t draw on his business relationships, because he’s burned almost every business associate he’s ever had. Unable to attract experienced veterans from the political and business establishments, Trump can offer newcomers to his court only one thing to kneel before: himself. Which explains the composition of his inner circle—they’re the only people who will join him.
Under Trump, personal loyalty is the sole qualification that counts. After taking office, Trump installed campaign toadies to monitor at least 16 key departments and report heretics to a White House deputy chief of staff. He rejected Elliott Abrams for a top role in the State Department because he hadn’t sided with Trump during the campaign, and abruptly fired an aide to Housing Secretary Ben Carson in February after it surfaced that he had written an op-ed critical of Trump last fall. But it’s impossible to staff an institution as vast and complex as the federal government if the only people you can count on are those who have never disagreed with you. Trump has yet to nominate anyone for 455 of 557 key posts that require Senate confirmation, effectively crippling his own administration.
In theory, there’s still time for Trump to come to terms with the realities of power in Washington. His tax cuts and deregulatory ambitions appeal to Republican business lobbies, and his get-tough immigration politics enjoy the support of House Republicans. He’s already changed federal abortion rules to please the religious right. By courting key factions of the establishment, he could conceivably govern like a conservative Republican.
But Trump appears too enamored of his own image as a billionaire populist to play by such rules. So he will continue to purge his administration of any one he cannot control, as he did with Comey. He will keep staging misguided shows of strength, as he did by trying to bully House Republicans into rubber-stamping his initial replacement for Obamacare. And as his political isolation deepens, his need for unquestioning allegiance will only grow more desperate. Operating as an outsider, he cannot overcome the structure of checks and balances that is thwarting his radical agenda; that same structure, however, is unlikely to lead to his removal, barring a Watergate-level scandal. But like those of the dictators he’d hoped to emulate, his reign will one day come to an end. And when it does, he will leave office as he entered it—accompanied only by his most loyal henchmen and enforcers, and thus utterly, irredeemably alone.
Originally published in the July 2017 print issue of New Republic, this article has been updated online to reflect recent news events.