Selma, the Sequel

Celebrated as a civil rights milestone, the three marches in Selma, 50 years ago, also ushered in a new style of social and political advocacy. In the March 18 issue of The Christian Century, I write about what many of the marchers went on to do, after Selma, and the faith-based movement they made. (By the way, the Century has a website worth checking every single day.) Here is the Selma piece, in full:

Fifty years ago, thousands marched across the Edmund Pettus Bridge in Selma, Alabama. They were led by an eye-catching row of marchers, including a bearded rabbi, an unidentified nun in flowing habit, and Martin Luther King Jr. The third Selma-to-Montgomery march, which began on March 21, 1965, is rightly remembered as a watershed in the struggle for civil rights. Less known is how Selma refocused the lives of many, black and white, who gave the march its spiritual hue.

The trek to Montgomery began with more than 3,000 of the civil-rights faithful, whose ranks swelled by the thousands along the way. In that initial vanguard were several hundred clergy and untold numbers of lay religious activists from around the country. Voting rights became law five months later, just as many who had marched were letting loose their faith in a wider field of activism, taking on a host of social wrongs. They and others forged a new style of advocacy eventually known as the “prophetic style.”

Until the 1960s, white church people were easy to spot at a civil rights protest in the South, because they were scarce. Standing out among them was William Sloane Coffin, the 30-something Yale chaplain and a former CIA agent. In May 1961, Coffin made front-page news nationwide because he was white, well connected—and leading a group of Freedom Riders, who rode interracial buses across state lines to challenge segregated transportation in the Deep South. He emerged as the brash young face of incipient white solidarity with southern blacks.

In the next few years, religious involvement in civil rights—beyond black churches—gradually grew. In Selma, it was finally brought to scale.

On March 7, police brutally assaulted several hundred marchers, mostly young black men attempting to cross the Edmund Pettus Bridge. Known as “Bloody Sunday,” the event is dramatically and faithfully rendered in the movie Selma. In response, King issued an urgent plea for a “Ministers’ March” to Montgomery. Within a couple of days, an estimated 400 members of the clergy were wandering Selma, many of them having flown in with one-way tickets.

In Cambridge, Massachusetts, Jonathan Daniels had an epiphany during a recitation of the Magnificat prayer at the Episcopal Divinity School. The white seminarian dropped everything to go see the humble exalted in Alabama.

In New York, renowned Jewish scholar Abraham Joshua Heschel momentarily agonized over whether to travel on the Sabbath. Two years earlier, he had met King at a national conference on religion and race in Chicago, where the two became fast friends. “The Exodus began,” said Heschel at his opening address there, “but it is far from having been completed. In fact, it was easier for the children of Israel to cross the Red Sea than for a Negro to cross certain university campuses.”

Heschel was outspoken, but had never quite taken to the streets before. Two weeks after Bloody Sunday, there he was, marching with King in the front row.

After Selma, many intensified their activism and broadened their faith-and-justice lens. Jonathan Daniels stayed in Alabama, living with a black family, fighting in the trenches to register black voters, and earning a stay in county jail. In August 1965, at the age of 26, he was shot dead by a segregationist construction worker moonlighting as a sheriff’s deputy.

That summer, King turned his attention to the subtler humiliations of northern racism. Soon, he and his family were tenanting in Chicago as he shifted his focus from lunch counters and voting booths to knottier problems such as housing and employment. King was also trying to reason with new, harsh adversaries, ranging from white northerners to black militants who dismissed his inclusive, interracial vision of a “beloved community.”

Seven months after Selma, some of its alumni pivoted to a different cause altogether. They formed a national organization that came to be called Clergy and Laity Concerned About Vietnam. Its leaders included Heschel, Coffin, radical Jesuit priest Daniel Berrigan, and then-Lutheran pastor (later Catholic neoconservative) Richard John Neuhaus. The New York-based coalition spearheaded some of the first broadly based mobilizations against escalated warfare in Southeast Asia.

It was this group that brought King firmly into the antiwar fold, with his then-controversial “Beyond Vietnam” speech at Manhattan’s Riverside Church in April 1967. At the time, King called the war “an enemy of the poor,” linking the expense of intervention in Vietnam to the lagging War on Poverty at home. By the end of that year he was announcing the Poor People’s Campaign, an interracial effort for economic justice. It was King’s last crusade, a dream unfulfilled.

Through these struggles, King and others nurtured a style of politics rooted most deeply in the prophetic literature of the Hebrew Scriptures. It was a politics of vehemence and passion.

If this loosely bundled movement had a bible—other than the actual one—it was arguably Heschel’s 1962 book The Prophets. This study of the ancient radicals helped usher out the soothing spiritual happy talk lingering from the 1950s. Heschel wrote jarringly (and admiringly) that the biblical prophet is “strange, one-sided, an unbearable extremist.” Hypersensitive to social injustice, the prophet reminds us that “few are guilty, but all are responsible.” Heschel also explained that a prophet feels responsible for the moment, open to what each hour of unfolding history is revealing. “He is a person who knows what time it is,” the rabbi wrote, checking his watch.

The book caught on among spiritually minded civil rights workers. After perusing its pages, a young aide to King named James Bevel started going around in a knit skullcap, his way of paying homage to ancient Israel’s prophets. On the day of the final march in Selma, scores of other young, black, and presumably Christian men also chose to incongruously sport yarmulkes. Andrew Young, one of King’s top lieutenants, has recalled seeing marchers arrive with copies of The Prophets in hand.

After King’s assassination in 1968, Heschel and many other Selma veterans pressed forward (though not for long, in the case of Heschel, who died in late 1972). In 1968, Coffin became a household name as he stood trial for aiding and abetting draft evasion through his counseling of young men. So did Berrigan, who exceeded Coffin’s comfort zone by napalming draft records. In keeping up their prophetic ministries, they and others also spawned an assortment of imitators.

In the late 60s and early 70s, student antiwar radicals mimicked the so-called “prophetic style,” denouncing and confronting like the spiritual radicals but adding contempt and sometimes even violence to the mix. They designated themselves, in the words of counterculture leader Tom Hayden, a “prophetic minority.” Later on, from another ideological galaxy, came Jerry Falwell and the Moral Majority, which explored the boundaries between prophetic denunciation of perceived social evils and demonization of one’s opponents.

In these and other imitations, much of the prophetic spirit was lost—and the tone. King and likeminded clergy of the 1960s may have been quick to denounce and confront, but they scarcely if ever demonized or even denigrated. Typically they managed to blend strong moral convictions with degrees of civility and good will often unseen in politics today.

Issues that galvanized the 60s clergy still haunt us today. Racism, poverty, and war remain with us; even voting rights is a present-day cause, due most notably to the voter ID laws passed by a majority of states. The “Black Lives Matter” uprising against police violence has exposed racial chasms in many cities. Jails are increasingly packed with poor people who committed minor offenses or were unable to pay court-imposed costs. In 1968, King considered the level of the federal minimum wage to be beneath dignity; today, adjusted for inflation, it’s worth substantially less.

Such challenges invite a theological perspective—and a prophetic one. It’s not hard to find people acting on that impulse, people like Kim Bobo of Chicago-based Interfaith Worker Justice, who has crusaded against wage theft while invoking Nehemiah’s censure of plundering the poor. She and many others breathe life into a far-flung movement that hit stride 50 years ago on a bridge in Selma.

…read more

Of Martyrs and Murderers

Students at the University of St. Thomas, in St. Paul, Minnesota, reenact the slaughter.

Students at the University of St. Thomas, in St. Paul, Minnesota, reenact the slaughter.

Who is a martyr? The question comes to mind 25 years after what has become known as “the Jesuit massacre” in El Salvador.

On November 16, 1989, an elite battalion of the Salvadoran military forced its way into the Jesuit residence at the University of Central America, or UCA, in San Salvador. Most of the soldiers had received counter-insurgency training in Georgia, at the U.S. Army School of the Americas. They proceeded to murder six Jesuits, their housekeeper, and her teenage daughter.

Unlike the martyrs of ancient Christianity, these men were not killed simply because they professed the faith. They were targeted specifically for speaking out on behalf of the impoverished and against persecutions carried out by the U.S.-backed military. Still, in the view of many, they died for the faith no less than the martyrs of old.

This happens to be subject to dispute in some quarters. The argument has surfaced mostly in connection with the sainthood cause of Archbishop Oscar Romero, who was gunned down by a paramilitary death squad while saying mass in the chapel of a cancer hospital in San Salvador, in 1980.

Friends of the cause would like to see Romero declared a martyr, a move that would unblock his path to beatification (the next-to-last step to sainthood) by making it unnecessary to prove that he performed a miracle. In other words, if you’re a martyr, you don’t need to be miraculous, at that critical stage of the process. Your advocates do need to prove just one miracle, though, in the final lap of canonization.

Those less thrilled with this prospect say Romero was not a martyr, because he didn’t die defending Christianity in general or a core doctrine such as the Resurrection. In this right-leaning view, Romero perished because he defended something so ancillary to the faith as the rights of the poor and powerless.

The argument is a little tendentious. It’s a bit like saying Derek Jeter doesn’t deserve a spot in baseball’s Hall of Fame because he didn’t hit all that many grand slammers. All he did was rack up 300-plus batting averages, steal bases like they were gold, and, speaking of which, walk off with five Golden Glove awards. Of course, all of that counts in Major League Baseball, just as standing up for the lowly and dispossessed matters in Christianity. The analogy veers off, because Romero was more than the theological equivalent of a great singles hitter. He knocked the ball out of the park in a way he could have never done by merely self-identifying as a Christian or endorsing the doctrine of transubstantiation.

In essence, Romero’s detractors are arguing that justice and the poor aren’t all that central to revealed faith. So, if you were forced to lay face down on the grass in the courtyard of UCA’s Jesuit residence, before shots were fired into your head, you didn’t have to go through all that trouble on account of your religious convictions. It was a sort of private choice you made, on the basis of your left-of-center political preferences, according to these skeptics.

But what happens if solidarity with the poor and marginalized is no small part of the story told in the Hebrew and Christian Scriptures? What if the so-called “preferential option for the poor,” articulated over the past generation in Catholic social teaching, means something?

I asked a Jesuit about this, specifically in the context of martyrdom. The Rev. Stephen A. Privett, S.J., is no random member of the Society of Jesus. He is the former president of the University of San Francisco, a Jesuit institution, and he knew the UCA Jesuits as a refugee worker in El Salvador during the late 1980s. The six priests were Ignacio Ellacuría (UCA president and internationally renowned theologian), Ignacio Martín-Baró, Segundo Montes, Amando López, Joaquin López y López, and Juan Ramón Moreno. They were slain together with Julia Elba Ramos and her 16-year-old daughter, Celina Maricet Ramos.

Privett and many others refer to all of them simply as “the martyrs.” He explained why, in an article I did for the U.S. Jesuit Conference, on the 25th anniversary of the predawn rampage at UCA. (The full story is available here, and my follow-up piece was also posted yesterday at the Conference’s site, www.jesuit.org.)

“When you sacrifice your life because of your active support for the marginalized, you are a martyr in the traditional sense. You are witnessing to a transcendental reality that is not comprehended by others, particularly the folks who are wielding the power,” explained Privett, underscoring that work for justice is an inherent part of his faith.

“I think the church needs martyrs in every era, to remind us that we can never be comfortable with the world as it is. We have to work for a better world, and often we pay a pretty heavy price, but that price is not that heavy when you look at it through the lens of the Resurrection, or through the eyes of the martyrs,” Privett added, putting a doctrinal and specifically Christian spin on the matter. “It’s a really important part and a dynamic piece of our tradition that keeps us moving and engaged, never comfortable with any status quo this side of heaven.”

For now, Privett and others will have to remain content with this supernatural form of justice. That’s because, in the case of the six Jesuits and two women, human justice was never done. None of the top military commanders who gave the orders to kill was ever prosecuted for the crimes. And we know their names, thanks in part to a 1993 report by a United Nations truth commission that investigated the atrocities.

Human-rights activists, including the San Francisco-based Center for Justice and Accountability, would like to see some long-delayed justice in this matter. So would Spain, which is now claiming jurisdiction in the case because five of the six Jesuit victims were Spaniards. Prosecutors there are trying to extradite some of those named by the U.N. commission. International justice might be catching up with the murderers, as one way of honoring the memory of the martyrs. …read more

Why Mandela Forgave the Butchers

Mandela with Archbishop Desmond Tutu

Mandela with Archbishop Desmond Tutu

Back in the early 1960s, black South African lawyer and activist Oliver Tambo was asked to describe a colleague who had just gone to prison for resisting white minority rule in that country. He replied that this man is “passionate, emotional, sensitive, quickly stung to bitterness and retaliation by insult and patronage.” Tambo was talking about his law-firm partner, Nelson Mandela—remembered today for his grace, humor, and empathy, as well as his remarkable courage and leadership.

What happened to Mandela in prison, what changed him so radically, is still a bit of mystery in my mind. He was often asked about a slice of this question—how he let go of the anger he felt specifically toward whites—and his responses were usually of a fairly standard therapeutic variety. Bill Clinton, in an interview aired last night by CBS Evening Newsrelated one such exchange with Mandela.

I said, “Now, Mandela, you’re a great man but you’re a wily politician. It was good politics to put your jailers in your inauguration and put the heads of the parties that imprisoned you in your government. But tell me the truth, when you were walking to freedom the last time, didn’t you hate ’em?” He said, “Yes. Briefly I did. I hated them and I was afraid. I hadn’t been free in so long. And then I realized if I still hated them after I left, they would still have me. I wanted to be free. And so I let it go.” He said, “That’s what you have to do. That’s what we all have to do. We have to let it go.” I mean, that’s the kind of thing he would say to me just in ordinary conversation.

“They would still have me.” How true. But does this explain the difference between the petulant man sized up by Oliver Tambo, circa 1963, and the Nelson Mandela we came to know and revere? Former Time managing editor Richard Stengel, author of Mandela’s Way: Fifteen Lessons on Life, Love, and Courage, has offered some further insight into Mandela’s personal transformation during his 27 years locked up in a tiny cell. Asked in an interview if prison was one of Mandela’s great teachers, he said:

Yes. Because prison changed that young man, and it burned away a lot of the extraneous parts of his character. And again, part of it was through his own self-analysis, but part of it is through this imposed control that prison has on you. I mean, the only thing you could control when you were in prison for all those years was yourself.

I mean, I remember when I first went to his cell in Robben Island. And I walked in, I walked—nearly walked in, but I gasped when I saw it, because—I mean, Nelson Mandela, as you know, is a big man. He’s 6’2″ inches tall, he has big hands and a big head. And he is larger than life in a literally and figurative way.

And this prison cell—I mean, he couldn’t even lie down and stretch out his legs. I mean, it could barely contain him. But what he learned and what he taught himself was how to contain himself, how to practice the self control that he actually didn’t have before he went into prison.

I don’t know if even this explains how someone becomes a strikingly different human being, although prison has been known to bring about extraordinary changes in people. What’s clear is that Mandela left prison with forgiveness in his heart—but there’s no getting around the politics.

Mandela’s Politics of Forgiveness

Mandela understood the difference between personal forgiveness and forgiveness in politics. In one of many symbolic and deeply personal gestures, he made his white jailer an honored guest at his presidential inauguration in 1994. But he knew that something else was needed in dealing with the larger ranks of white South Africans (often in the police and military) who had committed terrible human-rights violations. Mandela did not, as is widely believed, simply let those people go free, unconditionally. They had to do something in return for political amnesty. And that something was enshrined in the post-apartheid Truth and Reconciliation Commission that he set up with Anglican Archbishop Desmond Tutu as its chairman.

Human-rights abusers had to go before this tribunal, whose proceedings were televised, and tell the whole truth about their atrocities. They had to reveal, in some cases literally, where the bodies were buried, and they did so often in grisly detail. Or else, they faced criminal prosecution.

This is not garden-variety forgiveness. It is not a single, unconditional act of letting bygones be bygones. Political forgiveness is different. It is a process, usually a negotiated one. It calls for truth and acknowledgment, if not necessarily repentance, and there are trade-offs and conditions. Without the conditionality, forgiveness loses a vital link to justice and restitution. It ceases to have a reason for being in politics.

Mandela knew this. At the same time, he realized that justice alone (investigations and prosecutions) was not the answer. For one thing, there might not have been a negotiated settlement with the apartheid regime, without clear provisions for amnesty. In other words, there might have been the bloodbath between white and black South Africans that many had predicted.

Beyond that, Mandela had other pragmatic considerations that didn’t arise simply from the goodness of his heart. His clear-eyed view was that the stability of the New South Africa depended on a well-calibrated process of reconciliation. He went down this road at least partly because there was no real alternative. As a politician as much as a person, Nelson Mandela knew there was no future without forgiveness.

Posted today also at Tikkun Daily. …read more

Lascivious Swedes and other Vindications of Calvin

Vice magazineLately I’ve been exploring Vice, not the awful habits (those come naturally), but the international print and online magazine by that name. This week I’ve clicked on pieces with such headlines as “A Muslim’s Adventures in Pork,” “Massachusetts Might Force a Women to Share Parental Rights with the Rapist Who Impregnated Her,” and “You Can’t Just Walk Around Masturbating in Public, Swedish People.” The latter story was about a 65-year-old man who did the deed on a public beach in Sweden but was acquitted on grounds that he wasn’t seeking to harass “any specific person.”

But what really drew me into Vice was not a lascivious Swede, but an interview with Marilynne Robinson, Pulitzer Prize-winning author of acclaimed novels including Housekeeping and Gilead, and one of the more clear-eyed observers of the human situation.

When I saw the headline, “A Teacher and Her Student … Marilynne Robinson on Staying Out of Trouble,” my first thought was that she’s a creative choice for a publication called Vice. Robinson has a fresh and thoughtful take on the theological sensibility of John Calvin, who had a searching eye for all manner of human frailty.

Asked if she had any notable vices, Robinson quickly mentioned “lassitude,” apparently alluding to the second definition of that word—“a condition of indolent indifference.” She recalled a comment by a scientist on why creatures sleep—“It keeps the organism out of trouble.” She added, “So every once in a while I sit on the couch thinking, I’m keeping my organism out of trouble,” suggesting another human foible, that of self-rationalization.

“I do get myself involved in things that require a tremendous amount of work. And of course, I’m always measuring what I do against what I set out to do,” she continued. “My other vices—I cannot have macaroons in the house! I’m a pretty viceless creature, as these things are conventionally defined. On the other hand, one of the reasons I have taken [John] Calvin to my heart is that I can always find vices in the most unpromising places.”

Asked what a vice is, Robinson gave a sort of classically Calvinist response, “I have no idea. Underachievement, I suppose. The idea being that you have a good thing to give and you deny it.”

The Trouble with Seeing

The interviewer, Thessaly La Force (a former student of Robinson’s at the Iowa Writers’ Workshop), evinced no interest in the theological side of Robinson’s ruminations. And the part of the conversation I’ll remember for a while had to do not exactly with a vice, but with the decline of a virtue—simple respect for others and their degrees of goodness. Here’s how she unpacks the problem:

I think that a lot of the energies of the 19th century, that could fairly be called democratic, have really ebbed away. That can alarm me. The tectonics are always very complex. But I think there are limits to how safe a progressive society can be when its conception of the individual seems to be shrinking and shrinking. It’s very hard to respect the rights of someone you do not respect. I think that we have almost taught ourselves to have a cynical view of other people. So much of the scientism that I complain about is this reductionist notion that people are really very small and simple. That their motives, if you were truly aware of them, would not bring them any credit. That’s so ugly. And so inimical to the best of everything we’ve tried to do as a civilization and so consistent with the worst of everything we’ve ever done as a civilization.

On the surface, the notion that human beings are deserving of cynicism might seem to be an instinctively Calvinist (read dour) view. But that’s not how Robinson presents this misunderstood man of the Reformation. She has pointed out elsewhere that Calvinism starts with the idea that human beings are images of God, and every time we see another person, we’re encountering this image. The complication is that humans don’t have very good vision, in that regard.

Every act of seeing “tends to be enormously partial, just given the human situation,” Robinson told my friend and collaborator Bob Abernethy a few years ago. We may see things in a person that bolster our cynicism without seeing much else. And so, in her hands, this Calvinist perspective, this awareness that we never see adequately or exhaustively, “sensitizes you to the profundity of the fact of any other life—that people can’t be thought of dismissively.” And yet, that’s exactly how we are often made to think of the other, courtesy of this human situation. …read more

Sacred Space, at the Corner of Boylston and Berkeley

At Boylston and Berkeley, 8:00 a.m., Monday April 22

At Boylston and Berkeley, 8:00 a.m., Monday April 22 

Prepared for today’s edition of Tikkun Daily.

Two days after the Boston Marathon bombings, Massachusetts Governor Deval Patrick was asked in a public radio interview if there would be a permanent memorial to the victims of that horrific act. Patrick understandably felt it was too early to speculate about such a memorial—this was before the dramatic lockdown of Boston and surrounding communities. He went further to say that the most fitting tribute would be to return next year with the biggest and best marathon ever.

That surely would be a testimony to the city’s spirit, but it seems the governor, as a good technocrat, was missing the point. Fact is, people were already finding makeshift ways to memorialize the event. And if past atrocities are a guide, they’ll eventually find a permanent space for that solemn purpose.

If I didn’t know this already, I’d have found out just by standing for a few minutes near Copley Square this past Monday morning, at the intersection of Boylston and Berkeley streets.

Boylston, a crime scene, was still closed at the time. But people stood silently on a sidewalk at the corner, leaning against a police barricade in front of a popup memorial. They gazed at the flowers, flags, candles, handwritten notes, and other items left by anonymous people. They stared at three white crosses in the center of that growing memorial—in remembrance of the three who perished in the twin bombings of April 15. The shrine to eight-year-old Martin Richard was teeming with Teddy Bears, balloons, and children’s books.

People will memorialize, because they know hallowed ground when they see it. It’s extraordinary, when you think about it—how the heinous and the hallowed can share the same space, how a site of evil can be transfigured as holy. But this seems to happen every time. It happened at the Twin Towers, at the Murrah Federal Building at Oklahoma City, at Pearl Harbor, and most profoundly, at Auschwitz. Each of those names marks out a distinct space in the timeless realm of evil. And each space is inviolable.

But how about Boylston Street, or a consecrated corner of it? Is it now part of this geography of the sacred? It is, if you think of such space the way historian Edward Linenthal does. In an interview adapted in a book I did some years ago with Bob Abernethy of PBS’s Religion & Ethics Newsweekly, he said:

My definition of a sacred space is a simple one. Any place that’s capable of being defiled is by definition sacred. You can’t defile ordinary space. Any place that for a group of people is so special that a certain way of being there would be an act of disrespect means that that place is charged with a particular kind of meaning.

Linenthal, who now teaches religious studies at Indiana University, continued:

I tell my students, if they were sitting in the parking lot at K-Mart with a boom box, no one’s going to really care. They might be irritated that the noise is too loud. But if they had a boom box at Gettysburg or in the grove of trees at Shanksville [into which United Airlines Flight 93 crashed on September 11, 2001, in Pennsylvania] or in a church, a mosque, or a temple, it would be considered an act of defilement.

This is why questions about what to do with these places fraught with meaning can be so vexing and contentious. Consider the 9/11 memorial in Manhattan. The decision to store the unidentified remains of victims in an underground repository—rather than a more visible place of tribute—stirred resistance from victims’ families.

Sure enough, a debate erupted this past week over the impromptu memorial at Boylston Street—how to preserve it, where to move it. Such a discussion would have been ludicrous, if this were ordinary space. If it were incapable of being defiled.

And that’s just a prelude. A few days ago, Boston Mayor Tom Menino’s office let it be known that the process of figuring out how to permanently memorialize the bloodshed at Boylston has begun. …read more

“What the Hell’s the Presidency for?”

On Monday of this week, the police chief of Montgomery, Alabama, formally apologized to Georgia Congressman John Lewis, for what the police did not do in May 1961—protect Lewis and the other young Freedom Riders who arrived at the city’s Greyhound Bus station and were summarily beaten by a white mob. The day before the ceremony (the first time anyone had ever apologized to him for that particular thrashing, the congressman noted), Lewis, Vice President Joe Biden and 5,000 others joined in an annual reenactment of the 50-mile March from Selma, which led to passage of the Voting Rights Act in 1965. On that occasion 48 years ago, state troopers took a less passive approach and brutalized Lewis and others themselves. A few days before the reenactment, President Obama unveiled a statue of Rosa Parks that will stand permanently in the U.S. Capitol’s Statuary Hall, making her the first African American women to be so honored.

One name that doesn’t figure notably in these various commemorations is that of Lyndon Baines Johnson. But it should. At least that’s my feeling after reading Robert A. Caro’s The Passage of Power, the latest in his magnificent series of Johnson biographies. The writer makes it clear that Johnson wasn’t just a pragmatic politician who acceded to the prophetic demands for action on civil rights. LBJ made it happen, partly out of a visceral identification with the “dispossessed of the earth,” as Caro puts it.

True, there probably wouldn’t have been a Civil Rights Act of 1964 (not that year, anyway) if Parks had lost her nerve on the bus in Montgomery, in 1955, and given up her seat to the white passenger, or if King hadn’t led his nonviolent warriors into the streets of Birmingham in 1963. And the same goes for the Selma marchers and the Voting Rights Act (which the Supreme Court now seems poised to undo). But it’s also true that civil rights legislation was heading nowhere in the administration of the Brothers Kennedy.

JFK and RFK meant well, once they decided to push a bill of that kind. But they didn’t fully grasp what Johnson saw, which is that powerful southern lawmakers would be able to slam the breaks on civil rights, just as they had blocked other liberal domestic reforms ever since the late 1930s. A new strategy was needed to break open the dams of progressive legislation.

Dixie Democrats, in union with sympathetic Republicans, had perfected the art of legislative hostage taking in Congress. They would stall a critical piece of legislation, such as an appropriations bill, or something else that key lawmakers absolutely wanted, until the progressive measure was withdrawn. That’s how they fought off higher minimum wages, expanded unemployment insurance, greater federal aid to education, and other initiatives beginning in the Roosevelt administration (after the early-to-mid-thirties onslaughts of New Deal legislation).

When the Kennedy administration decided to press for a civil rights bill, in June 1963, they sent it up to Capitol Hill along with other must-pass items. Johnson, as vice president, had warned against doing exactly that. He had told Kennedy and his senior aides that they needed to shepherd the other bills through the process, before trotting out civil rights.

Relating a conversation between Johnson and Kennedy confidant Ted Sorensen, Caro writes:

He tried to explain to Sorensen how the Senate works: that when the time came for the vote on cloture [halting a filibuster], you weren’t going to have some of the votes you were promised, because senators who wanted civil rights also wanted—needed, had to have—dams, contracts, public works projects for their states, and those projects required authorization by the different Senate committees involved, and nine of the sixteen committees (and almost all of the important ones) were chaired by southerners or by allies they could count on.

The vice president was ignored as usual—frozen out of the administration’s legislative efforts, partly due to the machinations of RFK, who detested him. The Kennedy people thought they understood legislative realities better than the man who had been “the Master of the Senate,” as Caro dubs him, and they proceeded to play straight into the hands of southern tacticians, who bottled up the civil rights bill. Because of that, Kennedy did not live to see progress on that front.

The general wisdom is that his assassination is what galvanized the country behind his legislative program. And, as shown in The Passage of Power (covering the years 1958-1964), Johnson did move at breakneck speed to capitalize on that momentum. At the same time, he resisted calls to send civil rights to Congress right away, together with other bills deemed necessary—calls issued by Martin Luther King Jr. and the other civil rights heroes. Johnson waited. He kept his eye on the hostage takers, realizing that the best way to thwart them was to not hand them any hostages. He let other bills (appropriations, foreign aid, etc.) pass first. Then he mounted his attack. That’s how civil rights became law in the summer of 1964.

Don’t Leave out Lyndon

Caro points out that many have questioned the sincerity of Johnson’s commitment to civil rights. The author says those people should pay closer attention to words he let out during a meeting with governors at the White House (days after the Kennedy assassination), about why they should fight inequality and injustice: “So that we can say to the Mexican in California or the Negro in Mississippi or the Oriental on the West Coast or the Johnsons in Johnson City that we are going to treat you all equally and fairly.”

Note the “Johnsons in Johnson City,” Texas, where he grew up. Caro analyzes:

He had lumped them all together—Mexicans, Negroes, Orientals and Johnsons—which meant that, in his own heart at least, he was one of them: one of the poor, one of the scorned, one of the dispossessed of the earth, one of the Johnsons in Johnson City. What was the description he had given on other occasions of the work he had done in his boyhood and young manhood? “Nigger work.” Had he earned a fair wage for it? “I always ordered the egg sandwich, and I always wanted the ham and egg.” Nor was it financial factors alone that accounted for his empathy for the poor, for people of color—for the identification he felt with them. Respect was involved, too—respect denied because of prejudice.

Caro continues, relating what President Johnson said as he further reflected on his experiences as a young man teaching impoverished Mexican American children near San Antonio:

He had “swore then and there that if I ever had the power to help those kids I was going to do it.” And now, he was to say, ‘I’ll let you in on a secret. I have the power.” “Well, what the hell’s the presidency for?”

Lyndon Johnson is not known as one of the prophetic personalities of the civil rights era, and shouldn’t be. It was King and others who shaped the vision (in King’s case, of a “beloved community”) and expanded the realm of the possible, which enabled the “Master of the Senate” to work his legislative magic. Still, it’s hard to picture a Civil Rights Act of 1964 or a Voting Rights Act of 1965 without LBJ as well as MLK on history’s stage at that moment. That ought to be recognized more often than it is.

This item was first posted yesterday at Tikkun Daily.

When MLK was Old

King at Boston University

A new study published in Science magazine invites a fresh take on Bob Dylan’s refrain, “Ah, but I was so much older then, I’m younger than that now.” The study of 19,000 adults found that most people realize how much they’ve changed in the past ten years but seriously underestimate how different they’ll be in the future. People of all ages think they’ll stay pretty much the same—incorrectly, according to the Harvard and University of Virginia researchers. They call it the “end of history illusion.”

That’s to say, we think we’re so much older and wiser, but we’re younger than that now. There’s more growth to experience—different values, preferences, and personality traits to make our own. I don’t know if that’s necessarily comforting. Depends on how much you want to stay “just the way you are” (with apologies to Billy Joel). There were helpful summaries of the study and its methodology in Science Times and the Boston Globe, and at NPR online.

With Martin Luther King Day coming up, it’s worth asking how many of history’s great figures would have predicted how different they’d be, ten years out. I don’t think MLK, sprinting to his doctorate in theology at Boston University in 1953, had a clue.

Absorbed in Hegel, Tillich, Niebuhr, and others, King had what he saw as a clear picture of his future self. It involved standing at the front of a class in social ethics at a seminary or university, preferably a northern institution. As Stephen B. Oates recounted in his 1982 biography of King:

He hadn’t all the answers, by any means. He realized how much more he had to learn. But how he enjoyed intellectual inquiry. He would love to do this for the rest of his life, to become a scholar of personalism [the philosophical school that engaged his mind at B.U.], the Social Gospel, and Hegelian idealism, inspiring young people as his own mentors had inspired him. Yes, that would be a splendid and meaningful way to serve God and humanity.

King—on track to become a tweedy tenured theology professor—was so much older then.

A year later, he accepted what he assumed would be a sleepy temporary pastorate in Montgomery, Alabama. Newly married to Coretta, he took the job at Dexter Avenue Baptist Church, a relatively affluent congregation, figuring he’d get a little pastoral experience and draw a paycheck while wrapping up his doctoral dissertation.

Coretta wanted to get out of the Deep South as soon as possible. But on December 1, 1955, a 42-year-old seamstress named Rosa Parks refused to surrender her seat on a city bus to a white passenger, and was escorted to the police station. Uproar ensued, and King’s fellow clergy, a fairly timid bunch, drafted the 26-year-old into the leadership of what became the Montgomery Bus Boycott. There was no turning back.

Postscript

Last week, the Bible that MLK used in his early ministry made news. It was announced that Barack Obama would take the oath of office with his hand on King’s Bible as well as Lincoln’s. That’ll come at the highpoint of the January 21 inauguration ceremony, which happens to fall on the King holiday.

On the inaugural platform, you won’t have to look far to find a living person whose identity changed in unexpected ways. Just keep an eye out for Barry Obama. …read more

Heschel’s Prophets, and Ours

Some forty years ago, in one of his last public appearances, the celebrated Jewish philosopher Abraham Joshua Heschel said in an NBC television interview that “one of the saddest things about contemporary life in America is that the prophets are unknown.” He was referring to the ancient Hebrew prophets, who proclaimed the divine truth and yet were often “grossly inaccurate” because they concerned themselves with meaning, not facts, as Heschel had written. The rabbi spoke prophetically in that interview—which is to say, not very accurately.

Heschel died a few months later on December 23, 1972. But he lived to see and help usher in what he surely knew was one of the most prophetic moments in American history.

His timeless study, The Prophets, was published in late 1962, and it ushered out the soothing spiritual happy talk of the ‘50s. The Polish-born mystic wrote admiringly that the biblical prophet is “strange, one-sided, an unbearable extremist.” Hypersensitive to social injustice, the prophet reminds us that “few are guilty, but all are responsible,” Heschel declared.

The book was read widely in civil rights circles. In his 1963 “I Have a Dream Speech,” the Rev. Dr. Martin Luther King Jr. echoed the prophet Amos—“Let justice roll down like waters and righteousness like a might stream.” King used a translation barely known at the time but common today—Heschel’s translation. The standard rendering had been “judgment” rather than “justice.”

King and Heschel had first met in January of that year at a conference on religion and race, in Chicago, and the two became fast friends. In 1965, they and others locked arms in the first row of the march from Selma to Montgomery—a lasting image of that whole struggle. Afterward, Heschel remarked, “I felt like my legs were praying.”

By then, with his surfeit of white, wavy hair and his conspicuous white beard, Heschel looked as well as sounded the part of an Old Testament prophet. And within a year he was waging prophesy on another front, as co-leader of Clergy Concerned About Vietnam, a collection of kindred spirits emanating from New York City, where Heschel taught at the Jewish Theology Seminary.

This prophetic club included, among others, the otherworldly Jesuit priest Daniel Berrigan and the swashbuckling liberal Protestant minister William Sloane Coffin, and the group persuaded King to ramp up his antiwar activism. Heschel struck the spiritual high notes when he preached at a 1968 mobilization in Washington about “the agony of God in Vietnam.” He declared: “God’s voice is shaking heaven and earth, and man does not hear the faintest sound.”

Devolving Prophecy

In the late ‘60s, young radicals imitated this style of prophetic denunciation; leaders of the secular New Left often spoke self-referentially of a “prophetic minority.” The counter-cultural stance took on a conservative hue in the late ‘70s, with the ascending religious New Right. Fundamentalist leader Jerry Falwell often credited King with his conversion to a political and confrontational faith.

In a way, much of politics today has gone prophetic. The vilification of one’s opponents, the overstatements about a “war on religion” or a “war on women,” the jeremiads against the one percent or the 47 percent, have come to be expected. (Outside of the religious right, it is largely in secular politics that one sees this skewing of prophetic discourse.) Do we really need more prophets uttering their “strange certainties” and speaking “one octave too high,” as Heschel affectionately wrote of the biblical prophets?

The rabbi would say yes, but he’d have in mind a different prophetic style.

He, King, and company usually found a way to join prophecy with civility, denunciation with doubt. This isn’t like walking and chewing gum at the same time. It’s much harder. Heschel said in his old-world way (as related by his biographer, Edward K. Kaplan), “Better to throw oneself alive into a burning furnace than to embarrass a human being in public.” King, in his Letter from Birmingham Jail, pleaded with his white-clergy critics to forgive him “if I have said anything that overstates the truth.”

They did sail over the top at times, as when King, appearing with Heschel at Manhattan’s Riverside Church in 1967, branded America “the greatest purveyor of violence in the world today.” But that’s what prophets do. In their realm, unreasonableness is no vice, particularly when seeking to “strengthen the weak hands,” as the prophet Isaiah said in regard to the lowly and oppressed.

And that was Heschel’s prophetic calling—not so much to take the right stands, but to stand in the right places. …read more

Even Less Moral

Niebuhr on Time’s cover, March 8, 1948

In December 1932, a 40-year-old theology professor who had recently left his Michigan pastorate drew nationwide attention with his book, Moral Man and Immoral Society. Two sentences into the introduction, the author, Reinhold Niebuhr, was already walking back the title, saying the distinction it suggested was too unqualified. Reflecting on his classic work of social ethics three decades later, Niebuhr wrote that a better encapsulation of his thesis would have been, “Not So Moral Man and Even Less Moral Society.” By then he had become one of the principal definers of 20th century American liberalism.

The notion behind the title was that while individuals might be able to muster sympathy “for their kind,” human groups and societies have little such capacity for self-transcendence. It might have been the least emphatic argument of this unsettlingly unsentimental book, which can be as startling today as it was 80 years ago, in the throes of the Great Depression.

Niebuhr wrote Moral Man in a time arguably not unlike our own, when both economic and political power had concentrated in fewer hands. The wealthiest Americans had succeeded in making government “more pliant to their needs,” he argued. But the professor at New York’s Union Theological Seminary did not unleash his brash analytical power on plutocrats alone. He aimed squarely at his fellow liberals, who believed in the efficacy of moral suasion and rational argument, and who imagined that “men of power will immediately check their exactions and pretensions in society, as soon as they have been apprised by the social scientists that their actions and attitudes are anti-social.” Niebuhr’s intent was to disabuse them of these illusions.

One essay in this volume that seems to especially evoke our situation today is titled, “The Ethical Attitudes of the Privileged Classes.”

The attitudes have largely to do with economic inequalities. The chapter starts with a bow to the truism that such gaps are inevitable and stem partly from different levels of talent and skill. Niebuhr’s clear-eyed view of human nature and destiny could hardly make him suppose that inequality, along with a fair bit of misery, is unnatural. But he quickly adds that personal attributes never explain extraordinary degrees of wealth inequality. These are due chiefly to “disproportions of power,” he says, alluding in part to money’s grip on politics.

For Niebuhr, the task of plutocracy or government by the wealthy is to justify this power and privilege. Plutocrats do so by identifying their special interests with the general good. “Since inequalities of privilege are greater than could possibly be defended rationally, the intelligence of privileged groups is usually applied to the task of inventing specious proofs for the theory that universal values spring from, and that general interests are served by, the special privileges which they hold,” he observes.

Such thinking requires a certain amount of self-deception, according to Niebuhr. But he says it also involves hypocrisy—in that the privileged often salute one thing (the good of all) and engineer something else (narrow self-interests). He continues:

The most common form of hypocrisy among the privileged classes is to assume that their privileges are the just payments with which society rewards specially useful or meritorious functions. As long as society regards special rewards for important services as ethically just and socially necessary … it is always possible for social privilege to justify itself, at least in its own eyes, in terms of social function, which it renders. If the argument is to be plausible … it must be proved or assumed that the underprivileged classes would not have the capacity for rendering the same service if given the same opportunity. This assumption is invariably made by privileged classes.

As Niebuhr further limns this mind, he points to its understanding that the masses of people are economically unfit not simply because of their lesser intellects or purported lack of opportunity. They are also seen as succumbing to character flaws, namely their inclination toward what the Puritans (his spiritual ancestors in the Calvinist fold) styled as “laziness and improvidence.”

Plutocracy Revisited

Niebuhr’s analysis echoes in current debates. For instance, Chrystia Freeland, author of Plutocrats, notes a tendency among the super rich to “confuse their own self-interests with the common good.” Niebuhr’s plutocrat, though at times a cardboard figure, finds voice in billionaire activists such as Leon Cooperman (quoted in Freeland’s book), who wrote a open letter a year ago to President Obama, enumerating services rendered by his class: “As a group we employ many millions of taxpaying people … fill store shelves at Christmas … and keep the wheels of commerce and progress … moving.”

The “special rewards” today might include Wall Street bailouts, preferential tax rates for capital gains, and the carried-interest loophole that withers tax bills for hedge fund managers like Cooperman. “Specious proofs” abound with the notion, for example, that half of all Americans will never “take personal responsibility and care for their lives,” as Mitt Romney declared in his famous behind-closed-doors remarks about the 47 percent.

Yet few commentators would match Niebuhr’s unrelievedly unsentimental view.

Most decent people would hope to see different parties and factions engage in good-faith dialogue about the common good. Niebuhr would say: Don’t count on it. Because he saw reason as largely subservient to self-interests, he felt that relations between groups must always be “predominantly political rather than ethical,” meaning that those who favor greater equality should rely on sheer power and political mobilization, not just cogent arguments and appeals to conscience. The clear message: Expect little from conversations with plutocrats.

Among the many who found little uplift in Niebuhr’s critique was Niebuhr himself. “All this is rather tragic,” he said at the end of the book. He was speaking of unpalatable means toward the goal of greater equality, such as appealing to raw emotion and even resentment.

At times it’s hard to tell if Niebuhr is endorsing such behavior or trying to whip up an air of crisis. He certainly preferred loftier means such as civil discourse—provided they were effective. But a word he used favorably in this context is “coercion,” directed at the powerful, by the people through their government; he also saw an eternal need for power blocs such as labor unions and the pressures they apply. This would be “class warfare” by today’s squeamish standards.

Niebuhr Now

Moral Man and Immoral Society was Niebuhr’s first major work. At the time, many readers and reviewers (including his fellow liberal Protestant clergy) were understandably alarmed by what they saw as his cynicism, and Niebuhr’s response was characteristically defiant. Gradually, however, he gave a little more due to the possibilities of grace and goodness in political life. He also turned a scornful eye to self-righteousness on the left as well as right.

At the same time, Niebuhr applied his thoughts about the “brutal character of all human collectives” to an increasingly dangerous world. He inspired many a liberal Cold Warrior—and a latter-day adherent, Barack Obama, who calls Niebuhr his favorite philosopher. In recent decades the Niebuhr brigades have arguably been filled with neoconservatives more than liberals, animated by their interpretation of Niebuhrian realism, the idea that the search for perfect justice is dangerously utopian.

Still, Niebuhr was always a creature of the left. He cofounded the liberal Americans for Democratic Action in 1947 and opposed the Vietnam War, which was still raging when he died in 1971. And he remained a sober prognosticator of the human condition. He often said that the only empirically verifiable Christian doctrine was Original Sin, which he found more steadily reliable than any belief in human perfectibility.

With his acute sense of tragedy and paradox, Niebuhr would not put full faith in grand designs of economic justice (if those existed today). But he would also doubt there could be even proximate justice, apart from a confrontation with privilege and an unabashed plying of worldly power. …read more

Last Rites for Capital Punishment?

Model of a late 19th century French guillotine

On September 10, 1977, France raised the 88-pound blade of its guillotine one last time and let it drop on a Tunisian immigrant who had sexually tortured and murdered a young French nanny, lopping off his head in just a fraction of a second. After that, a cry of “off with their heads” heard anywhere in the Western world would likely suggest little more than a taste for metaphor, not a thirst for blood. And soon, all manner of executions, not just the heads-roll variety, would be declared illegal throughout Western Europe. In due time scores of countries elsewhere—from Mexico and the Philippines to Cambodia and Rwanda—would put away their death penalty statutes. Only the rare developed nation would kill to show that killing is unacceptable.

The United States would be rare. Lethal injections, electrocutions, and other means of judicial death would offer an eye-popping display of American exceptionalism. The death penalty is still all too with us in America, and only eight other countries, with not a democracy among them, executed more than two or three people last year. That said, in recent years we have become less exceptional on this score.

The latest case in point is Connecticut, where lawmakers voted yesterday to abolish new death sentences. Governor Dannel Malloy, a Democrat, has vowed to sign the measure, which will make Connecticut the fifth state in the past five years to forsake punishment by death. (The others are New Jersey, New Mexico, New York, and Illinois; California voters will probably have their say at the ballot box in November.)

The biggest story, however, is not about the handful of states that are shuttering their death houses altogether. It’s about the slow death of capital punishment throughout the country, though I’d lay emphasis on slow. The numbers of executions as well as new death sentences have been falling steadily in recent years. In 2011, 43 people were executed nationwide, a 56-percent drop since 1999, according to the Death Penalty Information Center in Washington.

Even Texas has been less eager to administer the heart-stopping potassium chloride and other lethally injected drugs. Texas extended the death protocol to 13 inmates in 2011, compared to 24 two years earlier. That’s just one way of sizing it up, though. Another way is to note that if Texas were a country, it would rank eighth in reported executions worldwide, right behind North Korea and the rest of the United States, but way ahead of countries such as Somalia and Afghanistan.

Moral Principle, Political Reality

For decades many in the United States have opposed capital punishment on moral and religious grounds. Such a culturally conservative force as the American Catholic hierarchy has repeatedly denounced the practice as a violation of the sanctity of human life. To me, one of the most cogent moral arguments against the death penalty came from Pope John Paul II. He argued time and again that the only possible justification for capital punishment (or any use of deadly force) would be strict self-defense—which rules out the death penalty in almost every conceivable circumstance. That’s because, as John Paul noted, there are many other ways of protecting society against a killer, ways known collectively as the modern penal system.

As someone who dislikes capital punishment for more or less those reasons, I’d be happy to give the credit for its decline to the abolitionists and their excellent principles. But I’d be kidding myself.

It’s not moral revulsion against the whole idea of capital punishment that has thinned the execution ranks. It is the well-founded fear of executing the innocent, a real possibility brought to light not by moral arguments but by the evidentiary wonders of DNA, which has led to multiple exonerations in recent years. Polls show that most though a declining number of Americans still support capital punishment at least in theory, and the basic reason is that most inmates on death row are not innocent. They’re guilty as hell.

So, Americans haven’t yet had a moral conversion on this issue. And that’s okay. In a pluralistic society, citizens—even those on the same side of an issue—will bring diverse values and considerations to the table of public conversation. When it comes to the death penalty, some worry about faulty procedures that could lead to wrongful execution or simply about the costs of seemingly endless appeals. It’s the job of others including the theologically motivated to add moral principles to the mix, and to do so with humility and what the Declaration of Independence refers to as a “decent respect” for the opinions of humankind. It’s fair to say that many different opinions have coalesced to put the greatest pressure on capital punishment in decades.

Counting on Conservatives

What might eventually tip the scales toward abolition is not liberal outrage but conservative caution. True, many conservatives have taken the untenable view that government—which, in their minds, is incapable of adequately performing a simple task like creating a construction job or an affordable housing unit—is somehow so adept and infallible that it can be trusted to make ultimate decisions about life and death. This logic is no longer flying with increasing numbers of Americans, however. And they include many who lean right.

The last words here go to Richard Viguerie, a father of what used to be called the New Right, now known as the Tea Party.

Conservatives have every reason to believe the death penalty system is no different from any politicized, costly, inefficient, bureaucratic, government-run operation, which we conservatives know are rife with injustice. But here the end result is the end of someone’s life. In other words, it’s a government system that kills people (his emphasis)….

The death penalty system is flawed and untrustworthy because human institutions always are [my emphasis]. But even when guilt is certain, there are many downsides to the death penalty system. I’ve heard enough about the pain and suffering of families of victims caused by the long, drawn-out, and even intrusive legal process. Perhaps, then, it’s time for America to re-examine the death penalty system, whether it works, and whom it hurts. …read more