In the NSA Debate, Where’s the Common Good?

TheoPol is off its weekly schedule, running occasionally during the summer.

As I scan the headlines and hear the radio talk about the federal surveillance program, one thought keeps coming to me: Why don’t I give a poop about any of this?

Maybe it’s because I don’t understand the implications of collecting domestic telephone data. Or maybe it’s because I cling to the rustic notion of the common good, in which personal liberties are of course balanced with the needs of community. That would basically mean balancing my right not to be surveilled with our need not to be bombed.

There’s a chance I’d react differently if the NSA’s algorithms were to spit out a particular innocent person—me. And I guess there are real questions that need to be answered about the NSA program, questions framed well by the Times today. But I don’t feel that the government is necessarily trampling upon my liberty, by scanning for networks and patterns of telephone use. Google already knows more about me than I know about me.

And then there’s that quaint idea of the common good. What is it, anyway? Someone in the field of Catholic social ethics once said that defining the common good is like trying to nail Jell-O to a wall. But that hasn’t stopped theologians and church authorities from hammering away at it.

For instance, the Second Vatican Council defined the common good as “the sum of those conditions of social life which allow social groups and their individual members ready access to their own fulfillment.” There goes the gelatin, dribbling from the wall.

The Catechism of the Catholic Church tried to get more of a handle on the concept, by breaking it up into pieces. The Catechism cited three components of the common good: 1) “respect for the person” (including individual freedom and liberties); 2) “social well-being and development” (including rights to basic things like food and housing); and 3) peace—“that is, the stability and security of a just order,” the Catechism said.

It’s abstract, but I like it. The Catechism’s rendering makes it clear that this principle is about balancing, not choosing between, various personal and social goods.

But I think the common good will always be subject to the Potter Stewart rule of knowing it when you see it. I see it in a raft of initiatives like gun control and progressive taxation, and yes, maybe even in Obama’s surveillance program. The critics of that program have real concerns about personal liberties, but these ought to be balanced with “social well-being” and “the stability and security of a just order.” The common good would seem to call for that. …read more

The Moral Minimum: Part 2

Filed under the heading of everybody-and-his-aunt-wants-a-higher-minimum-wage:

Madeline Janis, on "Moyers & Company"

Madeline Janis, on “Moyers & Company”

And we kept seeing this, something that we thought was wrong. We had to be in an Alice in Wonderland story or something. We would see a “Romney for President” sign and a pro-Tea Party for Congress and “Yes on the Living Wage,” all on the same lawn. And that’s because the idea of a living wage for people and their neighbors to be able to spend money in local stores resonated.

Madeline Janis made this comment in a Bill Moyers PBS interview earlier this month. She led a campaign in Long Beach, California, to enact a startling $13-an-hour minimum wage—specifically for hotel workers in that city. That’s almost six dollars above the $7.25 per hour federal minimum. The measure appeared on the ballot last November and passed easily with 63 percent of the vote.

In the interview, Janis’s main point was that small business owners rallied behind the voter referendum. Their reasoning was, “We want more customers. We want these hotel workers to be able to buy our clothes and our food,” as she related.

But surely, this is an anomaly. Or is it? Small business owners are typically cast as dogged opponents of the minimum wage. Is it possible that most are actually in favor of jacking up the minimum?

It’s more than possible.

Late last month, the organization Small Business Majority released the results of a national poll on raising the minimum wage. Small business owners were asked whether they agree or disagree with the following statement:

Increasing the minimum wage will help the economy because the people with the lowest incomes are the most likely to spend any pay increases buying necessities they could not afford before, which will boost sales at businesses. This will increase the customer demand that businesses need to retain or hire more employees.

Nearly two-thirds (65 percent) of those surveyed agreed with this boilerplate case for a more generous minimum wage. What’s more, 67 percent of these business owners agreed with the idea of taking a higher minimum (a dollar figure wasn’t specified) and “adjusting it yearly to keep pace with inflation.”

You might ask: Was the polling sample skewed toward bleeding-heart-liberals, the kind who set up shop in hip districts of Boston and southern California? It doesn’t seem that way. Forty-six percent of the respondents identified themselves as Republican, 35 percent as Democrat, and 11 percent as independent.

People like me often talk about the need to nurture a moral consensus on important questions facing our society. But I find it hard to talk that way, when it comes to the minimum wage. And that’s because we already have a moral consensus on that issue. (See my previous post, on public opinion.)

Apparently, most Americans agree pretty much with Martin Luther King: “There is nothing but a lack of social vision to prevent us from paying an adequate wage to every American [worker] … ” But for some reason, our political system today is unable to process this conviction. The minimum wage, adjusted for inflation, remains lower than it was when King fell to the assassin’s bullet in 1968. Special interests are trumping national consensus.

It’s clear that public sentiment in favor of a higher minimum wage is powerful. The problem is that the American people aren’t.

TheoPol will skip the week of Memorial Day and resume the following week. …read more

The Moral Minimum: Part 1

Minimum wageIf the word “democracy” means anything, it means that the people usually wind up getting their way—after careful deliberation by representative bodies and broad public debate. Much has been made of the fact that the American people haven’t gotten their way lately with regard to gun control. Recent polls indicated that nearly 90 percent of Americans thought universal background checks were a sensible idea, but 54 members of the U.S. Senate disagreed. As a result, a modest bill to that effect was gunned down.

Gun control is probably not the most eye-raising case of public sentiment ignored, however. That distinction might well go to a bread-and-butter issue: the minimum wage.

The people began favoring stricter gun laws only recently, in the wake of the Newtown massacre, and it appears the trend is already letting up. On the other hand, for decades polling has shown support for a higher minimum wage rocketing somewhere between 70 and 90 percent, depending on factors including the size of the raise. Americans aren’t polarized on this issue; the politicians are.

In March, a Gallup poll found that 71 percent of the people favored President Obama’s proposal to lift the bottom wage to $9 an hour. That’s $1.75 more than the current federal minimum; it would also be the largest increase ever passed by Congress. Past polling indicates that if people were simply being asked whether they support an unspecified increase in the minimum, or a somewhat lesser amount, the backing would be even stronger.

50 Percent of Republicans

Try to identify a single major subgroup of Americans that doesn’t want to see the minimum wage go up.

You’d think, for example, that self-identified conservatives would be pretty down on the idea. They aren’t, according to the Gallup survey. They favored the $1.75 hike by a clean 54-44 percent margin. Meanwhile the support among self-identified “moderates” was rather immoderate (75 percent). Republicans were the only subgroup that didn’t give clear majority support to the proposal—but even they backed it by a plurality, 50-48 percent.

And keep in mind that we’re talking about a relatively big jump for the minimum wage. The numbers, again, would undoubtedly be higher if the boost were smaller. Very, very few people would be opposed to a raise, in principle.

There appears to be a common moral sense among Americans that a full-time wage shouldn’t keep a family in poverty; it should get a family out of poverty. Whether the federal minimum wage is the only way to do that is, of course, debatable (there’s also the Earned Income Tax Credit, for instance). In any event, Obama’s $9 an hour wouldn’t get a family there. It would deliver a $3,000 a year raise to minimum wage workers, a bump up to $18,000 a year. That’s more than four thousand dollars below the official (and badly outdated) federal poverty line for a family of four.

And that’s why liberal Democrats recently pushed a bill that would have ramped up the minimum to $10.10 an hour by 2015. Even that higher amount is quite a bit lower than what the minimum wage would be today if it had merely kept up with inflation since the late 1960s. There were no takers, however, on the other side of the aisle.

On March 15, Republicans in the House of Representatives unanimously rejected the $10.10 proposal. Six Democrats joined them, in voting it down 233-184. If there’s a common moral sense on this issue, it doesn’t seem to be broadly shared in Congress.

Note: for Part 2, go here. …read more

Inertness, U.S.A.

Posted earlier today at Tikkun Daily.

Part of what fascinates me about the civil rights struggles of the 1960s is that, through these upheavals, America changed. Compare that to today’s inertness: we can barely budge on gun control and the minimum wage (for examples), despite overwhelming support among Americans for change on those fronts.

Yes, there are real questions about how much progress towards racial justice we’ve made. What’s clear is that a little over a year after the May 1963 “children’s crusade” in Birmingham, Alabama, we had the Civil Rights Act of 1964. And five months after the Selma to Montgomery march came the Voting Rights Act of ‘65. Which particular piece of landmark legislation has followed the Occupy Wall Street protests?

More to the point: How did change happen, half a century ago?

That question often comes up—and is answered all too readily. Many are quick to credit the vision, courage and sacrifice personified by the civil rights heroes. Others just as quickly recite with Bob Dylan that the times they were a-changin’. (Consider the reforms that washed over the Catholic Church during those years at the Second Vatican Council, which bookended Birmingham and the Civil Rights Act from 1962 to 1965.) Many still would single out the strategy of nonviolent confrontation, the purpose of which was to create an air of crisis.

One could also be impressed by the accidents of that history, arguably including the career of Martin Luther King. Earlier this year, I wrote about how, in 1954, the young MLK had a dream—to become a tweedy tenured theology professor. A year later, Rosa Parks sat on the bus and catapulted the reluctant neophyte pastor into the leadership of the Montgomery Bus Boycott. There was no turning back.

Add to this the accidental presidency of Lyndon Baines Johnson. One could argue we wouldn’t have had a Civil Rights Act in 1964 or a Voting Rights Act in 1965, without LBJ in the White House. Or those landmarks might not have been enacted until later. But it’s also true that King, Parks, and other storied figures, with their moral vision and mass movement politics, expanded the realm of the possible. That enabled Johnson to work his legislative magic.

Mysteries of Social Change

In their 2010 book, Switch: How to Change Things When Change is Hard, Chip Heath and Dan Heath made the simple observation: “For anything to change, someone has to start acting differently.” Nonviolent direct action was one clear innovation. As King explained in his 1963 Letter from Birmingham Jail, those who engage in such resistance are not “the creators of tension. We merely bring to the surface the hidden tension that is already alive,” in an unjust system. In Birmingham, the explicit strategy was to bring the brutality of segregation into the open by provoking it.

In addition, during the early 1960s King and other spiritual radicals—notably his friend, Abraham Joshua Heschel—resurrected the tradition of prophetic discourse. That is, the style of denouncing social evils and chastising the powers that be, while envisioning a radically better future, as King did in his “I Have a Dream” speech in August 1963. Such a religious challenge to the status quo was a distant cry from the soothing spiritual happy talk of the 1950s. King and company issued their jeremiads, but they also usually managed to join prophecy with civility, social struggle with social friendship.

Those varied elements converged in Birmingham 50 years ago. In early May of 1963, thousands of children as young as six years old strode out of schoolhouses to join in the marching downtown. And, in a bracing display of cognitive dissonance, King declared: “Bomb our homes and go by our churches early in the morning and bomb them if you please, and we will still love you.”

During the protests, King projected through his megaphone not only resoluteness, but also a longing for what he had limned on other occasions as a “beloved community.” It was a vision of solidarity between whites and blacks, rich and poor. And it was vitalized—with not just love but power, with both confrontation and a spirit of cooperation.

Whether that rare combination of moral and political sensibilities made the civil rights crusade successful is hard to say with certainty. There are too many imponderables. It should be noted too that King, depressed and guilt-ridden at the end of his abbreviated life, began to see himself as a failure, partly due to the unrealized dream of economic justice for all, both blacks and whites.

What we know is that by the end of the Birmingham campaign, there were thousands of freedom-chanting children jamming the city’s prisons. There was the thick air of crisis that King and others had prayed for, and there were the heartfelt pleas for love and reconciliation in the throes of intense agitation. All that provided what every movement for social change seems to need—the element of surprise.

I wouldn’t venture much further in trying to explain the developments of May 1963, any more than I’d pretend to unravel the mysteries of change. Perhaps these are best left as perennial questions. …read more

May 2, 1963

D-Day in Birmingham

D-Day in Birmingham

On this day 50 years ago, African American children began laying their little bodies on the line, in Birmingham, Alabama. Streaming out of schoolhouses by the thousands, they poured into downtown to join in the civil rights demonstrations led by Martin Luther King. My friend Kim Lawton has crafted the best piece of broadcast journalism I’ve seen or heard, on that extraordinary moment in America’s history.

This past weekend she filed the report for PBS’s Religion & Ethics Newsweekly, and one of the people she tracked down was Freeman Hrabowski III, now president of the University of Maryland in Baltimore. He was 12 years old when he came up against the arrayed forces of Bull Connor. The police chief issued the order to turn fire hoses and unleash German Shepherds on the young, nonviolent protesters.

The water came out with such tremendous pressure and, uh, it’s a very painful experience, if you’ve never been hit by a fire hose, and I thought, whoa. You know, I got knocked down and then we found ourselves crouching together and trying to find something to hold onto. People ran, people hid, people hugged buildings or whatever they could to keep the water hoses from just—just knocking them here and there.

After Lawton further described the scene with the police dogs and billy clubs, Hrabowski continued.

The police looked mean, it was frightening. We were told to keep singing these songs and so I’m singing, [he sings] Ain’t gonna let nobody turn me ‘round … keep on a-walk’n, keep on a-talk’n, march’n on to freedom’s land. And amazingly the other kids were singing and the singing elevates when you can imagine hundreds of children singing and you feel a sense of community, a sense of purpose.

And then …

There was Bull Connor, and I was so afraid, and he said, “What do you want little nigra?” And I mustered up the courage and I looked up at him and I said, “Suh,” the southern word for sir, “we want to kneel and pray for our freedom.” That’s all I said. That’s all we wanted to do. And he did pick me up … and he did spit in my face, he really—he was so angry.

For weeks, the protests against Birmingham’s segregated public facilities had been for adults only. Those acts of civil disobedience (marching without permission) had little effect, however. They were petering out by the time of the so-called “children’s crusade.” It was during April of ’63 that King also wrote his “Letter from Birmingham Jail,” but that literary classic fell on deaf ears at the time, as Robert Westbrook relates in his piece about the 50th anniversary of the letter, in the April 8 Christian Century. (A half-century later, King’s letter has finally received a proper reply from a group of tardy clergymen, as Adelle Banks reported last month in Religion News Service.)

The children’s crusade turned around the Birmingham campaign—and the nation. It prompted John F. Kennedy, a month later, to go on national television and call for civil rights legislation.

In a recent post, I floated a broader question: How did it happen? How did America change so quickly (there’s room for debate about the degree of change), and on the most polarizing issue of the time, race? I’ll get back to that next week. …read more

Sacred Space, at the Corner of Boylston and Berkeley

At Boylston and Berkeley, 8:00 a.m., Monday April 22

At Boylston and Berkeley, 8:00 a.m., Monday April 22 

Prepared for today’s edition of Tikkun Daily.

Two days after the Boston Marathon bombings, Massachusetts Governor Deval Patrick was asked in a public radio interview if there would be a permanent memorial to the victims of that horrific act. Patrick understandably felt it was too early to speculate about such a memorial—this was before the dramatic lockdown of Boston and surrounding communities. He went further to say that the most fitting tribute would be to return next year with the biggest and best marathon ever.

That surely would be a testimony to the city’s spirit, but it seems the governor, as a good technocrat, was missing the point. Fact is, people were already finding makeshift ways to memorialize the event. And if past atrocities are a guide, they’ll eventually find a permanent space for that solemn purpose.

If I didn’t know this already, I’d have found out just by standing for a few minutes near Copley Square this past Monday morning, at the intersection of Boylston and Berkeley streets.

Boylston, a crime scene, was still closed at the time. But people stood silently on a sidewalk at the corner, leaning against a police barricade in front of a popup memorial. They gazed at the flowers, flags, candles, handwritten notes, and other items left by anonymous people. They stared at three white crosses in the center of that growing memorial—in remembrance of the three who perished in the twin bombings of April 15. The shrine to eight-year-old Martin Richard was teeming with Teddy Bears, balloons, and children’s books.

People will memorialize, because they know hallowed ground when they see it. It’s extraordinary, when you think about it—how the heinous and the hallowed can share the same space, how a site of evil can be transfigured as holy. But this seems to happen every time. It happened at the Twin Towers, at the Murrah Federal Building at Oklahoma City, at Pearl Harbor, and most profoundly, at Auschwitz. Each of those names marks out a distinct space in the timeless realm of evil. And each space is inviolable.

But how about Boylston Street, or a consecrated corner of it? Is it now part of this geography of the sacred? It is, if you think of such space the way historian Edward Linenthal does. In an interview adapted in a book I did some years ago with Bob Abernethy of PBS’s Religion & Ethics Newsweekly, he said:

My definition of a sacred space is a simple one. Any place that’s capable of being defiled is by definition sacred. You can’t defile ordinary space. Any place that for a group of people is so special that a certain way of being there would be an act of disrespect means that that place is charged with a particular kind of meaning.

Linenthal, who now teaches religious studies at Indiana University, continued:

I tell my students, if they were sitting in the parking lot at K-Mart with a boom box, no one’s going to really care. They might be irritated that the noise is too loud. But if they had a boom box at Gettysburg or in the grove of trees at Shanksville [into which United Airlines Flight 93 crashed on September 11, 2001, in Pennsylvania] or in a church, a mosque, or a temple, it would be considered an act of defilement.

This is why questions about what to do with these places fraught with meaning can be so vexing and contentious. Consider the 9/11 memorial in Manhattan. The decision to store the unidentified remains of victims in an underground repository—rather than a more visible place of tribute—stirred resistance from victims’ families.

Sure enough, a debate erupted this past week over the impromptu memorial at Boylston Street—how to preserve it, where to move it. Such a discussion would have been ludicrous, if this were ordinary space. If it were incapable of being defiled.

And that’s just a prelude. A few days ago, Boston Mayor Tom Menino’s office let it be known that the process of figuring out how to permanently memorialize the bloodshed at Boylston has begun. …read more

When Liberals Feared Equality

This piece was posted earlier today at Tikkun Daily.

Late one evening in April 1963, Dick Gregory came crashing through the door of his Chicago apartment – drunk – and was informed by his wife that the president of the United States was looking for him. As Diane McWhorter related in her 2001 book, Carry Me Home, about the drive to desegregate Birmingham, Alabama, the comedian returned the phone call to the White House and spoke with John F. Kennedy, who reportedly told him, “Please, don’t go to Birmingham. We’ve got it all solved. Dr. King is wrong, what he’s doing.” Gregory, a celebrity at 30 years old, replied – “Man, I will be there in the morning.”

Kennedy and his aides were hardly the only ones pleading for racial calm in that place, 50 years ago. Birmingham’s liberal white clergy and even its black newspaper had urged Martin Luther King Jr. (who died 45 years ago, on April 4) to jettison plans for a campaign of nonviolent direct action. They feared that an escalation of tactics would only make the segregationists angrier.

It’s not that the city’s men of the cloth were devoted to milder tactics. Christian pastors had looked upon civil rights not as a moral problem, which would rightly claim their attention, but as a political one, which would not; Jewish leaders, opting to sit out the battle of Birmingham, viewed segregation as a “Christian problem” between whites and Negroes, McWhorter notes. The campaign was foundering in early May when King, desperate, resorted to letting schoolchildren join in the civil disobedience (which essentially involved marching without permission).

A month later, Kennedy – who had said publicly that he was “sickened” by televised images of police dogs and fire hoses mowing down children – sent a civil rights bill to Congress. A year after that, the Civil Rights Act of 1964 became law.

That struggle for racial justice is often held up as an example of how change is possible. And its stories have helped teach many movements of nonviolent resistance, in countries ranging from the Philippines to Poland to South Africa. But how was change possible at that time?

These days the lack of progress in our politics is a given, and it is usually chalked up to fierce polarization, chiefly between Democrats and Republicans. As today, the national politics of 1963 (certainly on the domestic front) was deeply fractured along ideological lines between liberals and conservatives if not strictly between Democrats and Republicans. Still, change happened – and on the most flammable question, race.

How?

I’ll let that question float for now. And I’ll listen in on conversations this month surrounding the 50th anniversary of the Birmingham campaign. …read more

Sightings of Moral Life in the Deficit-Hawk Universe

Jeffrey Polet: An unlikely advocate of single-payer healthcare.

Jeffrey Polet: An unlikely advocate of single-payer healthcare.

After Paul Ryan unveiled another one of his trademark balancing-the-budget-on-the-backs-of-the-poor plans, I found myself asking again, What’s the moral grounding for this fiscal sternness?

I raised that question in an item posted late last month. At the time I noted that while faith-based objections to draconian budget cuts are familiar enough, the moral and religious case in favor of such slashing is less clear. I promised to keep an eye out for real moral content in the arguments for balancing the government’s books.

In my search for such reasoning, I’ve scanned blogs, checked in on publications catering to fiscal conservatives, and broached the question with friends. I’ve also happily made the acquaintance of Jeff Polet, a scholar, writer, and not-so predictable conservative.

Polet is a political scientist at Hope College in Holland, Michigan, and a senior editor of the conservative online journal Front Porch Republic. He provided some evidence for the existence of moral and theological thinking in the deficit-hawk universe. For example, many liberals who speak on budget matters invoke values such as compassion and solidarity. Polet was just as quick to cite other legitimate virtues—temperance and prudence, among them.

“We’re spending money we don’t have,” he told me by phone in an interview I did for Our Sunday Visitor. “The bottom line is that we want a full range of services and we don’t want to pay for them.” He continued, “It’s a combination of greed, intemperance and a kind of luxuriousness. In an older time it would have been called decadence.”

When I asked him who the greedy are, he pointed to “interest groups” that oppose any cuts in programs that affect their constituencies, and fingered the AARP. I’d find it hard to pinpoint the elderly as a glaring source of national greed, not in these plutocratic times, anyway. But let’s stay on this trail.

As I noted previously, perhaps the only well-known moral claim on the fiscal right is a generational one—that we are saddling our children and their children with a crushing debt burden. Polet, a Catholic convert, roots the generational concern more deeply and evocatively in Scriptures. He pointed to the familiar biblical motif of inheritance (as in Genesis — “Abraham gave all he had to Isaac”).

“There’s this idea that parents owe their children an inheritance. You don’t take your inheritance and squander it, to the disadvantage of your own progeny,” said Polet, who chairs the political science department at Hope, an ecumenical Christian institution with Calvinist roots. “And that’s what I see us doing,” he added. “We’ve taken the cultural, financial inheritance we’ve been given, and we’ve squandered it in a lot of ways. So the world that we’re giving our children doesn’t seem to be as well-ordered as the world we inherited, certainly not from a financial viewpoint.”

I asked Polet if there might be another way of looking at the moral question of intergenerational solidarity. Do our obligations to the future extend only to the national debt? Or does the “well-ordered world” also need to include good schools, a solid infrastructure and a clean environment — which would require public investment now?

All that is part of a balanced way of looking at fiscal obligations, Polet acknowledged. “But if the debt problem gets too out of control, it’s going to make all those other things impossible,” he argued, falling back on a much-debated policy point (that our debt is unsustainable).

The Real Surprise

This will do, as a moral and religious case for fiscal hawkishness (and of course Polet has much more to say in his own writings). I didn’t come across much of that elsewhere—even among theocons, conservative religious types. I was unimpressed, for instance, by the Acton Institute for the Study of Religion and Liberty’s “Principles for Budget Reform,” which barely even try to root policy assertions in moral or theological soil. The same goes for something called Christians for a Sustainable Economy, a largely evangelical ad hoc group that seems more ideological than biblical.

But Polet’s attention to moral and biblical foundations is not really what surprised me. I assumed that at some point I’d run into such thoughts among deficit foes. What I found intriguing were a few of his policy conclusions.

Here’s one: After arguing like many conservatives for scaling back Medicare, Polet added—“At this point, America would be better off going to a single-payer system.” The single payer, of course, would be the government, as national health insurer. He thinks this radical approach might be the only way to control healthcare costs in the future.

Needless to say, principled liberals have been making this particular case for quite some time. But I’ve never heard it from a conservative—maybe not even from a centrist. That gives me hope for a richer and less predictable dialogue on budgets and values. …read more

Fumbling and Fallibility at the Vatican

One of the many questions being asked about Pope Francis is whether he’ll be able to get a handle on the unruly and unpredictable Roman curia, the central administration of the Catholic Church. In the past year, that governing body has delivered such spectacles as the case of the pope’s butler, the so-called “Vatileaks” affair, and a continuing corruption scandal at the highest levels of the Vatican bank. Infighting and skullduggery have made it clear that Vatican politics, like the secular variety, can be all too human and at times brutish.

Partly with that in mind, a number of Vatican experts are saying that the new successor of St. Peter needs to have the skill set of a CEO, to manage the unmanageable. I don’t know if that’s necessary (or sufficient). Pope John Paul II was not especially noted for his managerial brilliance, but he was able to transcend the bureaucracy and project a global presence that overshadowed it. The curia was generally trying to keep up with him, not the other way around.

But now, many are asking an oddly necessary question about the most famously hierarchical organization on earth: Who’s in charge there? John Thavis, a longtime Rome correspondent, digs deeply into the paradox in his new book, The Vatican Diaries (Viking). My review of the memoir appears in the current edition of America magazine, and here it is, in full:

After turning the last pages of The Vatican Diaries, I noticed an Associated Press item that began, “The Vatican praised President Barack Obama’s proposals for curbing gun violence.” The report was based on a radio commentary by the Vatican press secretary, Frederico Lombardi, S.J., on Jan. 19. Those who read John Thavis’s vivid recollections in The Vatican Diaries will have cause to be at least initially skeptical whenever they hear that “the Vatican” said this or that definitively about anything.

Recently retired as the longtime Rome bureau chief of Catholic News Service, Thavis argues that the popular image of the Vatican as a monolith, eternally on message, is a myth. On the contrary, it “remains predominantly a world of individuals, most of whom have a surprising amount of freedom to operate—and, therefore, to make mistakes,” he writes.

Re-enter Father Lombardi.

The author tells of an incident when Lombardi, during Pope Benedict XVI’s visit to Jerusalem in 2009, lashed out at “lies” circulating about the young Joseph Ratzinger in Nazi Germany. “The pope was never in the Hitler Youth, never, never, never!” the Vatican spokesman declared to an incredulous press. The problem was that Ratzinger’s Hitler Youth involvement was a matter of historical record. As Thavis explains, Lombardi (whom he describes otherwise as “a gentle soul with a sharp mind”) had overheard the papal secretary remark offhandedly at breakfast that Ratzinger was never an “active” Hitler Youth member. By lunch, the misconstrued comment had become the Holy See’s “latest media fiasco.”

Thavis points to the “fragmented chain of command in what is arguably the world’s most hierarchical organization,” and he relishes the irony. For him, the fumbling and fallibility humanize the institution. But not even the bureau chief was charmed by another episode he recounts that revealed both bungling and deception.

Thavis unfolds the story in a riveting chapter titled “Cat and Mouse,” about negotiations between Rome and the ultra-traditional Society of St. Pius X. Some at the Vatican sympathized with the breakaway order and saw no need to inform top officials that one of four traditionalist bishops whose excommunications were being lifted as part of a reconciliation effort, Richard Williamson, was a Holocaust denier. But most of those who could have averted this particular fiasco—the Williamson affair became one of the biggest religion stories of 2009—were not scheming. They were just snoozing. In the end, the pope admitted publicly that anyone with an Internet connection could have known of the bishop’s bizarre anti-Semitism.

In recent years I haven’t followed Catholic News Service closely, so I’m not sure how much of the book would have been politically incorrect and therefore not publishable in that official news outlet. But I’m guessing Thavis did not often portray Benedict unflatteringly alongside his immediate predecessor, as he does in this memoir.

Here is how the author, with help from Bob Dylan, teases out one contrast at the start of his last chapter, “The Real Benedict”:

The first thing I noticed was the twitching leg. It was dark backstage, but I could make out the slight figure standing at the edge of the platform. He wore a black suit with a white stripe running down the side, and his right leg was jerking up and down involuntarily. It had to be Dylan. And he must be nervous, I thought. Singing for the pope was not an everyday thing.

The performance took place at a Eucharistic congress in Bologna in 1997. Pope John Paul II followed with some reflective riffs on “Blowin’ in the Wind,” evoking the Holy Spirit in motion. Meanwhile, back at the Roman Curia, Cardinal Ratzinger was exuding disapproval, openly disparaging Dylan and other pop icons as “false prophets.” As Thavis writes in another chapter, John Paul traveled to remote lands to be with “tribal dancers in feathered headdresses.” Benedict prefers sitting “in a concert hall filled with dignitaries like himself, listening to Mozart.” John Paul projected a spirit of openness to the wide world. Benedict? Not so much.

Thavis also looks probingly at how the AIDS pandemic has provoked genuine debate within the Vatican about the use of condoms to prevent transmission of the disease. That aside, I was surprised to find little in the book that throws light on global justice issues. During Thavis’s 29 years in Rome, Communism imploded in Eastern Europe, Jesuits and others were massacred in El Salvador, and two popes issued encyclical letters refreshing Catholic social teaching—to mention a few developments. But hardly any of that is recalled in these pages.

Then again, income stratification does not make the most scintillating subject matter for a book subtitled A Behind-the-Scenes Look at the Power, Personalities, and Politics at the Heart of the Catholic Church. And I’m glad Thavis has offered this rare, perceptive and highly readable glimpse into a power structure that is less in control than many would have us believe. …read more