Resident Theologian

About the Blog

Brad East Brad East

Screentopia

A rant about the concern trolls who think the rest of us are too alarmist about children, screens, social media, and smartphones.

I’m grateful to Alan for writing this post so I didn’t have to. A few additional thoughts, though. (And by “a few thoughts” I mean rant imminent.)

Let me begin by giving a term to describe, not just smartphones or social media, but the entire ecosystem of the internet, ubiquitous screens, smartphones, and social media. We could call it Technopoly or the Matrix or just Digital. I’ll call it Screentopia. A place-that-is-no-place in which just about everything in our lives—friendship, education, finance, sex, news, entertainment, work, communication, worship—is mediated by omnipresent interlinked personal and public devices as well as screens of every size and type, through which we access the “all” of the aforementioned aspects of our common life.

Screentopia is an ecosystem, a habitat, an environment; it’s not one thing, and it didn’t arrive fully formed at a single point in time. It achieved a kind of comprehensive reach and maturity sometime in the last dozen years.

Like Alan, I’m utterly mystified by people who aren’t worried about this new social reality. Or who need the rest of us to calm down. Or who think the kids are all right. Or who think the kids aren’t all right, but nevertheless insist that the kids’ dis-ease has little to nothing to do with being born and raised in Screentopia. Or who must needs concern-troll those of us who are alarmed for being too alarmed; for ascribing monocausal agency to screens and smartphones when what we’re dealing with is complex, multicausal, inscrutable, and therefore impossible to fix. (The speed with which the writer adverts to “can’t roll back the clock” or “the toothpaste ain’t going back in the tube” is inversely proportional to how seriously you have to take him.)

After all, our concern troll asks insouciantly, aren’t we—shouldn’t we be—worried about other things, too? About low birth rates? And low marriage rates? And kids not playing outside? And kids presided over by low-flying helicopter parents? And kids not reading? And kids not dating or driving or experimenting with risky behaviors? And kids so sunk in lethargy that they can’t be bothered to do anything for themselves?

Well—yes! We should be worried about all that; we are worried about it. These aren’t independent phenomena about which we must parcel out percentages of our worry. It’s all interrelated! Nor is anyone—not one person—claiming a totality of causal explanatory power for the invention of the iPhone followed immediately by mass immiseration. Nor still is anyone denying that parents and teachers and schools and churches are the problem here. It’s not a “gotcha” to counter that kids don’t have an issue with phones, parents do. Yes! Duh! Exactly! We all do! Bonnie Kristian is absolutely right: parents want their elementary and middle school–aged kids to have smartphones; it’s them you have to convince, not the kids. We are the problem. We have to change. That’s literally what Haidt et al are saying. No one’s “blaming the kids.” We’re blaming what should have been the adults in the room—whether the board room, the PTA meeting, the faculty lounge, or the household. Having made a mistake in imposing this dystopia of screens on an unsuspecting generation, we would like, kindly and thank you please, to fix the problem we ourselves made (or, at least, woke up to, some of us, having not been given a vote at the time).

Here’s what I want to ask the tech concern trolls.

How many hours per day of private scrolling on a small glowing rectangle would concern you? How many hours per day indoors? How many hours per day on social media? How many hours per day on video games? How many pills to get to sleep? How many hours per night not sleeping? How many books per year not read? How many friends not made, how many driver’s licenses not acquired, how many dates and hangouts not held in person would finally raise a red flag?

Christopher Hitchens once wrote, “The North Korean state was born at about the same time that Nineteen Eighty-Four was published, and one could almost believe that the holy father of the state, Kim Il Sung, was given a copy of the novel and asked if he could make it work in practice.” A friend of mine says the same about our society and Brave New World. I expect people have read their Orwell. Have they read their Huxley, too? (And their Bradbury? And Walter M. Miller Jr.? And…?) Drugs and mindless entertainment to numb the emotions, babies engineered and produced in factories, sex and procreation absolutely severed, male and female locked in perpetual sedated combat, books either censored or an anachronistic bore, screens on every wall of one’s home featuring a kind of continuous interactive reality TV (as if Real Housewives, TikTok, and Zoom were combined into a single VR platform)—it’s all there. Is that the society we want? On purpose? It seems we’re bound for it like our lives depended on it. Indeed, we’re partway there already. “Alarmists” and “Luddites” are merely the ones who see the cliff’s edge ahead and are frantically pointing at it, trying to catch everyone’s attention.

But apparently everyone else is having too much fun. Who invited these killjoys along anyway?

Read More
Brad East Brad East

Conversions, Protestantism, and a new mainline

Reflections on the appeal of Catholicism rather than Protestantism to public intellectuals as well as the possibility of a new conservative Protestant mainline in America.

Why do people convert to Christianity? Why do intellectuals and other public figures convert so often to Catholicism (or Eastern Orthodoxy) and so rarely to Protestantism? And what is the fate of both Catholicism and Protestantism among American elites and their institutions, given the decimation of the liberal mainline? Could a new mainline arise to take its place, and if so, who would it be and what would it look like?

Dozens of writers have taken up these questions in recent weeks, some (not all) prompted by Ayaan Hirsi Ali’s conversion and her written explanation for it. Here’s Douthat and Freddie and Tyler Cowen and Alan Jacobs (and Alan again). Here’s Justin Smith-Ruiu. Here are two reflections about why Catholicism instead of Protestantism. And here is a series of pieces by Jake Meador on both the “new mainline” question and the “why Catholicism” question—with a useful corrective by Onsi Kamel.

I’ve got some belated thoughts; in my mind they connect to all of the above.

  1. It’s worth making clear at the outset that countless people defect annually from Catholicism and Orthodoxy, whether into unbelief or into some Protestant sect. So the question isn’t about who’s winning or which group people in general prefer or comparing overall numbers. The question is about public figures and intellectuals and their conversions, as adults, from unbelief to faith. Why does that type of person always seem to be joining “catholic” traditions (defined, for now, as Roman Catholicism, Eastern Orthodoxy, and perhaps also the Anglican Communion)?

  2. Summed up in a single sentence, the reason as I see it is that Catholicism is a living tradition embodied in a global institution that stretches back millennia, claims divine authority, and contains both a storehouse of intellectual resources and a panoply of powerful devotional and liturgical practices. Let’s unpack that.

  3. Catholicism is a world. Protestantism is not. Protestantism is not anything particular at all. It’s an umbrella or genus term that encompasses numerous unconnected or at best half-related Christian traditions, the oldest of which goes back five hundred years and the newest of which is barely older than a generation. There are not “Protestants,” somewhere out there. No ordinary layperson says, “I’m Protestant.” What he or she says is, “I’m Presbyterian” or “I’m Methodist” or “I’m Pentecostal” or “I’m Evangelical” or “I’m Lutheran” or “I’m Church of Christ” or “I’m Moravian” or “I’m Calvinist” or “I’m Baptist” or some other name. And the thing about midlife conversions on the part of public intellectuals is that they aren’t looking for a sub-culture. They’re looking for a moral and spiritual universe. They don’t want a branch of the tree; they want the tree itself—the trunk, the very root. “Protestantism” makes no exclusive claims to be the trunk as such. Its trunkness is never even in view. The question, therefore, is almost always whether Catholicism East or West is, properly speaking, the Christian trunk. Folks already in the West typically, though far from always, opt for the West’s claim of primacy.

  4. Note well that this observation isn’t per se a critique of Protestants or a presumption against them. The fundamental feature of Protestantism is an ecumenical evangelicalism in the strict sense: a Christian whole created and sustained and defined by nothing else than the gospel itself. So that second-order sub-gospel confessional identities are subsumed in and comprehended by God’s singular work in Christ, which is the sovereign word proclaimed by the good news. In this way, according to Protestants, any and all attempts to be, or searches to find, “the trunk” is a distortion of true catholicity.

  5. Be that as it may, the catholicity of Catholicism tends to be what wayward, agnostic, restless public intellectuals are after. And so they find it elsewhere than in Protestantism.

  6. There is a reason why so many evangelical and Protestant graduate students in theology move toward “higher church” traditions. Intellectually, they discover thinkers and writings their own “lower church” traditions either ignore or lack; liturgically, they discover practices handed down century after century that function like a lifeline in a storm. Reading Saint Ignatius or Saint Justin or Saint Irenaeus or Saint Augustine, it occurs to them that they don’t have to imagine what the church’s ancient liturgy looked and felt like; they can simply visit a church down the street.

  7. Speaking only anecdotally, I have never known students of Christian theology to move “down” the ecclesial ladder. I have only known them (a) to move “up,” (b) to move “left,” or (c) to move “out.” That is, relative to where they started, they go catholic, they go liberal, or they go away, leaving the faith behind. This remains true even of those who do not shift from one tradition or denomination to another: Baptists start reading Aquinas, evangelicals start celebrating Ash Wednesday, non-denom-ers start reciting the Creed. Or, if the move is lateral instead of vertical, one retains inherited beliefs and practices but changes on moral and social questions. Either way, “down” is not an option in practice.

  8. Once again this fact, or observation, need not mean anything in itself. The populist or evangelical criticism might well be apt: Theological education places obstacles between students and the plain gospel. A student of theology “classes up,” thereby rendered unable to join “lower” classes in the purity of normal believers’ unadorned worship. Perhaps, then, this is an argument against the sort of theological education dominant today!

  9. All this applies, mutatis mutandis, to public intellectuals. Put another way, suppose you are an atheist or agnostic exposed, over time, to the desert fathers, or to the pro-Nicene fathers of the fourth and fifth centuries, or to Saint Maximus Confessor or Saint John of Damascus, or to Benedictine monks, or to Saint Thérèse of Lisieux, or to Julian of Norwich, or to Saint Francis or Saint Bernard or Saint Anselm. It would simply never occur to you that what you find in these authors is what you’d find in the Methodist congregation on the corner, or the Baptist church around the block, or the non-denom start-up across town. Not only do the devotional and liturgical, spiritual and theological worlds conjured by these writers and texts not exist in such spaces. The traditions themselves do not claim the figures in question. You go, therefore, to the people and the places who are bold enough to say, “Those names are our names; those saints are our saints; those books are our books. We nurture and preserve and pass them on. Come learn them from us; indeed, come learn from us what they learned themselves, in their own time.”

  10. In sum: What intellectuals, especially agnostic intellectuals in midlife, are restlessly searching for is something not man-made, but divine; not provisional, but final; not a question, but an answer. They are looking for rest, however penultimate in this life, not more open-ended restlessness. Something that lasts. Something that can plausibly make a claim both to antiquity and to permanency. A bulwark that will not fail. Something to defer to, submit to, bow one’s head in surrender to; something to embrace and be embraced by: a teacher but also a mother. And the truth is that Rome plausibly presents itself as both mater et magistra, the pillar and bulwark of the truth. Orthodoxy does as well. The plausibility explains why so many intellectuals find port of harbor with each of them. The reverse, in turn, explains why so few of those sorts of people convert from rudderless adult atheism to Protestantism with a capital-p.

  11. As for motives, if what I’ve outlined so far is true, then it makes perfect emotional sense for restless brainy seekers whose spiritual midlife crisis is prompted by perceived civilizational decline, torpor, and decadence to turn to catholic Christianity, East or West, as a haven in a heartless, spiritless, lifeless world. They aren’t making a category error, nor are they (necessarily) joining the church in a merely instrumental sense. For all we know, their search for capital-t Truth in a culture that refuses the concept altogether may be wise rather than self-serving. As Alan remarked, “what matters is not where you start but where you end up.” Doubtless there are people who join Christianity as a cultural project; must they remain there forever? I see no reason why we must, as a matter of necessity, say yes, for all people, always, in every circumstance. No adult is baptized without a confession of faith; if a new convert makes an honest confession and receives the grace of Christ’s saving waters, then he or she is a new creation, God’s own child, whatever the mixed motives involved. To say this isn’t to worship the God-shaped hole in our hearts instead of God himself. It’s to acknowledge, from the side of faith, that the hole is real. Because the hole is real, different people will find themselves knocking on Christ’s door—which is to say, on the doors of the church—for every manner of reason in every manner of situation. What Christ promises is that, to the one who knocks, the door will be opened. He does not lay down conditions for what counts as a good reason for knocking. Nor should we.

  12. See here the opening paragraph of Christian Wiman’s new book, Zero at the Bone: Fifty Entries Against Despair (from entry 1, page 5):
    Thirty years ago, watching some television report about depression and religion—I forget the relationship but apparently there was one—a friend who was entirely secular asked me with genuine curiosity and concern: “Why do they believe in something that doesn’t make them happy?” I was an ambivalent atheist at that point, beset with an inchoate loneliness and endless anxieties, contemptuous of Christianity but addicted to its aspirations and art. I was also chained fast to the rock of poetry, having my liver pecked out by the bird of a harrowing and apparently absurd ambition—and thus had some sense of what to say. One doesn’t follow God in hope of happiness but because one senses—miserable flimsy little word for that beak in your bowels—a truth that renders ordinary contentment irrelevant. There are some hungers that only an endless commitment to emptiness can feed, and the only true antidote to the plague of modern despair is an absolute—and perhaps even annihilating—awe. “I prayed for wonders instead of happiness,” writes the great Jewish theologian Abraham Joshua Heschel, “and You gave them to me.”

  13. Now: Given this apparent movement among once-secular intellectuals toward faith, or at least a renewed openness toward the claims of faith, what about a parallel movement toward a kind of Christian establishment—and in America, a “new Protestant mainline”? Any answer here is always subject to the ironies of divine providence. Christ’s promise to Saint Peter stands, which means that the forces arrayed against Christ’s body will never finally succeed. That doesn’t mean all or even any of our local or parochial ecclesial projects will succeed. But some of them might, against the odds. That’s God’s business, though, not ours. For now, then, some earthbound comments and fallible predictions.

  14. I can’t speak to the situation in Europe or Great Britain, though my two cents, for what little it’s worth, is that we will not be seeing anything like a renaissance of established religion among elites and their institutions in our or our children’s lifetimes. In the U.S., I likewise think anything like a renewed liberal mainline is impossible. The once-dominant mainline—mainly comprising Episcopalians, Presbyterians, Lutherans, and Methodists—is on life support where it isn’t already dead and buried. As a coherent civic bloc, much less a motive force among elites, it is undeniably a thing of the past. I take that as read.

  15. So the only viable question in the American context, if there were ever to be a “new” mainline, is whether it would be Catholic, magisterial Protestant, or evangelical. There was a moment, as many others have written, when American Catholicism was in process of making a bid to function like a new mainline. That period runs basically from the birth of Richard John Neuhaus in 1936 (the height of the Great Depression, the end of FDR’s first term, with World War II imminent) to his passing in 2009 (Bush in disgrace, Obama triumphant, the Great Recession, in the sixth year of the Iraq War). Catholics were well represented in elite universities, in think tanks, in D.C., in presidential administrations, in magazines that fed and fueled all of the above. But between the priest sex abuse scandals, Iraq, the divisiveness of abortion, and rolling political losses on social issues (above all gay marriage), the dream of an American Catholic Mainline proved not to be.

  16. As for conservative Protestants and evangelicals, the former lack in numbers what the latter lack in everything else. Here’s what I mean. A genuine mainline or unofficially established church has to possess the following features: (a) so many millions of adherents that they’re “just there,” since some of them are invariably “around,” no matter one’s context; (b) powerful centralized institutions; (c) an internal logic that drives its laypeople to seek and acquire powerful roles in elite institutional contexts; (d) a strong emphasis on education in law, politics, and the liberal arts and their various expressions in careers and professions; (e) an investment in and sense of responsibility for the governing order, both its status quo and its ongoing reform; (f) a suspicion of populism and a rejection of revolution; (g) a taste for prestige, a desire for excellence, and an affinity for establishment; (h) wealth; (i) the ears of cultural and political elites; (j) networks of institutions, churches, and neighborhoods filled with civic-minded laypeople who can reliably be organized as a voting bloc or interest group; (k) groups of credentialed intellectuals who participate at the highest levels of their respective disciplines, whether religious or secular; (l) a loose but real shared moral and theological orthodoxy that is relatively stable and common across class and educational lines; (m) an ecclesial and spiritual culture of thick religious identity alongside popular tacit membership, such that not only “committed believers” but mediocre Christians and even finger-crossing public figures can say, with a straight face, that they are members in good standing of said established tradition.

  17. If even part of my (surely incomplete) list here is accurate, it should be self-evident why neither evangelicals nor conservative Protestants could possibly compose a new American mainline. It’s hard to put into words just how tiny “traditional” or “orthodox” magisterial Protestantism is in the U.S. It would be unkind but not unfair to call it a rump. Its size has been demolished by a quadruple defection over the past three generations: to secularism, to liberalism, to evangelicalism, to Rome. It’s arguable whether there ever even was any meaningful presence of magisterial Protestantism in America of the sort one could find in Europe. The four-headed monster just mentioned is a ravenous beast, and old-school Lutherans and Wesleyans and Reformed have been the victims. You need numbers to have power, not to mention institutions and prestige, and the numbers just aren’t there; nor is there a path to reaching them. It’s not in the cards.

  18. Evangelicals still have the numbers, even if they’re waning, but as I said before, they lack just about everything else: the institutionalism, the intellectualism, the elite ethos, the prestige and excellence, the allergy to populism—nearly all of it. Evangelicalism is Protestant populism. This is why evangelicals who enter elite spaces slowly, or sometimes not so slowly, lose the identifying marks of evangelicalism. It isn’t strange to learn that Prestigious Scholar X on the law/econ/poli-sci faculty at Ivy League School Y is Roman Catholic. It is a bit of a surprise to learn that he’s an evangelical. The moment you hear it, though, you wonder (or ask) whether he’s an evangelical Anglican or some such. Consider high-rank Protestant universities with large evangelical faculties, like Wheaton or Baylor or George Fox. Ask the religion, theology, and humanities professors where they go to church. Chances are it’s an Anglican parish. Chances are that not a few of them, if they left, or if the university permits it, have transitioned from evangelical to Anglican to Roman Catholic or Eastern Orthodox. This is just the way of things in higher-ed as well as other elite institutions.

  19. Here’s one way to think about it. An evangelical who climbs the elite ladder is more or less required, by the nature of the case, to shed vital elements of her evangelical identity. But a Catholic is not. And a Catholic is not for the same reason that, once upon a time, a liberal Protestant was not. A high-church Episcopalian wasn’t working against the grain by earning a law degree from Princeton or Yale a century ago. That’s what Episcopalians do. It’s what Episcopalianism is. Moreover, if said Episcopalian began as a wide-eyed conservative and ended a enlightened liberal, he would remain Episcopalian the whole time. There’d be no need to leave for some other tradition; the tradition encompassed both identities, indeed encouraged passage from one to the other. Whereas an evangelical who becomes liberal becomes a self-contradiction. A liberal evangelical is an oxymoron. He lacks any reason to exist. Evangelicalism isn’t liberal, in any sense. It is axiomatically and essentially illiberal. To become liberal, therefore, is to cease to be evangelical. That’s not what evangelicalism is for. Evangelicals who become liberal remain evangelical only for a time; they eventually exit faith, or swim the Tiber, or become actual liberal Protestants, where they feel right at home. Which means, for the purposes of this discussion, that every single time evangelicals send their best and brightest to elite institutions to be “faithfully present” there, only for them to become liberal in the process, evangelicalism loses one of its own. The same goes, obviously, for a rising-star evangelical who loses faith or becomes Catholic or Orthodox.

  20. The other thing to note is that the “moral” part of “moral and theological orthodoxy” is absolutely up for grabs right now, in every single Christian tradition and denomination in America. No church has successfully avoided being roiled and split in two by arguments over gender and sexuality. Nor is there some happy middle ground where everybody agrees to disagree. One or another normative view is going to win out, in each and every local community and global communion. We just don’t know, at this point in time, where the cards are going to fall. In that light, any ambition for conservative Protestants (or Catholics, for that matter) to form an established religious backdrop for elite cultural and political organs in America is a pipe dream, given what “conservative” means regarding sexual ethics. Whoever is still standing, Christianly speaking, at the end of this century, the wider culture is not going to welcome new overlords who oppose the legality of abortion, same-sex marriage, no-fault divorce, and artificial contraception. I mean, come on. Most Protestants I know take for granted the legality (and usually the morality, too) of all but the first, and are politically ambivalent about the first as well. Protestants are in numerical decline anyway, a fact I’ve bracketed for these reflections. Put it all together, and the reasons why public intellectuals don’t convert to Protestantism are inseparable from, and sometimes identical to, the reasons why magisterial Protestantism is not poised to become a new American mainline. Do with that what you will.

Read More
Brad East Brad East

Decline and its possibilities

Is decline possible? Not just for society, but for the church? How, in any case, should Christians write and think about it? A reflection occasioned by anti-declinist authors.

Is society in decline? Is our culture getting worse? Is the West less Christian by comparison to the past? And are these things a cause for lament?

Your answers to these questions do a lot of work in locating you in debates among Christians about the state of things today. I’m thinking in particular of anti-declinists, who resist and reject declension narratives as hysterical, overwrought, incomplete, or even unchristian. It occurred to me recently that anti-declinists confuse a number of concepts—which is not their fault, since their opponents often confuse them as well—that are distinct from one another.

First is decline. In what sense decline? In what area(s) of life? Religious in particular? Social, moral, aesthetic, political, economic? Or is the question a matter of net gain/loss? Claims about net loss are always going to be inordinately subjective. Countering a sense of loss in, say, church attendance with an equivalent gain in household wealth is a textbook instance of changing the subject. To be frank, Christians concerned about whether people have left church for good don’t care about the economy. It’s apples and oranges.

The second issue is blame. It’s often assumed that, if things have in some sense gotten worse (over the years, decades, or centuries), then those responsible are them, out there, the Bad Guys. But this doesn’t follow. It may well be the case that things have gotten worse because of us. The culprit, in other words, is Christians. Christians are on the hook for the things they bemoan. The church is culpable for social, moral, or religious decline. The proper response to recognition of genuine decline, then, is not pointing fingers at the world but donning sackcloth and ashes; fasting and prayer; contrition and repentance. (Next is figuring out a faithful path forward, but doing so is still an in-house affair.)

The third issue is judgment. Christians believe that God is Judge. And within history, God’s mysterious providence does not withhold all consequences of the actions of individuals, communities, and nations until the End, but enacts them, in part, in time, as anticipations of the Final Judgment. Our chickens do not always come home to roost. But sometimes they do. And when they do, it is the work of God. In the realm of social or cultural decline, then, Christians are inclined to see the hand of God. And often as not, it is against the church that his hand works. Judgment begins at the house of God. Nor is divine judgment a sneaky way of outsourcing blame back onto the culture. Well, this is God’s work—guess y’all will acknowledge him now! On the contrary, sometimes divine judgment works for the world to the detriment of God’s people. Assyria wins and Israel loses; Babylon wins and Judah loses. Not ultimately, but for quite a while. The gates of hell shall not prevail against the church. But Caesar made thousands of martyrs for a full three centuries before the bloodletting stopped.

The fourth issue is suffering. Not all suffering on the part of Christians is God’s judgment on our sins. But all of it is an opportunity to imitate Christ. That is, to unite out sufferings to his in patient endurance as a witness to the power of his resurrection. The blood of the martyrs is the seed of the church. Let us say, even, that the sweat of the saints—the smallest of pains in the mundane challenges of daily discipleship—is likewise the seed of God’s word scattered on the soil of our culture. It is God who gives the increase. All we can do is witness. We are not to seek after suffering. But when it comes, as it always will, in whatever form, our calling is to persevere. Not to whine. Not to complain. Not to eschew martyrdom for a martyr complex. Not, certainly, to feign bemusement at this strange disruption of how things should be, namely, life running smoothly. Life running smoothly is a blessed temporary relief from the ordinary run of things in a fallen world. So even and especially when society is in process of some kind of decline, even and especially if it is religious in nature, the task of the church is the same: faithful witness in suffering.

The fifth issue is hope. To spy decline is not to deny grounds for hope. Hope is in God alone. His promise is sure: the gates of hell will not prevail against the people whose God is the Lord. Nevertheless they will claim victories along the way: provisional victories, but victories all the same. Christian hope lies neither in the absence of decline nor in renewal following decline but in the ultimate victory of Christ above and beyond the waxing and waning of human civilizations. Behold, the nations are as a drop in a bucket. For the Lord, that is, and therefore for the church. A whole political theology is contained in that single verse. The peoples are like grass, which wither under the Lord’s breath; he makes nations rise and fall, but his word stands forever. Decline is both real and inevitable. It is also subordinate to the mission and worship of the church.

If I’ve been training my sights on the errors of the declinists, a few implications follow for anti-declinists as well.

First, decline is possible. To read some anti-declinists, you’d imagine they’ve sincerely bought into the myth of progress. They’re like film critics who tell you, every year, that the movies have never been better. Novels and music and all the other arts, too. Did I mention epic poetry and its public performance and memorization by children? Nothing ever changes, except when it does, and it’s for the better. That’s a mighty dollop of silliness no serious person should countenance. But it’s ruled out a priori for Christians. So for any Christian writer or thinker not driven by pessimism about our age, he or she must at least admit the possibility that decline may happen and perhaps even is happening.

Second, Christian decline is possible. Some anti-declinists read as though, even if a civilization might get worse, the state of the church within that civilization can never get worse. But we’ve already seen that this, also, is a red herring. I often think, in this respect, of Alan Jacobs’ marvelous review of Rod Dreher’s Benedict Option, which opens with a description of the extraordinary Christian culture of fifth-century Cappadocia created and fostered by saints like Basil and Gregory of Nazianzus. Yet no such culture exists there today. Here’s Alan:

If the complete destruction of a powerful and beautiful Christian culture could happen in Cappadocia, it can happen anywhere, and to acknowledge that possibility is mere realism, not a refusal of Christian hope. One refuses Christian hope by denying that Jesus Christ will come again in glory to judge the living and the dead, not by saying that Christianity can disappear from a particular place at a particular time.

Christianity can disappear from a particular place at a particular time. When I read non-declinists, I wish they would begin every paragraph with an affirmation of this possibility. Because most of the time, it sounds as though they do not believe it is possible. Yet we know it is possible, because it has happened time and again throughout history. Which means that people worried about it happening here are not fantasists; they are not hysterical just because they worry about it. Their worry may take the form of hysteria. They may be foolish in their response or ungodly in their lament or wicked in their prescriptions, not to mention their treatment of others. But the worry is valid in principle. The only question, ever, is whether it is a plausible worry given our time and place, and thence whether it is likely. It becomes a cultural question, a hermeneutical question, even in a sense an empirical question. That’s the arena for debate. Not its sheer possibility.

Third, non-declinists have to distinguish not just the mode of pessimism from its judgments but also the judgments from their implications. Bemoaning some particular religious or social loss is not ipso facto an endorsement of everything in the past (whether that past be the 1980s or ’50s or ’20s or 1770s or 1530s or 1250s or whenever the ostensible golden age is said to have occurred). Nor is it a denial that anything has improved. Declinists are on the hook for their rhetoric, which should be nuanced and accurate rather than generic and overblown. But anti-declinists are on the hook, too. They often write like to give an inch is to lose a mile. Any decline means total decline. Perhaps also despair. None of which follows.

The Christian theology of providence is an extraordinary analytical tool. It allows the soberest of diagnoses and the most confident of hopes. It’s always a delicate balancing act. Any Christian who writes about the state of society is going to fail in some way. But we can fail a little less, it seems to me, not least by being mindful of the distinctions I’ve elaborated above.

Read More
Brad East Brad East

Defining “culture”

Responding to Alan Jacobs’ critique of my undefined and indiscriminate use of “culture” in my recent Mere O essay.

I’ve been grateful for the responses I’ve seen to my Mere O essay “Once More, Church and Culture.” Andrew Wilson was particularly kind in a post about the piece.

I’ve especially appreciated folks understanding what I was (and was not) trying to do in it. Not to suggest Hunter’s typology is bad or discreditable. Certainly not to suggest an alternative singular image or approach. Rather, to suggest that the very search for one dominant or defining posture is a fool’s errand, not to mention historically and culturally parochial. I was worried the length and wending nature of the essay would mislead readers. I was wrong!

Alan Jacobs also liked the essay, but was left with one big question: What is “culture,” and why did I—why does anyone, in writing on the topic—leave it undefined? More to the point, “there is no form of Christian belief or practice that is not cultural through-and-through.” In which case it sounds rather odd to pit the church against something the church herself includes and which, in turn, includes the church.

Alan’s criticism is a legitimate one, and I don’t have any full-bodied defense of my unspecified use of the term. I do have a few quick thoughts about what I had in mind and why folks use the word generically in essays like this one.

First, “culture” is one of those words (as Alan agrees) that is nigh impossible to pin down. You know it when you see it. You discover the sense of what a person is referring to through their use. The term itself could call forth an entire lifetime’s worth of study (and has done so). In that case, it’s reasonable simply to get on with the discussion and trust we’ll figure out what we’re saying in the process.

And yet—this intuition may well be wrong, and its wrongness may be evidenced in the very interminability of the post-Niebuhrian conversation. Granted! I’m honestly having trouble, however, imagining everyone offering a hyper-specific definition of “culture” or avoiding the term altogether.

In any case, second, it may be true that Niebuhr made various errors in his lectures that became Christ and Culture, but I’m surprised that Alan thinks—if his tongue is not too far in cheek—that the book should be dismissed. My surprise notwithstanding, I engage Niebuhr in the essay not to defend him but to take up the ideas he put into circulation through his typology; or rather, to display the pattern in his approach, which became the template for so many who followed him.

Third, I would like to point out how I actually use the word “culture” in the essay. I do in fact mention it in the opening sentence, along with similar items in a grab-bag list meant to suggest a kind of comprehensive civic repertoire: “Christendom is the name we give to Christian civilization, when society, culture, law, art, family, politics, and worship are saturated by the church’s influence and informed by its authority.” After that, I use the term exclusively in four ways: in scare quotes, in actual quotations, in paraphrase of an author’s thought, or in generic reference to “church and culture” writing—until the final few paragraphs. Here they are:

  • “…Niebuhr, Hunter, and Jenson are right to see a dialectic at work in the church’s encounter with various cultures.”

  • “As I see it, there is no one ‘correct’ type, posture, or model. Instead, the church has four primary modes of faithful engagement with culture.”

  • “God is the universal Creator; the world he created is good; and he alone is Lord of all peoples and thus of all cultures.”

  • “When and where the time is right, when and where the Spirit moves, the proclamation of the gospel cuts a culture to the bone, and the culture is never the same.”

  • “…[my approach] does not prioritize work as the primary sphere in which the church encounters a culture or makes its presence known.”

  • “The mission of the church is one and the same wherever the church finds itself; the same goes for its engagement with culture.”

  • “Sometimes … the Spirit beckons believers, like the Macedonian man in the vision of St. Paul, to cross over, to enter in, to settle down, to build houses and plant vineyards. In other words, to inhabit a culture from the inside. Sometimes, however, the Spirit issues a different call…”

In my humble opinion, these uses of the term are clear. Either they are callbacks to the genre of “church and culture”/“engage the culture”/“church–culture encounter” writing, which I am in turn riffing on or deploying for rhetorical purposes (without any need for a determinate sense of the word). Or they are referring to a society or civilization in all its discrete particularity, as distinct from some other society or civilization.

Had Alan been the last editorial eye to read my essay before I handed it in, I would have cut “culture” from the first sentence and replaced most or all of these final mentions with “society” or “civilization.” I don’t think I would have defined “culture” from the outset, though I might have included a line about the term being at once inevitably underdetermined and unavoidable in writing on the topic, that is, on the church’s presence within and mission to the nations.

At the very least, I’ll be mindful of future uses of the term. It’s a slippery one, “culture” is, and I’m grateful to Alan for reminding me of the fact.

Read More
Brad East Brad East

Once more, negative world

A response to Alan Jacobs’ response to my response (and others’) to Aaron Renn’s “Three Worlds” framework.

I sometimes think of what I write on this blog as mostly just drafting off two other, far superior blogs: Richard Beck’s and Alan Jacobs’. Both are friends whose work I’ve been reading for more than a decade, and who have been kind enough, more than once, to link to my own work or to respond to it in some way.

Recently Alan wrote a follow-up to his blistering rejection of Aaron Renn’s “Three Worlds” framework for understanding Christians’ social status in the United States. In the follow-up, Alan mentions both Derek Rishmawy’s and my respective attempts to interpret and commend a version of Renn’s framework. Gently but firmly, he rebukes these attempts and underscores why he finds the whole business—the whole conversation—a misdirect: a futile, self-regarding failure to attend to the main thing, namely following Christ irrespective of our surroundings and their purported (in)hospitality to the gospel. We do not, Alan argues, need detailed plans in order to fulfill this charge. Nor do we need an ostensibly (or fantastically) friendlier society in order to succeed. We just need the will, the resolve, the obedience to Christ requisite to set one foot in front of the other, answering the call of the Lord whatever it may be, wherever it may lead, whenever it may come.

I see that Derek has written his own response to Alan (though I haven’t yet read it). I’m going to attempt my own here, with the aim both of understanding what Alan is concerned about and of clarifying my own position.

The simplest way to put my view is in the form of two broad questions:

  1. Do different societies, in different times and different places, treat an individual’s or a community’s public identification as Christian in different ways?

  2. If yes, does knowledge of those differences make some relevant difference for how Christians should understand, approach, engage, and inhabit their societies?

I take the answer to the first question as read: yes, obviously. I take the answer to the second to be yes as well.

To me, that settles the matter—at least at the formal level.

The third question descends from the heights of history and missiology, respectively, to applied sociology: Is it accurate to say, all things being equal, that being known publicly as a Christian in the U.S. is less likely to enhance one’s social status than at any time since World War II? Or, to put it differently, that public identification as Christian is more likely to downgrade one’s social status that at any point in living memory? Or, to put it more weakly and less comparatively, that in general “being identified as Christian” is not something a non-Christian would, in our society today, be tempted to pursue nominally for the sole reason of trying to enhance his or her social status?

Granted, the U.S. is a big country. I live in a town of 120,000 in west Texas. Having a nominal membership at a local church one doesn’t actually attend or care much about might still grant a certain cache here. (Though, in most circles, I doubt it.) Any comment, then, about “the U.S. today” is going to be an “in general, on balance, all things being equal, thinking about the country as a whole” comment. If you don’t think such comments can be meaningful, fair enough. But if you do, then this sort of comment is permissible like any other.

Region and subculture are one element here. Institutions and professions are another. Some organizations and careers will be neutral as regards religious identity; others, far from it. Also granted.

The upshot, all qualifications made, is simply that something has changed in the last century regarding how self-identifying as a Christian orients oneself to the wider culture; how one is perceived as a result. And apart from claims about this as a change, the point about the present moment is that, whether or not there ever was such a time (in this society or another) when being seen to be a Christian was something that might raise one’s prospects—marital, educational, financial, professional, political—this time, in this society, is not one of them. We can haggle over whether it’s preferable to say “it is not” one of them versus “it is no longer” one of them. But either way, it’s not.

Suppose Alan agrees with me (though, if I’m reading him correctly, I don’t think he does). Does it matter?

I think it does. But let me say how I don’t think it matters before I say how I think it does.

It does not matter “because America is no longer a Christian nation.” It does not matter, that is, as if this analysis were at heart a declension narrative, according to which things have been getting worse and worse and now, at this moment, we’ve reached the nadir; or at least have crept up to the edge of the cliff. No. The social status of being-seen-as-Christian is simply one among many sociological variables relevant to Christian consideration of the church’s mission.

I also don’t think it matters “because things are really bad out there.” They’re not. It’s bad when Christians get thrown to the lions. It’s bad when Christians can’t vote. It’s bad when certain Christians aren’t afforded basic rights and privileges common to civic society. It’s bad when it is against the law for Christians to gather on Sunday mornings, to pray and celebrate the Eucharist, to read their Bibles and worship without fear, to share the gospel with whomever will listen.

American society does not fit these descriptions, and it isn’t close to any of them. Christians in America are remarkably free; our privileges are innumerable. Words like “persecution” are inapt to our context, and unwise to use—not least since we have sisters and brothers elsewhere in the world who suffer actual persecution at this very moment.

How, then, is the social status of public identification as Christian relevant? In this respect:

The church cannot bear faithful witness to Christ in a given context if she lacks awareness of the particular features that constitute that context, that make it what it is.

Think about different locations and cultures today. Does Christian witness look the same in Riyadh, Nairobi, Beijing, St. Petersburg, Buenos Aires, Miami, Milan? Does it look the same in 2022, 1722, 1422, 1122, 822, 522, 222? Surely not. And surely all Christians would agree that differences of context in each time and place call for different forms of response to those differences? Such that the specific contours of Christian witness actually and rightly look different based on when and where one lives, and how a culture or society in question responds to—welcomes, rejects, shrugs, punishes—public identification as Christian?

Perhaps, again, Alan would agree with this. Let me try to say a bit more, then, to get enough meat on these bones to prompt a meaningful disagreement.

Consider the difference between life under Diocletian, about half a century before St. Augustine’s birth, and life under Honorius, when Augustine was bishop of Hippo. The former was a time when the imperial authorities were your enemy, if you were known as a Christian; the latter was a time when claiming to be a Nicene Christian might enhance one’s political or financial prospects (though not necessarily). How should the church navigate each setting? This was a real question faced by bishops, monks, priests, and laypersons around the Mediterranean. The first was, in Renn’s language, a “negative world”; the second, a (more) “positive world.” I see no reason to declare a priori that such labels, and the analysis underlying and following from them, an inadmissible distraction.

Now for an example closer to home. I teach undergraduate students of all kinds, but every semester I have a class all to myself composed only of Bible and ministry majors: i.e., young persons preparing for a life of formal service to the church in the form of teaching, preaching, pastoring, and so on. These students largely come from the Bible Belt, and many of them come from big churches in big cities where being Christian and attending such churches doesn’t feel abnormal. This experience in turn nurtures in a good number of them a sense of their context, present and future, as either neutrally or favorably disposed toward Christianity. A world of megachurches and popular pastors and celebrity Christians and spiritual influencers is just “the world”: yesterday, today, and forever. The churches they one day will lead will be large, healthy, full, and financially stable. The folks in the pews will lead lives as middle-class American Christians long have (so they imagine): unthreatened and tacitly buoyed by the surrounding culture.

Not for all of them, but for quite a few, it is something of a shock to learn about the declining rates of identification as Christian in America; about the decades-long decreasing numbers of church attendance; about how many churches are closing their doors each month; about some of the modest but real social, political, and professional challenges facing folks known to be Christian in what once were considered mainstream careers and institutions in this country.

In a word, most of my students believe they live in Renn’s Positive World. They really do. Others suppose it’s a Neutral World for Christians. Few to none see it as a Negative World. And I’m telling you, it makes a difference for how they understand their faith, their future, and their eventual ministry in the church.

This is one reason, in my view, why we keep seeing so many pastors quitting formal ministry in their 20s and 30s. It’s hard out there. And many of them are unprepared for what’s awaiting them. As I see it, part of that lack of preparation is a gap between the “World” they expect to inhabit as ministers and the actual “World” they find. And the gap is perpetuated if and when professors and writers like me fail to help them see—clearly, soberly, and accurately. I want them to see the world as it is. Not to scare them. Not to lament the supposed loss of a prior world. Not to remake the world in our desired image, in the image of what it “should” be. Not to be fatalist about the future or to forsake the challenge of persuasion or to give up on faithful witness until the world is nicer to us. By no means. The world owes us nothing, and as the apostle teaches, friendship with the world is enmity toward God.

What I want, rather, is for them to be equipped to minister in the real world, not the cloistered world of their childhoods, or the 1990s/2000s, or a fictional 1950s, or any other time and place. In that sense and to that extent, I find the “Three Worlds” heuristic to be useful. As a starting point. As a conversation starter. As an initial sociological, historical, and missiological framework, by which to help normie Christians and ministers to begin thinking about the particular challenges facing the mission of the church today—here and now, in our setting, not our parents’, not someone else’s: ours.

Maybe Renn’s “Three Worlds” comes with social or political baggage not worth onboarding in this particular conversation. Maybe it’s overdetermined by the uses to which various of its adherents want to put it. Maybe it’s wrong in certain key details, not least its laser focus on the last few decades and specific public events that occurred during them; a myopic legal and juridical cultural frame. Maybe its examples are wrong, such as offering the rhetorical style of Tim Keller as an artifact of a now-past “World,” no longer relevant. Maybe the “pre-1994” timeframe of “Positive World” is far too open-ended, and needs bracketing closer to the World Wars than to the Founding Fathers. Maybe the emphasis on elite institutions combined with a blurring of the lines between “public profession of Christian faith” and “actual discipleship to Christ” renders the framework finally useless at the practical level.

Maybe, maybe, maybe. With the qualifications I make above, I find it useful enough. More broadly, analysis like it seems to me self-evidently helpful, even needful. Not because Alan is wrong, but because he is right: The content of Christian witness is always and without exception the same: the life and teachings of Jesus of Nazareth. But what does imitation of and conformity to those life and teachings require in this time, in this place, by comparison to other times, other places?

That’s the question I want to answer. And I’ll take all the help I can get.

Read More
Brad East Brad East

Jimmy’s change

So far as I can tell—but I haven’t been trawling Twitter for contrarian takes—Alan Jacobs’ negative reaction to the Better Call Saul finale (spoilers herein, obviously) is the exception to the rule. The people I read loved it. Friends and family who watch it loved it. I loved it. But it’s useful to read a take against the grain. Is Alan right?

So far as I can tell—but I haven’t been trawling Twitter for contrarian takes—Alan Jacobs’ negative reaction to the Better Call Saul finale (spoilers herein, obviously) is the exception to the rule. The people I read loved it. Friends and family who watch it loved it. I loved it. But it’s useful to read a take against the grain. Is Alan right?

His case is simply stated. Jimmy’s volte-face in the third act of the finale is unwarranted either by the episode’s events or by the series’ narrative thrust as a whole. Who Jimmy is, deep down, has more or less always been set in stone, and that concreteness was not softened by these final episodes. Jimmy-as-Saul-as-Gene has only become more narcissistic, more reckless, more negligent, even murderous and sociopathic. Are we really supposed to believe that a single remarkable deed by Kim has the power to undo all of that, to make of Jimmy a Sydney Carton bound, selflessly, for the guillotine of a lifetime in prison?

Answer: Yes, actually. I think so. But before I defend that view, let me say why I resonate with Alan’s disappointment.

His disappointment was my disappointment with the original finale of Breaking Bad, which just about everyone else I read and know thought was perfect. It was not perfect, and for the same rationale Alan offers for BCS. Vince Gilligan loved his character too much to let him down. Though Walt had to die, though he had to be humiliated, he also had to go out in a blaze of glory. He had to earn some degree of redemption. He had to do something good, or at least something on his own terms. And thereby, all fans, not just “bad” fans, could get some measure of catharsis for watching and secretly (or not so secretly) cheering on a wicked and murderous drug dealer for five years.

Ever since that finale first aired nearly a decade ago, I’ve proposed an alternate ending. It’s only slightly different than what occurs in the final ten minutes of the episode. Walt arrives in his car, parks it where he does, walks into the building with all going according to plan. Only: when the button is pressed and the machine gun lets loose, the bullets spray wildly without hitting anybody. The plan fails. The cowboy’s last hurrah is an anticlimax. Walt doesn’t win. Instead, once the bullets are finished, the neo-Nazis look at Walter, look at Jesse, and shrug. Then they take them both outside. First they shoot Jesse, as Walter looks on. Then they shoot Walter. They dump both in an unmarked grave. Fade to black; end credits.

That’s a bleak ending, but its bleakness matches the bleakness of the show’s story. For that story is one without any happy endings. Walt doesn’t get to save his orphaned would-be son. His outlandish plan doesn’t succeed. Such plans don’t always work. He doesn’t get to pass out and pass on in the midst of the humming chemistry of a meth lab, happy in his own way, dying as he lived. He doesn’t get to set the terms of his exit from this life. That’s the way he thought he could live. But he was wrong. And the show’s writers mistook their protagonist’s self-understanding as the show’s own inner meaning. An easy error to make, but a costly one. Not just the bad fans rejoiced at the finale. Even ordinary viewers left with a sense of cathartic release: Jesse got away, the bad guys lost, and Walt redeemed himself. Good for him.

It seems to me that Alan thinks the same (whatever his actual thoughts on the BB finale) of the BCS finale. I wondered, going into the episode, whether Gilligan and Gould would be tempted by the same error: the need to make their evil lead good by the end; the desire to make things right that can’t be made right; the pull to let Saul undo, by TV magic, what can’t be undone.

I understand why one might see “Saul Gone,” the name of the series finale, as indulging that temptation. But I don’t agree, for the following four reasons.

First, there’s a lot more going on in Jimmy’s incredible courtroom speech than breaking good, for Kim’s sake. He’s putting on a performance. That performance is Heisenberg-like in its pomposity and pride. He doesn’t want seven years in a cush prison with the world thinking he was a victim. Instead, he wants the world—the feds, the judge, his future inmates, even Kim—to know that without him, Heisenberg wouldn’t have lasted a month as a free man. The real hero of Walt and Jesse and Mike and Gus’s story was Saul Goodman. He made it all possible. After all, he’s the only one still standing. Are you not impressed? Are you not entertained?

Second, so much of what the previous 61 episodes of BCS gave us, which the 62 episodes of BB did not, is that, unlike Walt, who was rotten to the core from the beginning and just needed the opportunity to show it, there was always a goodness to Jimmy intermingled with the bad. Not only certain good inclinations, but the desire to do and to be good. Granted, that desire is snuffed out by the time he’s transformed into Saul. But we have no reason to suppose that it’s gone forever, that it’s beyond recovery. Moreover, he didn’t leave Kim; she left him. It is precisely her reentry in his life that reawakens that desire once more. On the phone, she tells him to turn himself in; he scoffs and tells her to take her own medicine. She does. At great cost to her own life, possibly bringing it to an end. I find it wholly plausible, not that her extraordinary good deed converts him from pure evil to pure good, but rather that her action, like a flash of lightning, transforms the scenery before him. It shuffles the board of his potential actions. It makes possible certain decisions that he would never have considered before. He doesn’t become a martyr. But he does tell the truth.

Why? Because, third, what we know of Jimmy—again, from those prior 61 episodes—is that his moral psychology is not defined solely by greed or victory or successful schemes. An additional and irreducible element is his desire to please those he loves or reveres, even in spite of himself. (In this, too, he is different and, I think, a more complicated character than Walter White.) That’s the thing that made his relationship to Kim so complex. Together, they were bad. But in truth, while he made her worse, she made him better. She kept him from from the dark side, from truly breaking bad all the way. Only in her absence does does he do that. All his worst propensities, however much he toyed with them and leaned in their direction, he kept at bay so long as she was still around. That’s not to say such an arrangement would have lasted forever. But he always cared what she thought. Because he always truly loved her, as she did him. And what he is doing in that courtroom is trying to earn her approval, trying to see a glimmer of the love that once burned bright in her eyes. I have to say, this strikes me as absolutely and unquestionably psychologically and emotionally plausible. The man is a living image of self-sabotage in service of his insatiable desires. He never knows when to stop. Only now, he isn’t risking everything for the sake of some petty score. He’s forsaking a short time in prison for an indefinite one for the sake of the woman he never stopped loving, because the one and only thing that ever competed with his love for self and love for money was love for her. Which is to say, his need for her requited love. So he schemes one last performance for the ages. (Showtime!) And, as ever, he gets what he wants. It works—like Walt’s plan worked—except no one thinks him a hero, and the cost is a life sentence.

Fourth and last, it’s essential not to overlook what Gould shows us on the bus and in the prison. Jimmy isn’t in chains. His spirit isn’t quenched. He’s finally at rest. He’s among the people he always worked for and with and among. He always had their back, and now they’ve got his. They’re chanting his name. They’re fist-bumping him as he swaggers by. He’s not a fish out of water. He’s not suffering in squalor. He’s king of the castle. He’s come home. This is where he belongs. This is where he’s comfortable. This is where he was always meant to be, where his path always led. There’s not a trace of pain or resentment on his face. Not, again, because he’s a martyr. But because he’s accepted who he is and what he’s done, in an irresolvable combination (one that defined his life from start to finish) of chest-thumping pride, feigned performance, and quiet shame.

Nor is Kim’s visit an absolution. Their few words reflect the years and the distance between them. There’s nothing he can do to change the past, to rectify his wrongs. But behind bars, in the plain light of day, he can acknowledge who he is to the one person (apart from his brother) whose opinion he values, and she can accept that knowledge so long as he isn’t hurting anyone or inciting her to do the same. His quiet bravado (“…with good behavior…”) is a sign that he’s no Sydney Carton, nor does he imagine himself to be. He’s Jimmy. But then, Jimmy isn’t the antithesis of Saul Goodman, since Jimmy always was and always will be Slippin’ Jimmy. Kim, though, always loved Jimmy, and Jimmy always loved Kim. If what it took to see her again, to see her look at him like he was Jimmy, not Saul, one more time, was this—getting all the credit for Heisenberg’s crimes while serving time he always knew was coming down the pike—then so be it.

His whole life was a tissue of tradeoffs, anyway, cooking up some brilliant idea in the moment to get what he wanted most, without necessarily thinking of the long-term effects. He did it one last time. Who’s to say he’d regret it now anymore than he did in the past? In the time machine motif that haunts the episode like the ghosts of another Dickens tale, Jimmy wonders about regrets, his own and others. We know he always regretted losing Kim. His moment in the courtroom is his last chance to hop in his own personal time machine and make one single change. Not to alter the laws broken, the people conned, the lives ruined, the victims murdered. Not even—though he does regret it—to unwind his brother’s end.

No, the one change concerns Kim, having once lost her, seemingly forever. Once that change is made, he can live with the consequences.

Read More
Brad East Brad East

Kim, breaking bad

A comment on Kim Wexler in Better Call Saul and, you know, original sin.

You heard it here first. To be specific, on March 3, 2020, here’s what I wrote:

A brief comment on Better Call Saul, prompted by Alan Jacobs' post this morning:

I think the show rightly understands that Kim is, or has become, the covert protagonist of the show, and by the end, we (with the writers) will similarly come to understand that the story the show has been telling has always been about her fall. No escape, no extraction, no pull-back before the cliff: she, like Jimmy, like Mike, like Nacho, like Walter, like Jesse, like Skyler, lacks the will ultimately and decisively to will the good. They're all fallen; and in a way, they were all fallen even before the time came to choose.

In this way the so-called expanded Breaking Bad universe has made itself (unwittingly?) into a dramatic parable of original sin. Not that there is no good; not that characters do not want to do good. But they're all trapped in quicksand, and the more they struggle, the deeper they sink.

This was only three episodes into season 5; the closing moments of the eventual season finale—in which Kim not only initiated an unnecessary, risky revenge-scheme (now being played out in season 6) but also wryly double-barreled Jimmy just the way he had done in the closing moments of season 4 (“It’s all good, man!”)—signaled that the writers have known this was the destination, and the overriding theme of the show, for some time.

The present two-part final season is stretching out that slow burn to the breaking point, in peerless, masterly form as usual. In Gilligan and Gould we trust.

Read More
Brad East Brad East

Deflating tech catastrophism

There’s no better way to deflate my proclivities for catastrophism—a lifelong project of my long-suffering wife—than writers I respect appealing to authorities like St. Augustine and Wendell Berry. And that’s just what my friends Jeff Bilbro and Alan Jacobs have done in two pieces this week responding to my despairing reflections on digital technology, prompted by Andy Crouch’s wonderful new book, The Life We’re Looking For.

There’s no better way to deflate my proclivities for catastrophism—a lifelong project of my long-suffering wife—than writers I respect appealing to authorities like St. Augustine and Wendell Berry. And that’s just what my friends Jeff Bilbro and Alan Jacobs have done in two pieces this week responding to my despairing reflections on digital technology, prompted by Andy Crouch’s wonderful new book, The Life We’re Looking For.

I’m honored by their lovely, invigorating, and stimulating correctives. I think both of them are largely right, and what anyone reading Crouch-on-tech, East-on-Crouch, Bilbro-on-East-on-Crouch, East-on-tech, Jacobs-on-East-on-tech, etc., will see quickly is how much this conversation is a matter of minor disagreements rendered intelligible in light of shared first principles. How rare it is to have more light than heat in online (“bloggy”) disputations!

So thanks to them both. I don’t want to add another meandering torrent of words, as I’m wont to do, so let me aim for clarity (I would say concision, but then we all know that’s not in play): first in what we agree about, second in what we perhaps don’t.

Agreements:

  1. Andy’s book is fantastic! Everyone should buy it and do their utmost to implement its wisdom in their lives and the lives of their households.

  2. The measure of a vision of the good life or even its enactment is not found in its likelihood either (a) to effect massive political transformation or (b) to elicit agreement and adoption in a high percentage of people’s lives.

  3. Digital is not the problem per se; Mammon is. (Both Jeff and Alan make this point, but I’ll quote Alan here: “the Digital is a wholly-owned subsidiary of Mammon.” I’ll be pocketing that line for later use, thank you very much.)

  4. We cannot expect anything like perfection or wholesale “health”—of the technological or any other kind—in this life. Our attempts at flourishing will always be imperfect, fallible, and riddled with sin.

  5. Christians are called to live in a manner distinct from the world, so the task of resisting Mammon’s uses of Digital falls to us as a matter of discipleship to Christ regardless of the prospects of our success.

  6. Actual non-metaphorical revolutionary political change, whether bottom-up or top-down, is not in the cards, and (almost?) certainly would bring about an equally unjust or even worse state of affairs. Swapping one politico-technological regime for another turns out to mean little more than: meet the new boss, same as the old boss. A difference in degree, not in kind.

  7. What we need is hope, and Christians have good grounds for hope—though not for optimism, short of the Kingdom.

  8. What is possible, in faith and hope, here and now, is a reorientation (even a revolution) of the heart, following Augustine. That is possible in this life, because Christ makes it possible. Jeff, Alan, and Andy are therefore asking: Which way are we facing? And what would it take to start putting one foot in front of the other in the right direction? Yes. Those are the correct questions, and they can be answered. And though I (I think defensibly) use the language principalities and powers with respect to Digital, I do not disagree that it is not impossible—check out those negatives piling up one on another—for our digital technologies to be bent in the direction of the good, the true, and the beautiful. Which is to say, toward Christ’s Kingdom.

Now to disagreements, which may not amount to disagreements; so let us call them lingering queries for further pondering:

  1. For whom is this vision—the one outlined above and found in Andy’s book—meant? That is, is it meant for Christians or for society as a whole? I can buy that it is meant for the church, for some of the theological premises and commitments I’ve already mentioned. I’m less persuaded, or perhaps need persuading, that it is one that “fits” a globalized secular liberal democracy, or at least ours, as it stands at the moment.

  2. Stipulate that it is not impossible for this vision to be implemented by certain ordinary folks (granting, with Jeff, that Christians are called not to be normies but to be saints: touché!). I raised questions of class in my review and my blog posts, but I didn’t see class come up in Jeff or Alan’s responses. My worry, plainly stated, is that middle-to-upper-middle-class Americans with college degrees, together with all the familial and social and financial capital that comes along with that status, are indeed capable of exercising prudence and discipline in their use of digital technology—and that everyone else is not. This is what I meant in the last post when I drew attention to the material conditions of Digital. It seems to me that the digital architecture of our lives, which in turn generates the social scripts in which and by which we understand and “author” our lives, has proven most disastrous for poor and working class folks, especially families. They aren’t the only people I mean by “normies,” but they certainly fall into that category. It isn’t Andy et al’s job to have a fix for this problem. But I do wonder whether they agree with me here, that it is not inaccurate to describe one’s ability to extricate oneself even somewhat from Digital’s reach as being a function of a certain class and/or educational privilege.

  3. In which case, I want to ask the practical question: How might we expand our vision of the good life under Mammon’s Digital reign to include poor and working class families?—a vision, in other words, that such people would find both attractive and achievable.

  4. If pursuing the good life is not impossible, and if it begins with a reorientation of the heart to the God we find revealed in Christ, then it seems to me that—as I believe Jeff, Alan, and Andy all agree—we cannot do this alone. On one hand, as we’ve already seen, we require certain material conditions. On the other hand, we need a community. But that word is too weak. What we need is the church. This is where my despairing mood comes in the back door. As I’ve written elsewhere, the church is in tatters. I do not look around and see a church capable of producing or sustaining, much less leading, prudent wisdom in managing the temptations of Digital. I see, or at least I feel, abject capitulation. Churches might be the last place I’d look for leadership or help here. Not because they’re especially bad, but because they’re the same as everyone else. I mean this question sincerely: Is your local church different, in terms of its use of and reliance on and presumptions about technology, than your local public schools, your local gyms, your local coffee shops? Likewise, are your church’s leaders or its members different, in terms of their relationship to Digital, than your non-Christian neighbors? If so, blessings upon you. That’s not my experience. And in any case, I don’t mean this as some sort of trump card. If our churches are failing (and they are), then it’s up to us to care for them, to love them, and to do what we can to fix what’s ailing them, under God. Moreover, the promise of Christ stands, whatever the disrepair of the church in America: the gates of hell shall not prevail against his people. That is as true now as it ever was, and it will remain true till the end of time. Which means, I imagine my friendly interlocutors would agree, that we not only may have hope, but may trust that God’s grace will be sufficient to the tasks he’s given us—in this case, the task of being faithful in a digital age. Yes and amen to all of that. The point I want to close with is more practical, more a matter of lived experience. If we need (a) the spiritual precondition of a reasonably healthy church community on top of (b) the material precondition of affluence-plus-college in order (c) to adopt modest, though real, habits of resistance to Mammon-cum-Digital … that’s a tall order! I hereby drop my claim that it is not doable, along with my wistful musings about a Butlerian Jihad from above. Nevertheless. It is profoundly dispiriting to face the full height of this particular mountain. Yes, we must climb it. Yes, it’s good know I’ve got brothers in arms ready to do it together; we don’t have to go it alone. But man, right now, if I’m honest, all I see is how high the summit reaches. So high you can’t see to the top of it.

Read More
Brad East Brad East

Teaching a 4/4: freedom

This is the fourth and final post in a series of reflections on what it means to be a scholar in the academy with a 4/4 teaching load. The first discussed the allotment of one’s hours in a 4/4 load; the second compared apples and oranges in terms of institutional contexts (i.e., the material and structural differences between the Ivy League and teaching colleges, as well as between partnered parents and single persons without children); the third offered a series of tips and strategies for 4/4 profs who want to make, foster, and protect the time necessary for reading and writing. This fourth and last post is a companion to the third, discussing in broader terms some of the gifts and opportunities afforded by serving at a teaching-heavy university.

This is the fourth and final post in a series of reflections on what it means to be a scholar in the academy with a 4/4 teaching load. The first discussed the allotment of one’s hours in a 4/4 load; the second identified the unavoidable tradeoffs that come with either having a family or serving at a particular kind of institution; the third offered a series of tips and strategies for 4/4 profs who want to make, foster, and protect the time necessary for reading and writing. This fourth and last post is a companion to the third, discussing in broader terms some of the gifts and opportunities afforded by serving at a teaching-heavy university as well as having a time-demanding family life outside of work.

Let’s say you accept my terms and agree that it’s possible to find the time to publish while teaching a 4/4. Still, you reply, that doesn’t make the high teaching load good; the load remains a hindrance to research, only a hindrance that can be (partially) overcome.

There are two things to say to this. First, teaching isn’t a hindrance. Nor is it just your job. Teaching is a calling. If it’s not your calling, you might want to get out of the game. As I’ve repeated throughout this series, research is a component of and companion to teaching; the job is twofold, a balance or dance. You don’t teach in order to write. You teach and you write; that’s the scholarly life.

Second, a high teaching load isn’t solely a set of challenges for research by comparison to positions in the scaling heights of the ivory tower: the Ivy League, the R1 state schools, the super-rich private universities whose research operations are a well-oiled machine. A high teaching load, which is usually a function of serving at an institution with less prestige, less money, less power, and so on, also presents unique opportunities for your research.

How so?

First, by taking the pressure off. I cannot put into words the relief I have felt every day on my job not having to publish or perish. Note well: I’m still publishing. But there is no one peering over my shoulder, no one nudging and shoving me toward some invisible finish line extending forward ahead of me, always within sight but never within reach. The sheer benefit to one’s mental health makes it worth a positive mention. When I compare notes with friends who work in The Big Leagues, their jobs sound claustrophobic, stultifying, enervating, depressing. Like a panic attack waiting to happen. Who wants that?

Second, by taking the pressure off what I’m supposed to write. Three months ago my first book was published. I first drafted it two years ago, in the fall of 2019. Would I have written that book, the way that I wrote it, were I at a different, more research-heavy institution? Answer: Not on your life. And it’s the best thing I’ve ever done. I couldn’t be prouder of it. But the spirit that breathes across its pages, a spirit I trust you can sense as a reader, is the spirit of freedom. I wrote exactly what I believed to be true, in the style I thought most apt to the content. The book is what I wanted it to be. Never, not in a million years, would I have done that had an administrator been breathing down my neck, asking me when my Next Big Book would be coming out, and with what university press, and on what topic, and written with what level of dense and unreadable prose. I didn’t write to make a splash. I wrote what was burning up my insides, what was begging to come out. And you know what? If the book ends up making any kind of splash, it’ll be because I wrote from passion and desire, not from administrative pressure or T&P criteria. And praise God for that.

Third, of a piece with an overall reduction of pressure is a broader freedom to pursue interests as they arise. In the last five years I have somehow, by God’s grace (and editors’ largesse), become a person who writes essays and criticism for magazines on wide-ranging topics that include but are not limited to my scholarly expertise in theology. I’m going to say it again: I could not and would not have done that at an Ivy or R1 institution. Why? Because their incentive structures do not care about such writing. But you know what? Alan Jacobs is right: Literary journalism (a term he takes from Frank Kermode) is a lot harder to write than peer-reviewed journal articles. Not only that, doing so makes you a much better writer, doubly so if you have no training as a writer and the only writing you cut your teeth on was inaccessible, jargon-heavy academic “prose.” Working with editors from The Point and The Los Angeles Review of Books and The New Atlantis and The Hedgehog Review and Commonweal and First Things and Comment and elsewhere has made me an immeasurably better writer than I was before I started, indeed than I ever would have been had I never tried my hand at such writing. Thank God, then, too, that I’m here at ACU and not at some soul-destroying publish-or-perish elite place that doesn’t care one whit whether you write well or whether what you write is read widely, only whether the right number of PRJAs is checked on the T&P portfolio. No thanks.

Fourth, my entire “research profile” bears the imprint of this pressure-free vocational freedom afforded by working at a teaching-heavy institution. On one hand, my third book (beginning to draft next month!) is a 25,000-word popular work on the church meant for lay audiences. Would I have signed that contract elsewhere? Probably not. On the other hand, my fourth book (not due for a few years) is a similarly popular work, longer and more detailed, however, on the proper role of digital technology in the life of churches and their ordained leaders. I’m currently reading my way into being able to write about that topic; I’ve also just finished a pilot course teaching on the same. Are you sensing a theme? My writing habits have followed an unplanned and undirected course these last few years; or rather, I have allowed those habits to follow desires that sprang up organically from my reading and teaching, and my institutional location not only permitted but encouraged that process. I can tell you, my conversations with colleagues at other institutions do not report a similar story.

Fifth, I have learned to accept my reading and writing limits in (what I take to be) a healthy way. I’ve always loved a moment that comes early in Wallace Stegner’s novel Crossing to Safety; the narrator is speaking of his early days as a college prof:

I remember little about Madison as a city, have no map of its streets in my mind, am rarely brought up short by remembered smells or colors from that time. I don’t even recall what courses I taught. I really never did live there, I only worked there. I landed working and never let up.

What I was paid to do I did conscientiously with forty percent of my mind and time. A Depression schedule, surely—four large classes, whatever they were, three days a week. Before and between and after my classes, I wrote, for despite my limited one-year appointment I hoped for continuance, and I did not intend to perish for lack of publications. I wrote an unbelievable amount, not only what I wanted to write but anything any editor asked for—stories, articles, book reviews, a novel, parts of a textbook. Logorrhea. A scholarly colleague, one of those who spent two months on a two-paragraph communication to Notes and Queries and had been working for six years on a book that nobody would ever publish, was heard to refer to me as the Man of Letters, spelled h-a-c-k. His sneer so little affected me that I can’t even remember his name.

Nowadays, people might wonder how my marriage lasted. It lasted fine. It throve, partly because I was as industrious as an anteater in a termite mound and wouldn’t have noticed anything short of a walkout, but more because Sally was completely supportive and never thought of herself as a neglected wife—“thesis widows,” we used to call them in graduate school. She was probably lonely for the first two or three weeks. Once we met the Langs she never had time to be, whether I was available or not. It was a toss-up who was neglecting whom.

Early in our time in Madison I stuck a chart on the concrete wall of my furnace room. It reminded me every morning that there are one hundred sixty-eight hours in a week. Seventy of those I dedicated to sleep, breakfasts, and dinners (chances for socializing with Sally in all of those areas). Lunches I made no allowance for because I brown-bagged it at noon in my office, and read papers while I ate. To my job—classes, preparation, office hours, conferences, paper-reading—I conceded fifty hours, though when students didn’t show up for appointments I could use the time for reading papers and so gain a few minutes elsewhere. With one hundred and twenty hours set aside, I had forty-eight for my own. Obviously I couldn’t write forty-eight hours a week, but I did my best, and when holidays at Thanksgiving and Christmas gave me a break, I exceeded my quota.

Hard to recapture. I was your basic overachiever, a workaholic, a pathological beaver of a boy who chewed continually because his teeth kept growing. Nobody could have sustained my schedule for long without a breakdown, and I learned my limitations eventually. Yet when I hear the contemporary disparagement of ambition and the work ethic, I bristle. I can’t help it.

I overdid, I punished us both. But I was anxious about the coming baby and uncertain about my job. I had learned something about deprivation, and I wanted to guarantee the future as much as effort could guarantee it. And I had been given, first by Story and then by the Atlantic, intimations that I had a gift.

Thinking about it now, I am struck by how modest my aims were. I didn’t expect to hit any jackpots. I had no definite goal. I merely wanted to do well what my inclinations and training led me to do, and I suppose I assumed that somehow, far off, some good might flow from it. I had no idea what. I respected literature and its vague addiction to truth at least as much as tycoons are supposed to respect money and power, but I never had time to sit down and consider why I respected it.

Ambition is a path, not a destination, and it is essentially the same path for everybody. No matter what the goal is, the path leads through Pilgrim’s Progress regions of motivation, hard work, persistence, stubbornness, and resilience under disappointment. Unconsidered, merely indulged, ambition becomes a vice; it can turn a man into a machine that knows nothing but how to run. Considered, it can be something else—pathway to the stars, maybe.

I suspect that what makes hedonists so angry when they think about overachievers is that the overachievers, without drugs or orgies, have more fun.

My wife just about spit out her coffee when she read that for the first time. She could relate. But part of the point, aside from the sheer dictatorial vision required to devote all the time one has to what one wants to achieve, is that hour-counting and hour-assigning is not a way of disregarding limits. It’s a way of admitting them and working within them.

For me, those limits bear less on writing than on reading. I’m a fast writer but a turtle-slow reader. I’ve never known someone who reads regularly, for work or for pleasure, who reads as slowly as I do. What that means is that I have to make choices. Here are two choices I’ve had to make that, upon reflection, have made me a better scholar—or at least a practicing scholar, whatever my merits; someone who’s in the game, not on the sidelines.

One choice was to accept that my reading would never be comprehensive. That’s an obvious thing to say, but you might be surprised by how few academics accept it in their heart of hearts. And it’s true, I’ve known one or two polymaths who genuinely seem to have read it all. But that ain’t me. Not even in a single area, not even in a subtopic of subtopic of a subtopic, like the doctrine of Scripture, about which I’ve now written two books. What I’ve not read vastly outweighs what I’ve read. That truth (and it is a truth) can be paralyzing or liberating. I’ve chosen to let it be liberating. Read what I can and write what I’m able, and if people find it of any use, God be praised; if not, then I guess I didn’t meet the magical threshold of “enough, though not everything.” Naturally you don’t want such self-allowance to avoid total comprehensiveness to slide into a permission to be lazy, to avoid covering all one’s bases. Yet the point stands: it’s never enough; let that be enough. Get on with it and do your work, in acceptance that someone someday will read what you’ve written and point out the text you should have cited. It’ll happen. Be grateful they pointed it out to you. You can take the time to read it for the next thing you write!

The second choice followed from the first. If I wasn’t going to be an independent scholar or research professor who reads 1,000 pages a day (as I’ve heard the encyclopedic Wolfhart Pannenberg did, before writing a book by dinner time), then I might as well broaden my reading to include both academic works far from my area of research and books (old or new) acclaimed for their insights, their impact, or the beauty of their prose. Not only has this practice proved a revolution in my reading habits, and for the better. It has made me a far better academic, scholar, writer, and teacher. Why? Because what I read and know is more than a mile deep and an inch wide. I try to read the dozen or two dozen annual “biggest books,” whether trade or academic, that get press in the NYRB or NYTBR or New Yorker or elsewhere. I read political philosophy and biblical studies and philosophy of science and social science and critical theory and memoirs and novels and collections of essays. Sometimes I review them. I don’t do this reading only at home. I do it in my office. It’s part of my scholarly labor. At this point I’d feel irresponsible if I stopped. It’s helped me resist the siren song of becoming a hedgehog, or a hedgehog alone. In the few areas on which I publish in academic journals, I am a hedgehog: ecclesiology, bibliology, Trinity. But otherwise I’m a fox, reading and writing on as many topics and authors and books as I can lay my hands on.

And I’m telling you: Not only would I not have done that were I not teaching a 4/4. It wouldn’t even have occurred to me. It wouldn’t be possible. The material conditions don’t encourage it, at least for pre-tenure faculty, at least most of the time. Usually they actively block or prohibit it.

That’s why I’m happy where I am. That’s why I don’t resist my high teaching load. That’s part of what makes teaching a 4/4 not just “not as bad as you think,” but (apart from the teaching, which is itself fun and rewarding and good work) a surprisingly conducive environment for research and publication. If you can make and guard time for it, it might actually turn out to be better than it would have been were you elsewhere. Who would have thought?

*

I’m not quite done, though. Consider the following something of a coda.

In the second post in this series I discussed not just institutional but personal and familial tradeoffs. So I want to add a word here about how and why having a rich life beyond work, full of bustling households bursting with children as well as friends, neighbors, churches, sports leagues, and community service, is not only good—being far, far more important than publishing—but, perchance, itself a boon to your academic work.

Here’s the nutshell version. Having something to come home to makes the work you do during the day meaningful, even when it doesn’t always feel like significant work. I never ask myself the question, Why am I even doing this? What’s it all for? That’s not because I think my writing will outlive me. It certainly will not. It’s because having four children who don’t know and don’t care that Dad’s an author (not to mention a wife whose stated marital purpose in life is to be unimpressed with me) puts my work into perspective. The souls for whom I am responsible are not only worth more than the straw that is my writing and teaching. They are a reminder that my job, though a vocation, is also, well, a job. More than a job, perhaps, but not less than one. It pays the bills and puts food on the table. That’s a worthy thing in its own right. And given that I do what I love with a flexible schedule that more than pays the bills, the truth is that I’ve got it made in the shade, professionally speaking. I’m employed, in a time of precarity, anxiety, uncertainty, and fear. What’s not to be grateful for?

Furthermore, as I briefly alluded to in the second post, having a family does important motivational and boundary-setting work, if you’ll let it. I don’t choose not to bring work home with me. That choice is made for me by my children. I can’t be working while driving them to basketball or picking them up from school or attending church or cooking dinner or singing them to sleep. And when they’re finally in bed, should I be a good husband and spend time with my wife, or each and every night march to my office to get a few more hours in? The question answers itself.

Here’s the irony, if it counts as one. Having fewer hours in the office and stronger boundaries between work and home—having, in a word, both more and more fixed limits on one’s time—can have the unexpected effect of supercharging what work time you have. Because I know I have to accomplish X, Y, and Z in only three or 12 or 20 hours, then I don’t have a choice (there’s that freedom in unchosen commitments theme again): I’m just going to have to get it done in the time I have, because once I clock out, the work is finished in any case. What such expectations within limits produce, at any rate in me, is a singularity of vision that crowds out all the usual distractions and detours and time-sucking routes of avoiding work. No Slack, no Twitter, no Facebook, no Instagram, no Gmail, no Messages, no WhatsApp, no nothing. Turn Freedom on or the internet off; kill your inbox or set your phone in another room. Whatever it takes, read the book or write the essay or fix the draft or review the submission or complete the grant or prepare the lesson or grade the papers. Just do it. The only time you have is now. Take advantage of it.

My anecdotal experience in doctoral studies confirms this dynamic. Especially when ABD, my single friends—some of them, I should say, some of the time—had many a day like the following: sleep in (that is, relative to my 6:07am baby-crying human alarm), check email and social media, drag themselves to a coffee shop, work for an hour or two, meet a friend for a late lunch, work a little more, grab drinks at a bar, then work into the wee hours of the night. Their self-report would then describe such an experience as “working all day”—not without some self-awareness, but all the while underwritten by a mixture of disappointment, frustration, and resentment at the lack of some objective structure or set of involuntary strictures organizing their time.

By comparison I often had exactly four total daytime hours in which to get the same amount of work done (sharing, as I did, childcare with my wife; stipends rarely stretch so far as to cover daycare or nannies). And so I did the work in the time I had. I didn’t have another choice. Would I have been as efficient had I been in their shoes? No way. It was the inflexible limits placed on my time that forced my hand. And I’m grateful they did.

The same dynamic obtains beyond the PhD, if you’re fortunate enough to have a tenure-track gig. Limits aren’t the enemy. They’re the secret sauce of happiness. Once accepted, or even befriended, they might just help you publish.

Even while teaching a 4/4.

Read More
Brad East Brad East

Foundation

Later this month the television adaptation of Isaac Asimov’s Foundation will premiere on Apple+. I had been planning on writing something about it, then doubled down on that plan when I read a piece resorting to that laziest but most common of critical terms of approval these days: the R-word. You know what it is. “Relevant.”

Later this month the television adaptation of Isaac Asimov’s Foundation will premiere on Apple+. I had been planning on writing something about it, then doubled down on that plan when I read a piece resorting to that laziest but most common of critical terms of approval these days: the R-word. You know what it is. “Relevant.” As in, and I quote, Asimov’s story is “deeply relevant” and represents “something that feels more relevant than ever these days.” Foundation may be relevant, but if it is, it’s not because Asimov has something useful to say about our lives. Nor is it because Asimov offers us a critique of the late decadent phase of the American imperium. It’s because Asimov’s text begs to be read against itself, as an unconscious window on the late modern technocratic mind that believes itself to be the solution to decadence, when it is actually its principal symptom.

I have, or rather had, a lot more to say about that. But then Alan Jacobs beat me to the punch. He notes how, in both Asimov’s trilogy and Arthur C. Clarke’s Childhood’s End, which were all written between 1942 and 1952, each author is “deeply invested in thinking about the ways old political orders give way to self-proclaimed Utopias; and both, also, see that the technocratic Utopia—as distinguished, I think, from the more traditional Utopias of authoritarian and totalitarian states—is a new thing in the world.”

Let me add one thing, which concerns the protagonist, Hari Seldon, and his Foundation scheme that sets the plot going. Not only is Seldon a pure projection of Asimov, or Asimov as he imagines himself and his ilk to be. The so-called science that Seldon has cracked is the science of predicting the future based on the past with perfect exactitude. And it’s the cranks who run the Empire who are fools not to believe his probabilistic calculations. I remember, when I first read the initial novel in the trilogy, thinking that Asimov was setting Seldon up to be a fool himself. I mean, imagine thinking “psychohistory” to be a legitimate empirical-mathematical enterprise in which the custodians of trillions of living souls should place their trust! But I was the foolish one. Naturally, that is exactly what Seldon-Asimov thinks world leaders should do: believe the science—in this case, the pseudo-science of technocrats tinkering with their algorithmic prediction machines. Knowing the unlikelihood of being believed, Seldon-Asimov sets in motion a series of events leading to the hoped-for future founding of a new intergalactic civilization with far less bloodshed and destruction than otherwise would have occurred (in the absence, that is, of his genius). His well-timed appearances and messages in the centuries to come are a running deus ex machina, only the god in the machine is Hermes, bringing one more message, just in the nick of time, from the omniscient Seldon-Asimov (speaking from the past). Not to put too fine a point on it, whereas the Foundation he establishes is meant to contain all the knowledge humans have amassed across the millennia, the cornerstone of the Foundation is—you guessed it—psychohistory. (It doesn’t help that every time just what the Foundation is preserving is mentioned it’s always, or almost always, the deliverances of the technical and empirical sciences, and never, or rarely, the treasures of the humanistic arts. You can be sure the gadgets of Steve Jobs reside safely in some Foundation vault; less so the works of Bach or Rembrandt.)

All that said, the book is worth a read, not least for its influence on Frank Herbert and George Lucas. And it’s still a fun, if not especially well written, yarn. And I might check out the show; it would be nice if the showrunners signaled their having grasped the unintended subtext of the story instead of buying into its ostensible prescience and relevance to the year of our Lord 2021. But I’m not holding my breath.

Read More
Brad East Brad East

Alan Jacobs on avoiding unpaid labor for surveillance capitalism

...it’s important to understand that a lot of what we call leisure now is actually not leisure. It is unpaid labor on behalf of surveillance capitalism, what Shoshana Zuboff calls surveillance capitalism. That is, we are all working for Google, we are working for Facebook.

...it’s important to understand that a lot of what we call leisure now is actually not leisure. It is unpaid labor on behalf of surveillance capitalism, what Shoshana Zuboff calls surveillance capitalism. That is, we are all working for Google, we are working for Facebook. I would like to spread a model of reading that is genuinely a leisure activity and that escapes the loop of being unpaid labor for surveillance capitalism. That will start small, and maybe it will stay small, but my hope is that it would be it would be bigger. Even people who have very hard, demanding lives can spend an enormous amount of time in this activity that we have been taught to feel is leisure, but is as I have said unpaid labor.

It’s interesting to see how things come into fashion. Think about how in the last few months we have decided that nothing is more important than the Post Office—that the Post Office is the greatest thing in the world. One of those bandwagons that I’ve been on for my entire life is libraries. I think libraries are just amazing. I grew up in a lower-middle-class, working class family. My father was in prison almost all of my childhood, and my mother worked long hours to try to keep us afloat. My grandmother was the one who took care of my sister and me, and we didn’t have much money for books. There were a lot of books in the house, but that was because members of my family would go to used bookshops and get these like ten cent used paperbacks that had been read ten times and had coffee spilled on them and so forth. So we spent massive amounts of time at the library. Once a week my grandmother and I would go to the library and come out with a great big bag full of books and we would just read relentlessly.

I’m the only person in my family who went beyond high school—in fact I’m the only person in my family to graduate from high school. Yet we read all the time. We always had a TV on in our house, but nobody ever watched it. A characteristic thing in my family would be me, my mother, my grandmother, my sister when she got old enough, sitting in a room with the TV on, the sound turned down, and we were all just sitting there reading books. That was everything to me. It opened the door for everything that I’ve done in the rest of my life. I owe so much to the city of Birmingham, Alabama’s, public libraries. I think when I was doing that I was simultaneously engaging in genuine leisure while also being formed as a thinker and as someone who could kind of step out of the flow of the moment and acquire perspective and tranquility. So I believe that I’m recommending something that is widely available, even to people who have very little money and very few resources, and I know that from my own childhood.

—Alan Jacobs, "The Fare Forward Interview with Alan Jacobs," Fare Forward (30 Dec 2020)

Read More
Brad East Brad East

New piece published in Mere Orthodoxy: "Befriending Books: On Reading and Thinking with Alan Jacobs and Zena Hitz"

I'm in Mere Orthodoxy with a long review-essay of two new books: Breaking Bread with the Dead: A Reader's Guide to a More Tranquil Mind by Alan Jacobs and Lost in Thought: The Hidden Pleasures of an Intellectual Life. Here's a section from the opening:

I'm in Mere Orthodoxy with a long review-essay of two new books: Breaking Bread with the Dead: A Reader's Guide to a More Tranquil Mind by Alan Jacobs and Lost in Thought: The Hidden Pleasures of an Intellectual Life. Here's a section from the opening:

If the quality of one’s thinking depends upon the quality of those one thinks with, the truth is that few of us have the ability to secure membership in a community of brilliant and wise, like-hearted but independent thinkers. Search for one as much as we like, we are bound to be frustrated. Moreover, recourse to the internet—one commonly proffered solution—is far more likely to exacerbate than to alleviate the problem: we may find like-minded souls, yes, but down that rabbit hole lies danger on every side. Far from nurturing studiositas, algorithms redirect the energies of the intellect into every manner of curiositas; far from preparing a multicourse feast, our digital masters function rather like Elliott in E.T., drawing us on with an endless trail of colorful candies. Underfed and unsatisfied, our minds continue to follow the path, munching on nothing, world without end.

Is there an alternative?

Jacobs believes there is. For the community of potential collaborators in thinking is not limited to the living, much less those relatively few living folks who surround each of us. It includes the dead. And how do we commune with the dead? Through books. A library is a kind of mausoleum: it houses the dead in the tombs of their words. We break bread with them, in Auden’s phrase, when we read them. Reading them, we find ourselves inducted into the great conversation that spans every civilization and culture from time immemorial on to the present and into the future. We encounter others who are really and truly other than us.

Go read the rest here.

Read More
Brad East Brad East

Are there good reasons to stay on Twitter?

Earlier this week Nolan Lawson wrote a brief post extolling people to get off Twitter. He opens it by saying, "Stop complaining about Twitter on Twitter. Deny them your attention, your time, and your data. Get off of Twitter. The more time you spend on Twitter, the more money you make for Twitter. Get off of Twitter."

Alan Jacobs picked up on this post and wrote in support: "The decision to be on Twitter (or Facebook, etc.) is not simply a personal choice. It has run-on effects for you but also for others. When you use the big social media platforms you contribute to their power and influence, and you deplete the energy and value of the open web. You make things worse for everyone. I truly believe that. Which is why I’m so obnoxiously repetitive on this point."

I've written extensively about my own habits of technology and internet discipline. I deleted my Facebook account. I don't have any social media apps on my iPhone; nor do I even have access to email on there. I use it for calls, texts, podcasts, pictures of my kids (no iCloud!), directions, the weather, and Instapaper. I use Freedom to eliminate my access to the internet, on either my phone or my laptop, for 3-4 hours at a time, two to three times a day. I don't read articles or reply to emails until lunch time, then hold off until end of (work) day or end of (actual) day—i.e., after the kids go to bed. I'm not on Instagram or Snapchat or any of the new social media start-ups.

So why am I still on Twitter? I'm primed to agree with Lawson and Jacobs, after all. And I certainly do agree, to a large extent: Twitter is a fetid swamp of nightmarish human interaction; a digital slot machine with little upside and all downside. I have no doubt that 90% of people on Twitter need to get off entirely, and 100% of people on Twitter should use it 90% less than they do. Twitter warps the mind (journalism's degradation owes a great deal to @Jack); it is unhealthy for the brain and damaging for the soul. No one who deleted their Twitter account would become a less well-rounded, mentally and emotionally and spiritually fulfilled person.

So, again: Why am I still on Twitter? Are there any good reasons to stay?

For me, the answer is yes. The truth is that for the last 3 years (the main years of my really using it) my time on Twitter has been almost uniformly positive, and there have been numerous concrete benefits. At least for now, it's still worth it to me.

How has that happened? Partly I'm sure by dumb luck. Partly by already having instituted fairly rigorous habits of discipline (it's hard to fall into the infinite scroll if the scroll is inaccessible from your handheld device! And the same goes for instant posting, or posting pictures directly from my phone, which I can't do, or for getting into flame wars, or for getting notifications on my home screen, which I don't—since, again, it's not on my phone, and my phone is always (always!) on Do Not Disturb and Silent and, if I'm in the office, on Airplane Mode; you get it now: the goal is to be uninterrupted and generally unreachable).

Partly it's my intended mode of presence on Twitter: Be myself; don't argue about serious things with strangers; only argue at all if the other person is game, the topic is interesting, and the conversation is pleasant or edifying or fun; always think, "Would my wife or dad or best friend or pastor or dean or the Lord Jesus himself approve of this tweet?" (that does away with a lot of stupidity, meanness, and self-aggrandizement fast). As a rule, I would like for people who "meet" me on Twitter to meet me in person and find the two wholly consonant. Further, I try hard never to "dunk" on anyone. Twitter wants us to be cruel to one another: why give in?

I limit my follows fairly severely: only people I know personally, or read often, or admire, or learn something from, or take joy in following. For as long as I'm on Twitter I would like to keep my follows between 400 and 500 (kept low through annual culling). The moment someone who follows me acts cruelly or becomes a distraction, to myself or others, I immediately mute them (blocks are reserved, for now, for obvious bots). I don't feel compelled to respond to every reply. And I tend to "interface" with Twitter not through THE SCROLL but through about a dozen bookmarked profiles of people, usually writers or fellow academics, who always have interesting things to say or post links worth saving for later. All in all, I try to limit my daily time on Twitter to 10-30 minutes, less on Saturdays and (ordinarily, or aspirationally) zero on Sundays—at least so long as the kids are awake.

So much for my rules. What benefits have resulted from being on Twitter?

First, it appears that I have what can only be called a readership. Even if said readership comprises "only" a few hundred folks (I have just over a thousand followers), that number is greater than zero, which until very recently was the number of my readers not related to me by blood. And until such time (which will be no time) that I have thousands upon tens of thousands of readers—nay, in the millions!—it is rewarding and meaningful to interact with people who take the time to read, support, share, and comment on my work.

(That raises the question: Should the time actually come, and I'm sure that it will, when I am bombarded by trolls and the rank wickedness that erupts from the bowels of Twitter Hell for so many people? I will take one of two courses of action. I will adopt the policy of not reading my replies, as wise Public People do. But if that's not good enough, that will be the day, the very day, that I quit Twitter for good. And perhaps Lawson and Jacobs both arrived at that point long ago, which launched them off the platform. If so, good for them.)

Additionally, I have made contacts with a host of people across the country (and the world) with whom I share some common interest, not least within the theological academy. Some of these have become, or are fast becoming, genuine friendships. And because we theologians find reasons to gather together each year (AAR/SBL, SCE, CSC, etc.), budding online friendships actually generate in-person meetings and hangouts. Real life facilitated by the internet! Who would've thought?

I have also received multiple writing opportunities simply in virtue of being on Twitter. Those opportunities came directly or indirectly from embedding myself, even if (to my mind) invisibly, in networks of writers, editors, publishers, and the like. (I literally signed a book contract last week based on an email from an editor who found me on Twitter based on some writing and tweeting I'd done.) As I've always said, academic epistemology is grounded in gossip, and gossip (of the non-pejorative kind) depends entirely on who you know. The same goes for the world of publishing. And since writers and editors love Twitter—doubtless to their detriment—Twitter's the place to be to "hang around" and "hear" stuff, and eventually be noticed by one or two fine folks, and be welcomed into the conversation. That's happened to me already, in mostly small ways; but they add up.

So that's it, give or take. On a given week, I average 60-90 minutes on Twitter spread across 5-6 days, mostly during lunch or early evening hours, on my laptop, never on my phone, typically checking just a handful of folks' profiles, sending off a tweet or two myself, never battling, never feeding the trolls, saving my time and energy for real life (home, kids, church, friends) and for periods of sustained, undistracted attention at work, whether reading or writing.

Having said that, if I were a betting man, I would hazard a guess that I'll be off Twitter within five years, or that the site will no longer exist in anything like its current form. My time on Twitter is unrepresentative, and probably can't last. But so long as it does, and the benefits remain, I'll "be" there, and I think the reasons I've offered are sufficient to justify the decision.
Read More
Brad East Brad East

The value of keeping up with the news

If you've been paying attention the last two weeks, an ongoing controversy erupted Saturday, January 19, and is still unfolding in one of the many seemingly endless iterations such controversies generate today through social media, op-eds, and the like. Last week, in Alan Jacobs's newsletter, he wrote this:

"On Tuesday morning, January 22, I read a David Brooks column about a confrontation that happened on the National Mall during the March for Life. Until I read that column I had heard nothing about this incident because I do not have a Facebook account, have deleted my Twitter account, don’t watch TV news, and read the news about once a week. If all goes well, I won’t hear anything more about the story. I recommend this set of practices to you all."

This got me thinking about a post Paul Griffiths wrote on his blog years ago, perhaps even a decade ago (would that he kept that blog up longer!). He reflected on the ideal way of keeping up with the news—and, note well, this was before the rise of Twitter et al. as the driver of minute-by-minute "news" content. He suggested that there is no real good served in knowing what is going on day-to-day, whether that comes through the newspaper or the television. Instead, what one ought to do is slow the arrival of news to oneself so far as possible. His off-the-cuff proposal: subscribe to a handful of monthly or bimonthly publications ranging the ideological spectrum and, preferably, with a more global focus so as to avoid the parochialism not just of time but of space. Whenever the magazines or journals arrive, you devote a few hours to reading patient, time-cushioned reflection and reporting on the goings-on of the world—99% of which bears on your life not one iota—and then you continue on with your life (since, as should be self-evident to all of us, no one but a few family and friends needs to know what we think about it).

Consider how much saner your life, indeed all of our lives, would be if we did something like Griffiths' proposal. And think about how not doing it, and instead "engaging in the discourse," posting on Facebook, tweeting opinions, arguing online: how none of it does anything at all except raise blood pressure, foment discord, engender discontent, etc. Activists and advocates of local participatory democracy are fools if they think anything remotely like what we have now serves their goals. If we slowed our news intake, resisted the urge to pontificate, and paid more attention to the persons and needs and tasks before us, the world—as a whole and each of its parts—would be a much better place than it is at this moment.
Read More
Brad East Brad East

Scialabba, Jacobs, and God's existence: where the real problem lies

Alan Jacobs is right about George Scialabba's latest review essay, in this case of John Gray's new book, Seven Types of Atheism, for The New Republic. Scialabba is always great, but his theological instincts fail him here. As Jacobs observes, Scialabba wants to speak up for nonbelievers who wish God—if he does in fact exist—would simply make himself known in some inarguably clear way. But since, apparently, he does not and has not, that in itself is evidence that no such thing as an all-wise, all-good, all-powerful deity exists; or that, if he does, our knowledge or beliefs about or relationship with God is a negligible matter, and all will be sorted out in the hereafter.

Jacobs takes Scialabba to task for both the unthinking glibness on display (frivolous speculation about our ancient ancestors; writing contemporary mystics and charismatics out of the picture; etc.) and the more serious inattentiveness to what a truly incontrovertible divine self-revelation would mean. Jacobs uses the work of David Bentley Hart to remind us just what we mean, or rather do not mean, when we use the word "God," and how Scialabba is functionally reverting to a mythical picture of god-as-super-creature who yet inexplicably remains opaque to us here below. Jacobs then (being Jacobs) draws us to a speech of Satan's in Paradise Lost, bringing the existential point home.

Let me piggy-back on Jacobs' critique and suggest an even deeper problem with Scialabba's musings, one I've reflected on before at some length. The problem is twofold.

On the one hand, it cannot be emphasized enough that the kind of skeptical atheism that Scialabba sees himself standing up for here is vanishingly small in human history, both past and present. So far as we can tell, nearly all human beings who have ever lived have taken it for granted that reality is more than the empiricists suggest; that there is some Power or Goodness or Being that transcends the visible and tangible, preceding and encompassing it; that human life, though brief and sometimes terribly burdened, carries more weight and has more depth than those features would suggest on their face, and that it may or will in one way or another outlast its short span on this earth. Even today, the overwhelming majority of people on this globe "believe" in what we in the West call divinity or practice what we in the West call religion. The anxious queries of skeptical atheists, while worth taking seriously at an intellectual and emotional level, could not be less representative of humanity in general's relationship with "the God question."

In short, the sort of defeaters Scialabba offers as evidence of God's lack of self-revelation bear little to no relation to the average person's thoughts or experience regarding God's existence. Most people don't need God to write his name on the sun. In a sense, he already has.

Such a response doesn't go very far, though, in responding to Scialabba's true concern. Perhaps most human beings, past and present, are just not philosophically rigorous or serious enough to ask the tough questions that inexorably lead to atheism. Or perhaps it's not "religious belief" in general but the challenge of revelational certainty, i.e., which religion/deity to believe in, that's at issue. Here's what's most deeply wrong with his argument then.

The Christian tradition does not teach, nor has it ever taught, that the most important thing to do is believe that God exists, or even that the Christian God exists. Instead, the most important thing is to love this God with all one's heart, soul, mind, and strength. What God wants from you—demands, in fact—is not affirmation of a proposition about himself or mental assent to the facticity of his being. Rather, it is the totality of your being, the absolute and unconditional lifelong allegiance of your very self. What God wants is faithfulness.

And it turns out, according to the painfully consistent testimony of Holy Scripture, that faithfulness is a lot harder than faith. By which I mean: total devotion to God is far more difficult than belief that God exists. As the epistle of James says, the demons believe that God is one—and shudder. Israel at Sinai doesn't lack the belief that YHWH exists; there's evidence aplenty for that: lightning and thunder and a great cloud and the divine voice and Lord's glory; everything Scialabba wants from God! But what do the Israelites do? They make a golden calf and worship a false god. In doing so, they do not subtract belief in YHWH; they add to that belief "belief" in other gods. Which is to say, they add to worship of the one God the worship of that which is not God.

Our problem, therefore, isn't belief that God exists in the face of a thousand reasonable doubts. Our problem is idolatry. When the one true God comes near to human beings, when they hear his voice and see his face, they know it to be true—and they turn away. They know God—and sin. They believe "in" God—and disobey him. They lack doubt—and hurt others.

For Christians, this problem is illustrated most of all in the Gospels. Time and again the apostles see with their own eyes the identity and deeds of the incarnate Son of God, and time and again they misunderstand, mis-hear, mis-speak, fall away, to the point of deserting him in his hour of need and even denying ever knowing him.

Scialabba wants God to make it impossible to disbelieve in his existence. But even if God were to do that, it wouldn't change the fundamental problem—our sinful, wicked hearts, prone to evil and violence from birth and a veritable factory of idols—one bit. Or rather, what we would need is the kind of belief, the sort of knowledge, that went to the root of that problem, transforming us from the inside out. Making true worship possible; ridding us of idolatry; supplying us the power to do what we could never do for ourselves; making faithfulness a reality, that we might finally and wholeheartedly love God and love our neighbors as ourselves.

Christians believe God has done just this for the world in and through Christ. No dispositive evidence will persuade a Scialabba that this is the case. But the gospel isn't meant to answer such a request. Contained within the solution it offers is an entirely different diagnosis of our situation and thus of our greatest need. If the gospel and the faith it proclaims are to be rejected, those are the terms on which to do so.
Read More
Brad East Brad East

Marilynne Robinson should know better

Like most seminary graduates and all theologians, I have long loved Marilynne Robinson. I read Gilead in the summer of 2005 and The Death of Adam shortly thereafter, and the deep affection created in those encounters has never left me. I have heard her speak on more than one occasion—at a church of Christ university and at Yale Divinity School, my two worlds colliding—and she was nothing but gracious, eloquent, and compelling in her anointed role as liberal Christian public intellectual. God give us more Marilynne Robinsons.

The beauty of her writing, her thought, and her public presence makes all the more painful the increasingly evident internal contradiction at the heart of her work. Others, with greater depth and insight than I am capable of offering here, have drawn attention to a related set of issues; see especially the reviews of her most recent volume of essays by Micah Meadowcroft, James K. A. Smith, B. D. McClay, Doug Sikkema, and Wesley Hill (alongside the earlier critique of Alan Jacobs, and the brief reflection yesterday by Bryan McGraw). The specific issue I am thinking of is not her inattention to original sin or total depravity, or the way in which such a lacuna is a serious misrepresentation of the Reformed tradition; or the way in which her politics is more or less a down-the-line box-checking of the Democratic National Committee, plus God and minus chronological snobbery; or how her public friendship with Barack Obama veers weirdly toward the obeisant—as if the former President did not order hundreds of drone strikes or deport millions of illegal immigrants or grant legal immunity to members of the CIA who engaged in torture under the previous administration or...

These and other matters are all worthy of interrogation and critique. What I'm interested in is her fundamental lack of charity or empathy toward people with whom she disagrees, namely red-state Christians, particularly those who voted for or continue to support Donald Trump.

Robinson is a novelist. The novelist's modus operandi is to create and embody the lives of human beings from the inside, human beings who are by definition not the author, whose beliefs and deeds are neither necessarily good nor necessarily defensible nor even necessarily consistent with one another. Moreover, Robinson has taken this deep attention to difference in the irreducible diversity of human affairs and applied it to history, recommending to others that they resist the thought-terminating prejudices of common cliche and instead offer to historically disreputable groups and personages the same benefit of doubt they would hope to be given themselves. Moses and Paul, Calvin and Cromwell, the Lollards and the Puritans: all are set within their original social and cultural context, judged by standards available at the time, compared to peers rather than progeny, permitted above all to be complex, conscientious, fallible, fallen, gloriously human members of one and the same species as the rest of us.

Comes the question: Why not use this same hermeneutic on the Repugnant Cultural Other that is the Trump-voting red-state would-be evangelical Christian? For, to use Alan Jacobs's term, that is what such a person is for Robinson. Doubtless such a person, and such a group, deserve robust criticism. They need not be treated with either paternalism or condescension. Say what you think, and give reasons why those with whom you disagree are wrong.

But Robinson does not take even one proverbial second to imagine the motivations or emotions or experiences of such people, or to consider how they might have been formed, or to what extent, if at all, they might be legitimate, whatever they popular expression in national politics or cable news. She does not step into their shoes. She does not even appear to imagine that they have shoes to step in. They are shoeless ciphers, a caricature without flesh and blood. And so she can, without irony, slander them in an essay on slander and Fox News; or question their Christian identity while challenging their recourse to drawing lines around orthodoxy; or treat them as a monolithic bloc in paeans to individualism; or castigate them as a group without showing knowledge of literate representation of their views, while wringing her hands about reading primary texts and avoiding capitulation to popular prejudice. Is there any popular prejudice more culturally acceptable in 2018 than dismissing all Republicans south of the Mason-Dixon line as bigoted dummies feigning faith as cover for benighted tribalism?

I am neither an evangelical nor a Republican nor a Trump voter. But I've lived in Texas, Georgia, and Connecticut, in major urban areas as well as "Trump country," with extended family spread out across the South and friends distributed along the Acela corridor. Robinson embodies the smug disdain common to all right-thinking people for folks I've counted and continue to count as neighbors, elders, family members, and fellow believers. More than anyone, Robinson should know that disagreement and critique, however fierce, are not obstacles to love—to the considerate love that generates attention, subtlety, and fellow feeling, without thereby obviating difference or conflict.

She should know better. She owes us more.
Read More
Brad East Brad East

New essay published at the LA Review of Books: "Public Theology in Retreat"

I've got a new essay available over at the Los Angeles Review of Books called "Public Theology in Retreat." It's ostensibly a review essay of three books published by David Bentley Hart in the last year, but I use that occasion to ask about the role of public theology in contemporary U.S. intellectual culture, using Hart as a sort of Trojan horse. Alan Jacobs's essay in Harper's last year serves as a framing device, and I look at Hart as an exception that proves the rule—even while portraying Hart's thought to a largely non-theological audience as a kind of specimen, to intrigue and possibly attract unfamiliar and potentially hostile minds. We live in perilous and fickle times, after all. Why not give theology a try? There have been stranger bedfellows.

My thanks to the editors at LARB for publishing a work of straightforward theological exposition like this; I know it's not their usual cup of tea. I confess that I have steeled myself for more than one failure to read the actual argument of the piece, but so it goes. Mostly I'm excited to see what charitable readers make of it, from whatever perspective. So check it out and let me know what you think.
Read More