Book Reviews Archive – Law & Liberty https://lawliberty.org/book-reviews/ Thu, 19 Jun 2025 01:20:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 226183671 Getting Right with Buckley https://lawliberty.org/book-review/getting-right-with-buckley/ Thu, 19 Jun 2025 10:01:00 +0000 https://lawliberty.org/?post_type=book_review&p=68105 Conservatives have been waiting for Sam Tanenhaus’s official biography of William F. Buckley Jr. for too long. Twenty-seven years, to be precise. The last line of text in the acknowledgments on page 868 quotes Buckley shortly before his death, “I know I won’t see my biography.” So much procrastination and delay on the part of […]

The post Getting Right with Buckley appeared first on Law & Liberty.

]]>
Conservatives have been waiting for Sam Tanenhaus’s official biography of William F. Buckley Jr. for too long. Twenty-seven years, to be precise. The last line of text in the acknowledgments on page 868 quotes Buckley shortly before his death, “I know I won’t see my biography.” So much procrastination and delay on the part of the author indicate a divided mind and an inability to focus on the project. Things change over 27 years, including authors, potential audiences, and even the public memory of a figure like Buckley. This disappointed reader must ask: Does the biography give us Buckley, or rather, a disconnected series of reflections about who the American liberal mind needs Buckley to be?

Buckley: The Life and the Revolution That Changed America is a work of immense research and thorough study, replete with archival work, oral interviews, and an exhaustive examination of the subject’s life and career at every turn. Tanenhaus had untrammeled access to Buckley’s personal papers and calendar, and other materials. He presents a picture of the man enmeshed in a complex web of cultural, familial, educational, social, political, and interpersonal contexts. In certain respects, the reader now possesses more of Buckley than perhaps was wanted, or that even most closely informs what Buckley believed was his life’s mission. Almost no stone is left unturned in Buckley’s parents’ and siblings’ lives either.

One of the problems is that the book is not proportional to Buckley’s life. In a tome of almost 1,000 pages, fewer than 100 are devoted to the momentous period in Buckley’s life and the conservative movement, ranging from Reagan’s election to the presidency in 1980 to Buckley’s death in 2008. The one thing most needful is lacking, that is, an appreciation and evaluation of the precise contours of the Revolution that the author identifies in the subtitle as Buckley’s chief contribution to American life. What was this revolution? On that crucial topic, Tanenhaus seems reluctant to offer an opinion.

Since Tanenhaus took decades to write the book, one must take note of the significant shift in the American Left over the last few decades. Did this change in left-liberalism influence the biography that Tanenhaus ended up writing? The Left in America has moved from existing as a relatively benign force routed by conservative victories in the 1980s and 1990s and therefore content to manage the welfare state, keep tax rates reasonable, and ensure stable economic growth, to becoming a revolutionary force incapable of affirming American citizenship, race-neutral policies, or even articulating the biological differences between men and women. Race and plasticized gender became its calling cards, weapons wielded to fashion an America in the image of egalitarian humanism. Crucial political, educational, and cultural institutions, most crucial to constitutional democracy, have been locked into racial stories, struggle sessions, or patriarchal oppression plays. Because the book is rooted in race, family, culture, and historical sweep as opposed to the distinct presentation of Buckley’s ideas, it is not an easy question to dismiss. Did a similar movement permeate the mind of Sam Tanenhaus?

Historically, conservative readers loved Tanenhaus’s magnificent 1997 biography of Whittaker Chambers, which became a finalist for the National Book Award and the Pulitzer Prize. A former editor of The New York Times Book Review and a liberal in good standing with the ideology’s professional accreditation unit, Tanenhaus, while researching the book, had reevaluated many of his convictions about anti-communism and the somewhat dismissive or complacent attitudes that liberals had long held regarding communist infiltration into the federal government during the New Deal and World War II. In the biography’s thorough examination of Chambers’s sources and files, he came to know the truth and realized that Chambers had told the truth about the espionage accusations against Alger Hiss, and much more besides. The strength of this biography, plus the favorable opinion of his son, Christopher, convinced Buckley that Tanenhaus should write his biography.

Tanenhaus avoids the evidence and the seriousness of thought and purpose that a man like Buckley put into a lifetime of forging the American conservative movement.

In 2009, Tanenhaus authored The Death of Conservatism, a post-George W. Bush-inflected account of how movement conservatism was spent. The book’s thesis was that conservatives had held power and brought the country to the brink of domestic and international ruin. (He also hubristically argued that conservatism should only serve as a modest corrective to ascendant liberalism and not a real alternative.) It now fell to the Obama presidency to rearticulate America’s meaning and purpose in a second New Deal policy matrix. Ironically, six months into Obama’s term, the Tea Party emerged, and the Obama presidency, following a pyrrhic victory on healthcare, was reduced to administrative state machinations. Tanenhaus had fallen in love indiscriminately with a moment in time and was badly exposed, having opened himself to being attacked by reality checks.

A few years later, Tanenhaus penned an essay asserting that American conservatism was one long footnote to the concurrent majority thesis of former vice president and political theorist John Calhoun. In his works of political theory, Calhoun conceived of additional limits on the Constitution’s requirements for passing legislation in order to further protect certain minority interests, including those of slaveowners. In arguing that conservatism should only be understood in a Calhounite context, Tanenhaus was saying that the effectual truth of conservatism in America is that it is a heretical, ahistorical, and cloaked attempt to grab rights and liberties for various subgroups like greedy corporations, racists, and religious fanatics by exploiting supermajority requirements in the federal government’s constitutional architecture. Tanenhaus’ message: The country would be better off without it. And Tanenhaus was already not so subtly playing the race card.

These points are worth mentioning, not merely to criticize Tanenhaus, but because they provide the background for understanding a book about a man of ideas, politics, and high culture who is reduced in many respects to the mere product of family and historical circumstances. In short, Tanenhaus seems fundamentally incapable of grasping that conservatives believe that ideas have consequences, that they take shape through the interaction of men’s minds with reality, nature, and God, and that we can articulate a philosophical foundation for American constitutionalism and the morality that underpins it. Tanenhaus, as a good American liberal, bows, however “moderately,” before the altars of class, race, and gender. These become the deft arrows he repeatedly shoots into Buckley’s oeuvre, rendering it a corpse instead of a foundational work capable of intelligently guiding future efforts.

In this biography, Buckley’s revolution remains undefined, not because of Buckley’s inability to limn its definition but because of Tanenhaus’s intellectual limits. The book opens oddly with an epigraph from Marcel Proust’s Swann’s Way:

Facts do not penetrate the sphere in which our beliefs are cherished. They did not engender those beliefs and are powerless to destroy them. They can inflict on them continual blows of contradiction and disproof without weakening them. And an avalanche of miseries and maladies succeeding one another without interruption in the life of a family will not make it doubt either the benevolence of God or the competence of its physician.

From that opening encounter, the discerning reader knows that this biography is not just about Buckley but also about Tanenhaus and his intellectual prejudices. And what of the line about family never doubting or seeking a physician, refusing to seek any other source of wisdom? Did Buckley not let facts intrude upon his thinking and action? Should we only understand Buckley within the context of his family? Tanenhaus’s words diminish Buckley—the man who lived a life arguing, writing, speaking, interviewing, and attempting to rebuild the principles of faith, freedom, patriotism, and a commitment to the highest ideals of western civilization—to a figure ultimately lost in impenetrable prejudices and belief structures that remained impervious to the finer points of emergent reason and logic.

Praised and read by Ronald Reagan, Margaret Thatcher, Barry Goldwater, Henry Kissinger, among other leading statesmen of his day, a man who founded National Review in 1955, whose thousands of syndicated columns and essays, numerous books and novels, and years of his famous PBS Firing Line program elevated the status of American discourse and made conservative thought a non-negotiable force in American life is ultimately sealed within his time, incapable of being a living presence to future thinkers and political actors. Nonetheless, Tanenhaus does praise Buckley at the end, adding: In his death, many gathered “in affectionate remembrance of one it seemed natural to speak of as a great man.” He had been “the country’s greatest conservative,” and “left a vacuum no one since has been able to fill.” But this is all too little and too late.

The book is organized mostly along chronological lines. It begins with William F. Buckley Sr. and his ill-fated attempt to build an oil empire in late nineteenth and early twentieth-century Mexico. He succeeded, only to have the socialist revolution confiscate his business, leaving him with almost nothing. Determined to strike again, he successfully did—this time in Venezuela. The family was Southern and devotedly Catholic, with Buckley Sr. from the border region of Texas and his wife, Aloise Sterner, from a New Orleans family. They would have eleven children, with Bill Buckley as the fourth child. They split time between an imposing estate, “Great Elm,” in Sharon, Connecticut, and “Kamschatka,” a restored plantation in Camden, South Carolina.

Tanenhaus repeatedly reminds the reader of the father’s antisemitism, and his less-than-enlightened attitude about race and equality for blacks. He does note that the Buckley family’s attitude was also characterized by Christian charity, although this charity did not fully extend to advocating for civil rights and full legal equality. To his credit, Tanenhaus quotes Edward Allen, a former black servant in the Buckleys’ South Carolina home, who observed the decency and warmth of the family, including Buckley Sr., towards its mostly black servants. Allen noted that a white man once attempted, right in front of his father, to take his father’s job as a groundskeeper at Kamschatka. Buckley Sr. responded forcefully, “I wouldn’t hire ten of you.” Telling the man, “Get off my place.” Years later, after the Buckleys had sold Kamschatka, Allen, by then in his eighties, said that he remembered the Buckleys well. Sometimes he said he strolled past their home, and “every time, I look up to the heavens and thank God for the Buckleys.”

Seemingly, Tanenhaus’s overall purpose in writing this part is to link Buckley, National Review, and conservatism within a web of Southern racism. He spends entire chapters detailing race and the Buckleys. He reveals, for instance, that Buckley Sr. encountered and approved of the earliest manifestation of the infamous “Southern Strategy,” (at the time, a Republican electoral strategy meant to increase votes among the white electorate by stirring up racial anxiety about social change) and one that the family knew from its connection to South Carolina Senator Strom Thurmond who ran on the “Dixiecrat” ticket in 1948. Quoting Buckley’s father as saying in 1949, “I can’t exaggerate the interest in the East in a combination between dissident Southern Democrats and the Republicans,” Tanenhaus proposes that much of the opposition in the South to the New Deal and its economic program was rooted in racism, and the fear that the regulatory power of the government would lead to integrated workplaces and greater economic equality for blacks. In short, the through line for Tanenhaus, as it is for most liberal observers, is that the Southern Strategy and its incipient racism drove conservative politics to the present day, and helped create Goldwater, Wallace, Nixon, and Reagan, with Buckley riding the wave.

Tanenhaus notes that Buckley made racialist comments about blacks and intelligence at his prep school, Millbrook Academy, in the 1930s. He apparently opposed an interracial dance as an undergraduate in the 1950s at Yale, but supported the social event. Tanenhaus reminds us that National Review originally did not support civil rights legislation, although its stance would change as later thinkers, such as Harry Jaffa and Charles Kesler, among others, influenced the publication. Willmoore Kendall, Buckley’s academic mentor at Yale and a legendary early editor of National Review, also reversed his position. Tanenhaus does mention that Buckley later regretted and admitted fault for opposing the Civil Rights Act, but is unable to resist the temptation to add (his own personal view) that some of Buckley’s famous editorials on race and the South, most prominently “Why the South Must Prevail,” continue to make it difficult for many to take conservatism as a set of ideas not informed by racial animus. Really?

Tanenhaus goes too far here and fails to recognize how Buckley and the larger conservative movement he led became a salutary force on this issue, right up to the Supreme Court’s 2023 decision in Harvard v. Students for Fair Admissions, where the Court ruled that affirmative action violated the Constitution and the Civil Rights Act. Liberals on the Court, in the academy, and from their various cultural perches denounced the decision. Dividing people up by race has become their core ideological position. Conservatives repaired to the Declaration to recover truthful thinking on race, while liberals further uncovered their progressive ideology rooted in group rights, collectivism, and the rejection of the American Founding’s constitutional principles. The results speak for themselves. Tanenhaus can’t even bring himself to admit this. He approaches race and racism in a hopelessly Manichean way.

Buckley apologized and changed his mind about the issue. Conservatism itself would become the only significant force in American politics advocating for a color-blind constitution. But it’s not enough for Tanenhaus. The undertone of the book is that Buckleyite conservatism and racial discrimination are as thick as the morning dew on the South Carolina grass.

Another driving impetus of Buckley’s conservatism, Tanenhaus opines, was the “America First” antiwar movement led by Charles Lindbergh. The Buckley family rapturously followed Lindbergh, until Pearl Harbor. They were opposed to America entering the war and assisting the British. They believed that involvement in Europe’s war would lead to grave ills at home. Buckley followed his father’s lead, of course, and he was in prep school for most of this period. He later served as an officer in the Army, but never saw combat or left the United States. Tanenhaus’s rooting of Buckley’s conservatism in the America First movement seems a strange choice, completely belied by a deeper consideration of the ideas and the actions Buckley took after World War II.

While Buckley and his entire family were a part of the Lindbergh movement, so were many others in America. It was not an exotic position. That tune changed dramatically for everyone when Japan attacked Pearl Harbor. Moreover, far from isolationist politics, what Buckley expressed after the war, when the Soviet Union rose in opposition to the US, was a belief that America must defeat its communist foe. He believed in the rollback of communism globally. Tanenhaus documents at length Buckley’s defense of the principles behind Senator Joseph McCarthy’s anticommunist pressure campaign, which he expressed in a 1954 book, McCarthy and His Enemies, jointly authored with his close lifelong friend, Brent Bozell Jr. (who was also his brother-in-law, married to his sister, Patricia Buckley). Both Buckley and Bozell were more supportive of the galvanizing anticommunist thrust of McCarthy’s program rather than his tactics and bruising personality.

Does the biography give us Buckley, or rather, a disconnected series of reflections about who the American liberal mind needs Buckley to be?

There seems to be little here linking pre-World War II America First-ism with the postwar conservative movement that Buckley brought into being. Buckley closely hewed to the counsel of former New York University professor and famous public intellectual James Burnham, who was National Review’s chief foreign policy theorist. Both favored “psyop” campaigns against communists and domestic sympathizers to create the right images in people’s minds of what they were fighting against during the Cold War. Opposition to communism was one piece of glue uniting figures as diverse as Milton Friedman, Russell Kirk, and Frank Meyer, with Buckley the winsome leader of it all.

What we hear even less of in the book is Buckley’s opposition to the real revolution within the nation, that of Franklin D. Roosevelt’s election in 1932 and his dramatic expansion of government, dismissal of separation of powers, blurring of the lines between government and business, and the creation of a welfare state in America. The opposition to the principles undergirding this progressive constitutional and economic project was the other main source of unity for the disparate band of libertarians, anticommunists, classical liberals, and religious and cultural conservatives that Buckley assembled in the pages of National Review. This practical fusion of thinkers, forged in anticommunist fervor and the need to roll back domestic power in the federal government, is crucial to understanding Buckley’s legacy. Its reason for existing was not in racism or America First-ism, but in a collection of ideas, principles, and actions that loosely cohered intellectually and were forged together in the fires of national political combat.

Tanenhaus faults Buckley repeatedly for failing to write a theoretical conservative treatment that could stand the test of time, and seems to use this to relieve himself of any serious appraisal of the principles at stake in the conservative revolution. While Buckley attempted to do so, writing a few chapters in a book that would have been titled The Revolt Against the Masses, it wasn’t Buckley’s comparative advantage. As most readers know, the conservative movement has never really been at a loss for theorists. Their names and books are well known and remain worthy of study and reflection. If we are in doubt about Buckley’s principles, perhaps this statement from Up from Liberalism (1959) offers guidance: “freedom, individuality, the sense of community, the sanctity of the family, the supremacy of conscience,” and “the spiritual view of life.” These pillars come into full view when “political power is decentralized.”

The liberal mind can easily relativize and dismiss the quote above, but for Buckley, each term is thick with theological, philosophical, anthropological, and constitutional content. In various ways and with different emphases, Buckley would draw on these pillars throughout his career. Conservatism is not an ideology or a rigid interlocking program of action with coin-in-the-slot answers, but is rather the attempt to preserve the Good, that which is bigger than man’s will or the momentary interests of a group or generation. “Each age must find its own language to represent an eternal meaning,” Whittaker Chambers poignantly stated. This definition, offered by one of Buckley’s closest friends and confidantes, one that Buckley admired, represents conservatism at its best. To pretend as Tanenhaus does that “in his time, as in our own, no one really could say what American conservatism was or ought to be. Buckley himself repeatedly tried to and at last gave up” is to avoid the evidence and the seriousness of thought and purpose that a man like Buckley put into a lifetime of forging the American conservative movement. It is both wrong and ungenerous.

The supremacy of conscience and the spiritual view of life were the most important principles of Buckley’s life, emanating as they do from his profound Roman Catholicism. Tanenhaus never really considers Buckley’s faith, a faith that directed his life in full. What kind of biography misses the essence of its subject? Perhaps Buckley’s attempt to fully integrate his life with the teachings of Christianity is a bridge too far for a secular liberal to grasp, much less analyze. Buckley even wrote a book about his Catholic faith, Nearer, My God, the contents of which are barely mentioned, though it traces back to his familial origins. Strange, then, that Tanenhaus wouldn’t evaluate it.

Sharper bones must be picked here at the end. Included in a book meant to be about William Buckley’s conservative revolution, inexplicably, Tanenhaus repeats the rumor that William Rusher, an early publisher of National Review, was gay, and avers that Buckley was considered by peers and enemies to exhibit gay attitudes and tendencies despite his opposition to homosexuality. All without a scintilla of hard evidence produced to support these insinuations. This calumny seems out of place and indicates a lack of sophistication when writing about a man and his family who were equally at ease in New York’s high society or the smoke-filled rooms of political discussion.

This book will come to be seen by many as the final word on the life of William F. Buckley Jr. It should not. In the end, it is dishonest, ungenerous, and unworthy of its subject.

The post Getting Right with Buckley appeared first on Law & Liberty.

]]>
68105
Steering Right https://lawliberty.org/book-review/steering-right/ Thu, 19 Jun 2025 10:01:00 +0000 https://lawliberty.org/?post_type=book_review&p=68093 The art of biography from Plutarch onwards shows how character is destiny. And superb books in the genre show how that character was shaped by upbringing and environment. In this respect, Sam Tanenhaus’s Buckley: The Life and the Revolution That Changed America is magnificent. Tanenhaus shows in detail “how everything Buckley learned and everything he […]

The post Steering Right appeared first on Law & Liberty.

]]>
The art of biography from Plutarch onwards shows how character is destiny. And superb books in the genre show how that character was shaped by upbringing and environment. In this respect, Sam Tanenhaus’s Buckley: The Life and the Revolution That Changed America is magnificent. Tanenhaus shows in detail “how everything Buckley learned and everything he became began at home.”

A middle child in a pack of ten, he had to become a performer from the start simply to be heard over his siblings. In such an articulate and rambunctious family, the young Buckley cultivated his innate talents for listening and then responding with witty repartee. From a father who was a wildcatter, as often on the cusp of bankruptcy as of great wealth, he inherited a risk-taking, almost swashbuckling, persona. Even his famous transatlantic accent was not a later life affectation but a holdover from his formative years at a British boarding school, one of the many stops in a meandering journey of early learning.

And most of all, he grew up a cradle conservative. While his family’s principal residence was in Connecticut, his parents were emphatically not Yankees. His father was the son of a Texas sheriff, and his mother the daughter of a Louisianian cotton broker. His father hated the New Deal, and the Buckley children competed to improve on their father’s denunciations. Thus, when Buckley arrived at Yale, he had the preparation and confidence to astonish his classmates by making powerful arguments against his liberal professors on politically and economically contentious topics. But despite his verbal facility, Buckley did not become a scholar. He absorbed ideas quickly in conversation but rarely pursued their depths through sustained study. What made him the biggest man on campus was the brio of his chairmanship of the Yale Daily News, not the originality of his academic contributions.

This background prepared him for what he became—the greatest controversialist in the nation and the broker of the most important political movement of his time, transforming conservatism from a moribund and reactionary philosophy to an effective ideology of governance. Tanenhaus is at his best in describing the sheer improbability of the achievement. Republicans had enjoyed substantial success in electing Eisenhower, a Republican, but not a man of the right, because he had made his peace with the New Deal. But Buckley recognized this kind of Republicanism would merely prove an interregnum between eras of increasing liberalism. The right needed to argue for a fundamentally distinct set of principles, not merely slow the implementation of the consensus liberalism of the time.

Just as his father had the confidence to drill where there was no assurance of striking oil, Buckley was willing to set up National Review as a conservative magazine of opinion where there was a likelihood of failure. He assembled a group of writers that encompassed the entire spectrum of conservative opinion from traditionalist Russell Kirk to fusionist Frank Meyer to the ex-Marxist, anticommunist dialectician James Burnham. These were a cacophony of voices, but Buckley here, too, was a good listener, allowing each to make his case in the magazine and making a politically astute synthesis of his own. Buckley’s empathetic nature enabled the kind of open tent that modern conservatism required if it was to corral different factions to create an effective movement.

But Buckley also recognized that he needed to police the boundaries of conservatism, keeping out the crazies and extremists. Tanenhaus shows how he outmaneuvered Joseph Welch and the John Birch Society, exiling them from the respectable right. Much later, he would do the same to its strand that threatened to be antisemitic, represented by the mercurial Joseph Sobran.

He was also willing to put his money where his mouth was. Journals of opinions are notorious financial sinkholes. And thus, Buckley needed to first support National Review with family resources and then with speaking engagement fees. But no political magazine has ever earned such a good return on investment.

As Buckley made conservatism a vibrant intellectual force, it attracted a new generation who had tired of consensus liberalism. While much of the literature on the 1960s has focused on the SDS and other left rebels, Buckley midwifed the Young Americans for Freedom, who set forth their charter at his estate. His brother-in-law and fellow NR contributor, Brent Brozell, ghostwrote Barry Goldwater’s Conscience of a Conservative, the book which propelled him to the 1964 Republican presidential nomination. Goldwater lost the election in a landslide but made Republicans, for the first time in a generation, an indisputably conservative party. Tanenhaus rightly credits Buckley as the single most important architect of conservatism’s twentieth-century revival.

Even a great biography has its flaws. Tanenhaus is a liberal, and sometimes out of his depth or out of sorts in addressing conservatism.

In the face of disagreement with other conservatives, including many at NR, Buckley backed Nixon in 1968. This move was a matter of calculation: he did not believe the newly elected Governor Ronald Reagan was ready to be a successful national candidate. Buckley’s pragmatic dictum to support the most electable conservative candidate (surprisingly not quoted by Tanenhaus) was decisive here. When Nixon was elected, Buckley enjoyed substantial influence: Nixon needed him to protect his right flank. Tanenhaus shrewdly contrasts his ready access with the more standoffish treatment he received in the Reagan administration. Reagan did not need the magazine’s protection and may have been annoyed by National Review’s announcement on his election that “we now have a country to run.” Reagan was a far shrewder and more calculating politician than most observers ever realized.

For all his successes, Buckley had his limitations. Tanenhaus is correct that he was not an original “thinker, still less a theorist.” But a complex society enjoys a division of intellectual as well as physical labor. Men like Friedrich Hayek were deep thinkers, but they were not great controversialists. Milton Friedman was both, but he did not have the encompassing vision and charisma to renew and hold together a political movement.

Buckley was also late in embracing equal rights for African Americans. This flaw, too, stemmed from his upbringing. His parents were Southern paternalists when it came to race. They treated their black servants so well that descendants of those servants had tears in their eyes when they described the family’s kindnesses. But they also secretly funded a newspaper in their second residence of Camden, South Carolina, that championed white resistance to desegregation. Though one of the great debaters of his time, Buckley’s blind spot on race caused him to lose his most famous encounter, his debate with James Baldwin at the Cambridge Union. He came across as wholly without compassion to the degradations that African Americans had suffered as a group. But in time, he changed his views on the color line, welcoming and jousting with black radicals like Eldridge Cleaver and Jesse Jackson on his talk show, Firing Line, just as he did with other leftists.

More troubling is the evidence that Buckley’s risk-taking sometimes morphed into recklessness. He loved sailing but took unnecessary risks, resulting in two major insurance losses and a costly lawsuit after a man was lost overboard. He was sanctioned by the SEC for his manipulation of the radio company of which he was the dominant shareholder.

The book is full of revelations. While the popular image is that Buckley was rich because of his family’s money, that fortune rapidly dwindled. His wealth had in fact two sources. His wife inherited about $30 million, measured in today’s dollars. (Tanenhaus should have translated such past sums into present value. Because of inflation, the numbers he quotes are misleadingly small.) Every year, Buckley himself earned enormous sums—over $5 million a year from Firing Line and ample royalties from his books. He was the only author of his time whose fiction and non-fiction were regular bestsellers.

Tanenhaus also shows that Buckley knew about the origins of the Watergate break-in even earlier than Woodward and Bernstein. His friend from CIA days, Howard Hunt, had organized the burglary and, distraught after the death of his wife in a plane crash, confessed much of it to him. Tanenhaus is extremely critical of Buckley for his silence, thinking it a violation of professional ethics, perhaps even the law, to withhold this information. But Buckley was concerned about Hunt’s troubled children. His Catholic faith made his role as a godfather paramount.

Even a great biography, like a great man, has its flaws. Tanenhaus is a liberal, sometimes out of his depth or out of sorts in addressing conservatism. For instance, he argues that Buckley and National Review argued against civil rights by invoking John C. Calhoun, whom he characterizes as simply trying to protect the rights of minority slaveowners. But this analysis flattens the theories of Calhoun. To be sure, he was a defender of slavery, but his defense of minorities more generally, as John Stuart Mill recognized, made him one of America’s most distinguished political theorists. Tanenhaus faults Buckley for failing to have assimilated great works of political theory, but he himself shows no evidence of having read Calhoun’s Disquisition on Government. He also trots out the tired cliché of conservatism creating a New Gilded Age without analysis of the prosperity it helped bring to middle America and, through free trade, to many of the world’s wretchedly poor.

The book also feels rushed at the end. After the Reagan administration, we hear little of Buckley’s political analysis or ideas, despite his continuing to write a widely read column until his death in 2008 and appearing on Firing Line until 1999. Tanenhaus skates especially lightly over the succession crises at National Review—the magazine that Buckley recognized was his great legacy to the nation and conservatives. Like many company founders, he anointed heirs and then found them wanting. While Buckley got along famously with conservative titans like Burnham and Meyer in the early days, he could not find a replica for himself.

And Tanenhaus almost completely ignores his personal life. Chris Buckley’s Losing Mum and Pup provides evidence of complex, contentious, yet loving and loyal relations between husband and wife and father and son. Perhaps Tanenhaus sees Buckley the husband and father as irrelevant to Buckley the conservative revolutionary, but public and private selves are rarely so neatly severed.

Yet this biography unfurls a remarkable canvas. It is, at once, a vivid portrait of a singular man, skilled in all ways of contending, and of the broad sail by which he caught the prevailing winds to steer his nation on a new course.

The post Steering Right appeared first on Law & Liberty.

]]>
68093
Reimagining College https://lawliberty.org/book-review/reimagining-college/ Tue, 10 Jun 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67799 These are trying times for higher education, and for more than temporary partisan reasons. Future demographic trends will exacerbate declining enrollment numbers. Facing budgetary shortfalls, colleges must cancel burdensome academic and athletic programs. Growing numbers of institutions will fail altogether, inflicting economic tragedy on the local communities that rely on them. There are no painless […]

The post Reimagining College appeared first on Law & Liberty.

]]>
These are trying times for higher education, and for more than temporary partisan reasons. Future demographic trends will exacerbate declining enrollment numbers. Facing budgetary shortfalls, colleges must cancel burdensome academic and athletic programs. Growing numbers of institutions will fail altogether, inflicting economic tragedy on the local communities that rely on them. There are no painless choices here; there will be losers in this readjustment process. But two new books present impending crises, new technology, and shifting consumer demand as opportunities for innovative reform.

Their provocative titles notwithstanding, Richard K. Vedder’s Let Colleges Fail: The Power of Creative Destruction in Higher Education and Kathleen deLaski’s Who Needs College Anymore? Imagining a Future Where Degrees Won’t Matter both articulate largely optimistic visions for higher education’s long-overdue course correction. Neither work will appeal deeply to readers who revere America’s traditional liberal arts undergraduate curriculum. Yet thoughtful and generous reading of both suggests creative and perhaps necessary means to harmonize new realities with an ancient heritage.

Russell Kirk once professed certainty that “if all schools, colleges, and universities were abolished tomorrow,” the young would nevertheless “find lucrative employment, and means would exist, or be developed, of training them for … work.” According to Kirk, the college exists not for vocational training per se, but for “liberal education,” which “defends order against disorder” by “cultivation of the person’s own intellect and imagination.” If pursued in this spirit, such education conduces to “order in the republic.” But much as lab-produced substitutes are an ineffectual mockery of real food, liberal education’s ancillary blessings cannot be reductively pursued as ends in themselves.

Even two generations ago, few Americans concurred with Kirk’s noble ideal. Today, six decades after Clark Kerr coined the term “modern multiversity” for their often conflicting array of interests, research universities are less coherent than ever. Kerr famously quipped at a meeting of Cal Berkeley’s faculty in 1957 that the major administrative challenges to the university were to “provide parking for the faculty, sex for the students, and athletics for the alumni.” For the typical large “R1,” this may be as complete a mission statement as possible. It is certainly the most honest. Even at many putative liberal arts colleges, the increasingly vocational focus of undergraduate education contributes to this confusion as both cause and effect. The three most common reasons given for college attendance in a recent New America survey—all with response rates above ninety percent—were “improve employment opportunities,” “make more money,” and “get a good job.” This is the consensus understanding of higher education’s purpose. But it may have been so for longer than we imagine.

Clayton Sedwick Cooper wrote Why Go To College? in 1912, spawning a subgenre now large enough to fill a library by itself. Cooper recounts the scene at an Ivy League graduation of a couple whose “homely” clothes, “deeply lined” faces, and “hard, calloused hands” identified them as farmers, watching with pride as their son led in the senior class. He imagined them “dedicating their lives to the task of giving [this] boy the advantages … they must have felt would separate him forever from their humble life.” Such scenes are a commonplace of American life. Attend any college’s commencement exercises; the families cheering loudest when their student’s name is read will be the spiritual descendants of those Yankee agrarians. Kirk’s noble admonition notwithstanding, America’s colleges have always served partly as entryways to its professional class. That is an essential function in a socially diverse, egalitarian republic. At any rate, people understand it as such—and in a democratic society, the people will have their way. Liberal education and vocational training must coexist somehow.

Consumer-driven schemes for vocational training may not be incompatible with a liberal education intended to cultivate moral imaginations.

Richard Vedder, emeritus professor of economic history at Ohio University, has long been among higher education’s foremost conservative critics. He is not, though, a wanton agent of chaos. Let Colleges Fail is less a celebratory paean to higher education’s imminent disintegration than a call for its renewal. Vedder does consider the possible value of “creative destruction” in higher education, noting that publicly subsidized colleges and universities “lack strong incentives to improve outcomes,” reduce overheads and prices. “Though we may mourn the loss of [individual] schools,” he writes, “we should accept and even rejoice in more closing in the years ahead as resources shift away from” failing institutions toward “educationally stronger ones.” In this vein, the book retreads some familiar ground on higher education’s excesses, abuses, and inefficiencies, though often with characteristically insightful data analysis. 

Vedder suggests many reforms, ranging in scale and consequence from minor and benign to the most sweepingly ambitious. “Reform efforts must … reduce market ignorance in higher education,” he writes. Other merchants, such as “big-box stores,” do not advertise one price, then charge each customer unique and undisclosed discounted prices—why should colleges be permitted to do so? Such commonsense proposals would meet little popular resistance, at least in principle. More controversial, perhaps, would be his scheme for voucher-style tuition aid, “converting subsidies given to schools to payments made to students’ directly.” Then there are Vedder’s most original suggestions, such as halving tuition costs by moving the academic year to three fifteen-week semesters, eliminating summer vacation, and condensing the bachelor’s degree into three years. In this plan, faculty base pay is increased, but large lecture sections are tripled in size, the number of tenured instructors is reduced, and faculty pay is moved to a sliding scale pegged to student enrollment in their courses. Pray for the poor dean who is tasked with presenting this plan at the next faculty meeting!

Vedder’s timeliest proposals are for the restructuring of research universities. Private companies that grow unwieldy, he writes, “are constantly spinning off operations that do not fit well with their core activities—shouldn’t universities do the same?” Do teaching hospitals, vocational schools, advanced research labs, and professional football teams still belong under the same institutional umbrella? Could inefficient, high-cost dormitories and dining halls be replaced with private boarding houses or similar free-market arrangements? Can independent laboratories not turn research grants into knowledge as well or better without campus bureaucracy? Wherever possible, Vedder urges institutions to shed distracting encumbrances to their core purpose of educating students.

Kathleen deLaski’s book, though, suggests that even this foundational mission may undergo revolutionary transformation in the near future. Who Needs College Anymore? explores the possibilities for education at the dawn of what she terms the “skills-first age,” in which the bachelor’s degree will no longer serve as the primary signifier of employability. Her book is a surprisingly engaging tour of the present state and likely direction of “the alternative credentials market.” Central to deLaski’s narrative is the “micro-credential,” a trendy catch-all term for industry-certified short-term training programs. Many, such as intensive “bootcamps” to learn software coding languages, offer direct, non-degree paths into remunerative careers. But such credentials and the traditional campus are not exclusive models. Some institutions—the 250,000-student University of Texas system, for example—subcontract third-party providers to give students access to thousands of skills-based credential courses alongside their degree curriculum.

Two institutions that embedded “alternative credential” programs within their curriculum at their inception offer particularly illustrative models. Chartered in 1997, Western Governors University pioneered “competency-based” curricula, in which students “move through [self-guided] online course material” without any real-time instruction, then take assessments “to demonstrate mastery.” Chartered in 1912, Northeastern University in Boston was an even earlier pioneer. From the beginning, its undergraduate curriculum has required completion of a months-long off-campus work experience placement. Students are prepared for professional work environments with a mandatory general education course covering resume curation, interview etiquette, and the like. In both examples, once-uniquely innovative ideas are now commonplace in higher education. All major accreditors permit credit-bearing off-campus apprenticeships and competency-based curricula. In the same way, deLaski believes, “as less expensive alternate pathways become clearer and surer,” traditional bachelor’s degrees “will seem impractical for a new majority of learners.” But she asks, “Why does the degree have to be the only product colleges sell?” Campuses offer many advantages that could help savvy institutions adjust to a changing education landscape. deLaski suggests various means for higher-education institutions to offer non-degree credentials within, alongside, or as alternatives to their existing programs. In short, “the degree may be in trouble, but colleges can survive.”

In a democratic society, the people will have their way. Liberal education and vocational training must coexist somehow.

This would be cold comfort to Russell Kirk. But consumer-driven schemes for vocational training may not be incompatible with a liberal education intended, as Kirk wrote, to cultivate “the person’s intellect and [moral] imagination, for the person’s own sake.” A few institutions already offer suggestive examples for combining the two. Affectionately known as “Hard Work U,” Missouri’s College of the Ozarks’ work-study model enables every student to gain in-house job skills and graduate debt-free. Its robust general education curriculum includes required two-part course sequences in Christian Worldview, American history and civics, and Western Civilization. Another intriguing example is LeTourneau University in Longview, Texas—a private, religious, four-year vocational college. Typical major programs include various branches of engineering, computer science, nursing, and business. But since 2015, LeTourneau’s Honors College has offered an excellent slate of liberal arts courses. Roughly five percent of students complete the full nineteen-credit concentration, but a greater number take a few honors courses as electives.

These are rare and modest examples. But if technological change and consumer demand augur seismic change for higher education, they may hold out hope to those who cherish the old liberal education. Market forces, legislatures, or both may, as Vedder recommends, require large public universities to reorganize, shedding non-core functions and renewing their focus on undergraduate education. We should hope so. Selective liberal arts colleges may continue more or less unchanged. But what of non-elite smaller institutions lacking the mysterious appeal of “prestige” or the security of large endowments? Imagine a struggling four-year, private institution with low admissions standards, reliant on athletics and vocational majors to drive recruitment. In a world of readily accessible, rapidly adaptive short-term credential programs, why enroll in a four-year vocational degree whose curriculum is updated rarely and belatedly? Such programs may appear increasingly cumbersome and costly, chiefly benefiting their tenured faculty. Suppose this college abandoned the bachelor’s degree and replaced its numerous putatively pre-professional and vocational major programs with a single liberal arts associates degree. Imagine a three-year program, the first two years devoted primarily to a “great books” curriculum alongside some foundational vocational training and summer internship options. In year three, the focus shifts primarily to job-specific training gained through the latest micro-credential courses, perhaps taken online or through intensive “bootcamps” off-campus. Students would receive guidance from a corps of counselors with up-to-date training in the “alternate credentials market”—a much expanded role for the “drop-in” career centers presently an afterthought on many campuses. In less time and with lower cost than the current bachelor’s degree, graduates of such a college might attain a “skills-based, job-ready” resume while also forming their minds and imaginations in a college-level liberal arts core curriculum.

This may be a fanciful hope. But if Kathleen deLaski is correct, new technology and probable consumer demands will permit such ambitious reimagination of college education very soon. How many administrators have the vision and courage to try such things? Richard Vedder suggests they may have no choice.

The post Reimagining College appeared first on Law & Liberty.

]]>
67799
The Myth of Victimization https://lawliberty.org/book-review/the-myth-of-victimization/ Mon, 09 Jun 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67685 History will harshly judge the United States’ prolonged vacillation over whether to honor the Fourteenth Amendment’s command of color-blindness by government actors in the wake of Brown v. Board of Education, with the Supreme Court earning much of the blame. Brown vindicated Justice Harlan’s lonely dissent in Plessy v. Ferguson (1896), which proclaimed that “our […]

The post The Myth of Victimization appeared first on Law & Liberty.

]]>
History will harshly judge the United States’ prolonged vacillation over whether to honor the Fourteenth Amendment’s command of color-blindness by government actors in the wake of Brown v. Board of Education, with the Supreme Court earning much of the blame. Brown vindicated Justice Harlan’s lonely dissent in Plessy v. Ferguson (1896), which proclaimed that “our Constitution is color-blind.” Yet, it took the Justices 45 years, from Bakke in 1978 to SFFA v. Harvard in 2023, to reject the erroneous notion that racial discrimination in pursuit of “diversity” is acceptable under the equal protection clause of the Fourteenth Amendment and federal civil rights laws. Granting “preferences” to favored racial groups is invidious discrimination—and therefore unconstitutional.

The hand-wringing and indecision reflected in Bakke, Grutter, Fisher I, and Fisher II are an embarrassment to the High Court, which finally reached the right result in the Harvard case. The Supreme Court’s earlier endorsement of the dubious “disparate impact” theory (concocted out of whole cloth by the EEOC in a clear misreading of Title VII of the Civil Rights Act of 1964) in Griggs v. Duke Power Co. (1971) remains uncorrected. Statistical imbalances are not the same as intentional discrimination, and it is ludicrous to suggest otherwise. Title VII explicitly declined to impose racial quotas. Section 703(j) of Title VII specifically states that employers are not required to grant preferential treatment to any individual to correct statistical imbalances in the workforce, and Democrat Hubert Humphrey, the Senate floor leader for Title VII, famously promised to eat the pages from the statute if Title VII were shown to authorize preferential treatment for any group. Like many promises made in Washington, DC, this one was never carried out.

Wall Street Journal columnist Jason L. Riley’s excellent new book, The Affirmative Action Myth, is a thorough and balanced post-mortem of the Court’s bungled jurisprudence. The Affirmative Action Myth is really two books in one, as evidenced by the subtitle Why Blacks Don’t Need Racial Preferences to Succeed, because Riley argues that “affirmative action” (which he correctly points out is “synonymous with racial favoritism”) and other progressive innovations created in the name of “civil rights” have actually harmed blacks more than helping them. Decades of black upward mobility were upended by quota-driven affirmative action and welfare programs since the 1970s.

Thus, racial preferences are both improper and unnecessary. “The main purpose of this book,” Riley states, “is to explain how affirmative action has failed.” Riley deftly weaves together an accessible account of the Court’s tortuous decision-making and the failure of racial preferences to improve the status of affirmative action’s intended beneficiaries. He even digresses briefly into the Court’s dreadful busing decisions, using Lino Graglia’s aptly-titled book Disaster by Decree as a guide. As Riley proves, liberal largesse has made things worse for the black community. The Court’s affirmative action jurisprudence was infected by the same ideological flaw that has hijacked the civil rights movement in America. He cogently explains that the continued advancement of the black community in America will be hampered until that error is recognized and rejected.

Like many aspects of the Great Society’s social engineering programs, and their successors, affirmative action was well-intended. Beneficent motives, however, do not assure good results; as the saying goes, the road to hell is paved with good intentions. Affirmative action and other civil rights policies have been disastrous for blacks. The welfare state undermined the black middle class by encouraging fathers to abandon their families, incentivizing black women to have children out of wedlock, and resulting in the proliferation of single-parent households led by females. Family instability and, in particular, fatherless homes are strongly correlated with violent crime rates and other social pathologies.

As Riley demonstrates, all of this was a reversal of positive economic and educational trends experienced by black people between 1940 and 1960. Black Americans were making remarkable gains even under “peak Jim Crow.” Riley doesn’t claim that racism didn’t (or doesn’t) exist, only that the legacy of LBJ’s welfare programs and the advent of affirmative action has had detrimental effects on blacks that are often overlooked in lieu of continued reliance on tired canards such as “systemic racism” and the presumed debility among blacks caused by the institution of slavery (which ended 160 years ago).

Riley does not deny the existence of racism in America, but he insists that it doesn’t explain the disparities within the black community.

As it is practiced today, “civil rights” is an industry in which many activists, scholars, bureaucrats, journalists, and organizations have a vested interest in perpetuating the myth of black victimization and helplessness. Riley argues (with extensive supporting footnotes) that “blacks have made faster progress when color blindness has been the policy objective.” Allowing equal treatment to be replaced by a regime of “oppression pedagogy” and identity politics, Riley suggests, is “one of our greatest tragedies.” Racial preferences “have been a hindrance rather than a boon for blacks,” he contends.

Riley makes a persuasive case. He reprises the work done by scholars such as Thomas Sowell, Walter Williams, Robert Woodson, Shelby Steele, John McWhorter, and Wilfred Reilly; as he notes, much of the research on this topic by center-right figures tends to be done by black academics, possibly due to white scholars’ well-founded fear of repercussions. (If you doubt this, recall the pariah treatment accorded Charles Murray, Amy Wax, Ilya Shapiro, and others who refused to genuflect to the prevailing orthodoxy.) Riley also draws upon the work of Stephan and Abigal Thernstrom, Richard Sander and Stuart Taylor Jr., and many others. Readers may be familiar with some of this work, but Riley usefully summarizes it and supplements it with census data, lesser-known academic studies, and historical and biographical profiles such as Hidden Figures, the book and movie about pioneering black mathematicians who helped NASA’s space program in the 1960s. 

Riley devotes a compelling chapter to debunking the efficacy of affirmative action, but his critique is not limited to the harmful consequences of racial preferences in higher education—in the form of the “mismatch” phenomenon and otherwise. Riley contends, “One tragic legacy of the affirmative action era is that the number of black college graduates is almost certainly lower today than it would have been without racial preferences that mismatch students with schools for diversity purposes.” Riley nimbly tackles the whole array of liberal shibboleths on race: critical race theory (and its leading proponents), reparations, the false narrative of the 1619 Project, “mass incarceration,” DEI, redlining, and more.

What these topics have in common is that they share the premise—one that Riley debunks as a myth—that blacks are helpless victims of an oppressively racist system and cannot improve their status without special preferences and favored treatment. A more descriptive (but less catchy) title for the book would be “The Racial Victimization Myth,” because Riley explores the many facets of the false race narrative peddled by the Left.

The patronizing paternalism of the prevailing narrative harms blacks, Riley argues, because it instills a mentality of victimhood that fuels grievance and undermines effort and personal responsibility on the part of blacks. If the game is rigged due to “white supremacy,” and if “systemic racism” determines one’s fate, why bother to work hard, exercise self-restraint, adopt good habits (in the form of so-called “bourgeois values”), and so forth? This theme resonates powerfully throughout the book. History teaches that the path to upward mobility for minorities is assimilation into the mainstream culture, and the rigors and discipline of competition—the bedrock of meritocracy—foster a culture of striving instead of excuses, resentment, and despair. Treating black people as helpless victims discourages them from devoting themselves to achieving success through self-improvement.

One of Riley’s most effective rhetorical devices is the juxtaposition of attitudes about black self-reliance and personal responsibility from earlier eras and the contrived ideology of “anti-racism,” as exemplified by the writings of Ibram X. Kendi, Ta-Nehisi Coates, and Robin DiAngelo. Proponents of critical race theory (which Riley says “amounts to little more than a fancy justification for racial favoritism”) embrace what Riley calls “racial essentialism”: the notion that “anti-black bias in America is systemic … and must be eliminated root and branch before any significant narrowing of racial disparities can take place.” Not only is this premise contradicted by the well-documented progress blacks made in the first two-thirds of the twentieth century, even under Jim Crow, it is also contrary to the sentiments expressed by early civil rights leaders such as Booker T. Washington, W. E. B. Du Bois, Martin Luther King Jr., and even Malcolm X, all of whom emphasized the importance of self-reliance, hard work, and individual responsibility as indispensable to upward mobility. Riley notes that “Coates is waiting on white people to rescue black people. [Frederick] Douglass understood that black people must save themselves.”

Advocates of this concept, once dubbed the “politics of respectability,” called upon black Americans to adopt constructive manners, morals, and attitudes to achieve social and economic advancement, even in the face of discrimination. Other minority ethnic and racial groups, including Irish, Chinese, Japanese, and Jewish immigrants, overcame prejudice in America by adopting productive cultural habits—assimilation, in other words. Riley points out that this approach is anathema to modern-day civil rights activists, who scorn respectability politics as “ineffective and a waste of time. Studious black youngsters and other black people who adopt middle-class speech, dress, and behavior are accused of racial betrayal, or ‘acting white.’”

Critical race theory holds that “racism is mainly if not entirely to blame for black-white gaps in everything from income to incarceration to standardized test scores.” Riley strongly disagrees. Absolving blacks of any responsibility for improving their status in American society conveniently blames “white supremacy” for every aspect of dysfunctional black culture and encourages blacks to be dependent on favors bestowed by the welfare state. Riley states that affirmative action creates the “impression that black people are charity cases dependent on government programs.” Opponents of affirmative action and statist policies sometimes compare the retrogression of black advancement since the Great Society to a “return to the plantation.”

Like many aspects of the Great Society’s social engineering programs, and their successors, affirmative action was well-intended. Beneficent motives, however, do not assure good results.

Even mild suggestions for black self-improvement by sympathetic figures such as Barack Obama are rebuked by ideologues as “blaming the victim” and “talking down to black people.” Riley laments that “discouraging acculturation and assimilation in the name of racial solidarity is self-defeating.” The rejection of individual responsibility and the tendency to blame “whiteness” for all racial disparities have sabotaged the upward mobility and progress that blacks enjoyed before the civil rights era. Liberals understandably don’t want to acknowledge the body of data that Riley convincingly marshals. Hard work, thrift, sobriety, respect for authority, the nuclear family, recognizing the importance of education, and deferring gratification are not manifestations of white supremacy, but essential ingredients for success.

It is truly astounding to see how condescending leftist intellectuals deny black people any moral agency, and encourage them to wallow in grievance and victimhood. Riley observes that “black politicians and activists have a vested interest in a narrative that accentuates black suffering.” Victims require saviors, and those promising to deliver salvation are often rewarded with status, money, and influence.

The Supreme Court, left-wing scholars, and self-interested activists are not the only villains in The Affirmative Action Myth. Riley exposes the activist role of the Equal Employment Opportunity Commission in “turn[ing] Title VII on its head,” and points out that presidents from both political parties have muddied the waters by issuing executive orders mandating quotas (as Lyndon Johnson did with federal contractors in Executive Order 11246) or supporting the expansion of the Great Society welfare programs (as Richard Nixon did). There are few “heroes” in Riley’s account, although Justice Clarence Thomas—a longtime critic of affirmative action—comes close.

Riley does not deny the existence of racism in America, but he insists that it doesn’t explain the disparities within the black community (such as West Indian blacks versus African-American blacks) or the retrogression since the 1960s: “The elimination of white racism, however desirable then and now, is not a prerequisite for black socio-economic advancement.”

Riley’s sobering concluding chapter is chock-full of hard truths—and not for the faint of heart:

One reason antisocial behaviors [in black populations] became more common in the post-1960s era is because they became more tolerated and more lavishly subsidized by the government. … Low-income blacks began to adopt counterproductive attitudes and habits that previous generations had rejected and strived to eradicate. Even more tragically, academics began to intellectualize this degeneracy instead of calling it out for what it is.

Providing grisly details, Riley condemns the hip-hop culture and gangsta rap: “Too many young people have come to equate self-destructive behavior with black authenticity.” Put in economic terms, “government programs are no substitute for the development of human capital.”

Sadly, despite the proven failure of the victimization narrative, and its baleful consequences so readily apparent in our inner cities, the proponents of this model “have perhaps never been more celebrated in the academy and the media than they are today.” Affirmative action has never enjoyed popular support, and has now been declared unconstitutional and illegal. That it still carries sway in the influential spheres of academia and the media, Riley laments, “ought to be of deep concern to anyone who cares about the future of the black underclass.” Indeed.

The Affirmative Action Myth is a timely and well-written book that contains an abundance of common sense, solid arguments, and carefully researched historical data. One can only hope that it is widely read and provokes a long-overdue change in direction in the area of civil rights and race relations. Sixty years of failed policies are enough.

The post The Myth of Victimization appeared first on Law & Liberty.

]]>
67685
Plato in Syracuse https://lawliberty.org/book-review/plato-in-syracuse/ Thu, 05 Jun 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67579 I have heard it said that attempts have been made at staging productions of Plato’s dialogues, but that all have invariably flopped. I can’t vouch for the truth of that claim, but I can believe it. Intellectually, of course, Plato is gripping. And yet, as great a lover of Plato as I am, I fear […]

The post Plato in Syracuse appeared first on Law & Liberty.

]]>
I have heard it said that attempts have been made at staging productions of Plato’s dialogues, but that all have invariably flopped. I can’t vouch for the truth of that claim, but I can believe it. Intellectually, of course, Plato is gripping. And yet, as great a lover of Plato as I am, I fear that even I would fall asleep watching a dramatic reenactment of the Republic. Not enough “action,” in the conventional sense.

But did you know that there’s a ripping good yarn contained in the texts known as Plato’s “letters”? Therein, and especially in the longest and most famous Seventh Letter, Plato writes autobiographically of his political entanglements with Dionysius the Younger, the dissolute young tyrant of Syracuse. By accepting the invitation of his longtime friend and pupil, Dion of Syracuse, to realize the ideal of philosophic rule made famous in the Republic, Plato, by his own account, inadvertently precipitated the unravelling of the Syracusan regime, which finally plunged the city into a brutal civil war. In the pages of Plato’s letters, we find Plato the teacher, the counselor, the ally, the statesman; intrigue and faction in the court of a tyrant; grand political hopes dashed as famous utopian dreams become living nightmares—it is a stunningly dramatic and dynamic portrait of Plato and his philosophy.

And yet the experience of picking up and trying to read Plato’s letters tends to disorient the modern reader. For one thing, the political dynamics and major players of fourth-century BC Sicily are not very well remembered, which can make Plato’s narrative hard to follow. And to make matters worse, the authenticity and provenance of these letters have been the subject of fierce debate for centuries, which raises questions regarding their philosophic, historiographical, and biographical value. Thus, even though the skeleton of the story they contain is probably true, Plato’s letters suffer from an obscurity out of all proportion with their inherent interest.

The task of excavating the story of Plato’s political misadventures in Syracuse, of bringing it to life for a general audience of modern readers, is extremely complex. It requires extensive knowledge of classical Greek history, politics, culture, and philosophy, the ability to judge and to synthesize the accounts of varied and often conflicting historical sources, and, rarest of all, the discernment and literary talent to provide an education in the relevant background without sacrificing liveliness of storytelling. It is a great credit to James Romm that he was up to that task. In Plato and the Tyrant, Romm weaves together the threads of history, philology, philosophy, and archaeology with a deftness and erudition only a true classicist could possess. Historical details are illustrated by images of ancient Syracusan coins, vase paintings, and Sicilian ruins, the significance of Plato’s word choices are explicated by etymological lessons on the Greek roots of English words, complex scholarly arguments over fine points of historiography are distilled and clarified with admirable brevity, and all the while the reader is borne along on a journey of great historical and human interest. One senses from reading Romm’s work how lucky his students must count themselves for having found their way into his classroom.

In the long history of the study of Plato, there has never been widespread agreement about what Plato thought or how he meant to communicate it. It is inevitable that, as a scholar of Plato, I should have my disagreements with Professor Romm’s approach to the Platonic letters and what he does with them—especially since my most recent work culminated in a monograph on the subject, containing my own new English translation of the Letters and original interpretive essay. Romm will have no trouble finding supporters on those points where I will pick nits or bones; for the most part, he sides with the majority of scholars against my dissenting views. In particular, by assessing five of the thirteen letters in the collection as authentic works of Plato, dismissing one as spurious, and mostly leaving the others aside, he rejects effectively my heterodox contention that the Letters is actually a unified, semi-fictional work of Platonic philosophy, a one-sided epistolary novel in thirteen, artfully crafted and purposefully arranged parts. In a way, Romm recognizes the possibility of what I propose: he sees Letters Three, Four, Seven, and Eight as “open letters,” of which the salutation to one or more addressees is more literary device than real indication of Plato’s intended readers. But if that’s true, why couldn’t all the letters—indeed, the “Letters” as a wholehave been intended for broader dissemination? Had Romm given more weight, for instance, to the significance of Letter Two in the context of the whole Letters, he would have seen the importance of the claim there that “it is not possible for things written not to be exposed”: Plato writes nothing without anticipating that it will be published. In fact, the question of why Plato wrote as he did is a pervasive theme of the Letters—fitting for a text that, by its very form, emphasizes Plato’s role as author far more than anything else he wrote.

Romm reduces Plato’s philosophic insight to his biographical circumstances. I believe that we do this to Plato at our own intellectual and cultural peril.

A first point I would make, then, is that we must be careful how we use the Letters. Romm mines the text for historical details of the story he wishes to tell, juxtaposing and triangulating with the versions given by later historians, biographers, and gossipmongers, to ascertain what “really happened” in Syracuse. But I would caution that even those Greek writers whom we now describe as “historians” were philosophers and teachers more intent on providing a profound education than on making a meticulous and comprehensive record of events. As for Plato’s activity in Syracuse, our primary source of information is Plato’s account in the Letters; details found in biographies written centuries later may, for all we know, have started as speculative rumors meant to fill gaps in Plato’s narrative. Just as the Peloponnesian War is not nearly so important an object of study in its own right as it is because a profound thinker, Thucydides, made it the canvas for his pedagogical masterpiece, the story of Plato in Syracuse, I would submit, should be of interest to us above all because Plato is its narrator.

But if Romm can interest a general audience in classics by bringing this Platonic drama to life—which I believe his book will and has—why rain on his parade by insisting that he has not adequately distinguished the forms and original purposes of his various classical sources? I would be less concerned about this if Romm had not taken on a weightier responsibility for his book than the recovery of a good story for the entertainment and edification of his readers. In his introduction, Romm recounts how the “spell” Plato had cast upon him as an undergraduate gradually lifted, how he had come to question Plato’s political philosophic wisdom, and how “the questions that first troubled [him] when the spell of Plato was broken” came to “trouble [him] even more when [he] came to the letters.” He articulates the possibility that Plato, the great moralist, wound up “collaborating with evil” in Syracuse, and that the Republic is meant to “obscure” his hypocrisy. He reports the famous judgment of Karl Popper that Plato’s Republic belongs to “the perennial attack on freedom and reason.” And he concludes his introduction by saying that his book will show us how “the wise can become more tyrannical by the company of tyrants.”

Romm thus suggests that Plato’s Syracusan story helps us to see a grave problem with Platonic philosophy, that we should not so much seek wisdom and understanding from the education Plato’s works provide as we should seek to learn an object lesson from them. We must not dodge this question. To put the matter bluntly, Plato’s political philosophy long predates, and so stands outside of, the liberal tradition upon which our civilization rests. If we have something of value to learn from Plato in our political moment, it will be no defense of liberalism per se—it will, in fact, appeal to pre-liberal and perhaps illiberal moral and political principles. This is not to say that Plato will counsel a break from liberalism. I, for one, see Platonic political philosophy pointing us to the preservation of liberal democracy through a reinvigoration of its noblest principles, guided by the Platonic virtue of moderation. Such a case can indeed be made on the basis of the Letters, where Plato repeatedly makes it clear—in letters of which Romm makes use as well as in some he doesn’t—that he never condoned the wholesale substitution of one form of regime for another, especially by violent revolution. In this, Romm sees a reflection of Plato’s Sparta-philia, a longing for permanence and stability in a Greek world plagued by war, stasis, and moral decay. He thus reduces Plato’s philosophic insight to his biographical circumstances. I believe that we do this to Plato at our own intellectual and cultural peril.

I don’t have the space here to remedy the excessive vagueness of the foregoing claims and critiques regarding Plato’s contemporary relevance. I will instead limit myself to describing one substantive disagreement I have with Professor Romm’s reading of Plato’s Letters, indicating how our evaluation of Plato might hang on such a point.

Plato did not really believe in the possibility of philosophic rule as presented in the Republic, which means that Plato and Dion were on drastically different pages.

As messy and multipolar as the struggle for Syracuse became (elegantly clarified by Romm), Plato presents it as a battle between two factions: the tyrannical party of Dionysius and the anti-tyrannical party of Plato’s devoted acolyte, Dion. Throughout the Letters, Plato presents himself as having fundamentally, if moderately and contingently, supported Dion’s side in that conflict. Given the terms of Dion’s initial invitation to Plato, Romm naturally assumes that Plato hoped Dion might rule Syracuse as a philosopher king. This involves what I see as a critical error. Even though Dion himself is keen on the idea of bringing philosophic rule to Syracuse, Plato only ever represents him as having wished for Dionysius to be educated in philosophy for that purpose. It is surprising but undeniable upon reflection that Dion is never, in any of the Platonic letters, spoken of as having taken an interest in philosophic study himself. (Letter Ten shows how willing Plato was to endorse a view of philosophy among Dion’s circle that rested on a misapprehension of what the activity of philosophy is really like. Likewise, a careful study of Letter Eight shows that Plato and Dion do not have the same counsel for the Syracusans: Dion’s overly hopeful proposals must be compared with Plato’s much more practical, down-to-earth advice in Letter Seven.)

The Letters is about how Plato sought to present philosophy to the world, with the Republic as the centerpiece of his presentation. But even the Republic allows us to see that this presentation is steeped in paradox. Romm puzzles over the question of why Plato thinks the philosopher would return to politics, to “the cave,” after he has beheld the true world of the eternal “forms.” He fails to take note of the fact that, according to Plato’s Socrates, the philosophers have neither any obligation nor any desire to do so except in the ideal city he and his interlocutors have built in speech (cf. Letter Six). Only in that ideal city will the philosophers be compelled to administer the city as a just repayment for the regime’s cultivation of their philosophic natures. All this lends itself to the view that Plato did not really believe in the possibility of philosophic rule as presented in the Republic, which means that Plato and Dion were on drastically different pages. The distance that has grown between Plato and Dion by the time the latter is waging a war for Syracuse, which is made evident in Letter Four, is the fruit of Dion’s failure to grasp the meaning of Platonic philosophy.

In my book on the Letters, I share my own speculation as to why, if not to create philosophic rule, Plato went to Syracuse at all. In brief, I believe Plato’s literary project of defending philosophy in the wake of Socrates’s death achieved a kind of political success beyond even his own expectations, and that Dion’s zeal for philosophic rule put Plato in a bind. His decision to accept Dion’s invitation, as he makes clear in Letter Seven, had more to do with what the fallout would be for the reputation of Platonic philosophy if he should turn his back on this enthusiastic follower and benefactor than with anything he hoped to achieve politically in Syracuse. To be sure, that interpretation is itself up for debate. What is important for my purpose here is simply to emphasize that, unless we read the Letters closely as a work of Platonic political philosophy, we are prone to misunderstanding Plato’s intention and misjudging his character.

It must be said that Romm never really settles the question of whether the study of the Platonic letters issues in acquittal or condemnation of Plato. He comes back to it now and again—especially in his ninth chapter—indicating where a certain construal of the evidence might point to a harsher assessment of Plato’s thought and action than has been typical. But he tends to leave things at the level of suggestive “maybes” more than he draws conclusions or lays out a case of his own. If anything, the book ends with the suggestion that Plato stood above all for a government in which good rulers are constrained by good laws—a view with which I agree, and which I am glad Professor Romm has expressed. Yet I remain uncertain whether he has handled the matter in the most conscientious and responsible way. In his attempt to present the most entertaining and intriguing version of this tale, Romm often seems to seek out and present the theories that will most excite his audience. The possibility that Plato and Dion were lovers, which adds little to the substance of the story but to which Romm devotes considerable space, is probably the best example, but many other rumors and legends from sources of greatly varying reliability are sprinkled freely throughout the book. When Romm includes a vignette of lust, jealousy, and murder preserved by “Parthenius, a collector of salacious stories,” admitting that it is “impossible to confirm but too good not to tell,” he seems to describe what led him to include a great many of the episodes recounted in Plato and the Tyrant. Is it for this same reason that he has chosen to highlight the ways in which the story of Plato in Syracuse might be seen as evidence of Plato’s moral weakness or worse? Much as I am reluctant to make that accusation, I do wish Professor Romm had given more sustained and serious attention to the very serious question he chose to raise.

Updated, June 6th, 2025: The original version of this review, posted on June 5th, 2025, claimed that Dr. Romm accepts “five of the thirteen letters in the collection as authentic works of Plato and dismiss[es] the remainder as spurious.” That was erroneous. Only Letter One is dismissed by Romm as an “obvious fake”; otherwise, where he touches on the letters aside from the five he explicitly accepts as Platonic, Romm generally expresses openness to their possible authenticity.

The post Plato in Syracuse appeared first on Law & Liberty.

]]>
67579
Reining in the Spies https://lawliberty.org/book-review/reining-in-the-spies/ Wed, 04 Jun 2025 09:59:00 +0000 https://lawliberty.org/?post_type=book_review&p=67575 The debate about the proper function of intelligence in the US is as old as the nation. Several founding fathers, indeed, even patriarch George Washington, recognized the need for espionage to be kept secret from the Continental Congress—with all the attendant risks of abuse—to help win the war for independence. The debate then centered on […]

The post Reining in the Spies appeared first on Law & Liberty.

]]>
The debate about the proper function of intelligence in the US is as old as the nation. Several founding fathers, indeed, even patriarch George Washington, recognized the need for espionage to be kept secret from the Continental Congress—with all the attendant risks of abuse—to help win the war for independence. The debate then centered on whether the new country could free itself from sullying Old World intrigues and who would, or even should, oversee a secret apparatus for the new republic.

Today, intelligence is a permanent fixture in the US government. Now the debate is about the appropriate scope and reach of national security intelligence on balance with the protection of American civil liberties. This is the “constant crisis” in Jeffrey P. Rogg’s sweeping new book, The Spy and the State: The History of American Intelligence.

The book is a work of even-handed historical writing by an author with deep roots in national security studies (Rogg has taught as a member of the faculty of the US Naval War College, the Citadel, and the Joint Special Operations University). The book is also a balanced, thoughtful, and well-grounded discussion of the tumultuous growth of the national security intelligence bureaucracy, the professionalization of US intelligence, and the evolution of intelligence oversight.

The Spy and the State is a significant accomplishment of genuine scholarship. The author’s deep understanding of the US Intelligence Community (USIC) is evident in his excellent use of a wealth of primary sources, including published and archival materials ranging from government documents and period newspapers to relevant case law and the unclassified records of individual US intelligence agencies. Rogg also makes good use of secondary sources to provide insight and assessments from authors with special expertise, including the history of wartime US intelligence and of specific agencies. While The Spy and the State sometimes reads like a textbook, with some sluggish writing, Rogg is a disciplined researcher keen on offering detail. The book is well documented with more than 80 pages of notes and an outstanding bibliography. This book, then, will be welcomed by both scholars and students seeking to enhance and enlarge their understanding of the USIC.

Civil-Intelligence Relations

The Spy and the State is a history of the USIC seen “through the lens of civil-intelligence relations and the major themes of control, competition, coordination, professionalization, and politicization.” For this work, Rogg adapted the ground-breaking analog of civil-military relations advanced by Samuel P. Huntington in his book The Soldier and the State (1957). It’s a worthwhile model for Rogg to have acknowledged and adopted. Mirroring Huntington’s work, Rogg shows how the development of intelligence as a profession in the twentieth century, and attendant civil oversight, can regulate the role of intelligence in the national security state.

This work explores the USIC’s history by examining US intelligence in each of four wartime eras: the Revolutionary War to the Civil War; the Civil War to the end of World War II; the Cold War; and the present, post-Cold War era. This approach is more than a nod to the march of time. It acknowledges the dominant role military intelligence played in creating the USIC. Today, an estimated 80 percent of the nation’s classified intelligence spending is earmarked for military intelligence activities. Moreover, “each successive war,” Rogg explains, “saw the country engage in intelligence activities on an even greater scale, and each postwar period revealed the challenges that retrenchment posed.” With the era-by-era approach, the author illustrates how the changing nature of the US role in the world led to the establishment of the nation’s permanent intelligence community.

Bureaucracy and Rivalry

Rogg describes how the USIC grew by fits and starts, hamstrung as much by a failure to establish a profession of intelligence as by rivalries across government bureaus assigned various intelligence functions. For example, the author recounts episodes in the bureaucratic wrangle between the departments of State, Justice, and Treasury for control of various aspects of intelligence. For a time, Secret Service agents were “loaned” to other executive departments to pursue domestic law enforcement and counterespionage investigations, while still reporting to their managers at Treasury. That unsatisfactory arrangement spurred the Justice Department to create its own secret service, the Bureau of Investigation (BOI, later FBI).

The tangle of competing interests, Rogg observes in a telling insight, was made even more contentious because executive departments unilaterally formed their own intelligence services. Congress had no say in the creation, organization, and mission of the Secret Service, and the BOI, much less a say in the War Department’s Military Information Section (eventually the Military Intelligence Division of the Army General Staff in WWI), or the Navy Department’s Office of Naval Intelligence. Ultimately, only two of the current eighteen US intelligence agencies—the CIA and the Office of the Director of National Intelligence—would be chartered by Congress.

While Americans have often been able to reset civil-intelligence relations after a threat has passed or egregious abuses have been checked, Rogg is far less sanguine about future relations.

Rogg contends that before the onset of the Cold War era, every intelligence service in government was “straddling a fault-line in American civil-intelligence relations,” a blurry area between acceptable foreign collection and detested domestic surveillance. Various agencies, and their respective executive departments, all attempted to collect foreign intelligence, conduct domestic law enforcement investigations, surveil American citizens, and launch counter-espionage operations in the US. This, Rogg explains, was an outgrowth not only of the lack of coordination between executive departments, but of “mission creep.” He gives the example that when Secret Service agents uncovered a threat to President Cleveland, the Service simply expanded its role beyond investigations of counterfeiting and financial crimes to include protection of the president. Rogg argues that unbridled expansion and duplication were also the result of the failure of Congress to exercise any effective oversight of the growing intelligence community as the nation entered the twentieth century.

Permanence and Oversight

The Spy and the State offers readers an illuminating record of the spotty, ineffectual, and often politicized nature of oversight of the intelligence community. Rogg makes the case that the USIC in its first historical era remained “discretionary, disorganized, uncoordinated and unprofessional.” The author also describes how the intelligence community expanded in times of war and contracted in times of peace. He then neatly traces the robust growth of the nation’s intelligence capabilities in World War II and shows how that growth and the onset of the Cold War marked the end of another historical era.

At this pivotal point in the history of the USIC, Rogg ascribes an outsized influence to William “Wild Bill” Donovan, the wartime head of the Office of Strategic Services (OSS). The author contends Donovan “permanently transformed the American intelligence system,” and “set the conditions for an independent intelligence organization and, at long last, [a] profession.” It is more likely that while the influential and well-connected Donovan was then in the right place at the right time, the exigencies of the Cold War, the catastrophic intelligence failure at Pearl Harbor, and growing Congressional discomfort with the power of the executive branch spurred legislation that created the CIA in 1947. Rogg points out that legislation created two specific statutory missions for the CIA: to coordinate the activities of the USIC and furnish intelligence analysis to inform policymaking.

The fledgling CIA, however, attracted OSS veterans to its ranks who were intent on “seizing covert action” as part of its mission set. In so doing, the agency “absorbed an organization and culture that undermined its original statutory missions.” Rogg charts the uneven course of the CIA’s early covert actions. He acknowledges that policymakers steered the agency towards misguided forays and outright interference, for example, with the internal affairs of Burma, Guatemala, and Iran. By hewing to historical records, the author easily dispels any lingering notion that these were activities of rogue elements of the CIA; covert action was an integral part of Cold War strategy.

The Spy and the State recounts the covert missions of the 1950s and the agency’s soiled record in the 1960s and 1970s. The CIA’s mind-control experiments, surveillance of journalists and students, assassination plots, and other domestic intelligence operations did not escape public exposure. Media accounts spurred Congressional inquiry, and the Church and Pike Committee hearings were at the forefront to establish permanent legislative oversight. In the most telling part of his book, Rogg makes a clear-eyed account of how abuses and blatantly illegal actions by the USIC eroded public trust in government and fostered suspicion of the power of the administrative state.

Despite the growing professionalization of the intelligence community, and more vigorous oversight, the author shows that some of the most egregious abuses of the reach and power of the USIC occurred in the post-Cold War era. Rogg argues that “during the Global War on Terror, the government unleashed its powerful intelligence apparatus, undermining civil liberties and eroding constitutional rights in the process.” Enabled by the PATRIOT and Intelligence Reform and Terrorism Prevention Acts, new guidance issued by then Attorney General Michael Mukasey, for example, blurred the line between law enforcement and domestic intelligence. As a result, the FBI was able to gain access to NSA’s powerful surveillance tools. The agency’s PRISM program collected information from private companies and automatically sucked up data from Microsoft, Google, Facebook, Skype, YouTube, Apple, and others. The Bureau then expanded its use of National Security Letters (NSL)—administrative rather than judicial subpoenas—to collect information from tens of thousands of individuals each year. Because the NSLs also contain non-disclosure provisions, the FBI now had “the power both to investigate and to silence.”

The Spy and the State is as much of a historical account as it is a work of keen contemporary observation and incisive commentary. Informed by the judgements of history, the author in his conclusions argues that the combination of the national security state, its attendant administrative state, omnipresent surveillance technology, Big Data and AI, and a massive intelligence apparatus looms as an authoritarian threat in American civil-intelligence affairs. While Americans have often been able to reset civil-intelligence relations after a threat has passed or egregious abuses have been checked, Rogg is far less sanguine about future relations.

“The American people,” Rogg warns readers, “must assert their role in the US intelligence system more directly in the future than they have in the past—their liberty and security depend on it.”

The post Reining in the Spies appeared first on Law & Liberty.

]]>
67575
The Righteous Bureaucrat? https://lawliberty.org/book-review/the-righteous-bureaucrat/ Thu, 29 May 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67481 Among its many innovations, The New Yorker turned the humble biographical sketch into “The Reporter at Large,” arresting prose that revealed what little-known people do and why their work matters. My own experience with the form began in high school English, where I profiled a bond-trading family friend under the headline “Buy Low, Sell High.” In Who Is […]

The post The Righteous Bureaucrat? appeared first on Law & Liberty.

]]>
Among its many innovations, The New Yorker turned the humble biographical sketch into “The Reporter at Large,” arresting prose that revealed what little-known people do and why their work matters. My own experience with the form began in high school English, where I profiled a bond-trading family friend under the headline “Buy Low, Sell High.” In Who Is Government?, Michael Lewis assembles a roster of celebrated authors to write profiles shining a light on obscure federal employees—advancing a simple thesis that these unsung officials are the backbone of America.

At a time when President Donald Trump is determined to shrink the federal workforce, investigating how these officials work and the worth of what they do is a potentially valuable project. Unfortunately, while all but one of the profiles are informative and elegantly etched, the book contributes little to a fair assessment of the government employees’ overall performance. First, the sample is tiny: seven portraits, plus a cameo for government statistics in general. Such a small, hand-chosen selection cannot fully illuminate a mammoth bureaucracy that numbers in the millions. The administrative state has become so vast that it is almost impossible to evaluate through personal stories, however compelling.

Second, the writers have chosen a very unrepresentative group. For instance, the substantial focus is on those who gather and disseminate information. One of these essays covers the information the government produces in general, like employment and inflation statistics. In another, we meet Pamela Wright, an employee of the National Archives who digitizes census information so it can be more easily used in the hinterlands, including her native Montana. A third profile, Heather Stone, who runs a clearing house for the FDA that catalogues off-label uses of drugs that may cure rare diseases.  

Information production is a classic public good. It is undersupplied by the market, because once published, the information is hard to restrict and thus profit from.  Yet the dissemination of such matters as price and wage statistics or information about drugs that help cure obscure diseases has obvious positive externalities. And not only are they valuable, they do not impose substantial costs on anyone, unlike government regulations, but just add pennies to our tax bills. What’s not to like?

But even with seemingly unobjectionable government functions, the writers skate over difficult issues posed by the work their subjects do. For instance, the chapter about government statistics does not even mention the most substantial way that the government systematically misrepresents the state of our economy. Many economists think that the government understates growth, because it does not capture the accelerating technological improvements in goods; a cellular phone today may cost as much as one ten years ago, but its capabilities are far greater. And recently, economists have calculated that certain free goods, such as Facebook and other social media platforms, have also boosted real economic growth substantially, although they do not even show up in the statistics. This distortion matters: it creates an illusion of stagnation and feeds politicians’ claims that America is in relative decline from the boom post-WWII years. Maybe it is not an accident that government workers tend to statistically slight the accomplishments of the private sector, of which they are not a part.

Many of the remaining jobs portrayed are somewhat eccentric. One chapter focuses on a team building a coronagraph, an instrument used in conjunction with a space-based telescope, such as the ones named for NASA executives James Webb or Nancy Grace Roman. The device blocks out the brighter light from nearby stars to reveal more distant and fainter stars and structures. This project, too, might be described as gathering information to advance basic science–the positive externalities of discovering new galaxies or exoplanets are less clear than curing disease or gauging the employment rate, but increasing such knowledge can be defended as a form of civic flourishing.

If gathering information, studying the stars, and caring for veterans’ cemeteries was all or even most of what the government did, the administrative state would be far less controversial than it is.

But here, too, the profile’s author, Dave Eggers, glides past the less flattering facts surrounding his chosen agency. NASA has been shrinking, because presidents of both parties have concluded that private enterprise does a better job of supplying the infrastructure for outer space exploration than the state. As Law & Liberty contributing editor G. Patrick Lynch has written before, SpaceX, a for-profit company created by the world’s richest man, Elon Musk, has revolutionized rocketry and has taken over the trips to our manned space stations. Eggers never mentions that it will be this private company that will launch the Roman Telescope into the heavens. Worse still, Eggers insists, “This work is paid for by you; no billionaire would bankroll it—there’s no profit in it,” a claim belied by the philanthropic billions already pouring into science and health services with no expectation of personal return.

The most glowing write-up in the entire volume is that given to Ronald Waters, the principal deputy undersecretary for Memorial Affairs. That title is Washington-speak for tending to the burial and cemeteries of our nation’s veterans. It is an honorable calling, and Walters seems to be one of nature’s noblemen, but again, it is a very unusual kind of work in the vastness of American bureaucracy.

And it has something in common with the other jobs described above: it steps on no one’s toes and is not much touched by ideological disputes. If gathering information, studying the stars, and caring for veterans’ cemeteries were all or even most of what the government did, the administrative state would be far less controversial than it is. But, of course, the administrative state does a lot more than that.  It generates thousands of ideologically controversial regulations with which citizens and companies must comply. No doubt some of this is necessary, but the challenging questions are how much and whether government employees are sufficiently sensitive to the fellow citizens who spin the wheels of commerce that pay their salaries. The book does not shed light on these issues.

Oddly, the collection profiles only one government employee who works for a rulemaking agency, the Department of Labor. But that employee, Christopher Marks, also does not impose binding rules on industry. Instead, he worked out a formula for calculating how much support mine shafts needed for their roofs—a formula that mining companies may choose to use or not. Again, his work is best seen as information-generating. The argument for government provision is that coal companies might not have incentives to do this work themselves, because it may be hard to profit from. 

But this last point is controversial. As the writer Michael Lewis acknowledges, an economic historian has suggested that the real barrier to making roofs safer was the learning curve to use bolts to shore them up, and that the market naturally solved the problem over time, given that mining companies did not want expensive tunnel collapses any more than workers. Unfortunately, while Lewis has a skeptical attitude toward this theory, he does not provide any reason that it is not true. In a similar vein, Charles Murray has argued that the rate of accidents in the workplace declined at the same rate both before and after the establishment of the Occupational Health and Safety Administration. Greater wealth and more know-how, not government bureaucracy, reduced accidents.  

The one worker who clearly encroaches on the activities of others is Jarood Koopman of the Internal Revenue Service. But he is a cyber sleuth who unearths financial fraud and schemes to evade taxes. The criminals he helps catch rightly deserve little sympathy, unlike legitimate businesses which bear much of the brunt of government regulation.

All eight essays are superbly crafted and, like the best New Yorker profiles, provide the reader with a sense of what it would be like to be a colleague of these employees. The one exception is an essay by Kamau Bell, who writes about a paralegal who has just joined the Department of Justice’s antitrust division. The first problem is his choice of such a neophyte, which is hard to explain except that the paralegal is a goddaughter of his who hopes to be a social activist. Bell tells us almost nothing about what an antitrust paralegal actually does, substitutes first-name testimonials for evidence, and treats “fighting monopolies” as an unalloyed public good. Yet antitrust punishes only the abuse, not the mere possession of market power, and, as the Supreme Court reminds us, the prospect of monopoly profits is the spark that ignites innovation. By ignoring those trade-offs, Bell’s essay crystallizes the book’s larger defect: it invites us to cheer government in the abstract while averting our eyes from the hard arithmetic of costs, incentives, and unintended consequences that any serious accounting of the administrative state demands.

All nine essays also share a glaring omission: none confront the algorithmic elephant now charging through the administrative state that is artificial intelligence. No force is better designed to displace some of the very callings these writers celebrate—statistical clerks, archival digitizers, and government paralegals—than machine learning systems already scraping data, structuring case files, and drafting legal summaries in milliseconds. The Department of Government Efficiency’s question in any audit is now brutally simple: can a stack of silicon outperform a bevy of GS-12s? By refusing to reckon with that prospect, the book reproduces bureaucracy’s oldest vice—looking backward rather than forward and downplaying the private innovation that has made America great. 

The post The Righteous Bureaucrat? appeared first on Law & Liberty.

]]>
67481
Private Vices, Public Benefits? https://lawliberty.org/book-review/private-vices-public-benefits/ Tue, 27 May 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67333 The title of John Callanan’s Man-Devil: The Mind and Times of Bernard Mandeville, the Wickedest Man in Europe is not an exaggeration since Mandeville (1670-1733) was truly the most scandalous writer in the Enlightenment period. The ambition of Callanan’s book is to take Mandeville seriously as a thinker. Notwithstanding all his satire and jokes, the […]

The post Private Vices, Public Benefits? appeared first on Law & Liberty.

]]>
The title of John Callanan’s Man-Devil: The Mind and Times of Bernard Mandeville, the Wickedest Man in Europe is not an exaggeration since Mandeville (1670-1733) was truly the most scandalous writer in the Enlightenment period. The ambition of Callanan’s book is to take Mandeville seriously as a thinker. Notwithstanding all his satire and jokes, the book claims that Mandeville put forward a unified worldview, one that we ignore at our peril.

It was particularly through The Fable of the Bees (1714) that Mandeville became one of the most notorious men in Europe. The essential idea of the Fable was that conventional vices—drunkenness, gluttony, luxury consumption, and materialism in general—were all part and parcel of a modern economy based on market exchange. Eliminating such traditional vices would only lead to economic stagnation and less prosperity, which would ultimately weaken states such as Mandeville’s native Dutch Republic and his adopted England. The Fable’s core argument is succinctly captured by its subtitle: Private Vices, Public Benefits.

As Callanan deftly shows, Mandeville addressed the relationship between economics and morality, and gave an answer that no one wanted to hear: “On the one hand, [he] agreed that increased commercial activity would bring about benefits for society. However, it would not—in fact, could not—do so by eliminating individual vice.”

Against Utopianism

The Fable was based on a poem called The Grumbling Hive: or, Knaves turn’d Honest, which Mandeville published in 1705. This poem tells the fable of a beehive as a microcosm of a booming commercial society, in which the vice-ridden but prosperous bees were deeply unhappy in their wealth, constantly “grumbling” about their immorality, while being envious of the most successful among them. This was crucial to their economic success: as the bees were always discontented with their lot and hungry for the latest luxuries, the poor benefited as a result of the industries needed to keep up with demand. But when their god, Jove, answers their prayers by removing all vain and depraved behavior and turning the bees honest, the beehive’s economy fails as all industries and institutions associated with immoral behavior and its corrections become superfluous. As the weakened beehive looks ripe for the taking, it is attacked by a foreign enemy, and most bees are killed. As Mandeville wrote:

Then leave complaints: fools only strive / To make a great and honest hive. / To enjoy the world’s conveniences, / Be famed in war, yet live in ease, / Without great vices, is a vain / Eutopia seated in the brain. / Fraud, luxury and pride must live, / While we the benefits receive.

With a nod to Thomas More, Mandeville was clear that “utopias” were literally “no wheres,” only possible as the figment of someone’s imagination. Callanan usefully compares Mandeville’s recommendation for society with Machiavelli’s The Prince. Without necessarily recommending vice over virtue in general, Machiavelli had encouraged his readers not to mistake political expedience for moral rectitude. In a similar vein, Mandeville suggested that we should not conflate a prosperous society with a morally upright one. Just as Machiavelli was convinced that the prince could lose his state if he pursued virtue instead of expedience, Mandeville believed that society could disintegrate if it obsessed with perfecting the morals of its citizens.

Infamy

It would be wrong to think that this message would have been readily accepted in England. Despite the country being a commercial nation and a rising financial hub, much of the English reading public was, in fact, outraged by Mandeville’s message. When the second edition of the Fable was published in 1723, it was put on trial in Middlesex for attempting to “debauch the nation” and “run down religion and virtue as prejudicial to society, and detrimental to the state.” Although he was acquitted, the public outcry persisted. A few months later, the Bishop of London’s chaplain, Robert Burrow, targeted in a sermon “men, who strike at the foundation of virtue and morality.” In 1732, an anonymous author compared Mandeville to the Antichrist, from which Callanan takes his title: “And if GOD-MAN Vice to abolish came, / Who Vice Commends, MAN-DEVIL be his Name.”

Without being an intellectual biography, Man-Devil does an excellent job of situating Mandeville in his cultural milieu. His philosophical inheritance included such eminent figures as Erasmus, Montaigne, Spinoza, Pierre Bayle, and La Rochefoucauld, among others. A nice touch of the book is that each chapter begins with an epigram from La Rochefoucauld’s witty Maxims, whose importance for Mandeville’s thought is unmistakable. A few of them stand out, including: “Hypocrisy is a kind of homage that vice pays to virtue.”

One of the highlights of Man-Devil is the rich reception history of Mandeville and the Fable. Literary giants such as Alexander Pope, Jonathan Swift, Henry Fielding, and Samuel Johnson all had a view of Mandeville, as did the philosophers George Berkeley, David Hume, Adam Smith, Jean-Jacques Rousseau, and Immanuel Kant. The Methodist founder John Wesley’s reaction was rather typical. “Till now I imagined there had never appeared in the world such a book as the works of Machiavelli,” he wrote in his journal. “But de Mandeville goes far beyond it [sic].”

Mandeville’s defense against his many critics was that he did not encourage vice but only pointed out its permanence. In this way, he placed himself alongside Neo-Augustinians such as La Rochefoucauld, who viewed human nature as fallen. As he responded to his critics: “I am far from encouraging vice, and should think it an unspeakable felicity for a state, if the sin of uncleanness could be utterly banished from it; but I am afraid it is impossible.” Because of the lack of evidence, Callanan highlights that we cannot say for certain whether this was indeed Mandeville’s sincere view. Callanan speculates, however, that Mandeville, after having long labored in obscurity, may have been delighted by the notoriety he attained in the final years of his life.

This self-regarding passion, often hidden, Mandeville saw as the driver of social norms and indeed sociability itself.

Mandeville’s posthumous reception among philosophers was more complicated. Hume included Mandeville as one of the thinkers who had laid the foundation for his “science of man,” yet argued that Mandeville’s “selfish philosophy” was “contrary to common feeling and our most unprejudiced notions.” Smith, though overtly critical for similar reasons as his friend Hume, could find anticipations of both the division of labor and the invisible hand in Mandeville’s Fable. Rousseau engaged with Mandeville in his Discourse on the Origin of Inequality (1755), whereas Kant listed Mandeville as a crucial figure in the history of ethics.

Intentions and Innovations

Mandeville often purported that the Fable was purely meant as entertainment. But Callanan identifies that he acknowledged that his writing had wider ambitions:

If you ask me, why I have done all this, cui bono? and what good these notions will produce? truly, besides the reader’s diversion, I believe none at all; but if I was asked what naturally ought to be expected from them, I would answer, that, in the first place, the people who continually find fault with others, by reading them, would be taught to look at home, and examining their own consciences, be made ashamed of always railing at what they are more or less guilty of themselves; and that, in the next, those who are so fond of the ease and comforts, and reap all the benefits that are the consequence of a great and flourishing nation, would learn more patiently to submit to those inconveniences, which no government upon earth can remedy, when they should see the impossibility of enjoying any great share of the first, without partaking likewise of the latter.

This is a revealing but rare glimpse into Mandeville’s intentions. As Callanan notes at the outset, we know very little about Mandeville’s life, and we do not even know what he may have looked like. We do know, however, that he was a daytime physician, and one of the book’s rather novel aspects is the stress it places on Mandeville’s medical background for his conception of human beings. In 1711, Mandeville published what he took to be his magnum opus, A Treatise of the Hypochondriack and Hysterick Diseases in Three Dialogues. In this work, written after he had been a practicing doctor for twenty years, Mandeville argued that though many may have opted for a medical career with the aim of caring for others, we should recognize that such motives were often mixed with more selfish ones, including fame and remuneration.

Building on much recent scholarship on Mandeville, Callanan holds the second volume of the Fable of the Bees, published in 1729—not to be confused with the second edition from 1723—as especially innovative. It was here that Mandeville introduced his influential distinction between self-love and self-liking. Whereas self-love relates to straightforward self-interest, self-liking is a more complicated passion that is concerned with our capacity to value ourselves higher than others. It is self-liking that makes us fond of approbation and approval. As Callanan puts it, because of self-liking, Mandeville believed that human beings, in contrast to other creatures, “live and die not for the preservation of their lives but for the preservation of their good name.”

According to Mandeville, self-liking is an extremely powerful passion, even if we are usually unaware of its sway. “When we are applauded for extolling the virtues of fairness, impartiality, equality, and self-sacrifice,” Callanan writes, “we achieve the intended result of securing the admiration of others while covering up our true motives, even to ourselves.” This self-regarding passion, often hidden, Mandeville saw as the driver of social norms and indeed sociability itself, which emerges spontaneously. If human beings had been naturally benevolent, society would have developed very differently, and government would have been unnecessary. Because of his understanding of the importance of spontaneous order in human affairs, Friedrich Hayek celebrated Mandeville as one of the “master minds” of social theory, despite conceding that the Anglo-Dutchman may not have been an original thinker on technical economic topics.

Man and Society

Mandeville’s most radical insight, Callanan suggests, was that he saw through the intuitive idea that what is good behavior at the level of the individual will also be good at the societal level. As such, he presented a challenge to Kantian philosophy before it had been formulated. His actual targets were moral philosophers such as the Third Earl of Shaftesbury, who projected individual morality onto the group, indeed the universe. The idea that private vices could be public benefits was thus not just an eye-catching subtitle, but central to Mandeville’s unsettling philosophy. 

We need not endorse Mandeville’s entire philosophy to recognize that the questions he posed are as pertinent as ever. Is it possible to be morally good in a commercial society? Is there space for individual virtue in a market society? Are private vices really public benefits? Modern politics remains centered on the economy, as kitchen-table issues determine elections, populations clamor for growth and rising living standards, and global economic competition intensifies. But as thinkers from Smith to Joseph Schumpeter and Ludwig von Mises have shown, capitalism, despite all its obvious success, can be painful for the individual. Perhaps there is no coincidence that scattered evidence points to an ongoing religious revival among young people in the West, which, if sustained, means that the tension between wealth and virtue could become even more apparent. Wherever we happen to stand, this makes Mandeville an inescapable thinker for our time.

The post Private Vices, Public Benefits? appeared first on Law & Liberty.

]]>
67333
Saying No to Smartphones https://lawliberty.org/book-review/saying-no-to-smartphones/ Thu, 22 May 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67404 When I was a kid in the 90s, my parents would often take me and my three younger siblings to a playground or, if the weather was bad (and in northeast Ohio, it often was), to the PlayPlace at McDonald’s. In retrospect, this surely had its downsides. The ball pit was unsanitary, the food probably […]

The post Saying No to Smartphones appeared first on Law & Liberty.

]]>
When I was a kid in the 90s, my parents would often take me and my three younger siblings to a playground or, if the weather was bad (and in northeast Ohio, it often was), to the PlayPlace at McDonald’s. In retrospect, this surely had its downsides. The ball pit was unsanitary, the food probably worse than cardboard, and I vaguely remember being bitten by a kid my age when I wouldn’t vacate the slide. But we were kids, enjoying the innocence of childhood: laughing, shouting, running around, returning to the table only for another bite of french fries.

In the years since, McDonald’s has closed nearly all of its PlayPlaces (including the one we frequented). On X, Nancy French recently shared a “heart breaking” photo of what has replaced the playground at one location in Tennessee: two plastic chairs in front of two computer screens, a poignant symbol of childhood in the twenty-first century.

Childhood and Properly Formed Loves 

In Plato’s Republic, Socrates and his interlocutors agree that “the beginning is the most important part of every work.” At a “young and tender” age, a child’s soul is easily moldable and, once formed, tends to harden into concrete. The book of Proverbs declares a similar rule: train children up in the way they should go, and when they are old, they will not depart from it.

After laying down this principle, Socrates turns to, of all things, music, claiming that a young child’s musical education is “most sovereign” because, more than anything else, it vigorously grabs hold of “the inmost part of the soul,” thereby forming his loves “before he is able to grasp reasonable speech.” The music we listen to shapes what we love and hate, what we judge as beautiful and ugly, what we hold just or unjust. 

Following Socrates’ lead, Allan Bloom warned in the 1980s that children’s addiction to the artificial exaltation of rock ‘n’ roll made it very difficult for them to take their education seriously. His point was not to lament falling standardized test scores or an unprepared workforce—or even to moralize about the lyrical content of contemporary music. Rather, Bloom sounded the alarm that, without “strong counterattractions,” Generation X would be dominated by their immature passions. They would not know the “pleasures of reason” and, therefore, would not know how to live the good life.

Rather than vibrantly loving what is lovely and hating what is hateful, many of Bloom’s students, it seemed to him, have had “the color … drained out of their lives,” as if they were recovering from “a serious fling with drugs.” How can the serious study of great texts compete with the spectacle of, say, the Rolling Stones or Michael Jackson? Why would children need to cultivate an imagination when a constant stream of pop music overflows with graphically sexual and violent lyrics? What chance do the habits of the contemplative or prayerful life have against the raw and unearned emotions of contemporary music?

Of course, Bloom admitted, most adults recover from this adolescent obsession and eventually accept the grim responsibility and drudgery of work until they retire (and then die). But, as philosopher Zena Hitz writes, “If I work for the sake of money, spend money on the basic necessities for life, and organize my life around working, then my life is a pointless spiral of work for the sake of work.” If I never ask whether there is something worth doing for its own sake or what “the chief end of man” is, I’ll have no choice but to chase after the wind.

If Bloom was right—that rock music desensitized Generation X to the fundamental questions and deepest joys of life—then digital technology poses an even greater threat to Generation Alpha. For one, the problem is widespread. According to a study by Common Sense Media, more than two-thirds of 8-year-olds have their own tablet, and children aged 5-8 spend an average of 3 hours and 27 minutes on screen media every day. What’s worse, if our societal obsession with rock ‘n’ roll recognized, legitimated, and empowered children’s immature, bodily passions, our addiction to digital technology seems to form passionless, pseudo-disembodied souls. For kids with excessive screen time (colloquially known as “iPad kids”), it’s as if the lights have been turned off.

As a professor at a small liberal arts college, I can attest that many students are “functionally illiterate” and “unable to read and comprehend” serious adult novels. Screens are the medium of moving images, passively received by the user. If text does appear, the medium encourages skimming short paragraphs to inform, not deep reading of great books to challenge assumptions and cultivate wisdom. The “feed” is “snackable”—shaping attention into “discrete bursts that scatter and cascade.” (If you’re reading this article on your device, how much of it have you skimmed?) It’s no wonder that many in the generation raised on screens do not have the attention span or mental fortitude to read a full-length book. In 2024, “brain rot” was Oxford’s word of the year. And now, with powerful generative artificial intelligence at their fingertips, many students (though certainly not all) seem to think that the difficult work of independent thought can and should be “subcontracted” to a machine.

Enter The Tech Exit

“It is easy for me to imagine,” Wendell Berry wrote a quarter century ago, “that the next great division of the world will be between people who wish to live as creatures and people who wish to live as machines.” In The Tech Exit, Clare Morell—a young parent and scholar at the Ethics and Public Policy Center—takes up the task of advising parents about how to bring up creatures rather than machines. “This is the definition of the Tech Exit,” she writes: “no smartphones, social media, tablets, or video games during childhood,” while other screens, such as the family TV and computer, should be used sparingly, publicly, and purposefully. Parents, educators, and pastors will appreciate the “concrete, everyday advice” Morell offers. 

In the first part of the book, Morell cites the research of Jonathan Haidt, Anna Lembke, and Victoria Dunckley in order to highlight the dangers of digital technology—by which she especially means scrolling social media and interactive games on smartphones and tablets—and show why “harm reduction” doesn’t work.

Morell writes that digital tech is more like fentanyl than sugar. Its design features—the infinite scroll through personalized content curated by powerful algorithms and reinforced by constant notifications—work to rewire the reward system of young brains to crave dopamine, a hormone and neurotransmitter that creates and intensifies desire, without satiating it. In other words, digital tech hyperstimulates the brain’s dopamine receptors, which is especially dangerous for children between the ages of ten and twelve (during which time their dopamine receptors double). Its addictive properties are a feature, not a bug.

The results are catastrophic. Digital tech hijacks the nervous system, decimates impulse control, and can lead to mental health issues. Habituated to the user-friendly, sleek digital world, children seem to “lose their appetite for things of the real world,” Morell writes.

Few parents, I would imagine, give their children unfettered access to the digital world. Instead, many well-intentioned parents seem to assume that, since screen-based technologies are “an inevitable part of childhood,” they should set time limits and controls on their children’s smartphones, tablets, and gaming systems. 

Parents should help their kids choose in-person activities over screens and intentionally use the latter as a tool rather than as an interface mediating their experience of the world.

But Morell argues that parental controls, at best, are harm-minimization strategies—akin to (as she graphically writes) taking your kids to a bar or strip club but having them wear earplugs. In reality, they’re a myth. Since pocket-sized devices are almost always within reach and social media never turns off, they can occupy a child’s thoughts even when not in use. “These apps are designed to create a perpetual craving,” Morell reminds us. I know from my own experience the phantom twitching of a phone in my pocket. Similarly, content controls are porous, maybe intentionally so. To take just one example: Snapchat, TikTok, and Discord ostensibly offer “parental supervision” tools for accounts of 13- to 15-year-olds. (These apps block third-party controls.) But filters don’t always block sites accessed through in-app browsers; plus, the child can change the settings at any time; parents cannot see their child’s “feed” or messages; and there’s nothing stopping children from signing up for an adult account by lying about their age.

In the second and third parts of the book, Morell writes with religious imagery and advocates fasting from digital tech and feasting on the good things of life “in the real world.” She asks parents to commit to “a digital detox”: totally “eliminate all interactive screen time” for thirty days—enough time to normalize “biorhythms and brain chemistry.” A digital fast, Morell warns, will require a heavy investment of parents’ time, at least initially, as they might need to teach kids how to fill their day with board games or outside play instead of Minecraft. If committing to a full month seems daunting, Morell recommends starting with a week, a day, an hour, or even a meal. 

Using the acronym F-E-A-S-T, Morell then offers practical advice about creating “counterpressures” to our screen-saturated society and filling the void left by the digital detox: parents, find other “tech exit” families in your neighborhood or church; exemplify the low-tech life for your kids by physically distancing yourself from your phone while at home; adopt alternatives for your older kids (such as the Light phone, Gabb phone, Wisephone, or Bark phone); set up screen rules (“no aimless surfacing” or scrolling); and trade screens for real-life responsibilities. Parents already familiar with the dangers of digital technology will probably find these chapters the most helpful, even if the acronym is a bit forced.

How realistic is Morell’s advice? Digital tech has reshaped the world, at times shortening the path to real human goods, perhaps especially for older kids. FaceTime (almost) eliminates distance. Language-learning apps, math games, Chessly, and lectures on YouTube present engaging, world-class instruction on portable screens. Coordinating plans is difficult if you can’t join the group chat. 

For parents, a book on parenting can feel intensely personal, like it’s a stranger in the grocery store making judgmental comments about how you’re raising your kids. Morell writes that no one likes being told what to do. And she tries not to, admitting that the tech exit is not without cost and that many parents she interviewed made reasonable exceptions to the no-tech rule. For example, GPS is a must for some teenage drivers, and some older kids might really benefit from learning to play chess on one of the family computers. But, all else being equal, parents should help their kids choose in-person activities over screens and intentionally use the latter as a tool to accomplish specific tasks rather than as an interface mediating their experience of the world.

Will kids growing up in a tech-free household lack the habit of moderation and simply binge on digital tech once they reach adulthood, in a sort of digital rumspringa? Morell provides anecdotes from the parents and adult children she interviewed. They report that, having developed good habits when they were young, they now use their smartphones as the tools they were designed to be. The evidence, though obviously limited, makes a certain amount of intuitive sense.

Community Solutions?

The book concludes with policy solutions. First, and most urgently, Morell argues that K-12 schools should go screen-free, prohibiting students from possessing smartphones on school grounds and getting rid of “educational” screens like Chromebooks, tablets, and laptops. Morell is on solid ground here. As of 2024, more than 30 percent of countries worldwide (and at least 19 states in America) ban cell phones in schools. Secondly, Morell argues that Congress should ban social media for minors—and, until it does, individual states should take steps towards that end by, for example, requiring parental consent before a minor creates a social media account. 

But Morell’s book does not address the latest push for tech in schools: equipping students to use generative AI, as President Trump’s recent executive order seeks to do. Whereas one school in Texas boasts that replacing teachers with AI has boosted students’ enthusiasm and test scores while reducing time in class, some studies suggest that generative AI can harm learning, especially when used as a “crutch.” How should parents weigh the pros and cons of this powerful new tech? In addition, Morell too quickly dismisses legitimate concerns about the constitutionality of social media bans. For one, a nationwide ban would seem to transgress the limits of Congress’s constitutional authority. Her recitation of the many state laws passed in the past two years belies her argument that federal action is necessary. And courts have consistently ruled that social-media bans violate minors’ free speech rights, to which Morell says surprisingly little, only mentioning in a footnote that such laws “will have constitutional challenges to overcome.”

Still, Morell has done good work for parents concerned with the dangers of digital tech in their children’s lives. Screens are not an inevitable part of childhood, and The Tech Exit helps illuminate the path towards living in the real world—that is, as Wendell Berry put it, living as creatures and not as machines. Because the mind is free, but our days are numbered, we ought to turn our hearts to wisdom.

Resisting the temptation of screens is a constant struggle. A dad myself, I now find myself in my parents’ shoes, trying to find someplace dry in which my kids can run out their energy. Our local library has a small, indoor play place, but the last time we went, it was closed for cleaning. The four gaming computers in the kids’ area, however, were open.

The post Saying No to Smartphones appeared first on Law & Liberty.

]]>
67404
The Right’s Descent into Fascism? https://lawliberty.org/book-review/the-rights-descent-into-fascism/ Wed, 21 May 2025 10:00:00 +0000 https://lawliberty.org/?post_type=book_review&p=67343 From the 1980s through 2007, numerous books warned of the threat that the “religious right” posed to America. Tiring of that language, beginning in 2006, critics began to speak instead of the dangers of “Christian nationalism” which, in the words of one of its most prominent students, poses “an existential threat to American democracy and […]

The post The Right’s Descent into Fascism? appeared first on Law & Liberty.

]]>
From the 1980s through 2007, numerous books warned of the threat that the “religious right” posed to America. Tiring of that language, beginning in 2006, critics began to speak instead of the dangers of “Christian nationalism” which, in the words of one of its most prominent students, poses “an existential threat to American democracy and the Christian church in the United States.” I critique much of this literature in my recent book on Christian nationalism and have suggested that critics would soon tire of the “Christian nationalism” slur and move on to more provocative language. Katherine Stewart obliges in her recent book, Money, Lies, and God: Inside the Movement to Destroy American Democracy

Stewart’s manuscript went to press before the outcome of the 2024 presidential election was known, so she merely expresses concern that “the descent into fascism—if it hasn’t already happened by the time these pages reach you—remains the most likely path through which the American experiment ends, if it is to end.” She believes that American democracy is threatened by “a leadership driven movement” that has “no single headquarters” but instead consists of “powerful networks of leaders, strategists, and donors” who sometimes seem to seek different ends, e.g., small national government vs. massive administrative state, but who all apparently agree that America’s liberal democracy must go. The book is broken into three major sections that roughly follow the title of her book: Money, Lies, and Demons. 

Following the Money

Central to this “movement” are wealthy donors who contribute money to conservative or libertarian causes. These include well-known individuals such as “Betsy DeVos, the Wilks brothers, Rebecah Mercer, Tim Dunn, and the Koch brothers” as well as less well-known ones such as Elbert Hubbard (a pseudonym), James and Joan Lindsey, and Timothy Busch. She occasionally concedes that there are plenty of politically progressive donors, but dismisses them because “with some exceptions, liberal and progressive money tends to go to siloed causes.” 

Getting a particular person elected to the presidency would not seem to be a “siloed” cause, but the project certainly attracted a lot of wealthy donors in 2024. According to the New York Times, in the 2024 race for the presidency, “the Democrats, their allied super PACS and other groups raised about $2.9 billion, versus about $1.8 billion for the Republicans.” Although far more money was spent promoting Harris than Trump, she still lost a fair and democratic election. Perhaps democracy is healthier than Stewart thinks? 

Throughout the book, Stewart writes as if conservative groups are extraordinarily wealthy and as if progressive groups are underfunded. For instance, she regularly references the Alliance Defending Freedom (ADF) as an example of a “massive,” well-funded, conservative legal advocacy group. It is not clear to me that defending religious liberty and freedom of speech are “conservative” causes, but ADF did raise $101 million in 2022. In the same fiscal year, the Southern Policy Law Center had revenues of $169 million, and the American Civil Liberties Union had revenue of $162 million (all according to IRS form 990 filed in 2022). But perhaps the latter two organizations are “siloed” and, as such, are irrelevant? 

Similarly, Stewart describes the Becket Fund for Religious Liberty as a Catholic group that is one of the “beneficiaries of plutocratic largess,” even though it had revenues of “only” $22 million in 2022. But Becket does not describe itself as either a “Catholic” or “Christian” legal advocacy group, although one may be excused for thinking an organization named after Thomas à Becket is both. Indeed, I introduced an Islamic attorney who once worked for Becket as having worked for a Christian advocacy group. She immediately corrected me. 

As it turns out, this Islamic attorney was on the Becket team that represented the Greens in the Burwell v. Hobby Lobby Stores case (2014), a case that Stewart describes as “extending religious privileges to corporations.” Religious liberty is a right, not a privilege, a fact understood by the Democrats and Republicans who approved the Religious Freedom Restoration Act of 1993. It was this law that protected the owners of a closely held corporation from having to provide abortifacients to employees against their religious convictions. Nor was this the first case to protect the religious liberty of a corporation (see Gonzales v. O Centro Espírita Benficiente União do Vegetal [2006]). And this church was not the first corporation to be protected by the First Amendment, as evidenced by cases such as the New York Times Company v. Sullivan (1964). 

Exposing the Lies 

In addition to funding conservative legal advocacy groups, donors support organizations that, from Stewart’s perspective, provide intellectual justification for fascist policies. These include the Federalist Society, the Heritage Foundation, the “James Madison Center,” (such a thing exists, but I’m pretty sure she means Princeton University’s James Madison Program), the Edmund Burke Foundation, the Claremont Institute, and the State Policy Network. 

A good example of one of these organizations providing intellectual leadership to the new right is the National Conservative Conferences (NatCon). Here, conservative intellectuals gather to give speeches critiquing the old right and casting a new vision. Stewart attended NatCon 2023 where she heard conservative intellectuals advocate “a mix of nationalist rhetoric (in whatever nation the conversation happens to be taking place); vague (and typically insincere) gestures toward economic populism; and copious amounts of hate for liberals and, worst of all, ‘the woke.’” She offers no justification for calling proponents of economic populism “typically insincere” and she admits that “what this new vision involves is very hard to specify with precision.”

There is truth to almost all of her claims—i.e., conservative donors exist, as do conservative intellectual organizations and individuals, and there are conservative Christian groups aimed at mobilizing voters.

It is understandable that a progressive such as Stewart would find speeches at NatCon to be objectionable. But to paint NatCon as part of an intellectual movement that includes the Federalist Society, Hillsdale College, the James Madison Program, the State Policy Network, and the Claremont Institute beggars belief. Stewart’s presentation overlooks the many disagreements between and sometimes even within these organizations. To name one obvious difference, many conservatives take NatCon speakers advocating economic populism seriously and are quite critical of them. But it gets worse.

Stewart often characterizes these organizations as having a united front (upon occasion, she concedes that they have some differences), but many of their positions, e.g., being pro-life, pro-religious liberty, and pro-school choice, have been held by conservatives since the 1980s. In order to make them seem scary, she engages in guilt by association. For instance, Stewart is able to give examples of a scholar associated with the Claremont Institute criticizing feminism and advocating for more “traditional” gender roles. She attempts to connect these positions to the worst sort of misanthropy by noting that a former “recipient of a Lincoln Fellowship at Claremont [once] reportedly wrote ‘feminists need rape.’” I suspect “reportedly” is doing important legal work here. To support her claim, she cites a Mother Jones article stating that the individual in question posted but then deleted this statement on his blog. If I were a reporter, I’m not sure I would want to rely upon this evidence if I were sued for libel.

But let’s assume that this person wrote and then deleted this offensive and potentially dangerous statement. Claremont accepts 14-15 young professionals each year to participate in the week-long Lincoln Fellowship program. Is it reasonable to hold the Institute responsible for every foolish, hateful, or even dangerous thing past participants ever post?

Even worse, she then transitions from the scholar associated with Claremont to discuss books and Internet posts by misogynists like Bronze Age Pervert (BAP) and one of his fans, Justin Murphy. She connects BAP to the conservative movement by pointing out that Michael Anton reviewed BAP’s book, Bronze Age Mindset, in the Claremont Review of Books, although she does not make it clear that the review was a negative one. Make no mistake, Bronze Age Pervert and Murphy are troubled individuals, and they have Internet followers, but I don’t know anyone at the institutions Stewart highlights who shares their views.

Stewart’s second rhetorical strategy is to use adverbs and adjectives to make conservative thinkers and institutions sound scary. Take, for instance, the case of Princeton Professor Robert P. George, “the reactionary Catholic activist” who is also described as an “ultra conservative legal activist.” George is known for opposing activist decisions such as Roe v. Wade (1973) and Obergefell v. Hodges (2015), but he is also known for traveling around the country with the very progressive Cornell West, People’s Party presidential candidate in 2024, and engaging in civil dialogue about contentious issues. His most recent book, co-authored with West, provides a model for such dialogue. 

Similarly, Father Richard John Neuhaus is described as an “ultra-right-wing convert” to Catholicism, the Acton Institute is “hard-right,” and Hillsdale College is a “Christian nationalist school” and “the far right institution that operates as a gathering spot for thinkers of the anti-democratic reaction.” Christopher Rufo is a “reactionary henchman” and the Claremont Institute is an organization “wallowing in misogyny and hate.”

Stewart’s rhetorical strategy of using lots of adverbs and adjectives will likely get lots of nods from readers already convinced she is right (for example, folks like Andrew Seidel and Bradley Onishi, whom she quotes as authorities and whose work I critique here). But it will have little impact on undecided or conservative readers open to being convinced that there is a serious movement committed to undermining America’s liberal democracy. 

Exorcising the Demons

The Heritage Foundation, Claremont Institute, etc., may provide intellectual arguments, but Stewart recognizes that they do not have many followers who will serve as foot soldiers in a war against democracy. In the third section of her book, which is entitled “Demons,” she explains that these soldiers are primarily generated by organizations committed to motivating conservative Christians to be involved in politics. 

Stewart is a journalist, and she is at her best when she visits meetings of groups like Faith Wins, Conservative Political Action Congress, Moms for Liberty, ReWaken America, etc., and is able to report silly, outlandish, and sometimes even dangerous things speakers or participants say. For instance, she describes a mom who is terribly concerned with “the homosexual content” of The Diary of Anne Frank, men at supposedly Christian events wearing shirts proclaiming “God, Guns, and Trump” and “F*ck Biden,” and speakers saying things like “Hitler was fighting the same people we’re trying to take down.” I presume her accounts of these events are true, but I also doubt that they characterize many of the people or speakers at the meetings she attended. 

Stewart highlights, as does Matthew Taylor in a recent book, Pentecostal Christians such as Paula White-Cain who believe that Christians need to take “dominion” of the “seven peaks of modern culture—government, business, media, education, entertainment and arts, family, and religion.” Christians have long advocated for bringing Christian values into every part of life. That is why, for instance, the owners of Hobby Lobby and Chick-fil-A insist on closing their businesses on Sunday. And yet, it is understandable that non-Christians get nervous when Pentecostal leaders of the New Apostolic Reformation call for spiritual warfare against “demons” and “territorial spirits” and lead prayers declaring things like “we have been given legal power from heaven and now exercise our authority.” But adherents to this movement account for a small percentage of the 4 percent of Americans whom Pew deems to be Pentecostal

Most of Stewart’s book is devoted to a scary story of America’s descent into fascism. There is truth to almost all of her claims—i.e., conservative donors exist, as do conservative intellectual organizations and individuals, and there are conservative Christian groups aimed at mobilizing voters. She occasionally acknowledges that these individuals and organizations have different goals and so perhaps shouldn’t be considered part of a coherent movement. She references individuals like Bronze Age Pervert who say horrible things, and she quotes Christian conservatives as saying things that are undemocratic, but in the final analysis, she doesn’t come close to making the case that there is a movement afoot to end American democracy and replace it with fascism. 

Despite the extended horror story, the book ends with some hopeful notes, including her claim that “those of us who reject the politics of consequence and division—are in the majority.” This claim is ironic coming from Stewart, given the divisive nature of her book, but I think it is mostly accurate. Although too many Americans (23 percent) agree with the statement that “because things have gotten so far off track, true American patriots may have to resort to violence in order to save our country,” the vast majority disagree. BLM riots, the January 6 assault on the Capitol, vandalism of Teslas, and attacks on Jewish students are very concerning, but the vast majority of Americans remain committed to non-violent political action and liberal democracy. 

Stewart’s concluding chapter contains a variety of proposals that would help fend off fascism. These include adopting a system of progressive taxation because the “present system of taxation is flat at best, and even regressive at the top.” This might come as a surprise to “the top 10% of U.S. earners who pay 72% of the nation’s [income] taxes.” As well, and ironically for someone concerned with preserving America’s constitutional order, she suggests that the Electoral College and equal representation of states in the Senate may need to be abolished. 

America, it is safe to say, is not about to descend into fascism. But that will not keep critics like Stewart from comparing conservatives to fascists and Trump to Hitler. Such rhetoric sells books and may be personally satisfying, but it does not contribute to the sort of healthy political discourse necessary for a well-functioning liberal democracy.

The post The Right’s Descent into Fascism? appeared first on Law & Liberty.

]]>
67343