The Foreword to the new book “The Wisdom of Our Ancestors: Conservative Humanism and the Western Tradition.”
A Thousand Years from Now
Mankind’s entering the third millennium was an event that may quickly turn into a non-event in spite of the media’s selfish attention to it. Quite different was the event which passed without being really noticed and yet became the starting point for already three millennia. The only notable who took note of it, unintentionally and very briefly though, was a monster called Herod.
The event, or the birth of Jesus in Bethlehem, was recognized in Christian antiquity as the birth of One who was a person in that transcendental measure in which a true God, who lives as three Persons in one divine Nature, transcends all nature. One may disagree with this profound theological (as well as metaphysical) insight, but one is not free to disregard a fact of intellectual history. A clear awareness of such a transcendence secured meaning to man as being a person with inalienable rights. Nothing shows more the fallen state of man than his slighting the factor that really vindicated his special status as a man. Oblivion to this factor has set the tone of all leading editorials about man’s passing from the second into the third millennium. They were resolved to sing at the top of their voices, “Glory to Man in the highest,” so that the tune, “Glory to God in the highest,” might be drowned out.
Tellingly, the name of Jesus was least to be heard in this cavorting in decibels. It was, of course, a blessing in disguise that Jesus was not mentioned by a brazenly amoral Washington officialdom as it greeted the onset of the new millennium. The celebrations held in Independence Mall were a further signal of the resolve to remain independent of the hold of any absolute truth and tenet. Nor was the name of Jesus to be heard under the Millennium Dome in London, the capital of a nation which in all its disbelief stubbornly clings to remaining a monarchy whose crown has the “Defender of the Faith” as its chief decor. As irony would have it, in the City of Light the huge clock set up on the Eiffel Tower stopped its countdown ten minutes before midnight. In the frenzy of excitement nothing symbolic was noticed in that mechanical failure. Progress has so far repeatedly stopped just before it seemed to cross the finish line.
When a thousand years ago the year 999 had to yield to the year 1,000, the excitement was much less than imagined by some today. Luckily for those times, there were not yet media-men around to create the news instead of reporting it. A certain class of historians can, of course, treat any phase of the past so that it may appear to conform to the specifications of the present. On reading books, such as E. Weber’s Apocalypses: Prophecies, Cults, and Millennial Beliefs Through the Ages (1999), one cannot help thinking of Chesterton’s observation: “Human history is so rich and complicated that you can make out a case for any course of improvement or retrogression.” Some historians still have to learn that proper research into history demands more than raking up from the record only such facts that match some momentary predilections or something even less creditable.
It would be most unreasonable to assume that the sudden upsurge of vigor in Western Christendom from the middle of the eleventh century on was due to a mere sigh of relief: the fearsome watershed had been crossed without anyone being harmed and therefore it was all right to get on with improving one’s lot. More substance than a mere sigh had to be behind the feverish rate at which new cathedrals began to be built with startlingly new techniques, and with schools around cathedrals that quickly grew into universities. Tellingly, it was in the twelfth-century cathedral school of Chartres that people first began to look upon themselves as moderns. They certainly did not claim to have invented the sixth-century Latin word moderni, which originally meant modish. If they had cared to label the time in which they lived, they would have called it the modern age. But they were more intent on substance than on coming up with catchy labels.
This was so because they were moderns in the sense which a thousand years later was tied to the word “postmodern.” This word was invented to abuse those who had enough of the intellectual and moral farce that began to parade under the label “modern,” once the medieval centuries had gone. The medievals boldly looked into the future, but only because they felt that they were “sitting on the shoulder of giants,” the great minds and characters of previous times. And they were sincere in admitting their indebtedness to the past. There was no such posturing for them like the kind which Newton indulged in when, in false modesty, he quoted the same phrase which by then sounded rather hollow.
Not that those scholars in the school of Chartres were antiquarians. Unlike their Renaissance successors, they did not make meticulous collections of sundry dicta of Greek and Roman sages as if they alone could teach anything to man. The medievals handled ancient history with sound respect because they knew that antiquity had seen the moment that marked the fullness of time. And that fullness taught them something which classical antiquity was loath to accept, namely, that man was lacking something very important, because man had lost it long before antiquity set in. As with all losses, this, too, related to the loss of something additional to man’s nature. For what really comes with man’s nature cannot be lost.
Being taught about that loss, and about its partial recovery through redemption in Christ, the medievals learned something most important about the process of learning. As any sensible educator knows, learning is assimilation by the learner. This in turn is determined by the mental structure of the pupil himself. The mental structure of the medievals was riveted in holding fast to the event that marked the beginning of the first millennium. They were consciously and enthusiastically Christians even when they fell far below the standards set by Christ. When they fell they knew they were down. They never took a slump for a rise, a descent for an ascent. They never gloried in what could only be their shame.
Rousseau, that prime herald of modern man, loathed nothing more than the doctrine of original sin. A curious loathing it was because Rousseau tried to resurrect man on a purely empirical basis. Only he took some fake ideal (marketed by Margaret Mead with great effectiveness earlier in this century), the allegedly innocent savages in Polynesia, for real beings. Not that Rousseau wanted to defend innocence. He rather wanted to make it appear that in his original form, as allegedly exemplified by the aborigines, man could not care less about innocence. Rousseau indeed ignored all the sad reality of man (and woman), including the one which he himself embodied. Even noble pagans, to say nothing of ignoble ones, evidenced ordinary man’s proneness to errors and to doing evil. This (Rousseau provided many proofs in his Confessions) remains an outstanding empirical fact even when theology calls it the first two of the four secondary effects of original sin. Even more empirical, if possible, are the third and fourth—suffering and death—which again need not be labeled theologically in order for them to stare man in the face.
But whether one takes theology (or the Penny Catechism, for that matter) as a guide or not, it makes an immense difference if one takes reality for what it is, or if one tries to talk it away, let alone if one acts and thinks as if reality were not real. It became the mark of modern times to glory in turning things inside out and applaud those who claim that it is better to be a fake somebody than a real nobody. As long as man respects reality, he remains open to learning important things about himself and others. Otherwise he merely pretends that he has to learn some essentials, either about others or about himself. Actually, he thinks, as he looks condescendingly at himself in the mirror, that he has nothing essential to learn. This is the kind of man who has unlearned something essential about himself. And therein lies the drama of the modern age as it passes from the second to the third millennium.
Examples of man’s condescension towards himself are not a dime a dozen, but literally legion each and every day. But they took especially poignant forms around the very end of this second millennium, when editorials, filled with the euphoria of progress, handed down infallible views on sundry topics without leaving out any area of importance. They spoke about everything except that man was very fallible, and that his gravest failings were about most grievous matters. Of course, those editorials did not say that man could not fail, but they made it appear that it was entirely up to man to avoid any failure, even those that inevitably followed whenever he did something apparently very good.
Take, for instance, the sanctimonious warning which The New York Times handed down in an editorial, “Watching for the Y2K Bug,” on the next to last day of 1999. It came to a conclusion with the remark: “Unknown problems of our own making are an enduring part of existence.” Since among those in the know the problem of Y2K (a problem non-existent for the new generation of chips introduced in the late 1980s) had been an open secret for at least thirty years, the example was out of place. In fact, most of our problems arise out of plain disregard for obvious consequences, which we ignore because either for profit’s or for comfort’s sake we prefer to cut corners. Even twenty years ago, too many chips would have been needed to accommodate a software with three- let alone four-digit time markers. Considerations of marketing decided that problems, which surfaced as Y2K, be swept under the rug.
Then there was the no less sanctimonious preachment delivered by Francis Fukuyama, the prophet of the end of history, who in The Wall Street Journal ruminated: “It Could Have Been the German Century.” This might have been the case if the German Imperial Army had broken through the French lines in August 1914. Surely, in that case Lenin would not have been transported by desperate Germans from Switzerland to Russia, nor would Hitler have arisen either. There would have been no World War II, no long and short marches by the minions of Mao, no opportunity for smaller madmen who thought they knew Marx’s thought better than all previous Marxist gurus. Fukuyama had to admit that such a hypothetical scenario had no small things for its positive side. The European population would not have been decimated in 1914–1918, there would have been no Holocaust, no Gulags, no Great Leap forward that meant starvation for perhaps as many as 60 million Chinese. But the victory of democracy would have been delayed and with it the global triumph of capitalism. Thus the onset of “the end of history,” as imagined by Fukuyama, would have also been postponed by perhaps a hundred years or more. Shed a tear. . . .
Fukuyama’s sermonette would have been distasteful enough had it been only an example of Hegelian cogitation for which there are no real differences between facts and ideas, and therefore no room for moral revulsion, for plain outrage. If even a hundred or so million innocents, sacrificed on the altar of mad ideologies, cannot prompt an academic to break down in uncontrollable lamentation, there is something patently wrong with his ideology.
It is, of course, never popular to rake up some dirt of recent history when things look so rosy for the many who are relatively very few. The explosive rise of the stock market allows the media to lull the public into thinking that millennium is around the corner as the new millennium begins. This new age, as any other age, will have its own ethos as well as its ethics, because ethics are merely the reflection of ethos, or the climate of thought.
The ethos keeps revealing its shallowness because pragmatism can have no depths. Surely, there is something perversely shallow in the kind of ethics where the sole unethical parameter relates to the inevitable in a capitalist democracy. Most marvels of biotechnology will be available only for the well-to-do, and some of its marvels only for the very, very rich. Surely there is something revoltingly shallow in the pressure put on medical insurance companies that they should pay not only for Viagra for men, but also for birth control devices for women, and thereby reduce the imbalance between male and female.
Surely, the utter hollowness of the pragmatism of the modern age reveals itself when a frenzied recourse is taken to mere labels so that moral depths, or rather the depths of immorality, be covered up as soon as they surface. A case in point is the media’s reaction to the first real political debate of the year 2000. The place was the Johnson Auditorium of the University of New Hampshire, with Bill Bradley and Al Gore as the actors, though in all evidence they would have been the last to be confronted with the thorniest of issues. It fell indeed to a reporter to raise the issue and right at the outset. From that moment on the debate would have been about something substantial had it not been the fashion to deal with the manner of handling a case and not with the case itself.
The case was Bill Clinton’s attitude during the long months of the Monica affair and Al Gore’s reaction to it. Bill Bradley could be pleased, but only to a point. As an outright liberal he must have known that he was just as defenseless as his opponent. Bill Bradley, once the head of students at Princeton who took an active role in the services of the University Chapel there, failed to deplore on ethical grounds Bill Clinton’s immorality. Al Gore, once a theology student, is still to come clean in matters of divinity. And it is precisely the media, so proud of its investigative role, that would never investigate either those ethical grounds or that theology.
Rather, the media begins to cover-up under specious but universally accepted labels. Thus in reporting that the issue was raised by a reporter at the very beginning of the debate, CNN hastened to label the reporter as one working for the “staunchly conservative” Union Leader, the premier daily of New Hampshire. In other words, whenever the categories of truth versus error, right versus wrong, good versus evil, virtue versus vice make their appearance, they are to be covered up by bringing up the categories of mere pragmatism, of which none is more effective and treacherous than the contrast between conservative and liberal.
It seems that the liberals are more in the know than are so-called conservatives as they mutually resort to the technique of calling one another by name. The liberals, of course, have the advantage in that they are hardly ever pressed to define liberalism. That is a great advantage because the luxury of not being forced to define one’s position leaves one with no limits for wriggling about limits to programs of liberalization. Liberalism is the privilege accorded by conservatives to liberals, that they may go on demolishing limits, with no obligation to draw the line where even liberals would have to stop once and for all.
Conservatives, or rather their great majority, would be hard pressed if asked what exactly they want to conserve? The comfort of the mere status quo? The pleasure of seeing one’s children thinking no differently from their grandparents? The permanence of mere hairdos or the length of skirts? The stability of prevailing banking rules? The supple rigidity of political power brokering that favors the conservatives? Or something deeper?
Of course, there are conservatives who know better. But only a few of them dare to call a spade a spade and when they do they all too often put their feet in their mouths. This they do when they identify conservatism with politics, and, horribile dictu, with capitalism. Even from a purely tactical viewpoint these identifications are simplisms, to say the least. Today, more than ever, it remains true of politics what John Henry Newman wrote to a nephew of his, a distinguished professor of mathematics at the University of Manchester. For only one who is not thoroughly disgusted with all sorts of shady political deals would disagree with Newman that “to touch politics is to touch pitch,” a slimy, sticky, stinky substance indeed.
As to capitalism, conservatives, like anyone else, love to conserve their holdings. Money is not an unqualified evil, regardless of the fact that the love of money remains the root of all evil, to recall Paul’s dictum. It is not only a pithy dictum, worth being inserted in dictionaries of quotations, but should also be a revealed truth for many a conservative. For not a few conservatives, and this present writer is one of them, the accumulation of money, or the art of capitalism, must be acceptable only when hemmed in by strict moral guidelines set forth in the great social Encyclicals. Some Catholic advocates of capitalism have now thrown those guidelines overboard to such an extent as to create some strange perspectives indeed.
One of these is the claim that John Paul II has, in his Centesimus annus, drastically revised the teaching which Leo XIII had set forth in his Rerum novarum, and, forty years afterwards, Pius XI in his Quadragesimo anno. Those who make that claim seem to have gotten not only their theology muddy but also their notion of capitalism. One need not go as far back as the time when the seventeenth century turned into the eighteenth, although it is never useless to keep in mind remote origins. It was around 1700 that John Locke laid for the nascent United Kingdom philosophical foundations that included a very new politico-economical theory which was to set the tone of the modern age as the harbinger of the ultimate victory of capitalism. According to John Locke the purpose of the political state was to secure the unhindered accumulation of private holdings. There was no trace there of an income tax, let alone of the taxing of capital gains.
It should not be objected that in his Populorum progressio Paul VI called for breaking out of the hellish circle of poverty. Neither from his spirited appeal, nor from the fact that modern technology provides the tools necessary for implementing the abolition of at least the dire forms of poverty, should one construe that what is possible will also be realized. Tools do not connote their proper use. No age knew that so well as those Middle Ages which some Catholic conservative gurus of capitalism try to paint as the cradle of capitalism. That effort is no better than the eagerness of those who tried to identify Calvinist work ethics with that cradle.
The fact that one such presentation of the Middle Ages, indeed of Christianity, as the cradle of capitalism, appeared just a week before the onset of the new millennium in The Wall Street Journal speaks as much of that paper as of the presentation itself. There was not a word in that presentation, “How Christianity Created Capitalism,” about the late-medieval agonies over whether usury was no longer present when modest interest was taken on the loan or of the keen awareness at that time of the blunt dictum of Saint Jerome (no small mind, to be sure, and a saint at that) that merchants live in a permanent state of sin. Not that they do indeed necessarily live nowadays in that state, but some of them certainly do, which is not a fault of capitalism, but still a capital fault for which the remedy can only come from outside the frenzied atmosphere of the stock markets. At any rate, when a presentation, whether its claim is true or not, is so devoid of facts and data it deserves to be dismissed, as a letter to the Journal declared a few days later, as nothing more than a fiction.
As to the Journal itself, its editorial comment on the world-wide celebrations of the onset of the new millennium is worth a comment or two. The editorial singled out the ceremonies in Saint Peter’s as the most moving of them all. But this is as far as that daily mouthpiece of liberal capitalism would go in the way of endorsing religion. The endorsement would gladly view religion as a technique of immersing man in higher forms of aestheticism, but not as a means of re-tying man with God, to recall the etymological origin of “religion” that comes from re-ligare. For as long as man thinks of God, he can logically think of God only as the source of truth and not of mere opinions. But this is precisely the kind of thinking which is anathema to such critics, as the Journal, of leftist “liberalism,” which has its daily trumpeter in The New York Times. There it is still being bemoaned that the great experiment, called socialism (the Soviet Union), came to grief. There accolades are still heaped on such giants of political science as Henry Kissinger, who as late as 1987 was certain that the Soviet Union would remain the other superpower for the next hundred years. In retrospect, even a sophomore in political science would rightly guess that if the Soviet Union had enough strength to survive for another hundred years, it would be still around a thousand years from now.
Why is it, one may ask, that in Kissinger (1992), an almost thousand-page-long semiofficial biography of Kissinger, there is not a single word about John Paul II, as if the Polish pope had not been indispensable for Reagan to break the back of the evil empire? Why is it that Reagan’s phrase, “the evil empire,” smacks of McCarthyism in “polite” circles, all hell-bent on their own evil politicking? Why is it that in reviewing the century, an editorial in The New York Times dismissed Reagan as one about whom the verdict is still out as to whether he was a visionary or a mere “simpleton”? Clearly some thinking characteristic of this modern age is out of joint. Spokesmen of this modern age try to separate what God joined together and to fuse what has always been separate.
For if there is an epitome of the perversity of the modern age, it is the coining of the acronym GAY. It was coined not so much to remove the legal strictures on strictly private homosexual acts, as to launch a moral crusade on behalf of homosexual fusions. GAY stands for “Good as You,” that is, for the claim that living in homosexual unions is as good morally as living in “straight” marriages. In this crusade there came to a head what has been the chief aim of the modern age from its inception. The latter is to be located in the Renaissance, which aimed at the rebirth of that paganism that found in Christianity its sole real challenge and antagonist. The truth about the Renaissance has for long been an open secret, but recently Christians have become rather secretive about it. It took no small courage on Etienne Gilson’s part to put, two generations ago, the matter bluntly: “The Renaissance marks the opening of an era in which man will profess to be satisfied with the state of fallen nature.” It was most proper for Gilson to state this in his The Spirit of Mediaeval Philosophy (1932).
Today, although the evidence has since “centupled,” it takes heroism to say the same. But the facts are there for anyone who reads any daily paper with open eyes. To live in a fallen state is bad enough, but in no respect is it so fallen as in its taking a studied satisfaction in its fallenness. It is that satisfaction which eliminates the kind of reaction which is known as revulsion or outrage. There was no trace of outrage in that early January 2000 report in The New York Times, “Skin Cells Bring Cloning a Step Closer to Efficiency.” The report was an advance notice about an article to be published in the Proceedings of the National Academy of Sciences which contains the results of experiments whereby cells taken from the skin of a bull’s ear led to the successful cloning of four calves, now 4 to 9 months old. According to the calm words of the reporter, the significance of the new technique goes far beyond animals: “Human cloning, if ethically acceptable, might find a useful niche in assisted-reproduction clinics when other methods do not work.”
The bland use of the word “if” suggests that there would be no barriers to that acceptability. The word “useful” betrays a pragmatism that aims at supplanting ethics. As to the expression, “assisted-reproduction clinics,” it is another effort not to focus on ethical concerns. The new ethics is guided by practicability. And all this is offered by the reporter without batting an eye. The same impression is gained by the comments of a scientist, intimately involved in the new technique, as having immediate relevance for humans. Mario Capecchi of the University of Utah, the scientist in question, was reported as having stated that “he would not be particularly concerned if a very wealthy eccentric individual desired to produce a clone of him or herself.” His concerns, obviously purely pragmatic, were, so he stated, allayed by the obvious: “The drawbacks of the procedure would give it little chance of becoming popular.” Here is the modern age in a capsule.
The capsule’s contents are potent indeed, but of a piece with the resolve of the Renaissance to target what is most vital to Christianity: its moral core. This is why the latest upsurge of anti-Christianity, and especially of anti-Catholicism, is so significant. Believers are under increased pressure to rethink their lifestyle, to take it for just one of the alternative lifestyles. Clearly, a modern age that glories in deconstructionism on the intellectual level has only one intellectual weapon to use, which is to put a conceptual crowbar and monkey wrench into everything that appears to be “postmodern.”
This tactic can achieve something only with those who still claim to be thinkers, however confusedly. But any naive Tom, Dick, or Harry can be demolished in his inarticulate faith if he is ruined in his moral stance and integrity. Nothing comes so naturally to fallen man as to construct a set of perverse dogmas on behalf of moral perversity. The easiest means is to change the vocabulary. There is nothing new in this. It had to have been a widely accepted practice in pre-Confucian China. Otherwise Confucius would not have stated that if he were appointed the lord of the universe, he would first restore the proper use of words. Today the dilution of words’ proper meaning goes on at an accelerated rate. Not only the stock market, but also intellectual discourse is fueled by the new products that flood the marketplace and call for ever new twists being given to words if they are to be successfully marketed.
The present writer, whose forty-odd books are in a large part on science, its history, and philosophy, may be allowed to take from those subjects his concluding considerations, or rather from what appeared in that connection in prominent news organs as the new millennium dawned on us. One is a lengthy though inept argument in The Wall Street Journal by Norman Podhoretz that “Science Hasn’t Killed God.” His essay, almost a full page, contains platitudes about both God and science, very little as to why they have come into repeated conflicts in the past, and next to nothing as to why God is worth worshiping. If a case is made for a God who is truly worth worshiping, there is no need to worry that He might be killed either by Nietzsche or by science. If the case is not made, there is certainly no need for science to do the job, as science, to recall a pithy phrase of Eddington, cannot handle even the multiplication table singlehandedly. The Journal might have just as well reserved that page for discussing the merits of Sancho Panza’s battling the windmills of his imagination.
Of the many superficial statements in that article let me recall two, one briefly, the other at some length. The former concerns Podhoretz’s presentation of J. Robert Oppenheimer as a paragon of concern for the ethical parameters of the use of atomic bombs. Well, nine years after Hiroshima, Oppenheimer was still his overweening self, one who knows everything better than anyone else. Otherwise he would not have brushed aside the searching questions of a Congressional Committee with the defiant words: “It is my judgment in these things that when you see something that is technically sweet, you go ahead and you do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb. I do not think anybody opposed making it; there were some concerns about what to do with it after it was made.” A strange ethical concern indeed, capsulized in the words “technically sweet,” which cast doubt on the depth of Oppenheimer’s often quoted admission of his and others’ “having known sin” once the bomb first exploded in the desert of New Mexico. The second is Podhoretz’s invoking of Einstein as one who endorsed religion which for Einstein was nothing more than a pantheism, which cannot contain a God worth worshiping. Nor can that religion raise a caveat against acts in which Einstein excelled. This I mention partly because he was chosen by Time magazine for the honor of “Person of the Century.” That weekly still could, during its erstwhile ownership, voice some unconditionally valid standards, but not since it was bought out in the early 1970s. Its advocacy of feminazism, one-gender marriages, homosexuality and so forth was, of course, the reason why it called Einstein the Person, not the Man, of the Century.
But there was a strange irony in this bow to the dogma of gender equality, that most unscientific dogma of all. The irony relates to Einstein as a person. As such, politically, he was a coward and, morally, a lecher, if not something worse. The evidence, taken from the Einstein Archives, is there for all to see in the 355 pages of The Private Lives of Albert Einstein (1993), now for eight years in print and by a prominent London publisher at that. Tellingly, it never created a stir, and much less a revulsion, although feminists should have been in the forefront with their indignation about Einstein’s taking rank advantage of scores, yes scores, of women. But neither they nor the big gurus of the media cared to reflect on the evidence.
There was indeed something very disingenuous in a remark I once read in The New York Times, that, strangely, Einstein’s immorality created no revulsion. (The author of The Death of Outrage [1998] might have found it welcome grist for his very creditable mill.) Surely, that illustrious daily, so effective in fomenting revulsion, could have done it here too, if it really wanted. Obviously there is something very defective about its ability to blush. But such is the modern age in which man takes perverse pleasure in his fallen condition, by taking it for the height of his evolution. In fact, if pressed on the point that, to quote his words, it is not the know-how but the character that makes the scientist, Einstein should have been the first to protest his nomination as the Person of the Century. And if still alive, he should have been reminded of his explicit denial of the existence of free will. Some person, some willfulness.
The superficial reader may take all this for the voice of one predicting the demise of an immoral mankind within a thousand years and perhaps within a hundred. Man, of course, may not be around a thousand years from now for a reason that has nothing to do with immorality. If one over a thousand is the probability that within the next hundred years the earth will be hit by a very large meteor or comet, then one in a hundred is the probability for this to happen during the next thousand years. That eventuality is not something to be taken lightly. In fact it has been proposed that man should hit that incoming monster with a hydrogen bomb so as to nudge it away from its earthbound course.
But one need not invoke the use of a gruesome scientific tool to secure the prospect of another thousand or ten thousand years for man. The reason is science itself, or rather man’s capability to come up with science. Clearly man is not totally fallen, and at least not in his intellect. But why is it that science is such a latecomer in human history, not more than four hundred years old? Why is it that although ancient great cultures could boast of great intellects and of great cultural achievements, science, exact science, that is, was not among them?
Historians of science could tell a great deal, though most of them are very reticent on the subject of why science was born in the Christian West. In fact, a careful study of this question also shows that the spark which ignited long accumulated material came from the impact of the most specifically Christian dogma, or event, the birth of an infant two thousand years ago. But this is not the place and time to elaborate on this. Readers of my The Savior of Science (1988; Eerdmans, 2000) will find the particulars.
Such and similar particulars provide the basis for a very measured optimism about the future: a thousand years from now there will still be room for opportunity to rejoin the war for a culture which is steeped in true cult and not in the cultivation of the self. Without true cult, culture is a mere counterfeit, a mere frill of increasingly decivilized civilizations that are at each other’s throats. Meanwhile the only superpower is found fumbling more and more as it tries to prevent inhumanities, though only when this serves its “overriding national interest,” all too often a mere euphemism for rank national selfishness.
In its mad pursuit of a misinterpretation of separation of Church and State, American officialdom, including its self-anointed delegates in academia, does everything to promote a society where it is uncivilized to speak of hallowed cults even in contexts where plain logic would impose this. Seven years have now gone by since Samuel P. Huntington, professor of government at Harvard, called attention to the “Coming Clash of Civilizations,” in the op-ed page of The New York Times (June 6, 1993). There he prophesied the obvious, namely, that once again not the nation-state but civilizations would prove the driving force of history and that “it is to this pattern that the world returns.” No less prophetic proved to be the subtitle, “The West Against the Rest,” a fact which the President and presidential candidates carefully tiptoe around for an obvious reason. It is no longer proper to recall that the roots of Western civilization are cultural and that the culture itself is steeped in the cult which is Christianity, or rather the Christian Church.
Neither in 1993, nor very recently, when Huntington took up again the same topic in the same newspaper (“A Local Front in a Global War,” December 16, 1999), did he mention “cult” in speaking of cultures but mostly of civilizations. The word “culture” is ominously close to “cult” and therefore almost a taboo, which the word “cult” is rapidly becoming, unless it is used in a pejorative sense. The policy is nothing short of burying one’s head in the sand. Neither in reference to Kosovo nor in reference to Chechnya has it been permissible to call a spade a spade, and refer to what has been the chief driving force behind the troubles: militant Muslim revivalism, a virulent cult if ever there was one. At the same time the press, which knows all too well what a post-Christian culture wants to hear, decries, whenever opportunity arises, Christians (and especially Catholics) for asserting their cultic roots.
Why, one may ask, was it deplored, not too long ago, in the pages of the same newspaper, that the European Union takes, territorially, more and more the form of the long defunct Holy Roman Empire? Why is there no probing into the fact that the establishment of Yugoslavia by the West in the wake of World War I was motivated by the countercultic resolve to finish off the markedly Catholic Habsburg monarchy? Had the Orthodox Serbs and Macedonians, the Catholic Slovenes and Croats, and the partly Muslim Bosnians and Kosovars loved one another at that time so much as the public was made to believe? Obviously not. Otherwise they would not have gone their separate ways as soon as opportunity arose.
Such are some facts of the modern age that has tried to be agelessly non-religious, that is, non-cultic, for some centuries now. A thousand years from now many things will be enormously different, but some facts will remain exactly as they have been since times immemorial. Man will have cults because he cannot live without them. Cults will, unfortunately, clash. Unless the West wants to deny its very nature, it is most important for its survival to know which cult to cultivate. No tree has ever been known to flourish once it was uprooted, a truth valid even a thousand years from now and beyond.
Get the Collegiate Experience You Hunger For
Your time at college is too important to get a shallow education in which viewpoints are shut out and rigorous discussion is shut down.
Explore intellectual conservatism
Join a vibrant community of students and scholars
Defend your principles
Join the ISI community. Membership is free.
The Danger of Philosophy
In the wrong hands, it can easily lead to endless and perverse questioning of everything.
Was the Constitution a Coup?
H. W. Brands attempts to uncover the causes of the founding debates.