A Bad Fifteen Minutes: An Excerpt from Last Rites

A Bad Fifteen Minutes: An Excerpt from Last Rites

Print Friendly, PDF & Email

Buy

About twenty years ago, at the age of sixty-five, I wrote a kind of autobiography, entitled Confessions of an Original Sinner (published in 1990). It was fairly well received, and here and there is still in print. It was an auto-history rather than a routine autobiography. (I started it with two sentences: “This is not a history of my life. It is a history of my thoughts and beliefs.”) In one of its chapters, entitled “Writing,” I wrote about what and why I kept writing, and about some of the books I had written during the then forty years of my career as a historian. Well, now, during the following twenty, I wrote more books (though probably with fewer pages) than in the preceding forty, for all kinds of reasons. But in this there must be no place for a chortling summary listing (or even a melancholy one) of my published achievements. So my plan of this book is the reverse of Confessions, which proceeded, say, from 1924 to 1987, through the first sixty years of my life and from the personal to something impersonal, from something like an autobiography to something like a personal philosophy. Now my sequence will be that of a summing up of my recognitions of our present knowledge of the world to memories of my private life, from something like a philosophy to something like an autobiography. The precedence of the former: because of my conviction of its importance. That is my obsessive insistence that human knowledge is neither objective nor subjective but personal and participant—while (among other things) that we, and our earth, are at the center of the universe.

First things first is not always, and not necessarily, the best way to begin a book. I am taking a risk: but then all art, including writing, must contain a risk. Besides—I do not know who the readers of this book will be. And I know that because of the circumstances and the conditions of the world we now live in, their attention spans (very much including those of academics, intellectuals, philosophers, scholars, yes, myself too) have become, even if not altogether “brutish” and “nasty”—narrowed, constricted, and short. So, readers: please bear with me for fifteen minutes or so.

“Un mauvais quart d’heure,” the French say, of those painful fifteen minutes when a son must tell his father that he failed in school; or that he stole; or when a man thinks he now must tell his woman that he will leave her.1 They have to tell the truth: a truth.

First things first. This is the most important part of this book. For fifteen minutes bear with me.

~

Un mauvais quart d’heure. Telling a truth.

Step by step.

Or: “Architecture of a new humanism.

Oh, I was still very young when I saw that historians, or indeed scholars and scientists and human beings of all kinds, are not objective. And then, the trouble was with many who thought and wished to impress the world that they were objective. There are still many historians and even more scientists of that kind, men with gray ice on their faces.

But isn’t Objectivity an ideal? No: because the purpose of human knowledge—indeed, of human life itself—is not accuracy, and not even certainty; it is understanding.

An illustration. To attempt to be “objective” about Hitler or Stalin is one thing; to attempt to understand them is another; and the second is not inferior to the first. Can we expect a victim to be “objective” about someone who did him harm? Can we expect a Jewish man to be “objective” about Hitler? Perhaps not. Yet we may expect him, or indeed anyone, to attempt to understand. But that attempt must depend on the how, on the very quality of his participation, on the approach of his own mind, including at least a modicum of understanding his own self. After all, Hitler and Stalin were human beings, so they were not entirely or essentially different from any other person now thinking about them.

History involves the knowledge of human beings of other human beings. And this knowledge is different from other kinds of knowledge, since human beings are the most complex organisms in the entire universe.

The ideal of objectivity is the total, the antiseptic separation of the knower from the known. Understanding involves an approach, that of getting closer. In any event, and about everything: there is, there can be, no essential separation of the knower from the known.

But: are there no objective facts? Ah! Beside the limits of “objectivity” there are the limits of “facts.”

Yes, there are “facts.” The door was open. The water was at a boil. The house was on fire. Napoleon lost at Waterloo. But “facts” have at least three limits—perhaps especially for historians. One: for us the meaning of every “fact” exists only through our instant association and comparison of it with other facts. Two: for us the meaning of every fact depends on its statement, on the words with which it is expressed. Three: for us these words depend on their purposes. (There are statements in which every “fact” may be correct, and yet the meaning, tendency, purpose of their statements may be wrong.)

We are human beings, with our inevitable limitations. We think in words. Especially when it comes to history, which has no language of its own, no scientific terminology: we speak and write and teach it in words. Besides, words and language have their own histories. One pertinent example: four or five hundred years ago the very words objective, subjective, fact meant not what they now mean or pretend to mean. Words are not finite categories but meanings—what they mean for us, to us. They have their own histories and lives and deaths, their magical powers and their limits.

~

Historical knowledge—indeed, any kind of human knowledge— is necessarily subjective. That is what I tended to think in my early twenties. Soon I found that I was necessarily wrong: that subjectivity is merely the other, the obverse side of objectivism and objectivity, that there is something wrong with the entire Cartesian coin, of a world divided into Object and Subject: because Subjectivism as much as Objectivism is determinist.

 

Yes, every human being sees the world in his own way. That is inevitable: but not determined. We choose not only what and how we think but what and how we see. According to subjectivism I can think and see in only one (my) way; he in (his) another. This is wrong, because thinking and seeing are creative acts, coming from the inside, not the outside. Which is why we are responsible not only for how and what we do or say but for how and what we think and see. (Or: for what we want to think and for what we want to see.)

Very few people have recognized that the essence of National Socialism, including its biological racism, was something like subjectivist determinism, or call it idealistic determinism, or call it subjectivist idealism. The Jews are a spiritual, even more than a biological, race, Hitler once said. They think in a certain— their—way: they cannot think otherwise. A great historian, Johan Huizinga, saw something of this peril early. Around 1933—not referring to Germany or to Hitler—he wrote that “subjectivism” was a great danger. (The other great danger, for him, was the increasing domination of technology.)

There were a few historians who realized the limitations, indeed, the very ideal of Scientific Objectivity, at least in their profession. (One of them was Charles A. Beard, who slid into Subjectivism from Objectivism around that very time: but, unlike Huizinga, he could not see further.) Twenty-five or thirty years later it took Edward Hallett Carr, a former Marxist, to make the academy of professional historians hear what they, probably, were getting inclined to hear. (This is how and why the history of ideas is almost always woefully incomplete: not what but when it is that people are finally willing to hear something.) In What Is History, still a celebrated book, published in 1961, Carr declared: “Before you study the history, study the historian.” Well, yes (though the reverse of that applies too: before you study the historian, study his history).2 But Carr’s thesis is nothing but Subjectivist Determinism: in his view a historian’s background, and especially his social background, virtually determines the history he will write. This is nonsense: consider the sons of rich bourgeois who chose to become Marxists, or the offspring of Marxists who chose to become neoconservatives. The crucial word is: they chose.3

Besides—or perhaps more than “besides”—the subjectivist Carr could not really detach himself from the Cartesian, the Objective-Subjective terminology: “It does not follow that, because a mountain appears to take on different angles of vision, it has objectively no shape at all or an infinity of shapes.” But the more “objective” our concept of the mountain, the more abstract that mountain becomes.4 A few years after Carr the old bourgeois ideal of Objectivism was falling apart. Postmodernism appeared, even though that term and the “postmodern” adjective were confusing. (Was the ideal of Objectivity just another bourgeois ideal, a “modern” one?) “Structuralism” and its proponents, many of them French, appeared; entire academic departments of literature took them seriously, even though they were hardly more than yet another academic fad. Their essence was, and remains, not much more than Subjectivism. They will not endure. What will, what must endure is the piecemeal recognition that the division of the world into objects and subjects belongs to history, as does every other human creation: that whatever realities Objectivity and its practical applications contained and may still contain, they are not perennial, not always and not forever valid.

~

Knowledge, neither “objective” nor “subjective,” is always personal. Not individual: personal. The concept of the “individual” has been one of the essential misconceptions of political liberalism. Every human being is unique: but he does not exist alone. Not only is he dependent on others (a human baby for much longer than the offspring of other animals), his existence is inseparable from his relations with other human beings.

Every person has four relationships: with God, with himself, with other human beings, and with other living beings. The last two we can see and judge; the first two we may but surmise. But connected they are: we know some things about others through knowing some things about ourselves. That much is—or at least should be—obvious.

But there is more to that. Our knowledge is not only personal. It is participant. There is not—there cannot be—a separation of the knower from the known. And we must see farther than this. It is not enough to recognize the impossibility (perhaps even the absurdity) of the ideal of their antiseptic, “objective” separation. What concerns—or what should concern—us is something more than the inseparability, it is the involvement of the knower with the known. That this is so when it comes to the reading and the researching and the writing and the thinking of history should be rather obvious. “Detachment” from one’s passions and memories is often commendable. But detachment, too, is something different from “separation”; it involves the ability (issuing from one’s willingness) to achieve a stance of a longer or higher perspective: and the choice for such a stance does not necessarily mean a reduction of one’s personal interest, of participation— perhaps even the contrary.

Interest includes, it involves participation. But keep in mind: participation is not, it cannot be complete. What “A” says to “B” is never exactly what “B” hears—usually because of his or her instant associations with some things other than the words of “A.” Yes: their communications, all human communications, are necessarily incomplete—because of the complexity and the limitations of the human mind. But there is a wondrous compensation for this. That is that the charm of human communications resides precisely in their incompleteness, in the condition that what “B” hears is not exactly what “A” says—whence, in some instances, even the attraction of “A.”5

But this inevitable involvement of the knower with the known does not exist only in the relations of human beings with other human beings. It involves, too, what we call “science,” man’s knowledge of physical things, of nature, of matter. I shall come to this too—soon. Before that, a mere few words about the relationship of mind and matter. Did—does—matter exist independent, without, the human mind? It did and it does: but without the human mind, its existence is meaningless—indeed, without the human mind we cannot think of its “existence” at all. In this sense it may even be argued that Mind preceded and may precede Matter (or: what we see and then call matter).

~

In any case, or event, the relations of “mind” and “matter” are not simple. In any case or event6 they are not mechanical.

What happens is what people think happens. At the time of its happening—and at least for some time thereafter. History is formed thereby.

What happens and what people think happens are inseparable. That human condition is inevitable. (Does pain exist without one’s recognition of pain? When someone thinks he is unhappy he is unhappy. Etc.) What we think happens or happened may of course be wrong, something that we may recognize later, at another time. (Or not. Even then we may be right or wrong, since memory is not mechanical either but another creative function: we may clarify or deceive our memories too.)

This does not now matter. What matters is the necessary and historic recognition that the human mind intrudes into causality, into the relation of causes and effects.

Causality—the how? and why?—has varied forms and meanings (Aristotle and Saint Thomas Aquinas listed four): but for centuries the terms of mechanical causality have dominated our world and our categories of thinking. All of the practical applications of “science,” everything that is technical, inevitably depend on mechanical causality, on its three conditions: (1) the same causes must have the same effects; (2) there must be an equivalence of causes and effects; (3) the causes must precede their effects. None of this necessarily applies to human beings, to the functioning of their minds, to their lives, and especially to their history.

Illustrations thereof. (1) Steam rising in a kettle: at a certain point, at a measurable temperature, the pressure becomes intolerable, an explosion is inevitable and determined: the lid of the kettle will fly off. But in human life the lid is thinking about itself. “Intolerable” is what it chooses not to tolerate. What is intolerable is what people do not wish—or think—to tolerate. (2) There is no equivalence of causes and effects. Suppressions, restrictions, taxes imposed by one ruler on one people at one time are not the same when imposed on other people or even on the same people at another time. It depends on how they think about their rulers and about themselves, and when. (Under Hitler many Germans—the most educated people in the world at that time—thought that they were freer than they had been before.) (3) In life, in our histories, there are “effects” that may, at times, even precede “causes”: for instance, the fear (or anticipation) that something may or may not happen may cause it to happen (whence a view of “a future” may cause “a present”).

In sum, mechanical causality is insufficient to understand the functioning of our minds, and consequently of our lives, and even the sense and the meaning of our memories, of the past, of history. Every human action, every human thought is something more than a reaction. (That is, too, how and why history never repeats itself.) The human mind intrudes into, it complicates the very structure of events.7

To this let me add my own conclusion: that this relationship, this intrusion of Mind into Matter is not constant; that perhaps the evolution of human consciousness may be the only evolution there is: and that in this age of democracy this intrusion of mind into matter tends to increase.8 That is a startling paradox, a development at the same time when the applications of mechanical causality govern the lives of mankind more than ever before. Wendell Berry wrote (in 1999): “It is easy for me to imagine that the next great division of the world will be between people who wish to live as creatures and people who wish to live as machines.”

~

Mind over matter; mind dominating matter. Is this nothing more than a categorical assertion of a philosophy of idealism?

Yes: the opposite of materialism is idealism. But any intelligent idealist must be a realist. The opposite of idealism is materialism, not realism. But a categorical denial of the importance of matter is not only wrong but dangerous. Idealists understand the primacy of mind over matter, but they must recognize matter; indeed, they must be grateful for its existence. (Or to God for it: because both man and matter are God’s creations.) A friar once said to the fifteenth-century German mystic Meister Eckhart: “I wish I had your soul in my body.” Whereupon Eckhart: “That would be useless. A soul can save itself only in its own appointed body.” A German poet, much later: “I have a great awe of the human body, for the soul is inside it.” Another German philosopher (Romano Guardini): Man is not “the creature that idealism makes of him” (At the End of the Modern World ).

I cited Germans on purpose: because there are great and grave dangers in categorical idealism, as there are in categorical materialism. There is the—frequently German, but also at times Russian—tendency to, or belief in, an idealistic determinism. That was the essence of Hitler’s National Socialist ideology: that because the ideas of National Socialists were so much stronger and better than those of their reactionary or liberal or Communist opponents, they were bound, inevitably, to triumph.9

Wrong, in this sense, are not only applications of Hegel’s Zeitgeist. Wrong, too, was the English “idealist” historian R. C. Collingwood (sometimes referred to as a pioneer of “postmodernism”), who wrote that history is nothing but the history of ideas. But: no idea exists without the person who thinks it and represents it.

It took me, an anti-materialist idealist, perhaps forty or fifty years to recognize, suddenly, that people do not have ideas: they choose them.

And how and why and when (important, that!) they choose them: ah! there is the crux of the matter, of men’s predicaments, of their destinies, of history.

There is a deep difference between a merely anti-materialist idealism and a realistic idealism that includes the coexistence and the confluence of matter and spirit.10 Whence another duality, the perennial paradox of two different tendencies existing at the same time. Yes, now: when there is obvious and present danger of people succumbing entirely to materialism. Yet a deeper and greater danger exists that may burgeon and blossom forth from false idealisms and fake spiritualisms of many kinds—from a spiritual thirst or hunger that arises at the end of an age, and that materialism cannot satisfy.

~

Still, I am not a prophet, I am a historian. We live forward, but we can think only backward (Kierkegaard). The past contains all that we know.11 The “present” is a fleeting illusion; the “future” a sense into which we can only project, “predict,” this or that known from an evolving past. History is not the entire past; but it is more than the recorded past (which is what so many people and academic historians think it is); it is the recorded and the remembered past. Like memory, it is incomplete and fallible. Memory and thinking, like imagination and seeing, are inseparable. They have their limits: but our recognition of their limits, paradoxically, may enrich us. Such is human nature. Such must be the understanding of any old or new kind of humanism—historical, rather than “scientific.”

In the eighteenth century, when professional historianship first began to appear, people began to read more and more history, reading it as a form of literature. In the nineteenth century history became regarded as a science. That word, like scientist, had different meanings in different countries and in different languages, but let that go. It is sufficient to say that historians, many, though not all of them professionals, made immense and important contributions to historical knowledge during that century and the next. In the twentieth century the professional study of history broadened, dealing with subjects and people previously untouched. In 1694 the first edition of the Dictionary of the French Academy defined history as “the narration of actions and matters worth remembering.” In 1935 the eighth edition said much the same: “the accounts of acts, of events, of matters worth remembering.” Dignes de mÈmoire! Worth remembering! What nonsense this is! Is the historian the kind of person whose training qualifies him to tell ordinary people what is worth remembering, to label or authenticate persons or events as if they were fossil fish or pieces of rock? Is there such a thing as a person and another such thing as a historical person? Every source is a historical source; every person is a historical person.

Whatever the Dictionary of the French Academy said, some French historians in the early twentieth century were extending the fields and the horizons of their historical researches. Still the cultural and civilizational crisis of the twentieth century affected the state and the study of history, as it affected every other art and science. Furthered by the bureaucratization of professional historianship, the essence of the trouble remained the misconception of history as a “science”—perhaps as a social science, but as a science nonetheless—including science’s desideratum of the perfection of Objectivity.

At the very end of the nineteenth century one of its finest historians, Lord Acton, claimed and declared that the science of history had reached a stage when, say, a history of the Battle of Waterloo could be written that would not only be perfectly acceptable to French and British and Dutch and Prussian historians alike, but that would be unchanging, perennial, forever fixed. That was an illusion. ( John Cardinal Newman once said that Acton “seems to me to expect from History more than History can furnish.”) A century later we have (or ought to have) a more chastened and realistic view of historical and scientific Objectivity— indeed, of truth. Acton believed that history was a supremely important matter—yes—and that the purpose of history is the definite, and final, establishment of truths—no. The purpose of historical knowledge is the reduction of untruth.12 (And the method of history is description, not definition.)

And such is (or should be) the purpose of every science too.

Truth is of a higher order than Justice (this primacy is there in the difference between the Old and the New Testaments). Pure Truth (again Kierkegaard) is the property of God alone: what is given to us is the pursuit of truth.13 History reveals to us human fallibilities, which include the variations, the changeability and relativity of human and particular knowledge—hence the pursuit of truth, so often through a jungle of untruths. Near the end of the eighteenth century Edmund Burke saw and stated this historic condition. In the second half of the twentieth century liberal or neoconservative philosophers proposing Open Societies or Absolute Truths (Karl Popper, Leo Strauss) accused and attacked history for its “relativism.” They were ignorant of the essential difference between historicism and historicity: the first being the (mostly German and idealist) categorical concept of history, the second the recognition of the historicity of human reasoning.

The evolution is from rationalism to historicism to historicity, from the propositions of objective knowledge to that of subjective knowledge to that of personal and participant knowledge—all matters of the conscious and not of the subconscious mind. This is why at this time,14 at the beginning of the twenty-first century, we must begin to think about thinking itself.

~

From dualities of human nature, to dualities of history, and to dualities of our world.

At first sight—or, rather, on the wide surface—it seems that we have entered into a world where traditions in learning are disappearing. One consequence of this is the diminution, and sometimes the elimination, of the teaching of history in schools, reducing the requirement of history courses in colleges and universities. There is, too, much evidence of the ignorance of even a basic knowledge of history among large populations in this age of mass democracy and popular sovereignty.

At the same time—and not far beneath the surface—there is another, contrary development. There are multiple evidences of an interest in, nay, of an appetite for history that has reached masses of people untouched by such before. The evidences of such appetites are so various and so many that it would take pages merely to list them, which I regret that I cannot here do.15 Nor is this the place to speculate what the sources of this widespread and inchoate interest in and appetite for history might be. Of course such appetites may be easily fed with junk food. But my interest here is the appetite, not its nutrition.

Johan Huizinga, alone, saw this duality more than seventy years ago. He was despondent about mass democracy and populism and Americanization and technology. He wrote in his debate with the French rationalist Julien Benda in 1933: “Our common enemy is the fearful master, the spirit of technology. We must not underestimate its power.” Around the very time (1934) he also wrote: “Historical thinking has entered our very blood.”

~

It is arguable that the two greatest intellectual achievements of the now ended age of five hundred years have been the invention (invention, rather than discovery) of the scientific method, and the development of historical thinking. Towering, of course, above the recognition of the latter stood and stands the recognition of the importance of “science,” because of the fantastic and still increasing variety of its practical applications. Yet there is ample reason to recognize evidences of an increasing duality in our reactions to its ever more astonishing successful and successive applications.

At first (or even second) sight the rapid increase of the variety of the technical applications of “science” are stunning. Most of these have gone beyond even the vividest imaginations of our forebears. That they are beneficial in many fields, perhaps foremost in applications of medicine and techniques of surgery, leaves little room for doubt. That most people, including youngsters, are eager to acquire and to use the ever more complicated gadgets and machines available to them cannot be doubted either. Consider here how the natural (natural here means instinctive but not insightful) ability in dealing with pushbutton mechanical devices is normal for young, sometimes even very young, people who do not at all mind comparing or even imagining themselves as akin to those machines, unaware as they are of the complexity and the uniqueness of human nature.

At the same time consider how the reactions of people to the ever more and more complicated machines in their lives are increasingly passive. Few of them know how their machines are built and how they actually function. (Even fewer of them are capable of repairing them.) Inspired by them they are not. (Compare, for example, the popular enthusiasm that followed Lindbergh’s first flight across the Atlantic in 1927 with the much weaker excitement that followed the astronauts’ first flight to the moon and back forty-two years later.) Machines may make people’s physical lives easier, but they do not make their thinking easier. I am not writing about happiness or unhappiness but about thinking. It is because of thinking, because of the inevitable mental intrusion into the structure and sequence of events, that the entire scheme of mechanical causality is insufficient. Still every one of our machines is wholly, entirely, dependent on mechanical causality. Yes, we employ our minds when—meaning: before, during, and after—we use them: but their functioning is entirely dependent on the very same causes producing the very same effects. It is because of their mechanical causality that computers are more than two hundred and fifty years old, indeed, outdated. In 1749 a French rationalist, De la Mettrie, wrote a famous book: Man a Machine. That was a new proposition then (though perhaps even then not much more than one of those Ideas Whose Time Has Come): dismiss soul or spirit; man may be a very complicated, perhaps the most complicated machine, but a machine nevertheless. Two hundred and fifty years later there is something dull and antiquated in such a picture: a dusty and mouldy model of human nature. Hence, below the surface: our present passive (and sometimes sickish and unenthusiastic) dependence on and acceptance of many machines.

At this stage of my argumentation someone may ask: are these not merely the opinions of an old-fashioned humanist? A poet or even a historian of a particular kind may see the realities of the world otherwise from how (and why) a natural scientist may see them. They represent Two Cultures, a humanistic and a scientific one. That was the argument of a public intellectual and a popular scientist, C. P. (later Lord) Snow, around 1960. Readers: he was wrong. There may be dualities in our reactions: but— more important—there is evidence, and increasing evidence, that the dual division of the world ever since Descartes et al. into Objects and Subjects, into Known and Knower, is no longer valid. And such evidence is not only there in, relevant to, the so-called humanities. During the general crisis at the end of an age, in the twentieth century evidence for this has been there in physics, too, involving the very study of matter.

~

Having now less than a quart d’heure, I must sum up the what? before the how?

 

Whether we call it Uncertainty or Indeterminacy or Complementarity; whether we refer to quantum physics or nuclear physics or subatomic physics or particle physics, their practitioners found that the behavior of small particles (for instance, of electrons) is considerably unpredictable; and that this kind of uncertainty is not a result of inadequately precise measurements but may be proved by experiments.

When it comes to such small particles, their observation interferes with them. Due to this human participation, their complete or “objective” definition is not possible.

They may be described (rather than “defined”), but description, too, is constrained by the limitations of human language. The very definitions of words such as position or velocity are necessarily indefinite, incomplete, and variable, dependent on the moments and conditions of their observation. (So are the mathematical formulations of their relationships.)

A fundamental unit of matter is neither measurable nor ascertainable. Does such a unit “really” exist? Even atoms and electrons are not immutable “facts.” (We cannot see them. At best, we can see traces of their motions—but only with the help of machines invented by men.)

Neither are the earlier scientific distinctions between the categories of “organic” and “inorganic” matter any longer watertight. “Energy” may be transformed into matter or heat or light: but energy is a potentiality. An accurate definition, a measurement of the temperature of an atom is impossible, because its very “existence” is only a “potentia,” a probability.

In quantum physics, involving small particles, mechanical causality, as well as the complete separation of object from subject, of the knower from the known, cannot and does not apply.

~

This is a very short list of some of the more important discoveries (or rather, inventions) of quantum physics. All I hope is that some of my readers will recognize that they correspond with how we think about history—that is, with the knowledge human beings have not of things but of other human beings, involving the inevitable presence of participation.

But have historians preceded physicists with their wisdom? Oh no. The science of history, professional historianship, historians thought and said for a long time, must deal with what actually happened. That is the closest English translation of the dictum, or at least of the desideratum, that Leopold von Ranke, more than 120 years ago, stated in a famous phrase: history must be written (or taught) wie es eigentlich gewesen, “as it (actually, or really) happened.” We ought not criticize Ranke: at that time, for his time, he was largely right. But within this phrase there lurks an illusion of a perennial definitiveness (as in Acton’s earlier mentioned desideratum and illusion about a fixed, and thus forever valid, history of Waterloo). Yet the historian must always keep in mind the potentiality: that this or that may have happened otherwise.16

I happen to be a beneficiary of this. The modest success of two books I wrote, The Duel (1990) and Five Days in London (2000), dealing with May and June in 1940, has been largely due to my description of how difficult Churchill’s position was in those dramatic days and weeks—a description that is inseparable from the recognition of how easily it could have been otherwise, that is, of how close Hitler was to winning the war then and there. This is but one example, one illustration of the condition that every historical actuality includes a latent potentiality. (Also: that human characteristics, including mental ones, are not categories but tendencies).17

~

History is larger than science, since science is part of history and not the other way around. First came nature, then came man, and then the science of nature. No scientists, no “science.”

Whence I must sum up something about the recent history of physics. The 1920s were a—so-called—golden age of physics when the recognitions of quantum physics were born, in a decade which was already chock full with the symptoms of the general cultural and civilizational crisis of the twentieth century. But then, after the Second World War, that general and profound and sickening crisis of an entire civilization, of its intellect and its arts began to envelop physics too.

How? Why? Because physicists, too, are human beings, with their talents and shortcomings, with their strengths and weaknesses. During their golden age of the 1920s some of them thought seriously about what their new discoveries (or, perhaps more precisely: their new inventions) meant for human knowledge itself. As time went on (and as their reputations increased) fewer of them directed their attentions to that larger question. Heisenberg was among these few. Thirty years after his sudden pioneer and revolutionary formulation of the realities of quantum physics, and after the revolutionary and dramatic events of the Second World War, in 1955 he delivered the Gifford Lectures, amounting to his summation of what this new physics meant to our knowledge of the world. Many of his sentences were memorable. Among other things he stated that the scientific method has become its own limitation, since science by its intervention alters the objects of its investigations, “methods and objects can no longer be separated.” And: “The object of research is no longer nature itself, but man’s investigation of nature.” Note these two words, appearing in these two separate statements: no longer.

Yet there were and are very few scientists who agreed or who were interested in Heisenberg’s epistemological statements.18 And during the last twenty years of his life, Heisenberg too was moving, as were most other physicists,19 to seek a mathematical, a formulaic, “solution” of the problem of physical knowledge, in pursuit of what is called a Unified Theory of Matter (or, by some, a Theory of Everything). Another quarter-century later a number of physicists began to encompass absurdities.20 The decline of physics began.

All of this happened during and after three quarters of a century when physicists, inventing and relying on more and more powerful machines, have found21 more and more smaller and smaller particles of matter affixing them with all kinds of names—until now, well into the twenty-first century, it is (or should be) more and more likely that not only A Basic Theory of Everything but that the smallest Basic Unit of Matter will and can never be found. And why? Because these particles are produced by scientists, human beings themselves.

Every piece of matter, including the smallest—just as every number—is endlessly, infinitely divisible because of the human mind. Some scientists will admit this. Others won’t.

What science amounts to is a probabilistic kind of knowledge with its own limits, due to the limitations of the human mind, including the mental operations and the personal characters of scientists themselves, their potentialities ranging from sublime to fallible. There is only one kind of knowledge, human knowledge, with the inevitability of its participation, with the inevitable relationship of the knower to the known, of what and how and why and when man knows and wishes to know.

~

This has always been so—even as the recognitions of these conditions have varied.

But now, in the twenty-first century, at the end of the “modern” age, something new, something unprecedented has come about. For the first time since Adam and Eve, for the first time in the history of mankind, men have acquired the power to destroy much of the earth and much of mankind, potentially even most of it.

At the beginning of the Modern Age, some five centuries ago, Bacon wrote: Knowledge is Power. Near the end of this age we know, or ought to know, that the increase of power—including mental power—tends to corrupt.

Until now the great earth-shattering catastrophes—earthquakes, floods, firestorms, pests, plagues, epidemics—came from the outside. Now the potential dangers are coming from the inside: Nuclear explosions, global warming, new kinds of contaminations, pestilences produced by mankind itself (for instance by genetic engineering).22 All of such dangers come from men’s increasing knowledge—or, rather, from his increasing interference with elements of “nature.” There may be a shift now, from the potential dangers of material technology to biotechnology.

Of course a “danger” is a potentiality, not an actuality. Of course some of these developments may not happen. The road to hell may be paved with good intentions: but the road to heaven too may be paved with bad intentions that have not matured into acts. That is our saving grace, our hope. But we must recognize the sources of our new and enormous dangers: not outside us but inside this world, because of the minds of men, including “scientists” and those who support and cheer them on.23 We must rethink the very idea and meaning of “progress.”

And now a step—a last step—further. We must recognize, we must understand, that we are at the center of the universe.

~

Contrary to all accepted ideas we must now, at the end of an age, at the beginning of a new one, understand and recognize that we and our earth are at the center of our universe.24

We did not create the universe. But the universe is our invention: and, as are all human and mental inventions, time-bound, relative, and potentially fallible.

Because of this recognition of the human limitations of theories, indeed, of knowledge, this assertion of our centrality—in other words, of a new, rather than renewed, anthropocentric and geocentric view of the universe—is not arrogant or stupid. To the contrary: it is anxious and modest. Arrogance and stupidity, or at best shortsightedness, are the conditions of those who state that what human beings have figured out (most of these figurations occurring during the past five hundred years, a short period in the history of mankind!)—that water is H2O, that there cannot be speed greater than 186,262 mph, that e = mc2, etc., etc., that these scientific and mathematical formulas are absolute and eternal truths, everywhere and at any time in the universe, trillions of years ago as well as trillions of years in the future; that mathematics and geometry preceded the existence of our world—that these are eternally valid facts or truths even before the universe existed and even if and when our world or, indeed, the universe will cease to exist.

No. The known and visible and measurable conditions of the universe are not anterior but consequent to our existence and to our consciousness. The universe is such as it is because at the center of it there exist conscious and participant human beings who can see it, explore it, study it.25 This insistence on the centrality, and on the uniqueness of human beings is a statement not of arrogance but of humility. It is yet another recognition of the inevitable limitations of mankind.

I ask my readers to hear my voice. It is an appeal (appeal: call: ring) to think—yes, at a certain stage of history. I can only hope that for some people the peal may ring with at least a faint echo of truth. It is an appeal to the common sense of my readers.

When I, a frail and fallible man, say that every morning the sun comes up in the east and goes down in the west, I am not lying. I do not say that a Copernican or post-Copernican astronomer, stating the opposite, that the earth goes around the sun, is lying. There is accuracy, determinable, provable accuracy in his assertions: But my commonsense experience about the sun and the earth is both prior to and more basic than any astronomer’s formula.

Keep in mind that all prevalent scientific concepts of matter, and of the universe, are models. A model is manmade, dependent on its inventor. A model cannot, and must not, be mistaken for the world.

And now there exists an additional, and very significant, evidence of our central situation in the universe. Five centuries ago, the Copernican / Keplerian / Galilean / Cartesian / Newtonian discovery—a real discovery, a real invention, a calculable and demonstrable and provable one—removed us and the earth from the center of the universe. (Often with good intentions.) Thereafter, with the growth of scientism, and especially with the construction of ever more powerful instruments, among them telescopes (instruments separating ourselves ever more from what we can see with our naked eyes: but of course the human eye is never really “naked”), this movement led to our and to our earth having become less than a speck of dust at the rim of an enormous dustbin of a universe, with the solar system itself being nothing more than one tiniest whirl among innumerable galaxies. But the physicists’ (perhaps especially Niels Bohr’s) recognition that the human observer cannot be separated from things he observes (especially when it comes to the smallest components of matter) reverses this. We and the earth on and in which we live, are back at the center of the universe26 —a universe which is—unavoidably— an anthropocentric and geocentric one.

This is something other than the returning movement of a pendulum. History, and our knowledge of the world, swings back, but not along the arc where it once was. It is due to our present historical and mental condition that we must recognize, and proceed from a chastened view of ourselves, of our situation, at the center of our universe. For our universe is not more or less than our universe.27 That has been so since Adam and Eve, including Ptolemy, Copernicus, Galileo, Newton, Einstein, Heisenberg, and my own dual, because human (opinionated as well as humble), self.28

Our thinking of the world, our imagination (and we imagine and see together) anthropomorphizes and humanizes everything, even inanimate things, just as our exploration of the universe is inevitably geocentric. It is not only that “Know Thyself” is the necessary fundament of our understanding of other human beings. It is, too, that we can never go wholly outside of ourselves, just as we can never go outside the universe to see it.

~

In sum. Our consciousness, our central situation in space, cannot be separated from our consciousness of time.29 Does it not, for example, behoove Christian believers to think that the coming of Christ to this earth may have been the central event of the universe: that the most consequential event in the entire universe has occurred here, on this earth two thousand years ago? Has the son of God visited this earth during a tour of stars and planets, making a spectacular Command Performance for different audiences, arriving from some other place and—perhaps—going off to some other place?

And only two thousand years ago. The arguments of Creationism against Evolutionism entirely miss this essential matter. That is the unavoidable contradiction not between “Evolution” and “Creation” but between evolution and history. History: because in the entire universe we are the only historical beings. Our lives are not automatic; we are responsible for what we do, say, and think. The coming of Darwinism was historical, appearing at a time of unquestioned Progress. But its essence was, and remains, anti-historical. It elongated the presence of mankind to an ever-increasing extent, by now stretching the first appearance of “man” on this earth to more than a million years—implying that consequently there may be something like another million years to come for us. Ought we not question this kind of progressive optimism— and, at a time when men are capable of altering nature here and there and of destroying much of the world, including many of themselves?

~

Such is my sketch of a new architecture of humanism, of a chastened humanism cognizant of our unavoidable limitations, and of the earth and ourselves being at the center of the universe, our universe. What an ambitious proposition! And isn’t ambition inseparable from vanity? Why did I place this at the beginning of a kind of autobiography that ought to be not much more than an entertaining account, an ambling causerie?


Buy Read the rest of John Lukacs’s Last Rites by purchasing the book at Amazon.com.
 

Endnotes

1 Ah! But in 2009, at the end of an entire civilization, do we still live in a world where there still are, where there still must be, mauvais quarts d’heure?

2 That in reality, calls for more honesty, for a greater strength of mind. How often will historians dismiss a book because of its author, whom they do not like!

3 To dabblers in the history of ideas it may be interesting that Carr’s book nearly coincided with Thomas Kuhn’s The Structure of Scientific Revolutions (1962), a much celebrated but an essentially useless and worthless book in which vocabulary substitutes for thought, and which slides close to Subjectivism, though it does not quite dare to espouse it, suggesting but unready to state that Science is but the result of scientists (that, ‡ la Carr, “before you study the science, study the scientist”).

4 Perspective is a component of reality. So is history: because history had to pass until men began to call and see it a mountain, that is, something different from hills or from other outcroppings.

5 Another obvious illustration of this condition is this. When we know a foreign (that is: another) language well, there is a charm in knowing (or, rather, understanding) that the very same words, with their very same origins, in the two languages may mean something slightly different. (Example: the English honor and the French honneur.)

Yet another condition, illustrating the impossibility of dividing Object from Subject. Consider what happens when we are concerned, anxious about someone who is dear to us. Can we separate our concerns for her from how that concern affects (and will affect) us? There may be an imbalance of these two concerns, between our thinking mostly about her and our thinking mostly about how her state affects or will affect us. But in either case these concerns are inseparable: both “objectivity” (an exclusive concentration on her condition) and “subjectivity” (an exclusive concentration on my condition) are impossible. Our consciousness and our knowledge, our concern and our expectations are participant and thus inseparable.

6 Event, not fact. Consider but the sound and meaning of event, as it flows from past to present, while fact gives the impression of something definite and done, fixed in the past.

7 Tocqueville saw and described this clearly in his The Old Regime, without philosophizing about it: that revolutions emerge not when oppression by a regime is most severe but when it has begun appreciably to lessen. (Again: “intolerable” is what people no longer want to tolerate: in other words, when they begin to think and say that this or that must not be tolerated.)

8 Whence the structure of many of my books. In some of them my chapters follow each other according to a (my) ascending hierarchy: from economic to social and political and to mental, intellectual, spiritual and religious developments, from what I consider as less to what I consider more significant; that is, from the material measures of people’s lives to what (and how) their thoughts and beliefs appeared and formed. (The recent and rather latecoming interest in the study of the history of mentalitÈs represents a stumbling toward such recognitions, “pursuing the obvious with the enthusiasm of shortsighted detectives” [Wilde] or, rather: pursuing it with the trendy ambitions of academics.)

9 Hitler was an idealist determinist. In 1940 he said that because of the superior strength of his ideology a German soldier was worth two or three French or British or Russian soldiers, just as before 1933 National Socialist street fighters were tougher than Socialist or Communist street fighters in Germany. National Socialism was bound to triumph then and there, just as Germany was bound to triumph in this war: a repetition of what happened before, on a larger scale. His faithful adjutant General Alfred Jodl in November 1943: “We will win because we must win, for otherwise world history will have lost its meaning.” Field Marshal Walter Model on 29 March 1945 (!): “In our struggle for the ideals (Ideenwelt) of National Socialism . . . it is a mathematical [!] certainty that we will win, as long as our beliefs and will remain unbroken.”

10 Another neoidealist, the English Michael Oakeshott: “History is the historian’s experience. It is ‘made’ by nobody save the historian: to write history is the only way of making it.” This is separating the idea of history from history: the past from the memory of a past and from the reconstruction of some of it with the help of a historian’s ideas about it, which, according to Oakeshott, alone is “history.”

11 Diary 14 September 2004. “Famous bromide by L. P. Hartley: ‘The past is a foreign country; they do things differently there.’ A classic half-truth. (Perhaps even less than half.)”

12 “We find no absolute perfection in this world; always there is a background of imperfection behind our achievement; and so it is that our guesses at the truth can never be more than light obscured by shadow. The humble man’s knowledge of himself is a surer way to God than any deep researches into truth.” “How Truth Is to Be Learned,” Chapter 3 in Thomas ‡ Kempis’s The Imitation of Christ.

13 The purpose of law, too, must be the reduction of injustices but not a completion or perfection of justice (something to which Americans may be particularly inclined), an insane effort that may destroy men and much of the world.

14 Time, including nonhistoric time, is a mystery deeper than that of “space” (God’s creation of time being a condition of mankind, something with which Saint Augustine grappled). Among many other things, it suggests an answer to the perennial problem of human evil, which is not only that no human being has been absolutely evil or absolutely good, but that this is not a question of proportions. Was (or is) this or that evildoer, seen at times as being good to children, loyal to his friends, etc., only this or that much evil? No: this is not a question of percentages. It is that no human being can be good or evil at the same time—more precisely: in the same moment.

15 One (but only one) evidence: histories, of all kinds, now sell better and reach more readers than do novels. This is interesting, because in the eighteenth century professional history and the modern novel appeared around the same time, and because until about fifty years ago, vastly more people read novels than they read histories. Many of these novels were “historical novels,” a genre that appeared first about two hundred years ago, employing history as a background to a novel. But worth noticing now are the interests of more and more novelists in history, whence they confect novels in which history is the main theme, the foreground. That most of them do this very badly (indeed, illegitimately: attributing nonexistent acts and words and thoughts and desires to men and women who really existed) is not my point: my argument is the piecemeal swallowing of the novel by history.

16 In the twentieth century Huizinga went ahead of Ranke. “The sociologist, etc., . . . simply searches for the way in which the result was already determined in the facts. The historian, on the other hand, must always maintain towards his subject an indeterminist point of view. He must constantly put himself at a point in the past at which the know factors still seem to permit different outcomes. If he speaks of Salamis, then it must be as if the Persians might still win” (The Idea of History).

17 An—alas, trendy and unreasonable—awareness of the relations between actuality and potentiality has recently become a fad even among—again, alas— reputed historians. This fad, and the term, is that of “counterfactual” history. Both words, factual and counter, are wrong. History does not consist of “facts.” And the alternative of an event is not necessarily “counter” to it, that is, its actual opposite. Indeed, the positing or suggesting alternative events must be close to it, plausible. To write a speculative “history” of what would have happened if Lee had won the battle of Gettysburg is one thing; to write another one in which the South won because a Patagonian army had arrived in Pennsylvania to help fight the North, would be quite another one, implausible and senseless.

18 Epistemology is (or, rather, was) a branch of philosophy dealing with theories and conditions of knowledge. I happen to believe that we have now reached a stage in the evolution of our consciousness where all meaningful philosophy must become epistemology. (Or, as I wrote often: we must begin thinking about thinking itself.)

19 Diary 13 April 2007. “Heisenberg was a much greater physicist than Einstein (about whom even now giant ambitious biographies are written in America). Yet not exceptionally great, Heisenberg. Of course Newton, Galileo, etc., were not, either.”

20 There are many examples of this. Here is one I now cite (not for the first time), by a Nobel Prize–winning physicist, Steven Weinberg (1999): “The universe is very large, and it should be no surprise that, among the enormous number of planets that support only unintelligent life and the still vaster number of planets that cannot support life at all, there is some tiny fraction on which there are living beings who are capable of thinking about the universe, as we are doing here.” Whereupon I wrote: “What kind of language—and logic—is this? ‘No surprise’? Consider: The five boroughs of New York are very large, and it should be no surprise that, among the enormous number of its inhabitants who do not walk and the still vaster number who do not like to walk at all, there is some tiny fraction who are able to levitate.”

21 Found is not really a good word, because it may be argued that their findings have been inventions rather than discoveries.

22 In other words: mind leading to matter, mind preceding matter. The very history of medicine, the etiology (meaning: the study of the sources) of illnesses, is a startling proof of this. Especially among the most “advanced” peoples of the modern world, an increasing number and variety of illnesses now come not from outside, not from wounds or infections, but from the “inside”—another evidence of the increasing intrusion of mind, of the sometimes palpable but in essence deep and complex—confluences of mind and matter in human lives.

23 An illustration. “It has become part of accepted wisdom [?] to say that the twentieth century was the century of physics and the twenty-first century will be the century of biology. Two facts [?] about the coming century are agreed on by almost anyone [?]. Biology is now bigger than Physics. . . . These facts [?] raise an interesting question. Will the domestication of high technology, which we have seen marching from triumph to triumph with the advent of personal computers and gps receivers and digital cameras, soon be extended from physical technology to biotechnology? I believe that the answer to this question is yes. Here, I am bold enough to make a definite prediction. I predict that the domestication [?] of biotechnology will dominate our lives during the next fifty years at least as much as the domestication of computers has dominated our lives during the previous fifty years.”

Thus begins the leading article in the 19 July 2007 number of the New York Review of Books—“many of his essays appeared in these pages”—by Freeman Dyson, Professor of Physics Emeritus at the Institute for Advanced Study in Princeton. “Accepted wisdom.” “Facts.” “Domestication.” Thus an idiot savant.

24 Diary 1 May 2001. “John Polkinghorne [religious physicist at Cambridge] in Religion and Science: he wants to square the circle. I want to circle the circle.”

25 For those readers who believe in God: the world, and this earth were created by Him for the existence and consciousness of human beings.

25 Diary 20 December 2005. “All right: my knowledge that we on this earth are at the center of the universe, which (of course) is our invention. We have been inventing (and re-inventing) the universe. But God is more than our invention. And to those who think that God is nothing but our invention my question is: Why? What makes human beings want such an invention? Is it not that a spark of God may exist within us?

27 Already more than four hundred years ago Montaigne wrote: “The question is whether if Ptolemy was therein deceived, upon the foundations of his reason, it were not very foolish to treat now in what these later people [the “Copernicans”] say: and whether it is not more likely that this great body, which we call the world, is not quite another thing than that what we imagine.” And another fifty years later, Pascal, a different thinker from Montaigne: “Thought constitutes the greatness of man.” “Man is but a reed . . . but he is a thinking reeDiary The entire universe need not arm itself to crush him. A vapor, a drop of water suffices to kill him. But if the universe were to crush him, man would still be more noble than that which killed him, because he knows that he dies and the advantage which the universe has over him; the universe knows nothing of this. All our dignity consists, then, in thought. By it we must elevate ourselves, and not by space and time which we cannot fill.” “A thinking reed. It is not from space that I must seek my dignity, but from the government of my thought. I shall have no more if I possess worlds. By space the universe encompasses and swallows me up like an atom; by thought I comprehend the world.”

28 Diary 13 January 2006. “In the end my quick but strong vision that we are at the center of the universe, etc., has been—perhaps—an important recognition. Whether it will be later discovered by admirers does not matter. What matters, alas, is that I threw these things, these recognitions, off, without much insisting and developing and propagating them. This happened (and still happens) because of the frivolousness and failures of my character. My ‘historical’ philosophy, this new monism of our knowledge of ourselves and of the universe, may be my great mental achievement. But I do not feel particularly proud of this.”

29 Diary 18 April 2007. “Heidegger, Dasein, etc. ‘Any description of consciousness must include a world.rsquo; No: any description of a world must include consciousness— of one’s own perspective, of one’s time, of one’s situation in history, etc.”