📙 Illustrated Philosophy

  Poetry & Prose    Books / People

— A Brief History of Western Philosophy
by Anthony Kenny (1998)


REFERENCE

Kenny, A. (1998). An Illustrated Brief History of Western Philosophy. New York: John Wiley & Sons


In 1998, the first edition of Anthony Kenny’s comprehensive history of Western philosophy was published. It is said that it was met with much praise and critical acclaim. This was in no small part because it was the first book since Bertrand Russell’s 1945 A History of Western Philosophy to offer a concise single-author review of the complete history of philosophy from the pre-Socratics to the modern masters of the 20th c. The edition of Kenny’s boo pictured below is the twentieth year edition one.

To be honest with ya, I sort of know that Kenny is an agnostic, a believer in sitting on the fence. Take a look below to see what I am on about. Should or should this not colour your judgment of his work? Well no, not necessarily. But I do think we need to know. As a wannabe wordsmith and student of English Lit., knowing the context of a work is not much short of a necessity.

An illustrated brief history of western philosophy (20th anniversary edition)
Editable PDF (sample):
Kenny, A. (1998). An Illustrated Brief History of Western Philosophy. New York: John Wiley & Sons

Selected Extracts

From the book’s ‘Introduction’:

A person wondering whether to read a history of philosophy may reasonably wish to ascertain in advance what is the nature of the discipline whose history she is offered. However, it is by no means easy to give a plain and uncontentious answer to the question ‘what is philosophy?’
 
The word has meant different things at different times and in different cultures, and even at the present time it carries different connotations in different places. If you look at the shelves in a bookstore labelled ‘philosophy’ you will find books on self-help and on the environment, books containing advice on how to make yourself a better person and the world a better place. On the other hand, if you look at the lecture lists of a university philosophy faculty you will be invited to be instructed on such topics as the metaphysics of entanglement and to hear the answer to questions such as ‘Are there synthetic a priori propositions?’
 
Philosophy, as treated in the present book, will be conceived neither as broadly as in the bookstore definition nor as narrowly as in the faculty definition. But, sadly, it will only be after reading the book that the reader will understand exactly how I believe the term is to be understood.
 
Philosophy is simultaneously the most exciting and frustrating of subjects. It is exciting because it is the broadest of all disciplines, since it explores the basic concepts which run through all our talking and thinking. It is frustrating because its great generality makes it extremely difficult: not even the greatest philosophers have succeeded in reaching a complete and coherent under- standing even of the language that we use to think our simplest thoughts. The man who is, as it were, the patron saint of philosophers, Socrates, claimed that the only way in which he surpassed others in wisdom was that he was aware of his own ignorance.
 
This may well seem a dispiriting introduction. The counterbalancing good news is that philosophy does not require any special preliminary training, and can be undertaken by anyone who is willing to think hard and follow a line of reasoning. In itself, it does not call for any mathematical skill or literary connoisseurship.
 
A first crude attempt to define the subject is to say that philosophy is what the great philosophers did. This is fairly watertight as an initial account: we would laugh out of court any definition of the philosopher that ruled out Plato and Aristotle. But the problem remains unsolved. Those two giants were not only philosophers – Plato was a magnificent dramatist, and Aristotle a pioneering scientist – and we have to make up our minds what parts or aspects of their works count as philosophy pure and simple.
 
The philosophy section in the bookstore will very likely be placed between the section on religion and the section on science. Throughout its history philosophy has been entwined with both these activities: it has marched through the ages in a central position with religion on its right hand and science on its left hand. In many areas of study philosophical thought grew out of religious reflection and grew into empirical science. Many issues which in the past were discussed by philosophers would nowadays be regarded as the province of science: the structure of matter and the history of the cosmos, for instance. But long before philosophers addressed these issues they were the topics of religious myths.
 
Religion, philosophy, and science all offer answers to fundamental questions, responses to the wonder which is the starting point of the human intellectual quest. Religion suggests answers by appealing to sacred texts regarded as revelations from a superhuman power; science provides answers by observation of, and experiment upon, the natural world. Philosophy has no sacred texts, and operates not by experience but by pure thought. In this it resembles mathematics, which is perhaps its closest kin in the family of intellectual disciplines.
 
It can be said that philosophy is the younger sister of religion, and the elder sister of science. In Greek and Hebrew culture mythical accounts of the origin and nature of the world preceded any scientific conjectures. In ancient Athens it was Plato who divorced philosophy from religion by his devastating criticism of the theology of the Homeric poems that were the nearest thing the Greeks had to a Bible. Aristotle, on the other hand, brought under the umbrella of philosophy a number of sciences, such as astronomy, cosmology, physics, and biology. But the one discipline that he claimed to have invented, namely logic, was, like philosophy, closer to mathematics than to science. And logic remained the partner of philosophy when the sciences had, in the course of history, set up house independently.
 
Aristotle made a distinction between practical sciences and theoretical sciences. What he meant by practical sciences were disciplines such as ethics and politics, which guide behaviour and teach us how to relate to each other. Such studies, we might say, belong on the right-hand side of philosophy, where its concerns overlap with those of religion. Theoretical sciences have no practical goal, but pursue truth for its own sake. Prominent among these is what he called ‘physics’, from the Greek word for nature. For centuries it bore the name ‘Natural Philosophy’, and it belongs on the left-hand side of philosophy, where it is concerned with the same objects as what we would nowadays call science.
 
It can indeed be said that Aristotle invented the concept of science as we understand it and as it has been understood since the Renaissance. First, he is the first person whose surviving works show detailed observations of natural phenomena. Secondly, he was the first philosopher to have a sound grasp of the relationship in scientific method between observation and theory. Thirdly, he identified and classified different scientific disciplines and explored their relationships to each other. Indeed, the very concept of a distinct discipline is due to him. Fourthly, he is the first professor to have organized his lectures into courses, and to have taken trouble over their appropriate place in a syllabus. Fifthly, he set up the first research institute of which we have any detailed knowledge, the Lyceum, in which a number of scholars and investigators joined in collaborative inquiry and documentation. Sixthly, and not least important, he was the first person in history to build up a research library – not simply a handful of books for his own bookshelf, but a systematic collection to be used by his colleagues and to be handed on to posterity.
 
Aristotle’s contributions to practical philosophy, his treatises on ethics and politics, are still read today, and not just out of antiquarian interest. But his contributions to natural philosophy – physics, chemistry, biology, and physiology – have long ago been superseded. In the second century AD the medic Galen proved Aristotle wrong on a crucial point of physiology: it was the brain, and not the heart, that was the primary vehicle of human intellectual activity. In the sixth century an Aristotelian scholar called John Philoponus demolished his master’s physics, denying Aristotle’s account of motion and his thesis that the world had no beginning.
 
In late antiquity the most significant event for the history of philosophy was the advent, and eventual political triumph, of Christianity. A recent historian of philosophy, Anthony Gottlieb, describes its impact in terms of the tale of Sleeping Beauty. ‘Having pricked its finger on Christian theology, philosophy fell asleep for about a thousand years until awakened by the kiss of Descartes.’
 
Certainly, from the period when the Christian Emperor Justinian closed the schools of Athens in 529, philosophy was for many centuries subordinate to theology. Thinkers were no longer free to follow an argument wherever it led in accordance with the philosophical ideal held up long ago by Socrates. Henceforth, if an argument led to a conclusion in conflict with Christian doctrine, then it must be given up. But the relationship between philosophy and religion operated in both directions. The first great Christian philosopher, St Augustine, introduced a heavy dose of Platonic philosophy into a community that had begun as a Jewish sect. It must also be admitted that the religious strictures were not always harmful to philosophy. Philoponus’ improvement upon Aristotle’s physics was largely motivated by a desire to defend the Judeo-Christian doctrine of creation.
 
During the seventh and eighth centuries philosophy did go to sleep throughout Christendom, and its slumbers were hardly disturbed by the attempts of Charlemagne (crowned Holy Roman Emperor in 800) to revive the study of letters. The kiss that awoke it came from an unlikely quarter: from the realm of Islam which had spread from Arabia across Africa and southern Europe in the two centuries after the death of Muhammad in 633. In Islam as in Christendom philosophy was intertwined with religion in an embrace that was not always easy. The greatest Muslim philosopher of the period, Avicenna (980–1037), was anxious to ensure that his teachings did not come into conflict with the Koran, but his work was regarded as suspect by conservative mullahs.
 
During the ninth and tenth centuries it was Islam that kept alive the flame of Greek philosophy. It was not until the twelfth century that Aristotle’s works were available in Latin, in translations that were sometimes from the original Greek and sometimes from Arabic versions. They were studied in conjunction with the commentaries of the Arabic philosopher Averroes.
 
Initially regarded as suspect by Church authorities, the Aristotelian texts became the basis of university courses for several centuries.
 
The institution of universities was no less important a factor in the framework of medieval philosophy than the dominance of the Christian religion. The university is, in essentials, a thirteenth-century innovation, if by ‘university’ we mean a corporation of people engaged professionally, full-time, in the teaching and expansion of a corpus of knowledge in various subjects, handing it on to their pupils, with an agreed syllabus, agreed methods of teaching, and agreed professional standards. Universities and parliaments came into existence at roughly the same time, and have proved themselves the most long-lived of all medieval inventions.
 
A typical medieval university consisted of four faculties: the universal under- graduate faculty of arts, and the three higher faculties, linked to professions, of theology, law, and medicine. Students in the faculties learnt both by listening to lectures from their seniors and, as they progressed, by giving lectures to their juniors. A teacher licensed in one university could teach in any university, and graduates migrated freely in an age when all academics used Latin as a common language. Teaching was carried out not only by lecturing, but by means of disputations in which one student would argue for one side of a case, and another for another, and the master would sum up in favour of one or the other, or more likely resolve the dispute by drawing distinctions. This feature of medieval pedagogy survives today in the adversarial structure that marks the procedure in English-language courts, in contrast to the investigative methods of continental courts.
 
Philosophy belonged in the faculty of arts, but many of the topics it studied overlapped with those that formed the subject matter of the theology course. It was St Thomas Aquinas in the thirteenth century who demarcated clearly the boundary between the two disciplines. He is best known for his massive contribution to Christian theology, the Summa Theologiae. But he wrote another, shorter, treatise, the Summa contra Gentiles, which takes its initial stand on purely philosophical premisses that could be accepted by Jews and Muslims and pagans. He explained his method thus:
 
“Mahometans and pagans do not agree with us in accepting the authority of any Scripture we might use in refuting them, in the way in which we can dispute against Jews by appeal to the Old Testament and against heretics by appeal to the New. These people accept neither. Hence we must have recourse to natural reason, to which all men are forced to assent.”
 
Thus he sets out the difference between revealed theology, which is based on sacred texts, and natural theology, which is a branch of philosophy. He believed that there were some religious truths, such as the existence and attributes of God, that could be established by pure reason, while others, such as the doctrine of the Trinity, could be proved only by appeal to the authority of the Bible and the Church.
 
When we today look for material in medieval texts that is relevant to philosophy as nowadays understood, we can often find it in theological rather than philosophical treatises. But as the middle ages progressed the division between natural and revealed theology became sharper, as thinkers of a more sceptical turn than Thomas began to doubt the powers of natural theology and were thrown more and more upon an appeal to revelation. At the time of the Reformation, Martin Luther denounced natural theology as a delusion, as part of his demotion of human reason by comparison with divine grace.
 
The fact that medieval university courses were based on Aristotelian texts had in the long run a paradoxical result. It was that though Aristotle can claim to be the founder of science, his authority came to be a massive obstacle to science’s progress. The history of science from the time of the Renaissance is a series of secessions, in one discipline after another, from the dominion of Aristotelian philosophy. Almost all of these developments took place outside universities, and were initiated by independent thinkers.
 
Aristotelian physics, already challenged in antiquity, was the first to be discarded. In the sixteenth century Copernicus and his successors showed that Aristotle was wrong to believe that the earth was the centre of the universe and that it was surrounded by crystalline spheres that carried the heavenly bodies. In the seventeenth century Newton’s laws of motion replaced those of Aristotle that had been shown by Philoponus to be erroneous.
 
Chemistry was the next discipline to detach itself. The theory of the four elements, earth, air, fire, and water, each based on a combination of the properties of heat, cold, wetness, and dryness, had long survived its ancient formulation. The researches into air and fire of the eighteenth-century French chemist Lavoisier showed that combustion could take place only in the presence of a gas that he named ‘oxygen’ which was absorbed in the course of combustion. Lavoisier also discovered that water was a compound of oxygen and another element he named ‘hydrogen’. The four elements inherited from Greek philosophy were permanently displaced in favour of a new more rigorous table of elements.
 
In the nineteenth century Aristotle’s belief in the fixity of animal and plant species was undercut by Charles Darwin’s discovery of evolution by natural selection. Psychology, too, set up as an experimental discipline quite distinct from the mixture of philosophy and physiology presented in Aristotle’s treatise On the Soul. In the present century many would claim that psychology – whether philosophical or experimental – has itself been superseded by neurophysiology.
 
What do we learn from the way in which disciplines that in antiquity and in the middle ages were part of philosophy have since become independent sciences? We can say that a discipline remains philosophical as long as its concepts are unclarified and its methods are controversial. Perhaps no scientific concepts are ever fully clarified, and no scientific methods are ever totally uncontroversial; if so, there is always a philosophical element left in every science. But once problems can be unproblematically stated, when concepts are uncontroversially standardized, and where a consensus emerges for the methodology of solution, then we have a science setting up home independently, rather than a branch of philosophy.
 
Philosophy, once called the queen of the sciences, and once called their handmaid, is perhaps better thought of as the womb, or the midwife, of the sciences. But in fact sciences emerge from philosophy not so much by parturition as by fission, as a single example will suffice to show.
 
In the seventeenth century philosophers were much exercised by the problem which of our ideas are innate and which are acquired. This problem split into two problems, one psychological (what do we owe to heredity and what do we owe to environment?) and one epistemological (how much of our knowledge depends on experience and how much is independent of it?) The first question was handed over to psychology; the second question remained philosophical. But the second question itself split into a number of questions, one of which was ‘is mathematics merely an extension of logic, or is it an independent body of truth?’ This was given a precise answer by the work of logicians and mathematicians in the twentieth century. The answer was not philosophical, but mathematical. So here we had an initial, confused, philosophical question which ramified in two directions – towards psychology and towards mathematics, leaving in the middle a philosophical residue which remains to be churned over, concerning the nature of mathematical propositions.
 
Despite its close links to the sciences, philosophy itself is not a science. Philosophy is not a matter of expanding knowledge, of acquiring new truths about the world. The philosopher is not in possession of information that is denied to others. Philosophy is not a matter of knowledge, it is a matter of understanding, that is to say, of organizing what is known.
 
If philosophy is not a science, shall we say that it is an art like poetry, fiction, and drama? It does indeed resemble the arts in certain ways. In the arts, classic works do not date. If we want to learn physics or chemistry, as opposed to their history, we don’t nowadays read Newton or Faraday. But we read the literature of Homer and Shakespeare not merely to learn about the quaint things that passed through people’s minds in far off days of long ago. Surely, it may well be argued, the same is true of philosophy. It is not merely in a spirit of antiquarian curiosity that we read Aristotle today. Great philosophy is essentially the work of individual genius, and Kant does not supersede Plato any more than Shakespeare supersedes Homer.
 
Philosophy resembles the arts in having a significant relation to a canon. A philosopher situates the problems to be addressed by reference to a series of classical texts. Because it has no specific subject matter, but only characteristic methods, philosophy is defined as a discipline by the activities of its great practitioners. As remarked above, the earliest people whom we recognize as philosophers, the pre-Socratics, were also scientists, and several of them were also religious leaders. They did not yet think of themselves as belonging to a common profession, the one with which we today claim continuity. Those of us who call ourselves philosophers today can genuinely claim to be the heirs of Plato and Aristotle. But we are only a small subset of their heirs. What distinguishes us from their other heirs, and what entitles us to inherit their name, is that — unlike the physicists, the astronomers, the medics, the linguists, and so on — we philosophers pursue the goals of Plato and Aristotle only by the same methods as were already available to them.
 
However, philosophy resembles the sciences, in that its primary aim is to teach. Poetry, fiction, and drama can tell us much about human nature and the natural world. But the instructive effect of literature is oblique in comparison with that of science and philosophy, because of its essential relationship to aesthetic pleasure, whether it entertains or elevates. Philosophy and science, on the other hand, are essentially directed to the pursuit of truth.
 
Though it is not a part of science, philosophy is something that must precede and underpin scientific investigation. Suppose a cognitive scientist tells us that he is going to find out what happens in the brain when we think. We ask him, before starting his research, to be quite sure that he knows what thinking is, what ‘think’ means. Perhaps he will reply that in order to get clear about the meaning of the word all we have to do is to watch ourselves while we think: what we observe will be what the word means. But if we give serious attention to the ways in which we use the word ‘think’ we see that this is a misunderstanding of the concept of thought.If a neurophysiologist does not have a sound grasp of that concept prior to his investigations, then whatever he discovers, it will not tell us much about thought. He may protest that he is not interested in the linguistic trivialities which entertain philosophers. But after all, he is talking our ordinary language in order to identify the problem he wants to solve, and in order to define the boundaries of his research programme. He needs, therefore, to take ordinary language seriously: he should not dismiss it as ‘folk-psychology’.
 
In fact, it is possible for philosophy to be objectively rational without being a branch of science. Philosophy is, indeed, the quest for rationality across all disciplines, whether sciences, humanities, or arts; and its primary method is the attentive study of the language in which these different forms of rationality find their expression. A philosopher studies language, but not as a philologist does. On the one hand, the philosopher has a greater concern with the social practices and institutions in which the language is embedded; on the other hand, she is not concerned with the idioms and idiosyncracies of particular natural languages, but seeks to identify among their great variety the conceptual structures that underlie them all.
 
Some thinkers hope that in a better future philosophy will be wholly replaced by science in the way in which Aristotelian natural philosophy has been replaced by fundamental physics. I believe that this is an illusion. There are branches of philosophy that will always retain an unscientific residue, in particular the disciplines that concern human beings, such as philosophy of mind and philosophy of language. They will remain forever philosophical because of their self-referential structure. The philosophy of mind uses thought to investigate thought, and the philosophy of language uses language to investigate language.
 
The ambition of philosophy is to reach an understanding of language and the world that transcends particular times and places; but any individual philosopher must accept that he will never reach that goal. This has been well put by Thomas Nagel in his book The View From Nowhere. ‘Even those who regard philosophy as real and important know that they are at a particular, and we may hope, early stage of its development, limited by their own primitive intellectual capacities, and relying on the partial insights of a few great figures of the past. As we judge their results to be mistaken in fundamental ways, so we must assume that even the best efforts of our own time will come to seem blind eventually.’
 
In his book Nagel urged those of us who are philosophers to combine unashamed pride in the loftiness of our goal, with undeluded modesty about the poverty of our achievement, and to resist the temptation to turn philosophy into something less difficult and more shallow than it is. He ended his treatment of philosophical problems with words that have long echoed in my mind. ‘I do not feel equal to the problems treated in this book. They seem to me to require an order of intelligence wholly different from mine.’ Others who have tried to address the central questions of philosophy will recognize the feeling.
 
You may approach the history of philosophy from the side of history or from the side of philosophy. If you are historian, wishing to understand the peoples and societies of the past, you may read their philosophy to grasp the conceptual climate in which they thought and acted. If you are a philosopher you may study the great dead philosophers in order to seek illumination upon themes of your own philosophical inquiry. As a philosopher you will be most interested in those branches of philosophy, such as ethics and metaphysics, which remain relevant today; as a historian you may well take more interest in those branches of natural philosophy that have been superseded by science.
 
The historian of philosophy, whether primarily interested in philosophy or primarily interested in history, cannot help being both a philosopher and a historian. A historian of painting does not have to be a painter, a historian of medicine does not, qua historian, practice medicine. But a historian of philosophy cannot help doing philosophy in the very writing of history. It is not just that someone who knows no philosophy will be a bad historian of philosophy; it is equally true that someone who has no idea how to cook will be a bad historian of cookery. The link between philosophy and its history is a far closer one. The historical task itself forces historians of philosophy to paraphrase their subjects’ opinions, to offer reasons why past thinkers held the opinions they did, to speculate on the premisses left tacit in their arguments, and to evaluate the coherence and cogency of the inferences they drew. But the supplying of reasons for philosophical conclusions, the detection of hidden premisses in philosophical arguments, and the logical evaluation of philosophical inferences are themselves full-blooded philosophical activities. Consequently any serious history of philosophy must itself be an exercise in philosophy as well as in history.
 
How are we to view the different forms that philosophy has taken over the centuries? If philosophy were a science we could look on it as an ongoing, co- operative, cumulative intellectual venture in which from time to time fresh discoveries are made. On that view, we twenty-first-century philosophers have an advantage over earlier practitioners of the discipline. We stand, no doubt, on the shoulders of other and greater philosophers, but we do stand above them. We have superannuated Plato and Kant. But this, as we have seen, is a mistaken view: the great works of the best philosophers do not date.
 
Is there, then, any sense in which philosophy makes progress? Philosophy does not make progress in the way that science does, with the discoveries of the most recent generation building on, and making obsolete, the theories of previous generations. Contemporary philosophers, of course, know some things that the greatest philosophers of the past did not know; but the things they know are not philosophical matters but the truths that have been discovered by the sciences begotten of philosophy. New developments in philosophy commonly consist in the application of philosophy to new areas of discourse as human life becomes more complicated. Thus in addition to simple ethics – the ethics of the human condition – we now have business ethics, medical ethics, and environmental ethics.
 
Some people believe that the major task of philosophy is to cure us of intellectual confusion. On this modest view of the philosopher’s role, the tasks to be addressed differ across history, since each period needs a different form of therapy. The knots into which the undisciplined mind ties itself differ from age to age and different mental motions are necessary to untie the knots. A prevalent malady of our own age, for instance, is the temptation to think of the mind as a computer, whereas earlier ages were tempted to think of it as a telephone exchange, a pedal organ, a homunculus, or a spirit. Maladies of earlier ages may be dormant, such as the belief that the stars are living beings; or they may return, such as the belief that the stars enable one to predict human behaviour.
 
The therapeutic view of philosophy, however, may seem to allow only for variation over time, not for genuine progress. But that is not necessarily true. There are some things that philosophers of the present day understand which even the greatest philosophers of earlier generations failed to understand. For instance, philosophers clarify language by distinguishing between different senses of words; and once a distinction has been made, future philosophers have to take account of it in their deliberations. A confusion of thought may be so satisfactorily cleared up by a philosopher that it no longer offers temptation to the unwary thinker.
 
One such example appears early in this history. Parmenides, the founder of the discipline of ontology (the science of being) based much of his system on a systematic confusion between different senses of the verb ‘to be’. Plato, in one of his dialogues, sorted out the issues so successfully that there has never again been an excuse for mixing them up: indeed, it now takes a great effort of philosophical imagination to work out exactly what led Parmenides into confusion in the first place.
 
Another example is the issue of free-will. At a certain point in the history of philosophy a distinction was made between two kinds of human freedom: liberty of indifference (ability to do otherwise) and liberty of spontaneity (ability to do what you want). Once this distinction has been made the question ‘Do human beings enjoy freedom of the will?’ has to be answered in a way that takes account of the distinction. Even someone who believes that the two kinds of liberty in fact coincide has to provide arguments to show this; he cannot simply ignore the distinction and hope to be taken seriously on the topic.
 
It is unsurprising, given the relationship of philosophy to a canon, that one notable feature of philosophical progress consists in coming to terms with, and interpreting, the thoughts of the great philosophers of the past. The great works of the past do not lose their importance in philosophy – but their intellectual contributions are not static. Each age interprets and applies philosophical classics to its own problems and aspirations. This is, in recent years, most visible in the field of ethics. The ethical works of Plato and Aristotle are as influential in moral thinking today as the works of any twentieth century moralists – this is easily verified by consulting any citation index – but they are being interpreted and applied in ways quite different from the ways in which they were used in the past. These new interpretations and applications do effect a genuine advance in our understanding of Plato and Aristotle, but of course it is understanding of quite a different kind from that which is given by a new study of the chronology of Plato’s early dialogues, or a stylometric comparison between Aristotle’s various ethical works. The new light we receive resembles rather the enhanced appreciation of Shakespeare we may get by seeing a new and intelligent production of King Lear.
 
The history of philosophy presented in this book is not based on any notion that the current state of philosophy represents the highest point of philosophical endeavour up to the present. On the contrary, its primary assumption is that in many respects the philosophy of the great dead philosophers has not dated, and that even today one may gain great illumination by a careful reading of the great works that we have been privileged to inherit.
 
The kernel of any kind of history of philosophy is exegesis: the close reading and interpretation of philosophical texts. Exegesis may be of two kinds, internal or external. In internal exegesis the interpreter tries to make the text coherent and consistent, employing the principle of charity in interpretation. In external exegesis the interpreter seeks to bring out the significance of the text by comparing it and contrasting it with other texts.
 
Exegesis is the common basis of the two quite different historical endeavours which I described earlier. In one, which we may call historical philosophy, the aim is to reach philosophical understanding, about the matter or issue under discussion in the text. Typically, historical philosophy looks for the reasons behind, or the justification for, the statements made in the text under study. Even so, the historical philosopher must also have a knowledge of the historical context in which past philosophers wrote their works. If he finds a good reason for a past philosopher’s doctrines, his task is done. But if he concludes that the past philosopher had no good reason, he has a further and much more difficult task, of explaining the doctrine in terms of the context in which it appeared – social, perhaps, as well as intellectual.
 
In the other endeavour, the history of ideas, this contextual aim is paramount. The aim of the historian is not to solve the philosophical problem in hand, but to achieve understanding of a person or an age or a historical succession. Typically, the historian of ideas looks not for the reasons so much as the sources, or causes, or motives, for saying what is said in the target text.
 
Both historical philosophy and the history of ideas base themselves on exegesis, but of the two, the history of ideas is the one most closely bound up with the accuracy and sensitivity of the reading of the text. In different histories of philosophy, the skills of the historian and those of the philosopher are exercised in different proportions. The due proportion varies in accordance with the purpose of the work and the field of philosophy in question. The pursuit of historical understanding and the pursuit of philosophical enlightenment are both legitimate approaches to the history of philosophy, but both have their dangers. Historians who study the history of thought without being themselves involved in the philosophical problems that exercised past philosophers are likely to sin by superficiality. Philosophers who read ancient, medieval, or early modern texts without a knowledge of the historical context in which they were written are likely to sin by anachronism.
 
Each of these errors can nullify the purpose of the enterprise. The historian who is unconcerned by the philosophical problems that troubled past writers has not really understood how they themselves conducted their thinking. The philosopher who ignores the historical background of past classics will gain no fresh light on the issues which concern us today, but merely present contemporary prejudices in fancy dress.
 
When we look back at the long history of philosophy there are two striking features. One is that despite the emergence of innumerable new scientific disciplines over the centuries, there are still areas where science, philosophy, and religion are as entangled with each other as they were in the age of the pre-Socratics. The other is that in very recent times access to philosophy has been widened immeasurably.
 
The most obvious area in which science has not yet cut free from philosophy and religion is evolutionary biology. Examination of the fossil record and experiments on generations of fruit flies are patently scientific enterprises. Yet in many places, ranging from Turkey to a number of American states, scientists are hindered from presenting the results of their researches because they are believed to be in conflict with sacred texts, whether Genesis or the Koran. On the other hand, different scientific accounts of the mechanisms of evolution themselves involve assumptions of a philosophical nature: the neo-Darwinist claim that the only mechanism of evolution is natural selection upon random genetic mutation is not an empirical discovery, but a philosophical postulate. In the debates between neo-Darwinists and creationists both sides start from a position of faith. Whether it is possible to make sense of the notion of supernatural intelligent design is itself a question of pure philosophy.


The art of sitting on the fence:

No delusions here:

Richard Dawkins Q&A

By Kristian Hammerstad for New Statesman

The English academic Richard Dawkins was born in Nairobi, Kenya in 1941. He is the bestselling author of 16 books including The Selfish Gene and The God Delusion.

Who are your heroes?
As a child, Hugh Lofting’s Doctor Dolittle (I haven’t seen any of the films and don’t want to). As an adult, Charles Darwin.

What book last changed your thinking?
Democracy Hacked by Martin Moore, a penetrating exposé of the mercenaries who got us into our present appalling mess (see below for a review of Moore’s work).

Which political figure do you look up to?
Clement Attlee, who presided over the foundation of the welfare state.

In which time and place, other than your own, would you like to live?
The moment language was first invented. Did it start with vocabulary but no grammar? Was it invented by a single individual, in which case to whom did that genius talk?

What TV show could you not live without?
David Attenborough, of course.

Who would paint your portrait?
I have a blind spot where visual art is concerned. My answer to this question, were I to give one, would be founded in ignorance and therefore of no interest.

What’s the best piece of advice you’ve ever received?
Don’t follow your instincts, don’t follow a leader, don’t follow a tradition, don’t follow a holy book. Follow the evidence, follow logic and reason.

What’s currently bugging you?
The subversion of democracy by our current political leaders. We are, or should be, a parliamentary democracy, not a plebiscocracy and not a dictatorship ruled by a loud-mouthed Hooray-Henry Bullingdon bully.*

Are we all doomed?
Realistically yes, but it’s terrible to live your life in the shadow of such pessimism, so I prefer to feign optimism to myself. Almost every species that has ever lived has gone extinct, leaving no descendants. We might be a rare exception.


Review of: Democracy Hacked

The digital age was supposed to be democratic, but under Amazon, Google, Facebook etc., it has become a quest for profit at any cost.

By Steven Poole for New Statesman

Last month, Apple unveiled the latest version of its watch, featuring new health-monitoring features such as alerts for unusually low or high heart rates, and a way to sense when the wearer has fallen over and, if so, call the emergency services. In itself, that sounds pretty cool, and might even help save lives. But it’s also another nail in the coffin of social solidarity.

Why? Because shortly after the Apple announcement, one of America’s biggest insurance companies, John Hancock, announced it would stop selling traditional life insurance, and would now offer only “interactive” policies that required customers to wear a health-monitoring device – such as an Apple Watch or Fitbit. But such personalised insurance plans undermine the social spreading of risk that makes insurance a public good. Knowing every little dirty secret about our lifestyles, such an insurer will be heavily incentivised to make the riskier customers pay more in premiums than the healthy-livers. Eventually, the fortunate will subsidise the less fortunate to a far smaller degree than they do on traditional insurance models. For those who get sick, this will literally add insult to injury.

This happened too late to be mentioned in Martin Moore’s excellent new book, but he wouldn’t be surprised, having devoted an alarming section to the race into data-mining health applications by Apple, Google and Amazon. As he explains, “the big tech platforms – and many of their investors – can imagine a future in which each of them becomes our main gateway to health care.” This will, of course, undermine the NHS and cost us our biomedical privacy.

Silicon Valley’s dream of “disrupting” or “reimagining” health care is just one example of the way the tech giants long to muscle their way in to, and extract large profits from, social institutions they don’t understand. Tech CEOs know nothing in particular about education, for another thing, but are canny enough to see that it is a huge potential revenue centre, if only they could persuade schools to use their software and computers.

Actually, Google is already doing a very good job of that. By mid-2017, the majority of schoolchildren in America were using Google’s education apps, which of course track the activity of every child, creating a store of data that – who knows? – might come in useful when those children grow up to be attractive targets for advertising. In the near future, Moore points out, we might no longer have a choice: “It will be a brave parent who chooses to opt out of a data-driven system, if by opting out it means their child has less chance of gaining entry to the college of their choice, or of entering the career their aspire to.”

If, practically speaking, you can’t opt out of a health care platform, or switch from the education platform your local school uses, then unaccountable corporate monopolies have usurped the functions of government. Moore calls this “platform democracy”. You might equally suggest it as a new meaning for “technocracy”, which up till now has meant rule by experts. Soon, technocracy might mean rule by people who don’t understand anything, but think that data alone constitutes expertise; people who glory in the “engineering ethos” of rapid prototyping and deployment; or, as Facebook’s old motto had it, “move fast and break things”. This is fine when you are building a trivial app; it’s not so fine if the things you are breaking are people and social institutions.

It already seems a long time ago that people were hailing the so-called Facebook and Twitter revolutions in the Middle East, and that hacker-pranksters such as the Anonymous collective chose targets such as Scientology. Now these have been replaced by Russian bot-farms and total surveillance. Moore’s book is an investigation of how we got here from there, and a troubling warning about how the future might unfold.

He begins by bringing the reader up to speed, in lucid detail, on Steve Bannon and the Breitbart website, as well as the story of Cambridge Analytica. He explains what we know about Russian interference in the 2016 US presidential election, while making the important point that such operations are not at all new. During the Cold War, the USSR and its puppet regimes ran energetic fake-news operations against the West. The only difference now is that modern technology makes disinformation operations much more effective, as falsehoods can go viral around the globe in a matter of minutes. Putin now has his own social-media sock-puppet farm, hidden in plain sight under the bland name of the “Internet Research Agency”. (It does about as much research as Jacob Rees-Mogg’s “European Research Group” for hard Brexiteers.)

This leads directly into Moore’s larger argument, which is that for reasons of profit the tech platforms actively turned themselves into machines perfectly suited to the dissemination of anarcho-nationalist hatred and untruth. Until recently, Moore notes, Facebook rarely thought about politics, and if it did “it tended to assume the platform was by its nature democratising”. But ahead of its 2012 stock-market floating, it went “all out to create an intelligent, scalable, global, targeted advertising machine” that gave advertisers granular access to users. And so it created the most efficient delivery system for targeted political propaganda the world had ever seen.

It wasn’t just the bad guys who noticed this. In 2012, Barack Obama’s blog director Sam Graham-Felsen enthused: “If you can figure out how to leverage the power of friendship, that opens up incredible possibilities.” The possibilities that Facebook has since opened up would have seemed incredible six years ago. A member of the Trump campaign team openly described one aspect of their Facebook campaign as “voter suppression operations” aimed at Democrats, using something called “dark posts”. These allowed operators to conduct sophisticated testing comparing the effects of different kinds of adverts, creating, as Moore puts it, “a remarkably sophisticated behavioural response propaganda system”.

For its part, Google contributed to the global miasma of virtual bullshit through its innovations in advertising to create what is known as “ad tech”. Moore calls this “the poison at the heart of our digital democracy”, because “it cannot function without behavioural tracking, it does not work unless done at a gargantuan scale, and it is chronically and inherently opaque”. Famously, the Google founders, Larry Page and Sergey Brin, noted in 1998 that any search engine that depended on advertising revenue would be biased and not serve its users well. But then they, too, realised that they wanted to make tons of money, and advertising would be how. It was Google’s innovations in selling online advertising, Moore argues, that created the obsession with clicks that came to dominate the internet and drive the commissioning of ever more trivial click bait by terrified publishers. Apart from that it represents a terrible waste of formidable talent: as a former Facebook engineer, Jeff Hammerbacher, said in 2011: “The best minds of my generation are thinking about how to make people click ads. That sucks.”

Because Google’s “theology of engineering” placed a premium on removing friction – “friction for the most part meaning people”, Moore observes sharply – the system was designed to be automated and accessible to everyone. Google didn’t care whether you were a hawker of vacuum cleaners or a neo-Nazi. But you’d get the personal touch if you had a lot of money to spend. Remarkably, Moore reports, Google as well as Facebook sent employees to work with the Trump campaign in 2016, to help them optimise and create “engagement” with their propaganda (Facebook offered to do the same for the Clinton campaign). The metric of engagement (meaning clicks) also created an inbuilt bias even in the standard automated system, Moore points out: “Thanks to the way the ad tech model prioritised ads that were engaging, incendiary political advertisements were cheaper to post than more measured ones.”

Moore’s chapter about Twitter is really about the death of local journalism and the decline of national newsrooms, and the void of political accountability that has opened up because of it. Twitter has its own well-documented problems with toxic trolls and bots, but the slow death of news isn’t its fault. More clearly culpable is Google. On 9/11 Google employees were instructed to simply copy the text and code of news websites and display it on Google’s homepage. As Google’s former communications man Douglas Edwards relates in his memoir, I’m Feeling Lucky: “No one asked whether it was within our legal rights to appropriate others’ content.” That innovation became Google News. Now, in the US, there are four PR people for every journalist.

Moore also limns an ever-more-intense “surveillance democracy”, to be enabled by new forms of compulsory computerised ID and the shiny networked gewgaws of what is sold as the “smart city”. “By 2020,” Moore notes, “every car in Singapore has to have a built-in GPS that communicates location and speed not just to the driver but to authorities,” while already in one housing development, officials have access to real-time data about energy, water, and waste usage. “In layman’s terms,” Moore explains, “this translates to the local authority knowing when you have just flushed the toilet.” The Black Mirror-style “social credit” scheme already under way in China, meanwhile, gives citizens a trust score based on their communication and purchasing behaviour. If you have a low score you might not be able to book a train ticket. In Moore’s view, such advances amount to “reimagining the state as a digital platform”, and this is even more dangerous than giving pieces of the state over to the existing tech platforms.

So what can we do? There are some green shoots of resistance, and they all share the general idea that our creaking institutions of democracy need to be brought into the modern age, partly so as to resist the threat of “for-profit platform democracy”, and partly so as to renew public trust. (In one Journal of Democracy study, only 40 per cent of millennials in the UK and the US were wholly committed to living in a democracy.) Emmanuel Macron’s much-vaunted “citizens’ consultations” have not as yet amounted to much, but at least, Moore says, he “acknowledged the scale of the challenge”. In 2017, Paris mayor Anne Hidalgo let schoolchildren vote on how their budget should be allocated: this and other experiments in direct mass consultation show that it’s now much easier to know exactly what the people want, if you sincerely care to find out.

The best example of a dynamic democracy that is technologically literate enough not to be in danger of a takeover by the corporate giants is Estonia. There, the digital infrastructure was built with democracy and public accountability in mind. ID is electronic, but the data the state holds on each citizen is held in separate subject-area “silos” that can’t be amalgamated, and the citizen has the right not only to see it all, but to be notified whenever the state looks at it. It is a transparent system that Estonians themselves are rightly proud of. And its example ought to remind us that if we don’t follow their lead and design digital democracy ourselves, there is no shortage of rapacious corporations that will line up to do it for us.

In the space of one election cycle, authoritarian governments, moneyed elites and fringe hackers figured out how to game elections, bypass democratic processes, and turn social networks into battlefields. Facebook, Google and Twitter - where our politics now takes place - have lost control and are struggling to claw it back. Prepare for a new strain of democracy. A world of datafied citizens, real-time surveillance, enforced wellness and pre-crime. Where switching your mobile platform will have more impact on your life than switching your government. Where freedom and privacy are seen as incompatible with social wellbeing and compulsory transparency. As our lives migrate online, we have become increasingly vulnerable to digital platforms founded on selling your attention to the highest bidder. Our laws don't cover what is happening and our politicians don't understand it. But if we don't change the system now, we may not get another chance.
In the space of one election cycle, authoritarian governments, moneyed elites and fringe hackers figured out how to game elections, bypass democratic processes, and turn social networks into battlefields. Facebook, Google and Twitter – where our politics now takes place – have lost control and are struggling to claw it back. Prepare for a new strain of democracy. A world of datafied citizens, real-time surveillance, enforced wellness and pre-crime. Where switching your mobile platform will have more impact on your life than switching your government. Where freedom and privacy are seen as incompatible with social wellbeing and compulsory transparency. As our lives migrate online, we have become increasingly vulnerable to digital platforms founded on selling your attention to the highest bidder. Our laws don’t cover what is happening and our politicians don’t understand it. But if we don’t change the system now, we may not get another chance.


Footnote

*  Political satire can (1) be all about finding entertainment in politics but also (2) drawing/saying things in clever ways to avoid censorship — a method of advancing political arguments in countries where such arguments are expressly forbidden. Read more:

01. — Left wing vs. Right wing
02. — The “–isms”
03. — Theocracy
04. — Terminology
05. — Political figures
06. — Political satire


RELATED READINGS

Eldridge, R. T. (Ed.). (2009). The Oxford Handbook of Philosophy and Literature. Oxford: Oxford University Press.

Gottlieb, A. (2000). The Dream of Reason: a History of Philosophy from the Greeks to the Renaissance. New York: W. W. Norton & Co.

Gottlieb, A. (2016). The Dream of Enlightenment: The Rise of Modern Philosophy. London: Penguin.

Grayling, A. C. (2019). The History of Philosophy. London: Penguin.

Russell, B. (1945). A History of Western Philosophy. London: George Allen & Unwin Ltd.


FULL READING LIST

§ — Non-fiction § — Fiction
§ — Writers
PoetryProse
§ — Thinkers
PhilosophersPsychologistsPolitical figures