The history of philosophy is generally written by subject experts who explore and follow a tradition of thought about which figures and topics were "pivotal" and thereby created an ongoing research field. This is illustrated, for example, in Stephen Schwartz's A Brief History of Analytic Philosophy: From Russell to Rawls. Consider the history of Anglophone philosophy since 1880 as told by a standard narrative in the history of philosophy of this period. One important component was "logicism" -- the idea that the truths of mathematics can be derived from purely logical axioms using symbolic logic. Peano and Frege formulated questions about the foundations of arithmetic; Russell and Whitehead sought to carry out this program of "logicism"; and Gödel proved the impossibility of carrying out this program: any set of axioms rich enough to derive theorems of arithmetic is either incomplete or inconsistent. This narrative serves to connect the dots in this particular map of philosophical development. We might want to add details like the impact of logicism on Wittgenstein and the impact of Tractatus Logico-Philosophicus, but the map is developed by tracing contacts from one philosopher to another, identifying influences, and aggregating groups of topics and philosophers into "schools".
Brian Weatherson, a philosopher at the University of Michigan, has a different idea about how we might proceed in mapping the development of philosophy over the past century (link) (Brian Weatherson, A History of Philosophy Journals: Volume 1: Evidence from Topic Modeling, 1876-2013. Vol. 1. Published by author on Github, 2020; link). Professional philosophy in the past century has been primarily expressed in the pages of academic journals. So perhaps we can use a "big data" approach to the problem of discovering and tracking the emergence of topics and fields within philosophy by analyzing the frequency and timing of topics and concepts as they appear in academic philosophy journals.
Weatherson pursues this idea systematically. He has downloaded from JSTOR the full contents of twelve leading journals in anglophone philosophy for the period 1876-2013, producing a database of some 32,000 articles and lists of all words appearing in each article (as well as their frequencies). Using the big data technique called "topic modeling" he has arrived at 90 subjects (clusters of terms) that recur in these articles. Here is a quick description of topic modeling.
Topic modeling is a type of statistical modeling for discovering the abstract “topics” that occur in a collection of documents. Latent Dirichlet Allocation (LDA) is an example of topic model and is used to classify text in a document to a particular topic. It builds a topic per document model and words per topic model, modeled as Dirichlet distributions. (link)Here is Weatherson's description of topic modeling:
An LDA model takes the distribution of words in articles and comes up with a probabilistic assignment of each paper to one of a number of topics. The number of topics has to be set manually, and after some experimentation it seemed that the best results came from dividing the articles up into 90 topics. And a lot of this book discusses the characteristics of these 90 topics. But to give you a more accessible sense of what the data looks like, I’ll start with a graph that groups those topics together into familiar contemporary philosophical subdisciplines, and displays their distributions in the 20th and 21st century journals. (Weatherson, introduction)Now we are ready to do some history. Weatherson applies the algorithms of LDA topic modeling to this database of journal articles and examines the results. It is important to emphasize that this method is not guided by the intuitions or background knowledge of the researcher; rather, it algorithmically groups documents into clusters based on the frequencies of various words appearing in the documents. Weatherson also generates a short list of keywords for each topic: words of a reasonable frequency in which the probability of the word appearing in articles in the topic is significantly greater than the probability of it occurring in a random article. And he further groups the 90 subjects into a dozen familiar "categories" of philosophy (History of Philosophy, Idealism, Ethics, Philosophy of Science, etc.). This exercise of assigning topics to categories requires judgment and expertise on Weatherson's part; it is not algorithmic. Likewise, the assignment of names to the 90 topics requires expertise and judgment. From the point of view of the LDA model, the topics could be given entirely meaningless names: T1, T2, ..., T90.
Now every article has been assigned to a topic and a category, and every topic has a set of keywords that are algorithmically determined. Weatherson then goes back and examines the frequency of each topic and category over time, presented as graphs of the frequencies of each category in the aggregate (including all twelve journals) and singly (for each journal). The graphs look like this:
We can look at these graphs as measures of the rise and fall of prevalence of various fields of philosophy research in the Anglophone academic world over the past century. Most striking is the contrast between idealism (precipitous decline since 1925) and ethics (steady increase in frequency since about the same time), but each category shows some interesting characteristics.
Now consider the disaggregation of one topic over the twelve journals. Weatherson presents the results of this question for all ninety topics. Here is the set of graphs for the topic "Methodology of Science":
All the journals -- including Ethics and Mind -- have articles classified under the topic of "Methodology of Science". For most journals the topic declines in frequency from roughly the 1950s to 2013. Specialty journals in the philosophy of science -- BJPS and Philosophy of Science -- show a generally higher frequency of "Methodology of Science" articles, but they too reveal a decline in frequency over that period. Does this suggest that the discipline of the philosophy of science declined in the second half of the twentieth century (not the impression most philosophers would have)? Or does it rather reflect the fact that the abstract level of analysis identified by the topic of "Methodology of Science" was replaced with more specific and concrete studies of certain areas of the sciences (biology, psychology, neuroscience, social science, chemistry)?
These results permit many other kinds of questions and discoveries. For example, in chapter 7 Weatherson distills the progression of topics across decades by listing the most popular five topics in each decade:
This table too presents intriguing patterns and interesting questions for further research. For example, from the 1930s through the 1980s a topic within the general field of the philosophy of science is in the list of the top five topics: methodology of science, verification, theories and realism. These topics fall off the list in the 1990s and 2000s. What does this imply -- if anything -- about the prominence or importance of the philosophy of science within Anglophone philosophy in the last several decades? Or as another example -- idealism is the top-ranked topic from the 1890s through the 1940s, only disappearing from the list in the 1960s. This is surprising because the standard narrative would say that idealism was vanquished within philosophy in the 1930s. And another interesting example -- ordinary language. Ordinary language is a topic on the top five list for every decade, and is the most popular topic from the 1950s through the present. And yet "ordinary language philosophy" would generally be thought to have arisen in the 1940s and declined permanently in the 1960s. Finally, topics in the field of ethics are scarce in these lists; "promises and imperatives" is the only clear example from the topics listed here, and this topic appears only in the 1960s and 1970s. That seems to imply that the fields of ethics and social-political philosophy were unimportant throughout this long sweep of time -- hard to reconcile with the impetus given to substantive ethical theory and theory of justice in the 1960s and 1970s. For that matter, the original list of 90 topics identified by the topic-modeling algorithm is surprisingly sparse when it comes to topics in ethics and political philosophy: 2.16 Value, 2.25 Moral Conscience, 2.31 Social Contract Theory, 2.33 Promises and Imperatives, 2.41 War, 2.49 Virtues, 2.53 Liberal Democracy, 2.53 Duties, 2.65 Egalitarianism, 2.70 Medical Ethics and Freud, 2.83 Population Ethics, 2.90 Norms. Where is "Justice" in the corpus?
Above I described this project as a new approach to the history of philosophy (surely applicable as well to other fields such as art history, sociology, or literary criticism). But it seems clear that the modeling approach Weatherson pursues is not a replacement for other conceptions of intellectual history, but rather a highly valuable new source of data and questions that historians of philosophy will want to address. And in fact, this is how Weatherson treats the results of this work: not as replacement but rather as a supplement and a source of new puzzles for expert historians of philosophy.
(There is an interesting parallel between this use of big data and the use of Ngrams, the tool Google created to map the frequency of the occurrences of various words in books over the course of several centuries. Here are several earlier posts on the use of Ngrams: link, link. Gabriel Abend made use of this tool in his research on the history of business ethics in The Moral Background: An Inquiry into the History of Business Ethics. Here is a discussion of Abend's work; link. The topic-modeling approach is substantially more sophisticated because it does not reduce to simple word frequencies over time. As such it is a very significant and innovative contribution to the emerging field of "digital humanities" (link).)
No comments:
Post a Comment