September 19, 2001, 6-7:30 PM in 234 Moses Hall
Aladdin M. Yaqub (University of New Mexico, Albuquerque/Stanford University)
The View from Within: Mathematics as an Objective Study
In this paper I present a defense of the thesis that mathematics is an objective study whose subject matter is abstract structures. The central argument is carried out by examining and analyzing a few examples in which mathematical explanations and applications are presented within mathematics itself, i.e., for the sake of illuminating other parts of mathematics. This approach departs from customary philosophical practice, in which explanation is thought to exist exclusively in the natural and social sciences, and mathematical application is understood to be the application of mathematics in the sciences. Much of the current research in the philosophy of mathematics focuses on the philosophical implications of the applicability of mathematics in science - its external applicability. My goal is to rethink this focus and redirect our attention to the internal applicability of mathematics and to its (usually neglected) explanatory nature.
October 24, 2001, 6-7:30 PM in 234 Moses Hall
Jean-Yves Beziau (CSLI, Stanford University)
Possible Worlds: A Fashionable Nonsense?
The concept of possible world, originally due to Leibniz, has become extremely popular during the XXth century. This notion has been used in a technical way to develop the semantics of modal logics and this has been a real kick out for the growth of modal logics. Then it started spreading in philosophy and further contaminated all the fields, from literary theory to quantum physics. Witness of this glory: a Nobel symposium has even been organized on this topic.
After presenting some historical comments about the resurgence of this notion I will show that:
- From a technical point of view in logic, possible worlds are useless and can easily be eliminated.
- In philosophy of language, people use this notion of possible worlds in an ambiguous way. For example Kripke’s distinction between definite descriptions and proper names is based on a notion of possible worlds which does not fit with the standard technical definition (which was promoted by Kripke himself).
- In the other fields, it is used in a metaphorical and rhetorical way, which leads to philosophy ranging from bad science-fiction to post-modern prose.
This is joint work with Darko Sarenac (Philosophy, Stanford University)
November 07, 2001, 6-7:30 PM in 234 Moses Hall
Jenann Ismael (University of Arizona at Tuscon)
The Reality of Time
Most philosophers, partly out of commitment to the physicists’ picture ofan unchanging “block universe”, composed of a timeless web of “world-lines” in a four-dimensional space, and partly out of reaction to McTaggarts’ celebrated 1908 argument, have rejected the notion that time passes, as incoherent or, at least, physically disreputable. I defend a metaphysically benign conception of passage that is quite available to adherents of a block universe, and recommend combining it with objective features of the world as a general strategy for arriving at a full reconstruction of our experience of temporality.
November 14, 2001, 6-7:30 PM in 234 Moses Hall
Guido Bacciagaluppi (University of California, Berkeley)
Remarks on Spacetime and Locality in Everett’s Interpretation
Interpretations that follow Everett’s idea that (at some level ofdescription) the universal wave function contains a multiplicity ofcoexisting realities, usually claim to give a completely local account of quantum mechanics. That is, they claim to give an account that avoidsboth a non-local collapse of the wave function, and the action at adistance needed in hidden variable theories in order to reproduce the quantum mechanical violation of the Bell inequalities. In this paper, I sketch how these claims can be substantiated in tworenderings of Everett’s ideas, namely the many-minds interpretation ofAlbert and Loewer, and versions of many-worlds interpretations that rely on the concepts of the theory of decoherence.
December 05, 2001, 6-7:30 PM in 234 Moses Hall
Dr. Dirk Hartmann (Visition Scholar, Department of Philosophy)
Philosophical Pseudoproblems and Relevant Logics
The purpose of teaching logic in philosophy is to enable us to evaluate arguments with respect to (formal) validity.Standard classical logic refers to a concept of validity, which allows for the relation of implication to hold between premises and conclusion in cases where there is no “relevant” connection between the premises and the conclusion. A prominent example for this is the rule “Ex-Falso-Quodlibet” (EFQ), which allows us to infer an arbitrary proposition from a contradiction.As alternatives there are non-standard systems called “relevance logics” or “relevant logics” meant to avoid irrelevance. (What “relevance” precisely is supposed to mean will be made more clear in the talk.) The most outstanding property of those systems is their paraconsistency, i.e. they do not “trivialize” inconsistent theories (that´s because EFQ does not hold).But only comparatively few logicians/philosophers think of paraconsistent and relevant logics as superior to standard logic up til now. There are several reasons for this, but the most important ones IMO are some arguments to the effect that the principles denied by relevant logics are nevertheless valid inferences.So it seems that while we can indeed devise “relevant” calculi, these are necessarily incomplete - we can maybe use them in contexts where we want to focus on certain inferences for extralogical reasons, but they are not apt to evaluate inferences with respect to their logical validity.
In my talk I will first show that the arguments against relevant logics are begging the question.Then I´ll go one step further and try to provide support for the claim that relevant logics are in fact to be preferred to standard logic in all discoursive contexts.(The crucial point is, that standard logics cannot fulfill their intended task of analyzing and evaluating arguments - far from doing that they even cause a multitude of artificial philosophical pseudoproblems.)
February 12, 2002, 6-7:30 PM in 234 Moses Hall
Penelope Maddy (U.C. Irvine)
Three Forms of Naturalism
I will compare and contrast Quine’s naturalistic perspective with those of two post-Quinean naturalists (John Burgess and myself).
February 26, 2002, 6-7:30 PM in 234 Moses Hall
Michael Strevens (Stanford University)
Understanding Complexity through Probablity
Many complex systems behave in surprisingly simple ways. In spite of the complexity of animal behavior, for example, most ecosystems exhibit a fairly simple and stable behavior at the population level. By examining the foundations of statistical reasoning about complex systems such as gases, ecosystems, and certain social systems, I aim to provide an understanding of how macrolevel simplicity emerges from microlevel complexity. The talk will outline the main arguments and methods of my forthcoming book on this topic.
February 27, 2002, 6-7:30 PM in 234 Moses Hall
William Demopoulos (University of Western Ontario)
On the Philosophical Interest of Frege Arithmetic
After reviewing the methodological basis for the claims oftraditional and neo-Fregean logicism to have established the a prioricityof arithmetic, I outline an account of the truth of Frege arithmetic that can be exploited to address the issue of a prioricity.
March 06, 2002, 6-7:30 PM in 234 Moses Hall
Peter Godfrey-Smith (Stanford University)
Goodman’s problem and Scientific Methodology
A solution to Goodman’s “grue” problem is proposed. There is no appeal to simplicity, naturalness, projectibility, or prior probabilities. Instead, building on proposals made by Frank Jackson and Alexis Burgess, a connection is made to ideas from statistical methodology and data analysis.
April 03, 2002, 6-7:30 PM in 234 Moses Hall
Gerd Grasshoff (University of bern)
Copernican revolution, Kepler’s new astronomy and inferences to the best explanation
Inference to the best explanation (IBE) seems to capture a wide range of criteria for scientific theory choice: having the alternative between two competing theories of the same natural phenomena, prefer the one with the best explanation of the phenomena and infer its truth. The critical notions of IBE are “inference”, “explanation” and what it means to be the best explanation. Their clarification will be pursued in discussing two case studies. The development of astronomical theories during the period of scientific revolutions from Copernicus to Kepler is taken as a testing ground for the applicability of various versions of IBE. New historical results of Copernicus’ and Kepler’s scientific objectives and procedures will shed light on their methodological choices leading to one of the most remarkable scientific achievements.
April 17, 2002, 6-7:30 PM in 234 Moses Hall
Ronald Anderson (Boston College)
Mathematics, Logic, and Language: tracing the influence of interpreting symbolic structures in 19th century British electromagnetism
During the 19th century the physical significance of a set of mathematical terms known as potentials became a contested issue in British Electromagnetism. Such terms formed a vital part of James Clerk Maxwell’s formulation of the laws of electromagnetism (largely between 1855-73), reflecting both the path he followed in developing electromagnetism and the dynamic formulation of his approach. For those who followed Maxwell extending his electromagnetic theory, a group known as the Maxwellians such as FitzGerald, Heaviside, Hertz, and Larmor, apparent problems to dowith the physical significance of potential terms and as well the very complexity of Maxwell’s own formulation led to the development and use of a set of basic laws for electromagnetism without such terms (the four “Maxwell equations”). The potential terms became seen as being useful for mathematical calculations but largely without physical significance. Nevertheless the potential terms remained woven into the texture of electromagnetic theory and rather remarkably, even for some of those at the heart of Maxwellian enterprise such as Larmor and FitzGerald, aspects of their physical significance and interpretation remained unclear as late as 1901.
The details of this history have been well charted. Woven into these developments, however, is a larger set of issues yet to be fully traced about the ways physical meaning and significance is ascribed to mathematics, concerns that occur explicitly in the writings of the Maxwellians and revealing of the ways mathematics was taken to function in physical theories and how general features to do with the nature of physical reality were inferred from physical theories. I will trace a number of the features of how these interpretative issues occurred in the Maxwellians and introduce a project of locating these practices in a concern with language and abstract structures and their meanings in other disciplinary contexts in early and mid-19th century Britain.
The first context is that of mathematics where residues of the 17th and 18th century concern with the meaning of imaginary and negative numbers continued and a debate on meaning and interpretation of the newly emergent “symbolic algebra” in the 1830s flowered (Peacocke, deMorgan, William R.Hamilton). The second context is the development of logic as an abstract formal science using symbolic structures with an associated discussion of their meaning (Whately, Boole, deMorgan, and later Jevons), and the third, that of the emergence of British philology from the late 1830s onwards which included an older concern with the status of language, words and their relationship to thought (e.g. J. S. Mill, Herschel, Trench, Whewell). A fourth, minor influence is that of the interpretation of scripture which arose as a contested issue mid century. Maxwell in his 20s was reading Boole, Mill, Trench, Whewell, Herschel and was also in touch with the theological disputes and interesting resonances from these worlds appear in his writings.
The project also highlights the emergence in various contexts in mid 19th century Britain of a concern with formal abstract structures and their interpretation as well as hints of an interesting and more fundamental concern with the very ways such structures acquire meanings.
May 08, 2002, 6-7:30 PM in 234 Moses Hall
John MacFarlane (University of California, Berkeley, Philosophy)
A Valuational (but not Supervaluational) Approach to Vagueness
What’s good about supervaluationism is that it allows us to classify vague sentences as neither true nor false in borderline cases, while keeping classical logic largely intact and respecting “penumbral connections.” What’s bad about supervaluationism is that it engenders semantic and logical anomalies, like true existential sentences with no true instances, a truth predicate that does not satisfy the T-schema, and counterexamples to forms of the deduction theorem, case argument, and reductio ad absurdum. In this talk, I propose a simple modification of supervaluationism that keeps all of the good without any of the bad. I retain the framework of admissible valuations, the relativized notion of truth on a valuation, and even the notion of supertruth (truth on all valuations). But I refrain from identifying truth simpliciter with supertruth. The only argument I can find for identifying truth simpliciter with supertruth is that there is no better candidate (in the valuational framework). But that is only a good argument if we need to assign unrelativized truth values to sentences in contexts of utterance. Why not just make do with valuation-relative truth assessments?
The best answer I know goes like this: A semantics is only worthy of the name if it can help us understand the use that can be made of sentences in speech acts, e.g. assertions. It can do this if it tells us in what contexts sentences are true (simpliciter), because truth is the objective condition for the correctness of an assertion. (One is obliged to withdraw an assertion that has been shown to be untrue.) But if our “semantics” just tells us in what contexts sentences are true on a valuation, it cannot play this role, because truth on a valuation does not have any direct connection with the use of sentences in assertion. Truth on a valuation, like truth on an assignment of values to variables, is an auxiliary notion in semantics; it connects with use only indirectly, through its role in systematizing the (unrelativized) truth values of complex sentences in various contexts of utterance. Thus there must be a way of moving from valuation-relative truth assessments to unrelativized ones, and the identification of truth with supertruth provides just that.
One way to reject this argument is to connect assessments of truth on a valuation directly with the use of sentences in assertions. Using Belnap and Green’s treatment of future contingents as a model, I show how this might be done. Assertions of vague sentences, I suggest, can be assessed as (objectively) correct or incorrect only relative to both a context ofutterance and what I call a context of assessment. Since the set ofadmissible valuations can change from one context of assessment to another (due to accommodation), a single vague assertion can be correct relative to one context of assessment and incorrect relative to another. To understand the use of vague sentences in assertions, then, we need a semantics that tells us when an uttered sentence is true on a valuation, not when it is true simpliciter. We need a valuational, not a supervaluational, semantics.