October 16, 2002, 6-7:30 PM in 234 Moses Hall

Karen Neander (Karen Neander)

Content ascriptions in cognitive science

Cognitive scientists and neuroethologists use a notion of content that allows for the possibility of error. In that minimal sense at least, they treat neurological processes as intentional. This paper argues that if we attend to the role that content ascriptions play in these sciences, there are useful lessons for theories of mental representation. In particular, there are morals drawn for the functional indeterminacy problem in relation to teleosemantics.

November 02, 2002, 6-7:30 PM in 234 Moses Hall

Branden Fitelson (San Jose State)

Studies in Bayesian Confirmation Theory

According to Bayesian confirmation theory, evidence E (incrementally) confirms (or supports) a hypothesis H (roughly) just in case E and H are positively probabilistically correlated (under an appropriate probability function Pr). There are many logically equivalent ways of saying that E and H are correlated under Pr. Surprisingly, this leads to a plethora of non-equivalent quantitative measures of the degree to which E confirms H (under Pr). In fact, many non-equivalent Bayesian measures of the degree to which E confirms (or supports) H have been proposed and defended in the literature on confirmation and inductive logic. I will provide a brief (partial) survey of the various proposals, and a discussion of the philosophical ramifications of the differences between them. I will argue that the set of candidate measures can be narrowed drastically by just a few intuitive and simple desiderata. This leads, ultimately, to a unified theory of confirmation and logical support which subsumes both deductive and inductive cases in a single quantitative theory. If time permits, I will discuss (in more detail) one or two well-known confirmation-theoretic (or inductive-logical) problems from the perspective of this unified theory.

Slides for the talk

November 04, 2002, 6-7:30 PM in 234 Moses Hall

Richard Zach (University of Calgary, Philosophy)

The Early History of the Epsilon Calculus

The epsilon calculus is a logical formalism introduced by David Hilbert in the early 1920s. It consists in adding a term forming operator epsilon to formal systems of logic or mathematics. A term of the form epsilon x A(x), intuitively denotes an x for which A(x) is true, if there is one. Such a term forming operator can replace quantifiers, and in fact, it was this feature that led Hilbert to introduce epsilons. We will survey both the motivations that led Hilbert to the introduction of the epsilon calculus, as well as the early results obtained about it. This includes Hilbert’s ‘Ansatz’ for a consistency proof for arithmetic, his second, ‘failed’ proof for the first epsilon theorem, and the contributions of Bernays and Ackermann to the development of the theory of the epsilon calculus in the 1920s and early 1930s.


Hilbert’s “Verungluckter Beweis,” the first epsilon theorem, and consistency proofs

The Epsilon Calculus

November 25, 2002, 6-7:30 PM in 234 Moses Hall

Paul Teller (UC Davis, Philosophy)

Science and Truth

Recent work on the nature and function of scientific theories has broad repercussions for philosophy. The conventional attitude towards science holds that science aims to discover the exact, universally applicable laws that describe all features of the world. Recent work by Nancy Cartwright, Ronald Giere, and others reminds us that this is not what is accomplished by the science we know, not even our best theories in physics. Rather science produces models, always limited in their application and accuracy. This school of thought urges that, in the worlds of Laplace, “our efforts will always fall infinitely short of [the] mark” of the perfect scientific representation of the world. If so, the limited representative and epistemic achievements we know from actual science, and not the utopian perfect model, constitute the best of such human endevors, and so should be taken to apply to human representation very generally. I conclude that we need to take seriously the project of rethinking the idea of truth: Rather than exact representation or “correspondence” with the world, our representational contact with the world should be understood as something fundamentally, and not incidentally, inexact.

Such general considerations are all well and good, but not very illuminating without a more specific account of how fundamentally inexact representation works. The present talk will explore some initial efforts towards developing such an account.

December 04, 2002, 6-7:30 PM in 234 Moses Hall

Reviel Netz (Stanford University)

Mathematics is a Symbolic System: a Historical Note

It is natural to us to picture mathematics in terms of systems expressed in symbols . In this talk I briefly present two of the historical contexts that made such a picture natural. Where do symbols come from? Where do systems?

Late Antiquity and the Middle Ages are found to have a surprisingly central role in the development of this image of mathematics. New practices of writing made it natural to arrange mathematics systematically, as well asto introduce symbolism where, earlier, natural language alone was used. In conclusion, then, the talk illustrates what we might gain by considering the history of mathematical writing as part of the history of writing itself.

March 12, 2003, 6-7:30 PM in 234 Moses Hall

Mark van Atten (Philosophy, Catholic University of Leuven, Belgium)

On Gödel’s philosophical development

It is by now well known that Gödel first advocated the philosophy of Leibniz and then, since 1959, that of Husserl. This raises three questions:

  1. How is this turn to Husserl to be interpreted? Is it a complete dismissal of the Leibnizian philosophy, or a different way to achieve similar goals, or could the relation be even closer than that?

  2. Why did Gödel turn specifically to the later Husserl’s transcendental idealism?

  3. Is there any detectable influence from Husserl on Gödel’s writings?

The second question is particularly pressing, given that Gödel was, by his own admission, a realist in mathematics since 1925. Wouldn’t the uncompromising realism of the early Husserl’s Logical Investigations have been a more obvious choice for a Platonist like Gödel?

We want to suggest that the answer to the first question follows immediately from the answer to the second; and the third question can only be approached when an answer to the second has been given. We will present an answer to the second question and then see how it sheds light on the other two. To support our argument, we adduce unpublished material from the Gödel archive.

This is joint work with Juliette Kennedy (Mathematics, University of Helsinki)

April 02, 2003, 6-7:30 PM in 234 Moses Hall

James Tappenden (University of Michigan)

The Riemannian Context of Frege’s Philosophy of Mathematics

Standard interpretations of Frege’s logicism take at face value a drastically oversimplified picture of nineteenth century mathematics. Against this background, Frege can easily seem to be outside the mathematical mainstream, and many commentators have recently concluded exactly this. This paper digs into the historical background to show that Frege (and nineteenth century foundations more generally) was more profoundly engaged with ongoing mathematics than has been realized. Among other things that are relevant to assessing the mathematical thrust of Frege’s work are a contrast between the Riemann-inspired “conceptual” style of Göttingen and the arithmetical style of Weierstrass and Berlin, differences between Riemann and Weierstrass on definitional practices, and the early applications in number theory of (what is now called in English) the theory of cyclotomic extensions. This historical background is not just interesting in its own right, but it also prompts a revised assessment of what Frege was trying to do in Grundlagen, and in turn suggests a reevaluation of the proper relation between the philosophy of mathematics and mathematical practice.

April 16, 2003, 6-7:30 PM in 234 Moses Hall

Oystein Innebo (University of Oslo, Norway)

Neo-logicism and Impredicativity

The neo-logicist philosophy of arithmetic is based on Hume’s Principle (HP), which says that the number of Fs is identical to the number of Gs if and only if the Fs and the Gs can be one-to-one correlated. Let Frege Arithmetic (FA) be the second-order theory with HP as its sole non-logical axiom. FA can be shown to be technically adequate as a basis for arithmetic: It is consistent, and it allows us to interpret second-order Peano Arithmetic. But is it philosophically adequate? My paper discusses how the highly impredicative character of FA bears on the question of philosophical adequacy. I distinguish two dimensions of impredicativity involved in FA and prove that both are needed to establish that every number has a successor. I argue this dependence of the Successor Axiom on impredicative comprehension is both unnatural and probably circular. I therefore make an alternative proposal, which allows a purely predicative proof of the Successor Axiom.