“Knowledge ascriptions and the psychological consequences of thinking about error,” Jennifer Nagel

Philosophical Quarterly, volume 60, number 239 (April 2010), pages 287-306

When attempting to determine whether the subject of an epistemology case has knowledge or mere true belief, many of us find that our judgments are influenced by whether the case explicitly describes a way in which the subject’s belief could have turned out to be false. Nagel uses the following cases to illustrate this phenomenon:

(a)  John A. Doe is in a furniture store. He is looking at a bright red table under normal lighting conditions. He believes the table is red. Q: Does he know that the table is red?

(b)  John B. Doe is in a furniture store. He is looking at a bright red table under normal lighting conditions. He believes the table is red. However, a white table under red lighting would look exactly the same to him, and he has not checked whether the lighting is normal, or whether there might be a red spotlight shining on the table. Q: Does he know that the table is red?

Whereas most of us are inclined to judge that John A. Doe (henceforth ‘A’) knows that the table is red when we consider case (a), many of us judge that John B. Doe (henceforth ‘B’) does not know that the table is red when we consider case (b).

Non-skeptical invariantists hold that A and B both know that the table is red, and that the intuition that B does not know that the table is red is misleading. Thus they are obliged to provide some explanation for why many of us have this misleading intuition. One explanation, offered separately by Hawthorne and Williamson, appeals to the psychological phenomenon known as the availability heuristic, a pattern of thought whereby one estimates an event’s probability on the basis of how easily one can recall or imagine events of that type occurring. The availability heuristic can lead to error, since some types of event are both very improbable and very easily brought to mind; e.g., stranger abduction. Hawthorne and Williamson’s idea is that if certain error possibilities come easily to mind when considering a case, then the availability heuristic can lead one to overestimate the likelihood that the subject of the case is mistaken. Since knowledge is, plausibly, incompatible with a sufficiently grave danger of error, this can in turn lead one to incorrectly judge that the subject does not know.

Nagel has two goals in this article: first, to argue that the preceding explanation based on the availability heuristic is not empirically or conceptually plausible (I will focus on Nagel’s empirical objections in what follows); second, to propose an alternative explanation that will also be congenial to non-skeptical invariantists, but that appeals to a different psychological phenomenon, epistemic egocentrism.

Nagel’s empirical case against the Hawthorne-Williamson explanation is very compelling. It consists of three objections, each of which is supported by careful discussion of empirical work in psychology that examines the conditions under which the availability heuristic, or heuristics in general, are deployed. The point of these objections is, roughly, that we should not expect the availability heuristic to be deployed in the situation in which a person considers case (b) and has the intuition that B does not know that the table is red.

First, for the Hawthorne-Williamson explanation to succeed, it must be the case that the possibility that the table is illuminated by a red spotlight is easily imagined when considering case (b), but far less easily imagined when considering case (a). Since the only difference between (a) and (b) is that (b) explicitly mentions the red spotlight possibility, the explanation depends on supposing that simply mentioning the red spotlight possibility makes it easy to imagine. But some possibilities are just difficult to imagine, even when they are explicitly mentioned.  Nagel discusses a study in which subjects were presented with an abstractly described and unusual scenario in which they contract a fictitious disease. These subjects did not overestimate the probability of their contracting the disease. Because the scenario was difficult to imagine, the availability heuristic did not kick in. So, the Hawthorne-Williamson explanation depends on the red spotlight possibility being easy to imagine once mentioned. But why think that? It is, after all, quite an unusual possibility.

Second, the Hawthorne-Williamson explanation assumes that provided the red spotlight possibility is easily brought to mind, people will overestimate its probability. But this overlooks the spontaneous discounting of availability: when there is an obvious explanation for why a certain type of event is present to a subject’s mind, such as a researcher explicitly mentioning it just before asking a question, subjects do not overestimate its probability. So we should not expect people who consider case (b) to overestimate the probability of the red spotlight possibility.

Third, and finally, Nagel notes that heuristics tend to weaken or evaporate when subjects are placed in a context where careful reflection is called for. But these conditions obtain in the epistemology seminar room, where the intuition to be explained is presumably most common. It is therefore unlikely that the availability heuristic can explain the intuition in the situations in which it is most commonly elicited.

Nagel’s own explanation for our differing intuitions with respect to (a) and (b) appeals to epistemic egocentrism, our tendency to implicitly assume that other people share more of our beliefs and concerns than they in fact do. In brief, Nagel’s explanation is this: when we read case (b) we falsely assume that B is, like us, concerned with the red spotlight possibility. But B does not bother to look up and check whether there is a red spotlight above the table. This indicates that B is in a distracted or careless state of mind that is in tension with supposing that B knows the table is red. Nagel offers this explanation as an empirical hypothesis, and she briefly sketches the kind of empirical work that would be required to confirm it.

Share this story
  • E-mail this story to a friend!
  • Facebook
  • MySpace
  • LinkedIn
  • TwitThis
Printer-friendly Printer-friendly

1 Comment

  • By jambam, March 29, 2010 @ 5:28 pm

    this is a great bar-stool conversation

Other Links to this Post

RSS feed for comments on this post. TrackBack URI

Leave a comment

Tag cloud widget powered by nktagcloud