Browse by Author
No. | Title/Abstract | Author(s) | Volume/Issue | Date | Downloads |
---|---|---|---|---|---|
11 |
Lying and Deception
According to the standard philosophical definition of lying, you lie if you say something that you believe to be false with the intent to deceive. Recently, several philosophers have argued that an intention to deceive is not a necessary condition on lying. But even if they are correct, it might still be suggested that the standard philosophical definition captures the type of lie that philosophers are primarily interested in (viz., lies that are intended to deceive). In this paper, I argue that the standard philosophical definition is not adequate as a definition of deceptive lying either. I then suggest two plausible alternative definitions of this concept. |
Don Fallis | vol. 10 | November 2010 | |
04 |
Groundwork for an Explanationist Account of Epistemic Coincidence
Many philosophers hold out hope that some final condition on knowledge will allow us to overcome the limitations of the classic "justified true belief" analysis. The most popular intuitive glosses on this condition frame it as an absence of epistemic coincidence (accident, luck). In this paper, I lay the groundwork for an explanationist account of epistemic coincidence—one according to which, roughly, beliefs are non-coincidentally true if and only if they bear the right sort of explanatory relation to the truth. The paper contains both positive arguments for explanationism and negative arguments against its competitors: views that understand coincidence in terms of causal, modal, and/or counterfactual relations. But the relationship between these elements is tighter than typical. I aim to show not only that explanationism is independently plausible, and superior to its competitors, but also that it helps make sense of both the appeal and failings of those competitors. |
David Faraci | vol. 19 | 2019 | |
23 |
Imprecise Chance and the Best System Analysis
Much recent philosophical attention has been devoted to the prospects of the Best System Analysis (BSA) of chance for yielding high-level chances, including statistical mechanical and special science chances. But a foundational worry about the BSA lurks: there don’t appear to be uniquely correct measures of the degree to which a system exhibits theoretical virtues, such as simplicity, strength, and fit. Nor does there appear to be a uniquely correct exchange rate at which the theoretical virtues trade off against one another in the determination of an overall best system. I argue that there’s no robustly best system for our world – no system that comes out best under every reasonable measure of the theoretical virtues and exchange rate between them – but rather a set of ‘tied-for-best’ systems: a set of very good systems, none of which is robustly best. Among the tied-for-best systems are systems that entail differing high-level probabilities. I argue that the advocate of the BSA should conclude that the high-level chances for our world are imprecise. |
Luke Fenton-Glynn | vol. 19 | 2019 | |
02 |
Decisions, Diachronic Autonomy, and the Division of Deliberative Labor
It is often argued that future-directed decisions are effective at shaping our future conduct because they give rise, at the time of action, to a decisive reason to act as originally decided. In this paper, I argue that standard accounts of decision-based reasons are unsatisfactory. For they focus either on tie-breaking scenarios or cases of self-directed distal manipulation. I argue that future-directed decisions are better understood as tools for the non-manipulative, intrapersonal division of deliberative labor over time. A future-directed decision to ϕ gives rise to a defeasible exclusionary reason to ϕ. This reason is grounded on the default authority that is normally granted to one’s prior self as an “expert” deliberator. I argue that this kind of exclusionary reason is the only one that can account for the effectiveness of future-directed decisions at shaping our diachronic agency without violating our autonomy over time. |
Luca Ferrero | vol. 10 | February 2010 | |
12 |
Epistemology from an Evaluativist Perspective
The paper presents a kind of normative anti-realist view of epistemology, in the same ballpark as recent versions of expressivism. But the primary focus of the paper is less on this meta-epistemological view itself than on how it should affect ground-level issues in epistemology: for instance, how it should deal with certain forms of skepticism, and how it allows for fundamental revision in epistemic practices (deductive, inductive and perceptual). It is hoped that these methodological consequences will seem attractive independent of the normative anti-realism. Indeed, some normative realists seem to embrace the view on skepticism, but it is argued that their position is unstable: the realism undermines the methodology. The general theme of the paper is that the issue of normative realism is deeply entwined with issues of methodology, in strong contrast to the common claim that meta-epistemological views in the tradition of expressivism have no first order impact. |
Hartry Field | vol. 18 | June 2018 | |
02 |
The Question of Realism
This paper distinguishes two kinds of realist issue -- the issue of whether the propositions of a given domain are factual and the issue of whether they are fundamental. It criticizes previous accounts of what these issues come to and suggests that they are to be understood in terms of a basic metaphysical concept of reality. This leaves open the question of how such issues are to be resolved; and it is argued that this may be done through consideration of what grounds the facts of a given domain, when fundamentality is in question, and what grounds our engagement with the putative facts, when factuality is in question. |
Kit Fine | vol. 1 | June 2001 | |
07 |
The Obscurity of Internal Reasons
This article suggests that the argument of Bernard Williams’ classic paper ‘Internal and External Reasons’ has been widely misunderstood. The first section sketches four variants of the Standard Argument, catalogs their weaknesses and observes the exegetical obstacles they face. The second section proposes an alternative reading immune to all these objections and better supported by the text and charity. On this interpretation, Williams gives one consistent argument that unites his central concerns with (i) the ‘explanatory dimension’ of reasons statements, (ii) their conceptual content, and (iii) the connection between reasons and deliberation. His argument is normally thought to be based on the common claim that reasons must be capable of motivating; I argue that it rather begins from a substantive analysis of the concept of a normative reason: that to believe that R is for you a reason for action just is to believe that R is an explanation of why you would act if you were to deliberate soundly. |
Stephen Finlay | vol. 9 | July 2009 | |
04 |
Value and Implicature
Moral assertions express attitudes, but it is unclear how. This paper examines proposals by David Copp, Stephen Barker, and myself that moral attitudes are expressed as implicature (Grice), and Copp's and Barker's claim that this supports expressivism about moral speech acts. I reject this claim on the ground that implicatures of attitude are more plausibly conversational than conventional. I argue that Copp's and my own relational theory of moral assertions is superior to the indexical theory offered by Barker and Jamie Dreier, and that since the relational theory supports conversational implicatures of attitude, expressive conventions would be redundant. Furthermore, moral expressions of attitude behave like conversational and not conventional implicatures, and there are reasons for doubting that conventions of the suggested kind could exist. |
Stephen Finlay | vol. 5 | July 2005 | |
03 |
The Principle of Stability
How can inferences from models to the phenomena they represent be justified when those models represent only imperfectly? Pierre Duhem considered just this problem, arguing that inferences from mathematical models of phenomena to real physical applications must also be demonstrated to be approximately correct when the assumptions of the model are only approximately true. Despite being little discussed among philosophers, this challenge was taken up (if only sometimes implicitly) by mathematicians and physicists both contemporaneous with and subsequent to Duhem, yielding a novel and rich mathematical theory of stability with epistemological consequences. |
Samuel C. Fletcher | vol. 20 | 2020 | |
22 |
Semantics and the Plural Conception of Reality
According to the singular conception of reality, there are objects and there are singular properties, i.e. properties that are instantiated by objects separately. It has been argued that semantic considerations about plurals give us reasons to embrace a plural conception of reality. This is the view that, in addition to singular properties, there are plural properties, i.e. properties that are instantiated jointly by many objects. In this article, I propose and defend a novel semantic account of plurals which dispenses with plural properties and thus undermines the semantic argument in favor of the plural conception of reality. |
Salvatore Florio | vol. 14 | July 2014 | |
01 |
The Dear Self
Frankfurt argues that self-love is the purest and -- paradoxically, perhaps -- most disinterested form of love. |
Harry Frankfurt | vol. 1 | January 2001 | |
15 |
Austerity and Illusion
Many contemporary theorists charge that naïve realists are incapable of accounting for illusions. Various sophisticated proposals have been ventured to meet this charge. Here, we take a different approach and dispute whether the naïve realist owes any distinctive account of illusion. To this end, we begin with a simple, naïve account of veridical perception. We then examine the case that this account cannot be extended to illusions. By reconstructing an explicit version of this argument, we show that it depends critically on the contention that perceptual experience is diaphanous, or more minimally and precisely, that there can be no difference in phenomenal properties between two experiences without a difference in the scenes presented in those experiences. Finding no good reason to accept this claim, we develop and defend a simple, naïve account of both veridical perception and illusion, here dubbed Simple, Austere Naïve Realism. |
Craig French; Ian Phillips | vol. 20 | 2020 | |
07 |
Kant's Empirical Account of Human Action
In the first Critique, Kant says, "[A]ll the actions of a human being are determined in accord with the order of nature, " adding that "if we could investigate all the appearances . . . there would be no human action we could not predict with certainty. " Most Kantian treatments of human action discuss action from a practical perspective, according to which human beings are transcendentally free, and thus do not sufficiently lay out this Kant's empirical, causal description of human action. Drawing on Kant's lectures in empirical psychology and his anthropological writings, this paper offers a clear and detailed elucidation of Kant's empirical account of human action. After explaining the connection between cognitions, feelings, desires, and actions, I show how the lower faculty of desire is governed by various instincts, inclinations, and propensities, and how the higher faculty of desire is governed by (empirical) character. I also discuss how character and inclinations arise from natural human propensities combined with other empirical causes. By looking at both Kant's faculty psychology and his account of predispositions, I lay out an overall Kantian framework for explaining any kind of human action. |
Patrick R. Frierson | vol. 5 | December 2005 | |
05 |
Touch Without Touching
In this paper, I argue that in touch, as in vision and audition, we can and often do perceive objects and properties even when we are not in direct or even apparent bodily contact with them. Unlike those senses, however, touch experiences require a special kind of mutually interactive connection between our sensory surfaces and the objects of our experience. I call this constraint the Connection Principle. This view has implications for the proper understanding of touch, and perceptual reference generally. In particular, spelling out the implications of this principle yields a rich and compelling picture of the spatial character of touch. |
Matthew Fulkerson | vol. 12 | February 2012 | |
28 |
Deontic Modality and the Semantics of Choice
I propose a unified solution to two puzzles: Ross's puzzle (the apparent failure of 'ought φ' to entail 'ought [φ or ψ]') and free choice permission (the apparent fact that 'may [φ or ψ]' entails both 'may φ' and 'may ψ'). I begin with a pair of cases from the decision theory literature illustrating the phenomenon of act dependence, where what an agent ought to do depends on what she does. The notion of permissibility distilled from these cases forms the basis for my analysis of 'may' and 'ought'. This framework is then combined with a generalization of the classical semantics for disjunction — equivalent to Boolean disjunction on the diagonal, but with a different two-dimensional character — that explains the puzzling facts in terms of semantic consequence. |
Melissa Fusco | vol. 15 | October 2015 |