Though my research interests range widely across epistemology and the philosophy of language, much of my work over the past several years has been focused on questions concerning the metasemantics and epistemology of nonempirical areas of discourse such as logic and mathematics. I have also recently taken up, as a side project, the question of how to integrate a reasonable approach to higher-order evidence into a broadly Bayesian picture of belief update. Below are brief descriptions of a few of the things I’m working on.

In progress

Naturalism and the A Priori or: The Inevitability of Conventionalism (under contract at Cambridge University Press)
I explain why conventionalism—i.e., the doctrine that certain sentences of our lanuage are true by convention alone—provides the only real hope of a satisfying naturalist-friendly explanation of our knowledge in areas of discourse such as logic and mathematics, and I develop a conventionalist view that remains attractive even in the face of objections to conventionalism that have almost universally been taken to be decisive. Material from the manuscript is available on request.

“Whence admissibility constraints? From inferentialism to tolerance”
I argue that, despite what most inferentialists insist, there’s no inferentialist-friendly way to motivate constraints on admissibility, which means badly behaved expressions like Prior’s ‘tonk’ turn out, from an inferentialist perspective, to be legitimate. I then explain why this isn’t actually a problem for inferentialism.

“The omega rule and the Categoricity Problem”, with Julien Murzi
We argue that Jared Warren’s recent argument for the followability of the omega rule is unconvincing but that this is no problem for inferentialism: despite several recent claims to the contrary, the followability of that rule plays no essential role in an inferentialist-friendly account of either the categoricity of the quantifiers or the determinacy of arithmetic.