Inferential Role Semantics for Natural Language

PhD Thesis, 2017

Peter Blouw

Abstract

The most general goal of semantic theory is to explain facts about language use. In keeping with this goal, I introduce a framework for thinking about linguistic expressions in terms of (a) the inferences they license, (b) the behavioral predictions that their uses thereby sustain, and (c) the affordances that they provide to language users in virtue of these inferential and predictive involvements. Within this framework, linguistic expressions acquire meanings by regulating social practices that involve “intentional interpretation,” wherein people explain and predict one another’s behavior through linguistically specified mental state attributions. Developing a theory of meaning therefore requires formalizing the inferential roles that determine how linguistic expressions license predictions in the context intentional interpretation. Accordingly, the view I develop is an inferential role semantics for natural language. To describe this semantics, I take advantage of recently developed techniques in the field of natural language processing. I introduce a model that assigns inferential roles to arbitrary linguistic expressions by learning from examples of how sentences are distributed as premises and conclusions in a space of possible inferences. I then empirically evaluate the model’s ability to generate accurate entailments for novel sentences not used as training examples. I argue that this model takes a small but important step towards codifying the meanings of the expressions it manipulates. Next, I examine the theoretical implications of this work with respect to debates about the compositionality of language, the relationship between language and cognition, and the relationship between language and the world. With respect to compositionality, I argue that the debate is really about generalization in language use, and that the required sort of generalization can be achieved by “interpolating” between familiar examples of correct inferential transitions. With respect to the relationship between thought and language, I argue that it is a mistake to try to derive a theory of natural language semantics from a prior theory of mental representation because theories of mental representation invoke the sort of intentional interpretation at play in language use from the get-go. With respect to the relationship between language and the world, I argue that questions about truth conditions and reference relations are best thought of in terms of questions about the norms governing language use. These norms, in turn, are best characterized in primarily inferential terms. I conclude with an all-things-considered evaluation of my theory that demonstrates how it overcomes a number of challenges associated with semantic theories that take inference, rather than reference, as their starting point.

Full text links

 PDF

 External link

Thesis

School
University of Waterloo
Type
PhD thesis

Cite

Plain text

BibTeX