Computational Linguistics
About

Type Theory Semantics

Type theory provides a rich framework for natural language semantics in which meanings are organized by types, enabling fine-grained distinctions and compositional control over semantic well-formedness.

e : tau — if e has type tau

Type theory, originating in Russell's efforts to resolve logical paradoxes and developed extensively by Church, Martin-Lof, and others, provides a principled framework for organizing linguistic meanings. In type-theoretic semantics, every meaningful expression is assigned a type that determines how it can combine with other expressions. The simply typed lambda calculus used in Montague Grammar employs basic types e (entities) and t (truth values) with function types built by the arrow constructor. More expressive type theories introduce dependent types, polymorphism, and rich type hierarchies that capture nuanced aspects of natural language meaning.

Simple Types in Montague Semantics

Semantic Type System Basic types: e (entity), t (truth value), s (possible world)

Function types: if a, b are types, then ⟨a, b⟩ is a type

Common lexical types:
Proper noun: e
Intransitive verb: ⟨e, t⟩
Transitive verb: ⟨e, ⟨e, t⟩⟩
Quantifier: ⟨⟨e, t⟩, ⟨⟨e, t⟩, t⟩⟩
Sentence: t

The simple type system enforces semantic well-formedness: only expressions of compatible types can combine. A transitive verb of type ⟨e, ⟨e, t⟩⟩ first takes its object (type e) to yield a VP meaning of type ⟨e, t⟩, which then takes the subject. Quantified noun phrases like "every student" receive the generalized quantifier type ⟨⟨e, t⟩, t⟩, a function that takes a property and returns a truth value. This type assignment elegantly handles the semantic difference between proper names and quantified NPs.

Rich Type Theories

Several researchers have proposed richer type theories for natural language semantics. Dependent type theory, following Martin-Lof, allows types to depend on terms, enabling types like "the type of proofs that S is true." Ranta (1994) applied dependent types to provide constructive semantic analyses of donkey anaphora, generics, and other phenomena. Type Theory with Records (TTR), developed by Cooper, uses record types to model situations and provides a framework for dialogue semantics and perceptual meaning.

Coercive Subtyping and Selectional Restrictions

Modern type-theoretic approaches use coercive subtyping to model phenomena like logical polysemy and selectional restrictions. When "the newspaper" appears as subject of "fired the editor" (institution reading) versus "is on the table" (physical object reading), the type system can coerce the meaning between subtypes. Pustejovsky's Generative Lexicon and Asher's Type Composition Logic formalize these operations, providing a type-theoretic account of the flexibility and creativity of natural language meaning.

Applications in Computational Semantics

Type theory plays a central role in computational semantics. Type-logical grammars such as the Lambek calculus and its extensions use types as syntactic categories, with derivations corresponding to both syntactic parses and semantic compositions via the Curry-Howard correspondence. In modern NLP, the distributional compositional semantics program of Coecke, Sadrzadeh, and Clark uses pregroup types to guide the composition of vector space meanings, bridging formal and distributional semantics through categorical type theory.

Practical semantic parsing systems also leverage type information to constrain the search space of possible logical forms. By assigning types to database entities and relations, parsers can prune semantically ill-typed candidate parses early in the derivation. Typed lambda calculus representations provide a natural interface between natural language and structured knowledge bases, supporting applications in question answering and information extraction.

Related Topics

References

  1. Ranta, A. (1994). Type-Theoretical Grammar. Oxford University Press.
  2. Cooper, R. (2012). Type theory and semantics in flux. In R. Kempson, T. Fernando, & N. Asher (Eds.), Handbook of the Philosophy of Science: Philosophy of Linguistics (pp. 271–323). Elsevier. doi:10.1016/B978-0-444-51747-0.50009-8
  3. Asher, N. (2011). Lexical Meaning in Context: A Web of Words. Cambridge University Press. doi:10.1017/CBO9780511793936
  4. Coecke, B., Sadrzadeh, M., & Clark, S. (2010). Mathematical foundations for a compositional distributional model of meaning. Linguistic Analysis, 36(1–4), 345–384.

External Links