Head-Driven Phrase Structure Grammar (HPSG), developed by Carl Pollard and Ivan Sag in the late 1980s and early 1990s, is a constraint-based grammatical framework in which linguistic knowledge is expressed through richly structured feature structures organized by a type hierarchy. Unlike derivational grammars that generate strings through successive rule applications, HPSG treats grammatical well-formedness as the simultaneous satisfaction of constraints. The framework is strongly lexicalist, meaning that most grammatical information resides in the lexicon, and syntactic rules serve primarily to combine lexical entries according to general principles.
Typed Feature Structures
The basic data structure in HPSG is the typed feature structure — a directed graph in which nodes carry type labels and arcs carry feature labels. Types are organized in a multiple-inheritance hierarchy, allowing linguistic generalizations to be stated at the most abstract level and inherited by more specific types. Structure sharing (reentrancy) within feature structures is used to enforce agreement, subcategorization requirements, and the binding of long-distance dependencies.
[PHON ⟨devours⟩
SYNSEM|LOCAL|CAT [HEAD verb
SUBJ ⟨NP_i⟩
COMPS ⟨NP_j⟩]
SYNSEM|LOCAL|CONT [RELN devour
AGENT i
THEME j]]
Principles and Schemata
HPSG organizes grammatical knowledge through general principles — such as the Head Feature Principle (the head features of a phrase are shared with its head daughter), the Valence Principle (arguments are selected by the head and saturated by combination), and the Semantics Principle (semantic content is projected from the head). These principles interact with a small number of immediate dominance schemata that license particular phrase types: head-complement, head-subject, head-adjunct, head-filler (for long-distance dependencies), and others.
The LinGO Grammar Matrix, developed at Stanford and the University of Washington, provides a cross-linguistically motivated starter kit for developing HPSG grammars of specific languages. It includes implementations of core phenomena such as agreement, case, coordination, and basic word order, allowing grammar engineers to rapidly develop broad-coverage precision grammars. The English Resource Grammar (ERG), one of the largest precision grammars ever built, is an HPSG grammar covering a substantial fragment of English with deep syntactic and semantic analysis.
Computational Implementation
HPSG grammars are implemented using unification-based parsing platforms such as the LKB (Linguistic Knowledge Building system), PET (Platform for Experimentation with efficient TNNF parsing Technology), and agree. Parsing proceeds by unifying feature structures according to the grammar's schemata and principles, with subsumption-based packing and selective unpacking for efficiency. The computational complexity of parsing with unrestricted HPSG grammars is undecidable in the worst case, but practical grammars are carefully engineered to ensure efficient parsing through constraint ordering and type-driven search space reduction.
HPSG has been applied extensively to grammar engineering for machine translation, information extraction, and linguistic analysis. The Deep Linguistic Processing with HPSG (DELPH-IN) consortium maintains a collection of tools and grammars covering over a dozen languages. The framework's emphasis on precision, deep linguistic analysis, and cross-linguistic applicability makes it one of the most successful grammar formalisms for applied computational linguistics.