The branch of linguistics known as generative linguistics rests on the idea of a generative grammar, a set of rules that generates an endless variety of sentences that are considered grammatically correct and no sentences that aren’t. The set of assumptions underpinning the philosophy of generative linguistics includes two important ideas. The first is that the human ability for language is innate, and the second is that human language is based on a set of logical rules that allow a speaker to produce novel sentences that can be understood by others who speak the same language.
The idea that a set of formal rules could be used as a model of the human cognitive ability to create language is said to be structure-dependent. In other words, the formal rules of a generative grammar must refer to the structural units of the language. Once the structural units are defined, algorithmic rules can be written to model the cognitive language building processes that underlie spoken and written language.
The concept of a generative grammar was first applied in the field of syntactic theory, where it was employed in attempts to describe the human ability to construct sentences. The generative linguistics approach has since been expanded on — vigorously — and it has become useful in the fields of phonology, morphology, and semantics. There are now many different models of generative grammar that attempt to explain how the human mind processes language.
Several assumptions underpin the philosophy of generative linguistics. Foremost is the idea that the human ability for natural language is innate. Additionally, the generative approach assumes that a speaker of a given language must have command of certain linguistic knowledge in order to produce grammatically correct, or well-formed, sentences in that language. This linguistic knowledge theoretically includes a generative grammar that allows the speaker to construct sentences that have never before been uttered. Other speakers of the language who hear those sentences use the same grammar to decode them, and are thus able understand sentences they have never heard before.
The first technical use of the term generative within the discipline of linguistics occurred in 1957 when Noam Chomsky, a famous linguist, published a book entitled Syntactic Structures. In the book, Chomsky proposed a theory of generative grammar that he called “transformational grammar.” Many consider the publication of Syntactic Structures to be the birth of generative linguistics as a subfield of linguistics.