Transformational grammar

(Redirected from Transformational Grammar)

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations (called transformations) to produce new sentences from existing ones.

The method is commonly associated with the American linguist Noam Chomsky's biologically oriented concept of language. But in logical syntax, Rudolf Carnap introduced the term "transformation" in his application of Alfred North Whitehead's and Bertrand Russell's Principia Mathematica. In such a context, the addition of the values of one and two, for example, transform into the value of three; many types of transformation are possible.[1]

Generative algebra was first introduced to general linguistics by the structural linguist Louis Hjelmslev,[2] although the method was described before him by Albert Sechehaye in 1908.[3] Chomsky adopted the concept of transformation from his teacher Zellig Harris, who followed the American descriptivist separation of semantics from syntax. Hjelmslev's structuralist conception including semantics and pragmatics is incorporated into functional grammar.[4]

Historical context

edit

Transformational analysis is a part of the classical Western grammatical tradition based on the metaphysics of Plato and Aristotle and on the grammar of Apollonius Dyscolus. These were joined to establish linguistics as a natural science in the Middle Ages. Transformational analysis was later developed by humanistic grammarians such as Thomas Linacre (1524), Julius Caesar Scaliger (1540), and Sanctius (Francisco Sánchez de las Brozas, 1587). The core observation is that grammatical rules alone do not constitute elegance, so learning to use a language correctly requires certain additional effects such as ellipsis. It is more desirable, for example, to say "Maggie and Alex went to the market" than to express the full underlying idea "Maggie went to the market and Alex went to the market". Such phenomena were described in terms of understood elements. In modern terminology, the first expression is the surface structure of the second, and the second expression is the deep structure of the first. The notions of ellipsis and restoration are complementary: the deep structure is converted into the surface structure and restored from it by what were later known as transformational rules.[5]

It was generally agreed that a degree of simplicity improves the quality of speech and writing, but closer inspection of the deep structures of different types of sentences led to many further insights, such as the concept of agent and patient in active and passive sentences. Transformations were given an explanatory role. Sanctius, among others, argued that surface structures pertaining to the choice of grammatical case in certain Latin expressions could not be understood without the restoration of the deep structure. His full transformational system included

  1. ellipsis, the deletion of understood semantic or syntactic elements;
  2. pleonasm, the occurrence of syntactically superfluous elements;
  3. syllepsis, the violation of a rule of agreement;
  4. hyperbaton, the violation of normal word order.[6]

Transformational analysis fell out of favor with the rise of historical-comparative linguistics in the 19th century, and the historical linguist Ferdinand de Saussure argued for limiting linguistic analysis to the surface structure.[7] By contrast, Edmund Husserl, in his 1921 elaboration of the 17th-century Port-Royal Grammar, based his version of generative grammar on classical transformations (Modifikationen).[8] Husserl's concept influenced Roman Jakobson, who advocated it in the Prague linguistic circle, which was likewise influenced by Saussure.[9] Based on opposition theory, Jakobson developed his theory of markedness and, having moved to the United States, influenced Noam Chomsky, especially through Morris Halle. Chomsky and his colleagues, including Jerrold Katz and Jerry Fodor, developed what they called transformational generative grammar in the 1960s.[10][11]

The transformational grammar of the 1960s differs from the Renaissance linguistics in its relation to the theory of language. While the humanistic grammarians considered language manmade, Chomsky and his colleagues exploited markedness and transformation theory in their attempt to uncover innate grammar.[12] It would be later clarified that such grammar arises from a brain structure caused by a mutation in humans.[13] In particular, generative linguists tried to reconstruct the underlying innate structure based on deep structure and unmarked forms. Thus, a modern notion of universal grammar, in contrast to the humanistic classics, suggested that the basic word order of biological grammar is unmarked, and unmodified in transformational terms.[14][15]

Transformational generative grammar included two kinds of rules: phrase-structure rules and transformational rules. But scholars abandoned the project in the 1970s. Based on Chomsky's concept of I-language as the proper subject of linguistics as a cognitive science, Katz and Fodor had conducted their research on English grammar employing introspection. These findings could not be generalized cross-linguistically whereby they could not belong to an innate universal grammar.[16]

The concept of transformation was nevertheless not fully rejected. In Chomsky's 1990s Minimalist Program, transformations pertain to the lexicon and the move operation.[16] This more lenient approach offers more prospects of universalizability. It is, for example, argued that the English SVO word-order (subject, verb, object) represents the initial state of the cognitive language faculty. However, in languages like Classical Arabic, which has a basic VSO order, sentences are automatically transformed by the move operation from the underlying SVO order on which the matrix of all sentences in all languages is reconstructed. Therefore, there is no longer a need for a separate surface and deep matrix and additional rules of conversion between the two levels. According to Chomsky, this solution allows sufficient descriptive and explanatory adequacy—descriptive because all languages are analyzed on the same matrix, and explanatory because the analysis shows in which particular way the sentence is derived from the (hypothesized) initial cognitive state.[17][16]

Basic mechanisms

edit

Deep structure and surface structure

edit

While Chomsky's 1957 book Syntactic Structures followed Harris's distributionalistic practice of excluding semantics from structural analysis, his 1965 book Aspects of the Theory of Syntax developed the idea that each sentence in a language has two levels of representation: a deep structure and a surface structure.[18][19] But these are not quite identical to Hjelmslev's content plane and expression plane.[2] The deep structure represents the core semantic relations of a sentence and is mapped onto the surface structure, which follows the phonological form of the sentence very closely, via transformations. The concept of transformations had been proposed before the development of deep structure to increase the mathematical and descriptive power of context-free grammars. Deep structure was developed largely for technical reasons related to early semantic theory. Chomsky emphasized the importance of modern formal mathematical devices in the development of grammatical theory:

But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one. Although it was well understood that linguistic processes are in some sense "creative," the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.

— Aspects of the Theory of Syntax

Transformations

edit

The usual usage of the term "transformation" in linguistics refers to a rule that takes an input, typically called the deep structure (in the Standard Theory) or D-structure (in the extended standard theory or government and binding theory), and changes it in some restricted way to result in a surface structure (or S-structure). In TG, phrase structure rules generate deep structures. For example, a typical transformation in TG is subject-auxiliary inversion (SAI). That rule takes as its input a declarative sentence with an auxiliary, such as "John has eaten all the heirloom tomatoes", and transforms it into "Has John eaten all the heirloom tomatoes?" In the original formulation (Chomsky 1957), those rules were stated as rules that held over strings of terminals, constituent symbols or both.

X NP AUX Y   X AUX NP Y

(NP = Noun Phrase and AUX = Auxiliary)

In the 1970s, by the time of the Extended Standard Theory, following Joseph Emonds's work on structure preservation, transformations came to be viewed as holding over trees. By the end of government and binding theory, in the late 1980s, transformations were no longer structure-changing operations at all; instead, they add information to already existing trees by copying constituents.

The earliest conceptions of transformations were that they were construction-specific devices. For example, there was a transformation that turned active sentences into passive ones. A different transformation raised embedded subjects into main clause subject position in sentences such as "John seems to have gone", and a third reordered arguments in the dative alternation. With the shift from rules to principles and constraints in the 1970s, those construction-specific transformations morphed into general rules (all the examples just mentioned are instances of NP movement), which eventually changed into the single general rule move alpha or Move.

Transformations actually come in two types: the post-deep structure kind mentioned above, which are string- or structure-changing, and generalized transformations (GTs). GTs were originally proposed in the earliest forms of generative grammar (such as in Chomsky 1957). They take small structures, either atomic or generated by other rules, and combine them. For example, the generalized transformation of embedding would take the kernel "Dave said X" and the kernel "Dan likes smoking" and combine them into "Dave said Dan likes smoking." GTs are thus structure-building rather than structure-changing. In the Extended Standard Theory and government and binding theory, GTs were abandoned in favor of recursive phrase structure rules, but they are still present in tree-adjoining grammar as the Substitution and Adjunction operations, and have recently reemerged in mainstream generative grammar in Minimalism, as the operations Merge and Move.

In generative phonology, another form of transformation is the phonological rule, which describes a mapping between an underlying representation (the phoneme) and the surface form that is articulated during natural speech.[20]

Formal definition

edit

Chomsky's advisor, Zellig Harris, took transformations to be relations between sentences such as "I finally met this talkshow host you always detested" and simpler (kernel) sentences "I finally met this talkshow host" and "You always detested this talkshow host."[need quotation to verify] A transformational-generative (or simply transformational) grammar thus involved two types of productive rules: phrase structure rules, such as "S → NP VP" (a sentence may consist of a noun phrase followed by a verb phrase) etc., which could be used to generate grammatical sentences with associated parse trees (phrase markers, or P markers); and transformational rules, such as rules for converting statements to questions or active to passive voice, which acted on the phrase markers to produce other grammatically correct sentences. Hjelmslev had called word-order conversion rules "permutations".[21]

In this context, transformational rules are not strictly necessary to generate the set of grammatical sentences in a language, since that can be done using phrase structure rules alone, but the use of transformations provides economy in some cases (the number of rules can be reduced), and it also provides a way of representing the grammatical relations between sentences, which would not be reflected in a system with phrase structure rules alone.[22]

This notion of transformation proved adequate for subsequent versions, including the "extended", "revised extended", and Government-Binding (GB) versions of generative grammar, but it may no longer be sufficient for minimalist grammar, as merge may require a formal definition that goes beyond the tree manipulation characteristic of Move α.

Mathematical representation

edit

An important feature of all transformational grammars is that they are more powerful than context-free grammars.[23] Chomsky formalized this idea in the Chomsky hierarchy. He argued that it is impossible to describe the structure of natural languages with context-free grammars.[24] His general position on the non-context-freeness of natural language has held up since then, though his specific examples of the inadequacy of CFGs in terms of their weak generative capacity were disproved.[25][26]

Core concepts

edit

Innate linguistic knowledge

edit

Using a term such as "transformation" may give the impression that theories of transformational generative grammar are intended as a model of the processes by which the human mind constructs and understands sentences, but Chomsky clearly stated that a generative grammar models only the knowledge that underlies the human ability to speak and understand, arguing that because most of that knowledge is innate, a baby can have a large body of knowledge about the structure of language in general and so need to learn only the idiosyncratic features of the language(s) to which it is exposed.[citation needed]

Chomsky is not the first person to suggest that all languages have certain fundamental things in common. He quoted philosophers who posited the same basic idea several centuries ago. But Chomsky helped make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language. He made concrete and technically sophisticated proposals about the structure of language as well as important proposals about how grammatical theories' success should be evaluated.[27]

Grammaticality

edit

Chomsky argued that "grammatical" and "ungrammatical" can be meaningfully and usefully defined. In contrast, an extreme behaviorist linguist would argue that language can be studied only through recordings or transcriptions of actual speech and that the role of the linguist is to look for patterns in such observed speech, not to hypothesize about why such patterns might occur or to label particular utterances grammatical or ungrammatical. Few linguists in the 1950s actually took such an extreme position, but Chomsky was on the opposite extreme, defining grammaticality in an unusually mentalistic way for the time.[28] He argued that the intuition of a native speaker is enough to define the grammaticality of a sentence; that is, if a particular string of English words elicits a double-take or a feeling of wrongness in a native English speaker, with various extraneous factors affecting intuitions controlled for, it can be said that the string of words is ungrammatical. That, according to Chomsky, is entirely distinct from the question of whether a sentence is meaningful or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example, "colorless green ideas sleep furiously".[29] But such sentences manifest a linguistic problem that is distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as well-formed.

The use of such intuitive judgments permitted generative syntacticians to base their research on a methodology in which studying language through a corpus of observed speech became downplayed since the grammatical properties of constructed sentences were considered appropriate data on which to build a grammatical model.

Theory evaluation

edit

In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories.

Competence versus performance

edit

One was the distinction between competence and performance.[30] Chomsky noted the obvious fact that when people speak in the real world, they often make linguistic errors, such as starting a sentence and then abandoning it midway through. He argued that such errors in linguistic performance are irrelevant to the study of linguistic competence, the knowledge that allows people to construct and understand grammatical sentences. Consequently, the linguist can study an idealised version of language, which greatly simplifies linguistic analysis.

Descriptive versus explanatory adequacy

edit

The other idea related directly to evaluation of theories of grammar. Chomsky distinguished between grammars that achieve descriptive adequacy and those that go further and achieve explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar that achieves explanatory adequacy has the additional property that it gives insight into the mind's underlying linguistic structures. In other words, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, such mental representations are largely innate and so if a grammatical theory has explanatory adequacy, it must be able to explain different languages' grammatical nuances as relatively minor variations in the universal pattern of human language.

Chomsky argued that even though linguists were still a long way from constructing descriptively adequate grammars, progress in descriptive adequacy would come only if linguists held explanatory adequacy as their goal: real insight into individual languages' structure can be gained only by comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.[citation needed]

Development of concepts

edit

Though transformations continue to be important in Chomsky's theories, he has now abandoned the original notion of deep structure and surface structure. Initially, two additional levels of representation were introduced—logical form (LF) and phonetic form (PF), but in the 1990s, Chomsky sketched a new program of research known at first as Minimalism, in which deep structure and surface structure are no longer featured and PF and LF remain as the only levels of representation.[31]

To complicate the understanding of the development of Chomsky's theories, the precise meanings of deep structure and surface structure have changed over time. By the 1970s, Chomskyan linguists normally called them D-Structure and S-Structure. In particular, Chomskyan linguists dropped for good the idea that a sentence's deep structure determined its meaning (taken to its logical conclusions by generative semanticists during the same period) when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that both deep and surface structure determined meaning).[32][33]

"I-language" and "E-language"

edit

In 1986, Chomsky proposed a distinction between I-language and E-language that is similar but not identical to the competence/performance distinction.[34] "I-language" is internal language; "E-language" is external language. I-language is taken to be the object of study in linguistic theory; it is the mentally represented linguistic knowledge a native speaker of a language has and thus a mental object. From that perspective, most of theoretical linguistics is a branch of psychology. E-language encompasses all other notions of what a language is, such as a body of knowledge or behavioural habits shared by a community. Thus E-language is not a coherent concept by itself,[35] and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge or competence even though they may seem sensible and intuitive and useful in other areas of study. Competence, he argues, can be studied only if languages are treated as mental objects.

Minimalist program

edit

From the mid-1990s onward, much research in transformational grammar has been inspired by Chomsky's minimalist program.[36] It aims to further develop ideas involving "economy of derivation" and "economy of representation", which had started to become significant in the early 1990s but were still rather peripheral aspects of transformational-generative grammar theory:

  • Economy of derivation is the principle that movements, or transformations, occur only to match interpretable features with uninterpretable features. An example of an interpretable feature is the plural inflection on regular English nouns, e.g., dogs. The word dogs can be used to refer only to several dogs, not a single dog, and so the inflection contributes to meaning by making it interpretable. English verbs are inflected according to the number of their subject ("Dogs bite" v. "A dog bites"), but in most sentences, that inflection just duplicates the information about number that the subject noun already has, and the inflection is therefore uninterpretable.
  • Economy of representation is the principle that grammatical structures must exist for a purpose: the structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.

Both notions, as described here, are somewhat vague, and their precise formulation is controversial.[37][38] An additional aspect of minimalist thought is the idea that the derivation of syntactic structures should be uniform: rules should not be stipulated as applying at arbitrary points in a derivation but instead apply throughout derivations. Minimalist approaches to phrase structure have resulted in "Bare Phrase Structure", an attempt to eliminate X-bar theory. In 1998, Chomsky suggested that derivations proceed in phases. The distinction between deep structure and surface structure is absent in Minimalist theories of syntax, and the most recent phase-based theories also eliminate LF and PF as unitary levels of representation.

Critical reception

edit

In 1978, linguist and historian E. F. K. Koerner hailed transformational grammar as the third and last Kuhnian revolution in linguistics, arguing that it had brought about a shift from Ferdinand de Saussure's sociological approach to a Chomskyan conception of linguistics as analogous to chemistry and physics. Koerner also praised the philosophical and psychological value of Chomsky's theory.[39]

In 1983 Koerner retracted his earlier statement suggesting that transformational grammar was a 1960s fad that had spread across the U.S. at a time when the federal government had invested heavily in new linguistic departments. But he claims Chomsky's work is unoriginal when compared to other syntactic models of the time. According to Koerner, Chomsky's rise to fame was orchestrated by Bernard Bloch, editor of Language, the journal of the Linguistic Society of America, and Roman Jakobson, a personal friend of Chomsky's father. Koerner suggests that great sums of money were spent to fly foreign students to the 1962 International Congress at Harvard, where an exceptional opportunity was arranged for Chomsky to give a keynote speech making questionable claims of belonging to the rationalist tradition of Saussure, Humboldt and the Port-Royal Grammar, in order to win popularity among the Europeans. The transformational agenda was subsequently forced through at American conferences where students, instructed by Chomsky, regularly verbally attacked and ridiculed his potential opponents.[40]

See also

edit

References

edit
  1. ^ {{cite book|author=Carnap, Rudolph |title=Philosophy and Logical Syntax |publisher=AMS Press |year=1935}
  2. ^ a b Seuren, Pieter A. M. (1998). Western linguistics: An historical introduction. Wiley-Blackwell. ISBN 0-631-20891-7.
  3. ^ Seuren, Pieter (2018). Saussure and Sechehaye: Myth and Genius. Brill. ISBN 978-90-04-37815-5.
  4. ^ Butler, Christopher S. (2003). Structure and Function: A Guide to Three Major Structural-Functional Theories, part 1 (PDF). John Benjamins. ISBN 9781588113580. Retrieved 2020-01-19.
  5. ^ Percival, William Keith (1976). "Deep and surface structure concepts in renaissance and mediaeval syntactic theory". In Parret (ed.). History of Linguistic Thought and Contemporary Linguistics. Walter de Gruyter. pp. 238–253.
  6. ^ Percival, William Keith (1976). "Deep and surface structure concepts in renaissance and mediaeval syntactic theory". In Parret (ed.). History of Linguistic Thought and Contemporary Linguistics. Walter de Gruyter. pp. 238–253.
  7. ^ de Saussure, Ferdinand (1959) [First published 1916]. Course in general linguistics (PDF). New York: Philosophy Library. ISBN 9780231157278. Archived from the original (PDF) on 2020-04-14. Retrieved 2022-08-08.
  8. ^ Bianchin, Matteo (2018). "Husserl on Meaning, Grammar, and the Structure of Content" (PDF). Husserl Studies. 34 (2): 101–121. doi:10.1007/s10743-017-9223-2. S2CID 254553890. Retrieved 2022-08-08.
  9. ^ Holenstein, Elmar (2018). "Jakobson und Husserl: Ein beitrag zur genealogie Des strukturalismus" (PDF). Tijdschrift voor Filosofie. 35 (3): 560–607. JSTOR 40882437. Retrieved 2022-08-08.
  10. ^ Battistella, Edwin (2015). "Markedness in Linguistics". In Wright, James D. (ed.). International Encyclopedia of the Social & Behavioral Sciences (2nd ed.). Elsevier. pp. 533–537. doi:10.1016/B978-0-08-097086-8.52037-6. ISBN 978-0-08-097087-5.
  11. ^ Partee, Barbara (2011). "Formal Semantics: Origins, Issues, Early Impact". The Baltic International Yearbook of Cognition, Logic and Communication. Vol. 6. BIYCLC. pp. 1–52. doi:10.4148/biyclc.v6i0.1580.
  12. ^ Percival, William Keith (1976). "Deep and surface structure concepts in renaissance and mediaeval syntactic theory". In Parret (ed.). History of Linguistic Thought and Contemporary Linguistics. Walter de Gruyter. pp. 238–253.
  13. ^ de Boer, Bart; Thompson, Bill; Ravigniani, Andrea; Boeckx, Cedric (2020). "Evolutionary Dynamics Do Not Motivate a Single-Mutant Theory of Human Language". Scientific Reports. 10 (451): 451. Bibcode:2020NatSR..10..451D. doi:10.1038/s41598-019-57235-8. PMC 6965110. PMID 31949223. S2CID 92035839.
  14. ^ Battistella, Edwin (2015). "Markedness in Linguistics". In Wright, James D. (ed.). International Encyclopedia of the Social & Behavioral Sciences (2nd ed.). Elsevier. pp. 533–537. doi:10.1016/B978-0-08-097086-8.52037-6. ISBN 978-0-08-097087-5.
  15. ^ Partee, Barbara (2011). "Formal Semantics: Origins, Issues, Early Impact". The Baltic International Yearbook of Cognition, Logic and Communication. Vol. 6. BIYCLC. pp. 1–52. doi:10.4148/biyclc.v6i0.1580.
  16. ^ a b c Chomsky, Noam (2015). The Minimalist Program. 20th Anniversary Edition. MIT Press. ISBN 978-0-262-52734-7.
  17. ^ Benmamoun, labbas; Choueiri, Lina (2013). "The Syntax of Arabic From A Generative Perspective". In Owens (ed.). The Oxford handbook of Arabic linguistics. Oxford Umiversity Press. pp. 115–164. doi:10.1093/oxfordhb/9780199764136.013.0006. ISBN 978-0199764136.
  18. ^ Chomsky, Noam (1965). Aspects of the Theory of Syntax. MIT Press. ISBN 0-262-53007-4.
  19. ^ The Port-Royal Grammar of 1660 identified similar principles; Chomsky, Noam (1972). Language and Mind. Harcourt Brace Jovanovich. ISBN 0-15-147810-4.
  20. ^ Goldsmith, John A (1995). "Phonological Theory". In John A. Goldsmith (ed.). The Handbook of Phonological Theory. Blackwell Handbooks in Linguistics. Blackwell Publishers. p. 2. ISBN 1-4051-5768-2.
  21. ^ Hjelmslev, Louis (1969) [First published 1943]. Prolegomena to a Theory of Language. University of Wisconsin Press. ISBN 0299024709.
  22. ^ Emmon Bach, An Introduction to Transformational Grammars, Holt, Rinehart and Winston. Inc., 1966, pp. 59–69.
  23. ^ Peters, Stanley; R. Ritchie (1973). "On the generative power of transformational grammars" (PDF). Information Sciences. 6: 49–83. doi:10.1016/0020-0255(73)90027-3.
  24. ^ Chomsky, Noam (1956). "Three models for the description of language" (PDF). IRE Transactions on Information Theory. 2 (3): 113–124. doi:10.1109/TIT.1956.1056813. S2CID 19519474. Archived from the original (PDF) on 2010-09-19.
  25. ^ Shieber, Stuart (1985). "Evidence against the context-freeness of natural language" (PDF). Linguistics and Philosophy. 8 (3): 333–343. doi:10.1007/BF00630917. S2CID 222277837.
  26. ^ Pullum, Geoffrey K.; Gerald Gazdar (1982). "Natural languages and context-free languages". Linguistics and Philosophy. 4 (4): 471–504. doi:10.1007/BF00360802. S2CID 189881482.
  27. ^ McLeod, S. "Language Acquisition". Simply Psychology. Retrieved 21 February 2019.
  28. ^ Newmeyer, Frederick J. (1986). Linguistic Theory in America (Second ed.). Academic Press.[page needed]
  29. ^ Chomsky 1957:15
  30. ^ Kordić, Snježana (1991). "Transformacijsko-generativni pristup jeziku u Sintaktičkim strukturama i Aspektima teorije sintakse Noama Chomskog" [Transformational-generative approach to language in Syntactic structures and Aspects of the theory of syntax of Noam Chomsky] (PDF). SOL: Lingvistički časopis (in Serbo-Croatian). 6 (12–13): 105. ISSN 0352-8715. S2CID 186964128. SSRN 3445224. CROSBI 446914. ZDB-ID 1080348-8. (CROLIB). Archived (PDF) from the original on January 16, 2013. Retrieved 7 September 2020.
  31. ^ In a review of The Minimalist Program, Zwart 1998 observed, "D-Structure is eliminated in the sense that there is no base component applying rewrite rules to generate an empty structure which is to be fleshed out later by 'all at once' lexical insertion. Instead, structures are created by combining elements drawn from the lexicon, and there is no stage in the process at which we can stop and say: this is D-Structure." Similarly, "there is no need for language particular S-Structure conditions in order to describe word order variation" and can be handled by LF.
  32. ^ Jackendoff, Ray (1974). Semantic Interpretation in Generative Grammar. MIT Press. ISBN 0-262-10013-4.
  33. ^ May, Robert C. (1977). The Grammar of Quantification. MIT Phd Dissertation. ISBN 0-8240-1392-1. (Supervised by Noam Chomsky, this dissertation introduced the idea of "logical form.")
  34. ^ Chomsky, Noam (1986). Knowledge of Language. New York:Praeger. ISBN 0-275-90025-8.[page needed]
  35. ^ Chomsky, Noam (2001). "Derivation by Phase." In other words, in algebraic terms, and the I-language is the actual function, whereas the E-language is the extension of this function. In Michael Kenstowicz (ed.) Ken Hale: A Life in Language. MIT Press. Pages 1-52. (See p. 49 fn. 2 for comment on E-language.)
  36. ^ Chomsky, Noam (1995). The Minimalist Program. MIT Press. ISBN 0-262-53128-3.
  37. ^ Lappin, Shalom; Levine, Robert; Johnson, David (2000). "Topic ... Comment". Natural Language & Linguistic Theory. 18 (3): 665–671. doi:10.1023/A:1006474128258. S2CID 189900915.
  38. ^ Lappin, Shalom; Levine, Robert; Johnson, David (2001). "The Revolution Maximally Confused". Natural Language & Linguistic Theory. 19 (4): 901–919. doi:10.1023/A:1013397516214. S2CID 140876545.
  39. ^ Koerner, E. F. K. (1978). "Towards a historiography of linguistics". Toward a Historiography of Linguistics: Selected Essays. John Benjamins. pp. 21–54.
  40. ^ Koerner, E. F. K. (1983). "The Chomskyan 'revolution' and its historiography: a few critical remarks". Language & Communication. 3 (2): 147–169. doi:10.1016/0271-5309(83)90012-5.

Bibliography

edit
edit