Friday, April 18, 2008

cool

Coolie (1936) by Mulk Raj Anand gives a clear and poignant description of the poor face of India, telling the story of a 15-year-old boy who has to work as a child labourer and eventually dies of tuberculosis. It is the story of Munoo who is forced to leave his village out of necessity and poverty to work in the city as a child labourer. The novel shows his adventures and escapades as he works as a servant, factory worker, rickshaw driver far away from his home. The story is told from the eyes of the narrator and brings to light the inevitable and hidden evils of the Raj, right from exploitation, caste ridden society, communal riots, and police injustice. The novel takes us to different places and cities showing the inhuman and degrading treatment that the poor Munoo gets at the hands of the socially, economically, and politically affluent and higher classes of Indian society and how he copes with all circumstances alone. Anand was able to strike a cord in the hearts of the conscientious indians with the beautiful and real to life portrayal of the down trodden masses of Indian society,the so called have nots. Mulk Raj Anand was much appreciated and recognized for this novel and was one of those people who were highly influenced by Mahatma Gandhi. And this influence is clearly seen in all his works including Coolie. True to his Marxist spirit, he always portrayed the real India, and more specifically the poor India. He is also regarded as one of the first Indian writers in English who started the trend of using Hindi and Punjabi phrases in his writings to enrich and enhance the language. Also caled the Charles Dickens of India by many literary figures, Anand's novels deal with the underdog. Coolie is one classic example of the story of the underprevileged class of the society and of the oppressed people who cannot even make both ends meet.

Guidance

The comprehensive school counseling program refers to a sequential, developmental program designed to benefit all students in preparation for their futures. Such a program includes a curriculum organized around three areas essential for students growth and development: Academic Development, Career Development, and Personal/Social Development.
Demonstrate a positive attitude toward self as a unique and worthy person. Gain life-planning skills that are consistent with their needs, interests, and abilities. Develop responsible social skills and an understanding and appreciation of being a contributing member of society. Demonstrate an understanding and appreciation of the life-long process of learning, growing, and changing. Activities and strategies for achieving identified student outcomes in these three areas can be integrated across the curriculum by teachers and counselors. A goal for this guide is to illustrate the connectivity between the National Standards, the ABCs Goals, the SCANS, and the National Career Development Guidelines. This Guidance Curriculum for a Comprehensive School Counseling Program is student centered and teacher friendly. Counselors should use it as a blueprint for collaboratively building a sequential and developmentally appropriate school counseling program.

Behaviourism

Behavioural (or "behavioral") theory in psychology is a very substantial field: follow the links to the left or right for introductions to some of its more detailed contributions impinging on how people learn in the real world. How I have the effrontery to produce a single page on it amazes even me, whatever my reservations about it!
Behaviourism is primarily associated with Pavlov (classical conditioning) in Russia and with Thorndike, Watson and particularly Skinner in the United States (operant conditioning).
Behaviourism is dominated by the constraints of its (naïve) attempts to emulate the physical sciences, which entails a refusal to speculate about what happens inside the organism. Anything which relaxes this requirement slips into the cognitive realm.
Much behaviourist experimentation is undertaken with animals and generalised.
In educational settings, behaviourism implies the dominance of the teacher, as in behaviour modification programmes. It can, however, be applied to an understanding of unintended learning.
For our purposes, behaviourism is relevant mainly to:
Skill development, and
The "substrate" (or "conditions", as Gagné puts it) of learning
If you want to follow your own links, use "behaviorism" (sic.) Most of the material is US-based and "behaviorism" and "behaviorist" is how they spell it, and I freely admit that this side-bar is purely to get the stupid search engine "bots" to register "behavior"

Classical conditioning:
is the process of reflex learning—investigated by Pavlov—through which an unconditioned stimulus (e.g. food) which produces an unconditioned response (salivation) is presented together with a conditioned stimulus (a bell), such that the salivation is eventually produced on the presentation of the conditioned stimulus alone, thus becoming a conditioned response.
This is a disciplined account of our common-sense experience of learning by association (or "contiguity", in the jargon), although that is often much more complex than a reflex process, and is much exploited in advertising. Note that it does not depend on us doing anything.
Such associations can be chained and generalised (for better of for worse): thus "smell of baking" associates with "kitchen at home in childhood" associates with "love and care". (Smell creates potent conditioning because of the way it is perceived by the brain.) But "sitting at a desk" associates with "classroom at school" and hence perhaps with "humiliation and failure"...
More on Pavlov


Operant Conditioning
If, when an organism emits a behaviour (does something), the consequences of that behaviour are reinforcing, it is more likely to emit (do) it again. What counts as reinforcement, of course, is based on the evidence of the repeated behaviour, which makes the whole argument rather circular.
Learning is really about the increased probability of a behaviour based on reinforcement which has taken place in the past, so that the antecedents of the new behaviour include the consequences of previous behaviour.
Summary of Skinner'sideas
On operantconditioning
Skinner's own account

Wikipedia on operant conditioning
The schedule of reinforcement of behaviour is central to the management of effective learning on this basis, and working it out is a very skilled procedure: simply reinforcing every instance of desired behaviour is just bribery, not the promotion of learning.
Withdrawal of reinforcement eventually leads to the extinction of the behaviour, except in some special cases such as anticipatory-avoidance learning.

BehaviorismDefinitionBehaviorism is a theory of animal and human learning that only focuses on objectively observable behaviors and discounts mental activities. Behavior theorists define learning as nothing more than the acquisition of new behavior.
DiscussionExperiments by behaviorists identify conditioning as a universal learning process. There are two different types of conditioning, each yielding a different behavioral pattern:
Classic conditioning occurs when a natural reflex responds to a stimulus. The most popular example is Pavlov's observation that dogs salivate when they eat or even see food. Essentially, animals and people are biologically "wired" so that a certain stimulus will produce a specific response.
Behavioral or operant conditioning occurs when a response to a stimulus is reinforced. Basically, operant conditioning is a simple feedback system: If a reward or reinforcement follows the response to a stimulus, then the response becomes more probable in the future. For example, leading behaviorist B.F. Skinner used reinforcement techniques to teach pigeons to dance and bowl a ball in a mini-alley.
There have been many criticisms of behaviorism, including the following:
Behaviorism does not account for all kinds of learning, since it disregards the activities of the mind.
Behaviorism does not explain some learning--such as the recognition of new language patterns by young children--for which there is no reinforcement mechanism.
Reserach has shown that animals adapt their reinforced patterns to new information. For instance, a rat can shift its behavior to respond to changes in the layout of a maze it had previously mastered through reinforcements.
How Behaviorism Impacts LearningThis theory is relatively simple to understand because it relies only on observable behavior and describes several universal laws of behavior. Its positive and negative reinforcement techniques can be very effective--both in animals, and in treatments for human disorders such as autism and antisocial behavior. Behaviorism often is used by teachers, who reward or punish student behaviors.

PiagetDefinitionSwiss biologist and psychologist Jean Piaget (1896-1980) is renowned for constructing a highly influential model of child development and learning. Piaget's theory is based on the idea that the developing child builds cognitive structures--in other words, mental "maps," schemes, or networked concepts for understanding and responding to physical experiences within his or her environment. Piaget further attested that a child's cognitive structure increases in sophistication with development, moving from a few innate reflexes such as crying and sucking to highly complex mental activities.
DiscussionPiaget's theory identifies four developmental stages and the processes by which children progress through them. The four stages are:
Sensorimotor stage (birth - 2 years old)--The child, through physical interaction with his or her environment, builds a set of concepts about reality and how it works. This is the stage where a child does not know that physical objects remain in existence even when out of sight (object permanance).
Preoperational stage (ages 2-7)--The child is not yet able to conceptualize abstractly and needs concrete physical situations.
Concrete operations (ages 7-11)--As physical experience accumulates, the child starts to conceptualize, creating logical structures that explain his or her physical experiences. Abstract problem solving is also possible at this stage. For example, arithmetic equations can be solved with numbers, not just with objects.
Formal operations (beginning at ages 11-15)--By this point, the child's cognitive structures are like those of an adult and include conceptual reasoning.
Piaget outlined several principles for building cognitive structures. During all development stages, the child experiences his or her environment using whatever mental maps he or she has constructed so far. If the experience is a repeated one, it fits easily--or is assimilated--into the child's cognitive structure so that he or she maintains mental "equilibrium." If the experience is different or new, the child loses equilibrium, and alters his or her cognitive structure to accommodate the new conditions. This way, the child erects more and more adequate cognitive structures.
How Piaget's Theory Impacts LearningCurriculum--Educators must plan a developmentally appropriate curriculum that enhances their students' logical and conceptual growth.
Instruction--Teachers must emphasize the critical role that experiences--or interactions with the surrounding environment--play in student learning. For example, instructors have to take into account the role that fundamental concepts, such as the permanence of objects, play in establishing cognitive structures.

Government and binding theory
From Wikipedia, the free encyclopedia
(Redirected from Government and Binding)
Jump to: navigation, search
Government and binding is a theory of syntax in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s.[1][2][3] This theory is a radical revision of his earlier theories [4][5][6] and was later revised in The Minimalist Program (1995)[7] and several subsequent papers, the latest being Three Factors in Language Design (2005).[8] Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.
The name refers to two central subtheories of the theory: government, which is an abstract syntactic relation, and binding, which deals with the referents of pronouns, anaphors, and R-expressions. GB was the first theory to be based on the principles and parameters model of language, which also underlies the later developments of the Minimalist Program.
Contents[hide]
1 Government
2 Binding
3 Further reading
4 References
5 External links
//

[edit] Government
The main application of the government relation concerns the assignment of case. Government is defined as follows:
A governs B if and only if
A is a governor and
A m-commands B and
no barrier intervenes between A and B.
Governors are heads of the lexical categories (V, N, A, P) and tensed I (T). A m-commands B if A does not dominate B and B does not dominate A and the first maximal projection of A dominates B. The maximal projection of a head X is XP. This means that for example in a structure like the following, A m-commands B, but B does not m-command A:

In addition, barrier is defined as follows:[9] A barrier is any node Z such that
Z is a potential governor for B and
Z c-commands B and
Z does not c-command A
The government relation makes case assignment unambiguous. The tree diagram below illustrates how DPs are governed and assigned case by their governing heads:

Another important application of the government relation constrains the occurrence and identity of traces as the Empty Category Principle requires them to be properly governed.

[edit] Binding
Binding can be defined as follows:
An element α binds an element β if and only if α c-commands β, and α and β are co-referent.
Consider the sentence "John saw his mother." which is diagrammed below using simple phrase structure rules.

"John" c-commands "his" because the first non-trivial parent of "John", S, contains "his". "John" and "his" are also co-referent (they refer to the same person), therefore "John" binds "his".
On the other hand, in the sentence "A friend of John saw his mother", "John" does not c-command "his", so they have no binding relationship, regardless of whether they are co-referent (which they may be; the example is ambiguous).
The importance of binding is shown in the grammaticality of the following sentences:
*Johni saw himi. (ungrammatical with co-reference)
John saw himself. (unambiguously co-referent)
*Himself saw John. (ungrammatical)
*Johni saw Johni. (ungrammatical, unless it refers to two distinct Johns)
Binding is used, along with particular binding principles, to explain the ungrammaticality of those statements. The applicable rules are called Binding Principle A, Binding Principle B, and Binding Principle C.
Principle A states that anaphors (reflexives and reciprocals, such as "each other") must always be bound in their domains. Since there is nothing to bind "himself" in sentence [3], that principle is violated, and the sentence is ungrammatical.
Principle B states that a pronoun must never be bound within its domain. If, in sentence [1], "John" and "him" are co-referent, then there is a binding relationship between them, violating the principle and resulting in ungrammaticality.
Principle C states that R-expressions must never be bound. R-expressions are referential expressions: non-pronoun, uniquely identifiable entities, such as "the dog", or proper names such as "John". In sentence [4], the first instance of "John" binds the second, resulting in the ungrammaticality.
Note that Principles A and B refer to domains. It is difficult to define a domain in a way that explains all the data, though the definition may be related to movement islands and the Phase Impenetrability Constraint.

Transformational grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
In linguistics, a transformational grammar, or transformational-generative grammar (TGG), is a generative grammar, especially of a natural language, that has been developed in a Chomskyan tradition. Additionally, transformational grammar is the Chomskyan tradition that gives rise to specific transformational grammars. Much current research in transformational grammar is inspired by Chomsky's Minimalist Program.[
Deep and surface:

In 1957, Noam Chomsky published Syntactic Structures, in which he developed the idea that each sentence in a language has two levels of representation — a deep structure and a surface structure.[2] [3] The deep structure represented the core semantic relations of a sentence, and was mapped on to the surface structure (which followed the phonological form of the sentence very closely) via transformations. Chomsky believed that there would be considerable similarities between languages' deep structures, and that these structures would reveal properties, common to all languages, which were concealed by their surface structures. However, this was perhaps not the central motivation for introducing deep structure. Transformations had been proposed prior to the development of deep structure as a means of increasing the mathematical and descriptive power of Context-free grammars. Similarly, deep structure was devised largely for technical reasons relating to early semantic theory. Chomsky emphasizes the importance of modern formal mathematical devices in the development of grammatical theory:
But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one. Although it was well understood that linguistic processes are in some sense "creative", the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.
(Aspects of the Theory of Syntax, p. 8 [2])

[edit] Development of basic concepts
Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF — Logical Form, and PF — Phonetic Form), and then in the 1990s Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure no longer featured and PF and LF remained as the only levels of representation.
To complicate the understanding of the development of Noam Chomsky's theories, the precise meanings of Deep Structure and Surface Structure have changed over time — by the 1970s, the two were normally referred to simply as D-Structure and S-Structure by Chomskian linguists. In particular, the idea that the meaning of a sentence was determined by its Deep Structure (taken to its logical conclusions by the generative semanticists during the same period) was dropped for good by Chomskian linguists when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that meaning was determined by both Deep and Surface Structure).[
Innate linguistic knowledge
Terms such as "transformation" can give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences. Chomsky is clear that this is not in fact the case: a generative grammar models only the knowledge that underlies the human ability to speak and understand. One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to. Chomsky was not the first person to suggest that all languages had certain fundamental things in common (he quotes philosophers writing several centuries ago who had the same basic idea), but he helped to make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language. Perhaps more significantly, he made concrete and technically sophisticated proposals about the structure of language, and made important proposals regarding how the success of grammatical theories should be evaluated.
Chomsky goes so far as to suggest that a baby need not learn any actual rules specific to a particular language at all. Rather, all languages are presumed to follow the same set of rules, but the effects of these rules and the interactions between them can vary greatly depending on the values of certain universal linguistic parameters. This is a very strong assumption, and is one of the most subtle ways in which Chomsky's current theory of language differs from most others.

[edit] Grammatical theories
In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. The first was the distinction between competence and performance. Chomsky noted the obvious fact that people, when speaking in the real world, often make linguistic errors (e.g. starting a sentence and then abandoning it midway through). He argued that these errors in linguistic performance were irrelevant to the study of linguistic competence (the knowledge that allows people to construct and understand grammatical sentences). Consequently, the linguist can study an idealised version of language, greatly simplifying linguistic analysis (see the "Grammaticalness" section below). The second idea related directly to the evaluation of theories of grammar. Chomsky made a distinction between grammars which achieved descriptive adequacy and those which went further and achieved explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar which achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind; that is, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, so if a grammatical theory has explanatory adequacy it must be able to explain the various grammatical nuances of the languages of the world as relatively minor variations in the universal pattern of human language. Chomsky argued that, even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would only come if linguists held explanatory adequacy as their goal. In other words, real insight into the structure of individual languages could only be gained through the comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.

[edit] "I-Language" and "E-Language"
In 1986, Chomsky proposed a distinction between I-Language and E-Language, similar but not identical to the competence/performance distinction.[6] I-Language is taken to be the object of study in syntactic theory; it is the mentally represented linguistic knowledge that a native speaker of a language has, and is therefore a mental object — from this perspective, most of Linguistics is a branch of psychology. E-Language encompasses all other notions of what a language is, for example that it is a body of knowledge or behavioural habits shared by a community. Thus, E-Language is not itself a coherent concept[7], and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge, i.e. competence, even though they may seem sensible and intuitive, and useful in other areas of study. Competence, he argues, can only be studied if languages are treated as mental objects.

[edit] Grammaticality
Further information: Grammaticality
Chomsky argued that the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way. In contrast an extreme behaviorist linguist would argue that language can only be studied through recordings or transcriptions of actual speech, the role of the linguist being to look for patterns in such observed speech, but not to hypothesize about why such patterns might occur, nor to label particular utterances as either "grammatical" or "ungrammatical". Although few linguists in the 1950s actually took such an extreme position, Chomsky was at an opposite extreme, defining grammaticality in an unusually (for the time) mentalistic way.[8] He argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a particular string of English words elicits a double take, or feeling of wrongness in a native English speaker, it can be said that the string of words is ungrammatical (when various extraneous factors affecting intuitions are controlled for). This (according to Chomsky) is entirely distinct from the question of whether a sentence is meaningful, or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colorless green ideas sleep furiously". But such sentences manifest a linguistic problem distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as being well formed.
The use of such intuitive judgments freed syntacticians from studying language through a corpus of observed speech, since they were now able to study the grammatical properties of contrived sentences. Without this change in philosophy, the construction of generative grammars would have been almost impossible, since it is often the relatively obscure and rarely-used features of a language which give linguists clues about its structure, and it is very difficult to find good examples of such features in everyday speech.

[edit] Minimalism
Main article: Linguistic minimalism
In the mid-1990s to mid-2000s, much research in transformational grammar was inspired by Chomsky's Minimalist Program.[9] The "Minimalist Program" aims at the further development of ideas involving economy of derivation and economy of representation, which had started to become significant in the early 1990s, but were still rather peripheral aspects of TGG theory.
Economy of derivation is a principle stating that movements (i.e. transformations) only occur in order to match interpretable features with uninterpretable features. An example of an interpretable feature is the plural inflection on regular English nouns, e.g. dogs. The word dogs can only be used to refer to several dogs, not a single dog, and so this inflection contributes to meaning, making it interpretable. English verbs are inflected according to the grammatical number of their subject (e.g. "Dogs bite" vs "A dog bites"), but in most sentences this inflection just duplicates the information about number that the subject noun already has, and it is therefore uninterpretable.
Economy of representation is the principle that grammatical structures must exist for a purpose, i.e. the structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.
Both notions, as described here, are somewhat vague, and indeed the precise formulation of these principles is controversial.[10][11] An additional aspect of minimalist thought is the idea that the derivation of syntactic structures should be uniform; that is, rules should not be stipulated as applying at arbitrary points in a derivation, but instead apply throughout derivations. Minimalist approaches to phrase structure have resulted in "Bare Phrase Structure", an attempt to eliminate X-bar theory. In 1998, Chomsky suggested that derivations proceed in "phases". The distinction of Deep Structure vs. Surface Structure is not present in Minimalist theories of syntax, and the most recent phase-based theories also eliminate LF and PF as unitary levels of representation.

[edit] Mathematical representation
Returning to the more general mathematical notion of a grammar, an important feature of all transformational grammars is that they are more powerful than context free grammars.[12] This idea was formalized by Chomsky in the Chomsky hierarchy. Chomsky argued that it is impossible to describe the structure of natural languages using context free grammars.[13] His general position regarding the non-context-freeness of natural language has held up since then, although his specific examples regarding the inadequacy of CFGs in terms of their weak generative capacity were later disproven. [14] [15]

[edit] Transformations
The usual usage of the term 'transformation' in linguistics refers to a rule that takes an input typically called the Deep Structure (in the Standard Theory) or D-structure (in the extended standard theory or government and binding theory) and changes it in some restricted way to result in a Surface Structure (or S-structure). In TGG, Deep structures were generated by a set of phrase structure rules.
For example a typical transformation in TG is the operation of subject-auxiliary inversion (SAI). This rule takes as its input a declarative sentence with an auxiliary: "John has eaten all the heirloom tomatoes." and transforms it into "Has John eaten all the heirloom tomatoes?". In their original formulation (Chomsky 1957), these rules were stated as rules that held over strings of either terminals or constituent symbols or both.
X NP AUX Y X AUX NP Y
In the 1970s, by the time of the Extended Standard Theory, following the work of Joseph Emonds on structure preservation, transformations came to be viewed as holding over trees. By the end of government and binding theory in the late 1980s, transformations are no longer structure changing operations at all, instead they add information to already existing trees by copying constituents.
The earliest conceptions of transformations were that they were construction-specific devices. For example, there was a transformation that turned active sentences into passive ones. A different transformation raised embedded subjects into main clause subject position in sentences such as "John seems to have gone"; and yet a third reordered arguments in the dative alternation. With the shift from rules to principles and constraints that was found in the 1970s, these construction specific transformations morphed into general rules (all the examples just mentioned being instances of NP movement), which eventually changed into the single general rule of move alpha or Move.
Transformations actually come of two types: (i) the post-Deep structure kind mentioned above, which are string or structure changing, and (ii) Generalized Transformations (GTs). Generalized transformations were originally proposed in the earliest forms of generative grammar (e.g. Chomsky 1957). They take small structures which are either atomic or generated by other rules, and combine them. For example, the generalized transformation of embedding would take the kernel "Dave said X" and the kernel "Dan likes smoking" and combine them into "Dave said Dan likes smoking". GTs are thus structure building rather than structure changing. In the Extended Standard Theory and government and binding theory, GTs were abandoned in favor of recursive phrase structure rules. However, they are still present in tree-adjoining grammar as the Substitution and Adjunction operations and they have recently re-emerged in mainstream generative grammar in Minimalism as the operations Merge and Move.

Phrase structure rules
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Phrase-structure rules are a way to describe a given language's syntax. They are used to break a natural language sentence down into its constituent parts (also known as syntactic categories) namely phrasal categories and lexical categories (aka parts of speech). Phrasal categories include the noun phrase, verb phrase, and prepositional phrase; lexical categories include noun, verb, adjective, adverb, and many others. Phrase structure rules were commonly used in transformational grammar (TGG), although they were not an invention of TGG; rather, early TGG's added to phrase structure rules (the most obvious example being transformations; see the page transformational grammar for an overview of the development of TGG.) A grammar which uses phrase structure rules is called a phrase structure grammar - except in computer science, where it is known as just a grammar, usually context-free.

[edit] Definition
Phrase structure rules are usually of the form , meaning that the constituent A is separated into the two subconstituents B and C. Some examples are:
The first rule reads: An S consists of an NP followed by a VP. This means A sentence consists of a noun phrase followed by a verb phrase. The next one: A noun phrase consists of a determiner followed by a noun.
Further explanations of the constituents: S, Det, NP, VP, AP, PP
Associated with phrase structure rules is a famous example of a grammatically correct sentence. The sentence was constructed by Noam Chomsky as an illustration that syntactically but not semantically correct sentences are possible.
Colorless green ideas sleep furiously can be diagrammed as a phrase tree, as below:

where S represents a grammatical sentence. The theory of antisymmetry proposed in the early '90s by Richard Kayne is an attempt to derive phrase structure from a single axiom.

[edit] Alternative approaches
A number of theories of grammar dispense with the notion of phrase structure rules and operate with the notion of schema instead. Here phrase structures are not derived from rules that combine words, but from the specification or instantiation of syntactic schemata or configurations, often expressing some kind of semantic content independently of the specific words that appear in them. This approach is essentially equivalent to a system of phrase structure rules combined with a noncompositional semantic theory, since grammatical formalisms based on rewriting rules are generally equivalent in power to those based on substitution into schemata.
So, in this type of approach, instead of being derived from the application of a number of phrase structure rules, the sentence "colorless green ideas sleep furiously" would be generated by filling the words into the slots of a schema having the following structure:
(NP(ADJ N) VP(V) AP(ADV))
And which would express the following conceptual content
X DOES Y IN THE MANNER OF Z
Though they are noncompositional, such models are monotonic. This approach is highly developed within Construction grammar, and has had some influence in Head-Driven Phrase Structure Grammar and Lexical functional grammar.

Generative grammar
Main article: Generative grammar
Generative grammar hypothesizes that language is a mental structure of the human mind. The goal of generative grammar is to make a complete model of this inner-language (or i-language) which could be used to describe all human speech, and predict the grammaticality of any given speech utterance (that is, whether speech would sound correct to native speakers of the language). This approach to language was pioneered by Noam Chomsky. Most generative theories (although not all of them) assume that syntax is based in constituent structure. Generative grammars are among the theories that focus primarily on the form of the sentence rather than the function.
Among the many Chomskyan generative theories of linguistics are:
Transformational Grammar (TG) (now largely out of date)
Government and binding theory (GB) (common in the late 1970s and 1980s)
Minimalism (MP) (the most recent Chomskyan version of generative grammar)
Other theories that find their origin in the generative paradigm are:
Generative semantics (now largely out of date)
Relational grammar (RG) (now largely out of date)
Arc Pair grammar
Generalised phrase structure grammar (now largely out of date)
Head-driven phrase structure grammar
Lexical-functional grammar
HPSG and LFG also fall in the category of unification grammars.

[edit] Categorial grammar
Categorial grammar is an approach that focuses on the combinatoric properties of categories. For example, an intransitive verb has the property that it requires a noun phrase (NP) to complete it and the result is a sentence (S) thus the category of such a verb is NP\S (in one notation).
Tree-adjoining grammar is a categorial grammar but adds in partial tree structures to the categories

[edit] Dependency grammar
Dependency grammar is a different type of approach in which structure is determined by the relation between a word (a head) and its dependents rather than being based in constituent structure.
Some dependency-based theories of Syntax
Algebraic syntax
Word grammar
Operator Grammar

[edit] Stochastic/Probabilistic grammars/Network Theories
Theoretical approaches to syntax that are based in probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a Neural network or Connectionism. Some theories based in this are:
Optimality Theory
stochastic context-free grammar

[edit] Functionalist grammars
Functionalist theories, although concerned about form, are driven by explanation based in the function of a sentence (i.e. its communicative function). Some typical functionalist theories include:
Functional grammar (Dik)
Prague Linguistic Circle
Systemic functional grammar
Cognitive grammar
Construction grammar (CxG)
Role and reference grammar (RRG)

Functional grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Functional Grammar is a model of grammar motivated by functions. The model was originally developed by Simon C. Dik at the University of Amsterdam in the 1980s, and has undergone several revisions ever since, the latest one being the integration of discourse as major component by Kees Hengeveld. This has led to a renaming of the theory to "Functional Discourse Grammar". This type of grammar is quite distinct from systemic functional grammar as developed by Michael Halliday and many other linguists since the 1970s.

[edit] "Functions"
The notion of "function" in FG generalizes the standard distinction of grammatical functions such as subject and object. Constituents (parts of speech) of a linguistic utterance are assigned three types or levels of functions:
Semantic function (Agent, Patient, Recipient, etc.), describing the role of participants in states of affairs or actions expressed
Syntactic functions (Subject and Object), defining different perspectives in the presentation of a linguistic expression
Pragmatic functions (Theme and Tail, Topic and Focus), defining the informational status of constituents, determined by the pragmatic context of the verbal interaction
case grammar

Case Grammar is a system of linguistic analysis, focusing on the link between the valence of a verb and the grammatical context it requires, created by the American linguist Charles J. Fillmore in (1968), in the context of Transformational Grammar. This theory analyzes the surface syntactic structure of sentences by studying the combination of deep cases (i.e. semantic roles) -- Agent, Object, Benefactor, Location or Instrument -- which are required by a specific verb. For instance, the verb "give" in English requires an Agent (A) and Object (O), and a Beneficiary (B); e.g. "Jones (A) gave money (O) to the school (B).
According to Fillmore, each verb selects a certain number of deep cases which form its case frame. Thus, a case frame describes important aspects of semantic valency, of verbs, adjectives and nouns. Case frames are subject to certain constraints, such as that a deep case can occur only once per sentence. Some of the cases are obligatory and others are optional. Obligatory cases may not be deleted, at the risk of producing ungrammatical sentences. For example, Mary gave the apples is ungrammatical in this sense.
A fundamental hypothesis of case grammar is that grammatical functions, such as subject or object, are determined by the deep, semantic valence of the verb, which finds its syntactic correlate in such grammatical categories as Subject and Object, and in grammatical cases such as Nominative, Accusative, etc. Fillmore (1968) puts forwards the following hierarchy for a universal subject selection rule:
Agent < Instrumental < Objective
That means that if the case frame of a verb contains an agent, this one is realized as the subject of an active sentence; otherwise, the deep case following the agent in the hierarchy (i.e. Instrumental) is promoted to subject.
The influence of case grammar on contemporary linguistics has been significant, to the extent that numerous linguistic theories incorporate deep roles in one or other form, such as the so-called Thematic Structure in Government and Binding theory. It has also inspired the development of frame-based representations in AI research[citation needed].
During the 1970s and the 1980s, Charles Fillmore developed his original theory onto what was called Frame Semantics. Walter A. Cook, SJ, a linguistics professor at Georgetown University, was one of the foremost case grammar theoreticians following Fillmore's originial work. Cook devoted most of his scholarly research from the early 1970s until 1990s to further developing case grammar as a tool for linguistic analysis, language teaching methodology, and other applications, and was the author of several major texts and many articles in case grammar. Cook directed several doctoral dissertations applying case grammar to various areas of theoretical and applied linguistics research.

universal grammar

Universal grammar is a theory of linguistics postulating principles of grammar shared by all languages, thought to be innate to humans (linguistic nativism). It attempts to explain language acquisition in general, not describe specific languages. Universal grammar proposes a set of rules intended to explain language acquisition in child development. The application of the idea to the area of second language acquisition (SLA) is represented mainly by the McGill University linguist Lydia White.
Some students of universal grammar study a variety of grammars to abstract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a range of traits, from the phonemes found in languages, to what word orders languages choose, to why children exhibit certain linguistic behaviors. as they considered issues of the Argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The contrasting school of thought is known as functionalism.
History
The idea can be traced to Roger Bacon's observation that all languages are built upon a common grammar, substantially the same in all languages, even though it may undergo accidental variations, and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. Charles Darwin described language as an instinct in humans, like the upright posture[1]
The idea rose to notability in modern linguistics with theorists such as Noam Chomsky and Richard Montague, developed in the 1950s to 1970s, as part of the "Linguistics Wars".

[edit] Chomsky's theory
Further information: Language acquisition device, Generative grammar, X-bar theory, Government and Binding, Principles and parameters, and Minimalist Program
Linguist Noam Chomsky made the argument that the human brain contains a limited set of rules for organizing language. In turn, there is an assumption that all languages have a common structural basis. This set of rules is known as universal grammar.
Speakers proficient in a language know what expressions are acceptable in their language and what expressions are unacceptable. The key puzzle is how speakers should come to know the restrictions of their language, since expressions which violate those restrictions are not present in the input, indicated as such. This absence of negative evidence -- that is, absence of evidence that an expression is part of a class of the ungrammatical sentences in one's language -- is the core of poverty of stimulus argument. For example, in English one cannot relate a question word like 'what' to a predicate within a relative clause (1):
(1) *What did John meet a man who sold?
Such expressions are not available to the language learners, because they are, by hypothesis, ungrammatical for speakers of the local language. Speakers of the local language do not utter such expressions and note that they are unacceptable to language learners. Universal grammar offers a solution to the poverty of the stimulus problem by making certain restrictions universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
The presence of creole languages is cited as further support for this theory. These languages were developed and formed when different societies came together and devised their own system of language. Originally these languages were pidgins and later became more mature languages that developed some sense of rules and native speakers.
The idea of universal grammar is supported by the creole languages by virtue of the fact that all or most of these languages share certain features. Syntactically, they use participles to form future and past tenses and multiple negation to deny or negate. Another similarity among creoles is that a question can be implemented by changing inflection rather than changing words.

[edit] Criticism
Some linguists oppose the universal grammar theory. It is outspokenly opposed by Geoffrey Sampson, who maintains that universal grammar theories are not falsifiable, arguing that the grammatical generalizations made are simply observations about existing languages and not predictions about what is possible in a language.
Some feel that the basic assumptions of Universal Grammar are unfounded. Another way of defusing the poverty of the stimulus argument is if language learners notice the absence of classes of expressions in the input and, on this basis, hypothesize a restriction. This solution is closely related to Bayesian reasoning. Elman et al. argue that the unlearnability of languages assumed by UG is based on a too-strict, "worst-case" model of grammar.
Critics argue that the postulate of a "language acquisition device" essentially amounts to the trivial claim that languages are, in fact, learnt by humans, and that the LAD isn't a theory so much as the explanandum looking for theories.[2]
The Pirahã language has been claimed by the linguist Daniel Everett to be a counterexample to Universal Grammar, showing properties allegedly unexpected under current views of Universal Grammar. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and color terms.[3] Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of Universal Grammar.[4] While most languages studied in that respect do indeed seem to share common underlying rules, research is hampered by considerable sampling bias. Linguistically most diverse areas such as tropical Africa and America, as well as the diversity of Indigenous Australian and Papuan languages have been insufficiently studied. Furthermore, language extinction apparently has affected those areas most where most examples of unconventional languages have been found to date[citation needed].