De Quincey, Thomas. Confessions of an English Opium Eater. 1821. Rpt. New York: The Walter Scott Publishing Co., Ltd, 1886. 104 pages.
Selected Passages : Here is the classic "disclaimer" which seems, in some form or another, to have graced the beginning of every literary work dealing with sin up to the 1970s. From Moll Flanders to Naked Lunch, there was always, in the foreword or first chapter, some kind of apology, rationalization, explanation, or, in De Quincey’s case, an attempt to distance himself from the other confessions and then divert the reader’s attention entirely by slamming the French!
"Nothing, indeed, is more revolting to English feelings, than the spectacle of a human being obtruding on our notice his moral ulcers or scars, and tearing away that ‘decent drapery,’ which time, or indulgence to human frailty, may have drawn over them: accordingly, the greater part of our confessions (that is, spontaneous and extra-judicial confessions) proceed from demireps, adventurers, or swindlers: and for any such acts of gratuitous self-humiliation from those who can be supposed in sympathy with the decent and self-respecting part of society, we must look to French literature, or to that part of the German which is tainted with the spurious and defective sensibility of the French." ("To the Reader," p.xxii-xxiv)
De Quincey brings up a very good point here that most people still don’t quite understand: drugs have different effects on different types of people. The literary folks can’t help but read deeper meanings into the whole experience, whereas ‘regular folks" just like the way the drugs make them feel
"If a man ‘whose talk is of oxen,’ should become an Opium-eater, the probability is, that (if he is not too dull to dream at all) - he will dream about oxen : whereas, in the case before him, the reader will find that the Opium-eater boasteth himself to be a philosopher; and accordingly, that the phantasmagoria of his dreams (waking or sleeping, day-dreams or night dreams) is suitable to one who is in that character." ("Preliminary Confessions," p.2)
Like Charles Lamb, De Quincey varies his style by occasionally slipping into anachronistic language (or dropping Latin and Greek) when he feels the urge to wax poetic. Here he bids farewell to his life of poverty.
"So then, Oxford Street, stony-hearted step-mother! thou that listenest to the sighs of orphans, and drinkest the tears of children, at length I was dismissed from thee: the time was come at last that I no more should pace in anguish thy never- ending terraces; no more should dream, and wake in captivity to the pangs of hunger. Successors, too many, to myself and Ann, have, doubtless, since then trodden in our footsteps - inheritors of our calamities : other orphans than Ann have sighed : tears have been shed by other children : and thou, Oxford Street, hast since, doubtless, echoed to the groans of innumerable hearts." ("Preliminary Confessions," p.42)
By the time he finally gets around to really talking about opium, De Quincey delivers the most passionate writing of the entire book. Almost all of that writing is in praise of the drug.
"Oh! just, subtle, and mighty opium! that to the hearts of poor and rich alike, for the wounds that will never heal, and for ‘the pangs that tempt the spirit to rebel,’ bringest an assuaging balm; eloquent opium! that with thy potent rhetoric stealest away the purposes of wrath; to the guilty man, for one night gives back the hopes of his youth, and hands washed pure from blood; and to the proud man, a brief oblivion for ‘wrongs undressed and insults unaveng'd;’ that summonest to the chancery of dreams, for the triumphs of suffering innocence, false witnesses; and confoundest perjury; and dost reverse the sentences of unrighteous judges; - thou buildest upon the bosom of darkness, out of the fantastic imagery of the brain, cities and temples, beyond the art of Phidias and Praxiteles - beyond the splendour of Babylon and Hekatompylos; and ‘from the anarchy of dreaming sleep,’ callest into sunny light the faces of long- buried beauties, and the blessed household countenances, cleansed from the ‘dishonours of the grave.’ Thou only givest these gifts to man; and thou hast the keys of Paradise, oh, just, subtle, and mighty opium!" ("The Pleasures of Opium," p.62-63)
A chilling portrait of how opium had insinuated itself into his daily routine.
"Whether desperate of not, however, the issue of the struggle in 1813 was what I have mentioned; and from this date, the reader is to consider me as a regular and confirmed opium-eater, of whom to ask whether on any particular day he had or had not taken opium, would be to ask whether his lungs had performed respiration, or the heart fulfilled its functions." ("Introduction to the Pains of Opium," p. 70)
De Quincey delivers some more inspired writing in his description of the happiest days of his life, which consisted of many winter hours spent sitting by the fire, reading, blissed-out on opium.
"Surely everybody is aware of the divine pleasures which attend a winter fireside; candles at four o'clock, warm hearth-rugs, tea, a fair tea-maker, shutters closed, curtains flowing in ample draperies on the floor, whilst the wind and rain are raging audibly without," ("Introduction to the Pains of Opium," p 76)
De Quincey defends his "frankness."; "You will think, perhaps, that I am too confidential and communicative of my own private history. It may be so. But my way of writing is rather to think aloud, and follow my own humours, than much to consider who is listening to me; and, if I stop to consider what is proper to be said to this or that person, I shall soon come to doubt whether any part at all is proper." ("The Pains of Opium," p. 81)
Inexplicably, De Quincey is roused from his extended opium torpor by…a book about political economics?
"At length, in 1819, a friend in Edinburgh sent me down Mr. Ricardo's book : and recurring to my own prophetic anticipation of the advent of some legislator for this science, I said, before I had finished the first chapter, ‘Thou art the man!’ ("The Pains of Opium," p. 85)
When it comes time for De Quincey to detail "The Pains of Opium," he side-steps the issue for the most part, and instead goes into long, detailed descriptions of his dreams and how extended opium use altered their character. Here is one of the insights he gained from these dreams:
"Of this, at least, I feel assured, that there is no such thing as forgetting possible to the mind; a thousand accidents may and will interpose a veil between our present consciousness and the secret inscriptions of the mind; accidents of the same sort will also rend away this veil; but alike, whether veiled or unveiled, the inscription remains for ever; just as the stars seem to withdraw before the common light of day, whereas, in fact, we all know that it is the light which is drawn over them as a veil, and that they are waiting to be revealed when the obscuring daylight shall have withdrawn." ("The Pains of Opium," p. 90)
De Quincey describes what it was like to give up drugs in the days before Betty Ford.
"I triumphed : but think not, reader, that therefore my sufferings were ended; nor think of me as one sitting in a dejected state. Think of me as of one, even when four months had passed, still agitated, writhing, throbbing, palpitating, shattered; and much, perhaps, in the situation of him who has been racked, as I collect the torments of that state from the affecting account of them left by a most innocent sufferer." ("The Pains of Opium," p. 103)
In the Book’s final passage, De Quincey discusses some lingering effects of his long period of drug use. (Funny, maybe he was still experiencing these symptoms because he didn’t actually quit. Hmmm.)
"One memorial of my former condition still remains: my dreams are not yet perfectly calm : the dread swell and agitation of the storm have not wholly subsided : the legions that encamped in them are drawing off, but not all departed : my sleep is still tumultuous, and, like the gates of Paradise to our first parents when looking back from afar, it is still (in the tremendous line of Milton)- ‘with dreadful faces throng’d and fiery arms.’" ("The Pains of Opium," p. 104)
De Quincey, Thomas. Confessions of an English Opium Eater. 1821. Rpt. New York: The Walter Scott Publishing Co., Ltd, 1886. 104 pages.
Selected Passages : Here is the classic "disclaimer" which seems, in some form or another, to have graced the beginning of every literary work dealing with sin up to the 1970s. From Moll Flanders to Naked Lunch, there was always, in the foreword or first chapter, some kind of apology, rationalization, explanation, or, in De Quincey’s case, an attempt to distance himself from the other confessions and then divert the reader’s attention entirely by slamming the French!
"Nothing, indeed, is more revolting to English feelings, than the spectacle of a human being obtruding on our notice his moral ulcers or scars, and tearing away that ‘decent drapery,’ which time, or indulgence to human frailty, may have drawn over them: accordingly, the greater part of our confessions (that is, spontaneous and extra-judicial confessions) proceed from demireps, adventurers, or swindlers: and for any such acts of gratuitous self-humiliation from those who can be supposed in sympathy with the decent and self-respecting part of society, we must look to French literature, or to that part of the German which is tainted with the spurious and defective sensibility of the French." ("To the Reader," p.xxii-xxiv)
De Quincey brings up a very good point here that most people still don’t quite understand: drugs have different effects on different types of people. The literary folks can’t help but read deeper meanings into the whole experience, whereas ‘regular folks" just like the way the drugs make them feel
"If a man ‘whose talk is of oxen,’ should become an Opium-eater, the probability is, that (if he is not too dull to dream at all) - he will dream about oxen : whereas, in the case before him, the reader will find that the Opium-eater boasteth himself to be a philosopher; and accordingly, that the phantasmagoria of his dreams (waking or sleeping, day-dreams or night dreams) is suitable to one who is in that character." ("Preliminary Confessions," p.2)
Like Charles Lamb, De Quincey varies his style by occasionally slipping into anachronistic language (or dropping Latin and Greek) when he feels the urge to wax poetic. Here he bids farewell to his life of poverty.
"So then, Oxford Street, stony-hearted step-mother! thou that listenest to the sighs of orphans, and drinkest the tears of children, at length I was dismissed from thee: the time was come at last that I no more should pace in anguish thy never- ending terraces; no more should dream, and wake in captivity to the pangs of hunger. Successors, too many, to myself and Ann, have, doubtless, since then trodden in our footsteps - inheritors of our calamities : other orphans than Ann have sighed : tears have been shed by other children : and thou, Oxford Street, hast since, doubtless, echoed to the groans of innumerable hearts." ("Preliminary Confessions," p.42)
By the time he finally gets around to really talking about opium, De Quincey delivers the most passionate writing of the entire book. Almost all of that writing is in praise of the drug.
"Oh! just, subtle, and mighty opium! that to the hearts of poor and rich alike, for the wounds that will never heal, and for ‘the pangs that tempt the spirit to rebel,’ bringest an assuaging balm; eloquent opium! that with thy potent rhetoric stealest away the purposes of wrath; to the guilty man, for one night gives back the hopes of his youth, and hands washed pure from blood; and to the proud man, a brief oblivion for ‘wrongs undressed and insults unaveng'd;’ that summonest to the chancery of dreams, for the triumphs of suffering innocence, false witnesses; and confoundest perjury; and dost reverse the sentences of unrighteous judges; - thou buildest upon the bosom of darkness, out of the fantastic imagery of the brain, cities and temples, beyond the art of Phidias and Praxiteles - beyond the splendour of Babylon and Hekatompylos; and ‘from the anarchy of dreaming sleep,’ callest into sunny light the faces of long- buried beauties, and the blessed household countenances, cleansed from the ‘dishonours of the grave.’ Thou only givest these gifts to man; and thou hast the keys of Paradise, oh, just, subtle, and mighty opium!" ("The Pleasures of Opium," p.62-63)
A chilling portrait of how opium had insinuated itself into his daily routine.:"Whether desperate of not, however, the issue of the struggle in 1813 was what I have mentioned; and from this date, the reader is to consider me as a regular and confirmed opium-eater, of whom to ask whether on any particular day he had or had not taken opium, would be to ask whether his lungs had performed respiration, or the heart fulfilled its functions." ("Introduction to the Pains of Opium," p. 70)
De Quincey delivers some more inspired writing in his description of the happiest days of his life, which consisted of many winter hours spent sitting by the fire, reading, blissed-out on opium.: "Surely everybody is aware of the divine pleasures which attend a winter fireside; candles at four o'clock, warm hearth-rugs, tea, a fair tea-maker, shutters closed, curtains flowing in ample draperies on the floor, whilst the wind and rain are raging audibly without," ("Introduction to the Pains of Opium," p 76)
De Quincey defends his "frankness."; "You will think, perhaps, that I am too confidential and communicative of my own private history. It may be so. But my way of writing is rather to think aloud, and follow my own humours, than much to consider who is listening to me; and, if I stop to consider what is proper to be said to this or that person, I shall soon come to doubt whether any part at all is proper." ("The Pains of Opium," p. 81)
Inexplicably, De Quincey is roused from his extended opium torpor by…a book about political economics?:"At length, in 1819, a friend in Edinburgh sent me down Mr. Ricardo's book : and recurring to my own prophetic anticipation of the advent of some legislator for this science, I said, before I had finished the first chapter, ‘Thou art the man!’ ("The Pains of Opium," p. 85)
When it comes time for De Quincey to detail "The Pains of Opium," he side-steps the issue for the most part, and instead goes into long, detailed descriptions of his dreams and how extended opium use altered their character. Here is one of the insights he gained from these dreams:"Of this, at least, I feel assured, that there is no such thing as forgetting possible to the mind; a thousand accidents may and will interpose a veil between our present consciousness and the secret inscriptions of the mind; accidents of the same sort will also rend away this veil; but alike, whether veiled or unveiled, the inscription remains for ever; just as the stars seem to withdraw before the common light of day, whereas, in fact, we all know that it is the light which is drawn over them as a veil, and that they are waiting to be revealed when the obscuring daylight shall have withdrawn." ("The Pains of Opium," p. 90)
De Quincey describes what it was like to give up drugs in the days before Betty Ford.:"I triumphed : but think not, reader, that therefore my sufferings were ended; nor think of me as one sitting in a dejected state. Think of me as of one, even when four months had passed, still agitated, writhing, throbbing, palpitating, shattered; and much, perhaps, in the situation of him who has been racked, as I collect the torments of that state from the affecting account of them left by a most innocent sufferer." ("The Pains of Opium," p. 103)
In the Book’s final passage, De Quincey discusses some lingering effects of his long period of drug use. (Funny, maybe he was still experiencing these symptoms because he didn’t actually quit. Hmmm.):"One memorial of my former condition still remains: my dreams are not yet perfectly calm : the dread swell and agitation of the storm have not wholly subsided : the legions that encamped in them are drawing off, but not all departed : my sleep is still tumultuous, and, like the gates of Paradise to our first parents when looking back from afar, it is still (in the tremendous line of Milton)- ‘with dreadful faces throng’d and fiery arms.’" ("The Pains of Opium," p. 104)
FOOTNOTES : {1} "Not yet RECORDED," I say; for there is one celebrated man of the present day, who, if all be true which is reported of him, has greatly exceeded me in quantity.
{2} A third exception might perhaps have been added; and my reason for not adding that exception is chiefly because it was only in his juvenile efforts that the writer whom I allude to expressly addressed hints to philosophical themes; his riper powers having been all dedicated (on very excusable and very intelligible grounds, under the present direction of the popular mind in England) to criticism and the Fine Arts. This reason apart, however, I doubt whether he is not rather to be considered an acute thinker than a subtle one. It is, besides, a great drawback on his mastery over philosophical subjects that he has obviously not had the advantage of a regular scholastic education: he has not read Plato in his youth (which most likely was only his misfortune), but neither has he read Kant in his manhood (which is his fault).
{3} I disclaim any allusion to EXISTING professors, of whom indeed I know only one.
{4} To this same Jew, by the way, some eighteen months afterwards, I applied again on the same business; and, dating at that time from a respectable college, I was fortunate enough to gain his serious attention to my proposals. My necessities had not arisen from any extravagance or youthful levities (these my habits and the nature of my pleasures raised me far above), but simply from the vindictive malice of my guardian, who, when he found himself no longer able to prevent me from going to the university, had, as a parting token of his good nature, refused to sign an order for granting me a shilling beyond the allowance made to me at school--viz., 100 pounds per annum. Upon this sum it was in my time barely possible to have lived in college, and not possible to a man who, though above the paltry affectation of ostentatious disregard for money, and without any expensive tastes, confided nevertheless rather too much in servants, and did not delight in the petty details of minute economy. I soon, therefore, became embarrassed, and at length, after a most voluminous negotiation with the Jew (some parts of which, if I had leisure to rehearse them, would greatly amuse my readers), I was put in possession of the sum I asked for, on the "regular" terms of paying the Jew seventeen and a half per cent. by way of annuity on all the money furnished; Israel, on his part, graciously resuming no more than about ninety guineas of the said money, on account of an attorney's bill (for what services, to whom rendered, and when, whether at the siege of Jerusalem, at the building of the second Temple, or on some earlier occasion, I have not yet been able to discover). How many perches this bill measured I really forget; but I still keep it in a cabinet of natural curiosities, and some time or other I believe I shall present it to the British Museum.
{5} The Bristol mail is the best appointed in the Kingdom, owing to the double advantages of an unusually good road and of an extra sum for the expenses subscribed by the Bristol merchants.
{6} It will be objected that many men, of the highest rank and wealth, have in our own day, as well as throughout our history, been amongst the foremost in courting danger in battle. True; but this is not the case supposed; long familiarity with power has to them deadened its effect and its attractions.
{7} (Greek text) / {8} (Greek text). EURIPEDES. Orestes. / {9} (Greek text)
{10} (Greek text). The scholar will know that throughout this passage I refer to the early scenes of the Orestes; one of the most beautiful exhibitions of the domestic affections which even the dramas of Euripides can furnish. To the English reader it may be necessary to say that the situation at the opening of the drama is that of a brother attended only by his sister during the demoniacal possession of a suffering conscience (or, in the mythology of the play, haunted by the Furies), and in circumstances of immediate danger from enemies, and of desertion or cold regard from nominal friends.
{11} EVANESCED: this way of going off the stage of life appears to have been well known in the 17th century, but at that time to have been considered a peculiar privilege of blood-royal, and by no means to be allowed to druggists. For about the year 1686 a poet of rather ominous name (and who, by-the-bye, did ample justice to his name), viz., Mr. FLAT-MAN, in speaking of the death of Charles II. expresses his surprise that any prince should commit so absurd an act as dying, because, says he, "Kings should disdain to die, and only DISAPPEAR." They should ABSCOND, that is, into the other world.
{12} Of this, however, the learned appear latterly to have doubted; for in a pirated edition of Buchan's Domestic Medicine, which I once saw in the hands of a farmer's wife, who was studying it for the benefit of her health, the Doctor was made to say--"Be particularly careful never to take above five-and-twenty OUNCES of laudanum at once;" the true reading being probably five-and-twenty DROPS, which are held equal to about one grain of crude opium.
{13} Amongst the great herd of travellers, etc., who show sufficiently by their stupidity that they never held any intercourse with opium, I must caution my readers specially against the brilliant author of Anastasius. This gentleman, whose wit would lead one to presume him an opium-eater, has made it impossible to consider him in that character, from the grievous misrepresentation which he gives of its effects at pp. 215-17 of vol. i. Upon consideration it must appear such to the author himself, for, waiving the errors I have insisted on in the text, which (and others) are adopted in the fullest manner, he will himself admit that an old gentleman "with a snow-white beard," who eats "ample doses of opium," and is yet able to deliver what is meant and received as very weighty counsel on the bad effects of that practice, is but an indifferent evidence that opium either kills people prematurely or sends them into a madhouse. But for my part, I see into this old gentleman and his motives: the fact is, he was enamoured of "the little golden receptacle of the pernicious drug" which Anastasius carried about him; and no way of obtaining it so safe and so feasible occurred as that of frightening its owner out of his wits (which, by the bye, are none of the strongest). This commentary throws a new light upon the case, and greatly improves it as a story; for the old gentleman's speech, considered as a lecture on pharmacy, is highly absurd; but considered as a hoax on Anastasius, it reads excellently.
{14} I have not the book at this moment to consult; but I think the passage begins--"And even that tavern music, which makes one man merry, another mad, in me strikes a deep fit of devotion," etc.
{15} A handsome newsroom, of which I was very politely made free in passing through Manchester by several gentlemen of that place, is called, I think, The Porch; whence I, who am a stranger in Manchester, inferred that the subscribers meant to profess themselves followers of Zeno. But I have been since assured that this is a mistake.
{16} I here reckon twenty-five drops of laudanum as equivalent to one grain of opium, which, I believe, is the common estimate. However, as both may be considered variable quantities (the crude opium varying much in strength, and the tincture still more), I suppose that no infinitesimal accuracy can be had in such a calculation. Teaspoons vary as much in size as opium in strength. Small ones hold about 100 drops; so that 8,000 drops are about eighty times a teaspoonful. The reader sees how much I kept within Dr. Buchan's indulgent allowance.
{17} This, however, is not a necessary conclusion; the varieties of effect produced by opium on different constitutions are infinite. A London magistrate (Harriott's Struggles through Life, vol. iii. p. 391, third edition) has recorded that, on the first occasion of his trying laudanum for the gout he took FORTY drops, the next night SIXTY, and on the fifth night EIGHTY, without any effect whatever; and this at an advanced age. I have an anecdote from a country surgeon, however, which sinks Mr. Harriott's case into a trifle; and in my projected medical treatise on opium, which I will publish provided the College of Surgeons will pay me for enlightening their benighted understandings upon this subject, I will relate it; but it is far too good a story to be published gratis.
{18} See the common accounts in any Eastern traveller or voyager of the frantic excesses committed by Malays who have taken opium, or are reduced to desperation by ill-luck at gambling.
{19} The reader must remember what I here mean by THINKING, because else this would be a very presumptuous expression. England, of late, has been rich to excess in fine thinkers, in the departments of creative and combining thought; but there is a sad dearth of masculine thinkers in any analytic path. A Scotchman of eminent name has lately told us that he is obliged to quit even mathematics for want of encouragement.
{20} William Lithgow. His book (Travels, etc.) is ill and pedantically written; but the account of his own sufferings on the rack at Malaga is overpoweringly affecting.
{21} In saying this I mean no disrespect to the individual house, as the reader will understand when I tell him that, with the exception of one or two princely mansions, and some few inferior ones that have been coated with Roman cement, I am not acquainted with any house in this mountainous district which is wholly waterproof. The architecture of books, I flatter myself, is conducted on just principles in this country; but for any other architecture, it is in a barbarous state, and what is worse, in a retrograde state.
The Romantic Critical Imagination: Thomas De Quincey's Confessions of an English Opium Eater (part One) looked at for its understanding of and social criticism of drugs and other specific aspects of early nineteenth-century British society. You may wish to read in Siegmund Freud's Opium Papers for a comparative study of two fine minds, one Romantic the other modern, thinking about drugs.
Tuesday, November 4, 2008
Friday, April 18, 2008
cool
Coolie (1936) by Mulk Raj Anand gives a clear and poignant description of the poor face of India, telling the story of a 15-year-old boy who has to work as a child labourer and eventually dies of tuberculosis. It is the story of Munoo who is forced to leave his village out of necessity and poverty to work in the city as a child labourer. The novel shows his adventures and escapades as he works as a servant, factory worker, rickshaw driver far away from his home. The story is told from the eyes of the narrator and brings to light the inevitable and hidden evils of the Raj, right from exploitation, caste ridden society, communal riots, and police injustice. The novel takes us to different places and cities showing the inhuman and degrading treatment that the poor Munoo gets at the hands of the socially, economically, and politically affluent and higher classes of Indian society and how he copes with all circumstances alone. Anand was able to strike a cord in the hearts of the conscientious indians with the beautiful and real to life portrayal of the down trodden masses of Indian society,the so called have nots. Mulk Raj Anand was much appreciated and recognized for this novel and was one of those people who were highly influenced by Mahatma Gandhi. And this influence is clearly seen in all his works including Coolie. True to his Marxist spirit, he always portrayed the real India, and more specifically the poor India. He is also regarded as one of the first Indian writers in English who started the trend of using Hindi and Punjabi phrases in his writings to enrich and enhance the language. Also caled the Charles Dickens of India by many literary figures, Anand's novels deal with the underdog. Coolie is one classic example of the story of the underprevileged class of the society and of the oppressed people who cannot even make both ends meet.
Guidance
The comprehensive school counseling program refers to a sequential, developmental program designed to benefit all students in preparation for their futures. Such a program includes a curriculum organized around three areas essential for students growth and development: Academic Development, Career Development, and Personal/Social Development.
Demonstrate a positive attitude toward self as a unique and worthy person. Gain life-planning skills that are consistent with their needs, interests, and abilities. Develop responsible social skills and an understanding and appreciation of being a contributing member of society. Demonstrate an understanding and appreciation of the life-long process of learning, growing, and changing. Activities and strategies for achieving identified student outcomes in these three areas can be integrated across the curriculum by teachers and counselors. A goal for this guide is to illustrate the connectivity between the National Standards, the ABCs Goals, the SCANS, and the National Career Development Guidelines. This Guidance Curriculum for a Comprehensive School Counseling Program is student centered and teacher friendly. Counselors should use it as a blueprint for collaboratively building a sequential and developmentally appropriate school counseling program.
Behaviourism
Behavioural (or "behavioral") theory in psychology is a very substantial field: follow the links to the left or right for introductions to some of its more detailed contributions impinging on how people learn in the real world. How I have the effrontery to produce a single page on it amazes even me, whatever my reservations about it!
Behaviourism is primarily associated with Pavlov (classical conditioning) in Russia and with Thorndike, Watson and particularly Skinner in the United States (operant conditioning).
Behaviourism is dominated by the constraints of its (naïve) attempts to emulate the physical sciences, which entails a refusal to speculate about what happens inside the organism. Anything which relaxes this requirement slips into the cognitive realm.
Much behaviourist experimentation is undertaken with animals and generalised.
In educational settings, behaviourism implies the dominance of the teacher, as in behaviour modification programmes. It can, however, be applied to an understanding of unintended learning.
For our purposes, behaviourism is relevant mainly to:
Skill development, and
The "substrate" (or "conditions", as Gagné puts it) of learning
If you want to follow your own links, use "behaviorism" (sic.) Most of the material is US-based and "behaviorism" and "behaviorist" is how they spell it, and I freely admit that this side-bar is purely to get the stupid search engine "bots" to register "behavior"
Classical conditioning:
is the process of reflex learning—investigated by Pavlov—through which an unconditioned stimulus (e.g. food) which produces an unconditioned response (salivation) is presented together with a conditioned stimulus (a bell), such that the salivation is eventually produced on the presentation of the conditioned stimulus alone, thus becoming a conditioned response.
This is a disciplined account of our common-sense experience of learning by association (or "contiguity", in the jargon), although that is often much more complex than a reflex process, and is much exploited in advertising. Note that it does not depend on us doing anything.
Such associations can be chained and generalised (for better of for worse): thus "smell of baking" associates with "kitchen at home in childhood" associates with "love and care". (Smell creates potent conditioning because of the way it is perceived by the brain.) But "sitting at a desk" associates with "classroom at school" and hence perhaps with "humiliation and failure"...
More on Pavlov
Operant Conditioning
If, when an organism emits a behaviour (does something), the consequences of that behaviour are reinforcing, it is more likely to emit (do) it again. What counts as reinforcement, of course, is based on the evidence of the repeated behaviour, which makes the whole argument rather circular.
Learning is really about the increased probability of a behaviour based on reinforcement which has taken place in the past, so that the antecedents of the new behaviour include the consequences of previous behaviour.
Summary of Skinner'sideas
On operantconditioning
Skinner's own account
Wikipedia on operant conditioning
The schedule of reinforcement of behaviour is central to the management of effective learning on this basis, and working it out is a very skilled procedure: simply reinforcing every instance of desired behaviour is just bribery, not the promotion of learning.
Withdrawal of reinforcement eventually leads to the extinction of the behaviour, except in some special cases such as anticipatory-avoidance learning.
BehaviorismDefinitionBehaviorism is a theory of animal and human learning that only focuses on objectively observable behaviors and discounts mental activities. Behavior theorists define learning as nothing more than the acquisition of new behavior.
DiscussionExperiments by behaviorists identify conditioning as a universal learning process. There are two different types of conditioning, each yielding a different behavioral pattern:
Classic conditioning occurs when a natural reflex responds to a stimulus. The most popular example is Pavlov's observation that dogs salivate when they eat or even see food. Essentially, animals and people are biologically "wired" so that a certain stimulus will produce a specific response.
Behavioral or operant conditioning occurs when a response to a stimulus is reinforced. Basically, operant conditioning is a simple feedback system: If a reward or reinforcement follows the response to a stimulus, then the response becomes more probable in the future. For example, leading behaviorist B.F. Skinner used reinforcement techniques to teach pigeons to dance and bowl a ball in a mini-alley.
There have been many criticisms of behaviorism, including the following:
Behaviorism does not account for all kinds of learning, since it disregards the activities of the mind.
Behaviorism does not explain some learning--such as the recognition of new language patterns by young children--for which there is no reinforcement mechanism.
Reserach has shown that animals adapt their reinforced patterns to new information. For instance, a rat can shift its behavior to respond to changes in the layout of a maze it had previously mastered through reinforcements.
How Behaviorism Impacts LearningThis theory is relatively simple to understand because it relies only on observable behavior and describes several universal laws of behavior. Its positive and negative reinforcement techniques can be very effective--both in animals, and in treatments for human disorders such as autism and antisocial behavior. Behaviorism often is used by teachers, who reward or punish student behaviors.
PiagetDefinitionSwiss biologist and psychologist Jean Piaget (1896-1980) is renowned for constructing a highly influential model of child development and learning. Piaget's theory is based on the idea that the developing child builds cognitive structures--in other words, mental "maps," schemes, or networked concepts for understanding and responding to physical experiences within his or her environment. Piaget further attested that a child's cognitive structure increases in sophistication with development, moving from a few innate reflexes such as crying and sucking to highly complex mental activities.
DiscussionPiaget's theory identifies four developmental stages and the processes by which children progress through them. The four stages are:
Sensorimotor stage (birth - 2 years old)--The child, through physical interaction with his or her environment, builds a set of concepts about reality and how it works. This is the stage where a child does not know that physical objects remain in existence even when out of sight (object permanance).
Preoperational stage (ages 2-7)--The child is not yet able to conceptualize abstractly and needs concrete physical situations.
Concrete operations (ages 7-11)--As physical experience accumulates, the child starts to conceptualize, creating logical structures that explain his or her physical experiences. Abstract problem solving is also possible at this stage. For example, arithmetic equations can be solved with numbers, not just with objects.
Formal operations (beginning at ages 11-15)--By this point, the child's cognitive structures are like those of an adult and include conceptual reasoning.
Piaget outlined several principles for building cognitive structures. During all development stages, the child experiences his or her environment using whatever mental maps he or she has constructed so far. If the experience is a repeated one, it fits easily--or is assimilated--into the child's cognitive structure so that he or she maintains mental "equilibrium." If the experience is different or new, the child loses equilibrium, and alters his or her cognitive structure to accommodate the new conditions. This way, the child erects more and more adequate cognitive structures.
How Piaget's Theory Impacts LearningCurriculum--Educators must plan a developmentally appropriate curriculum that enhances their students' logical and conceptual growth.
Instruction--Teachers must emphasize the critical role that experiences--or interactions with the surrounding environment--play in student learning. For example, instructors have to take into account the role that fundamental concepts, such as the permanence of objects, play in establishing cognitive structures.
Government and binding theory
From Wikipedia, the free encyclopedia
(Redirected from Government and Binding)
Jump to: navigation, search
Government and binding is a theory of syntax in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s.[1][2][3] This theory is a radical revision of his earlier theories [4][5][6] and was later revised in The Minimalist Program (1995)[7] and several subsequent papers, the latest being Three Factors in Language Design (2005).[8] Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.
The name refers to two central subtheories of the theory: government, which is an abstract syntactic relation, and binding, which deals with the referents of pronouns, anaphors, and R-expressions. GB was the first theory to be based on the principles and parameters model of language, which also underlies the later developments of the Minimalist Program.
Contents[hide]
1 Government
2 Binding
3 Further reading
4 References
5 External links
//
[edit] Government
The main application of the government relation concerns the assignment of case. Government is defined as follows:
A governs B if and only if
A is a governor and
A m-commands B and
no barrier intervenes between A and B.
Governors are heads of the lexical categories (V, N, A, P) and tensed I (T). A m-commands B if A does not dominate B and B does not dominate A and the first maximal projection of A dominates B. The maximal projection of a head X is XP. This means that for example in a structure like the following, A m-commands B, but B does not m-command A:
In addition, barrier is defined as follows:[9] A barrier is any node Z such that
Z is a potential governor for B and
Z c-commands B and
Z does not c-command A
The government relation makes case assignment unambiguous. The tree diagram below illustrates how DPs are governed and assigned case by their governing heads:
Another important application of the government relation constrains the occurrence and identity of traces as the Empty Category Principle requires them to be properly governed.
[edit] Binding
Binding can be defined as follows:
An element α binds an element β if and only if α c-commands β, and α and β are co-referent.
Consider the sentence "John saw his mother." which is diagrammed below using simple phrase structure rules.
"John" c-commands "his" because the first non-trivial parent of "John", S, contains "his". "John" and "his" are also co-referent (they refer to the same person), therefore "John" binds "his".
On the other hand, in the sentence "A friend of John saw his mother", "John" does not c-command "his", so they have no binding relationship, regardless of whether they are co-referent (which they may be; the example is ambiguous).
The importance of binding is shown in the grammaticality of the following sentences:
*Johni saw himi. (ungrammatical with co-reference)
John saw himself. (unambiguously co-referent)
*Himself saw John. (ungrammatical)
*Johni saw Johni. (ungrammatical, unless it refers to two distinct Johns)
Binding is used, along with particular binding principles, to explain the ungrammaticality of those statements. The applicable rules are called Binding Principle A, Binding Principle B, and Binding Principle C.
Principle A states that anaphors (reflexives and reciprocals, such as "each other") must always be bound in their domains. Since there is nothing to bind "himself" in sentence [3], that principle is violated, and the sentence is ungrammatical.
Principle B states that a pronoun must never be bound within its domain. If, in sentence [1], "John" and "him" are co-referent, then there is a binding relationship between them, violating the principle and resulting in ungrammaticality.
Principle C states that R-expressions must never be bound. R-expressions are referential expressions: non-pronoun, uniquely identifiable entities, such as "the dog", or proper names such as "John". In sentence [4], the first instance of "John" binds the second, resulting in the ungrammaticality.
Note that Principles A and B refer to domains. It is difficult to define a domain in a way that explains all the data, though the definition may be related to movement islands and the Phase Impenetrability Constraint.
Transformational grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
In linguistics, a transformational grammar, or transformational-generative grammar (TGG), is a generative grammar, especially of a natural language, that has been developed in a Chomskyan tradition. Additionally, transformational grammar is the Chomskyan tradition that gives rise to specific transformational grammars. Much current research in transformational grammar is inspired by Chomsky's Minimalist Program.[
Deep and surface:
In 1957, Noam Chomsky published Syntactic Structures, in which he developed the idea that each sentence in a language has two levels of representation — a deep structure and a surface structure.[2] [3] The deep structure represented the core semantic relations of a sentence, and was mapped on to the surface structure (which followed the phonological form of the sentence very closely) via transformations. Chomsky believed that there would be considerable similarities between languages' deep structures, and that these structures would reveal properties, common to all languages, which were concealed by their surface structures. However, this was perhaps not the central motivation for introducing deep structure. Transformations had been proposed prior to the development of deep structure as a means of increasing the mathematical and descriptive power of Context-free grammars. Similarly, deep structure was devised largely for technical reasons relating to early semantic theory. Chomsky emphasizes the importance of modern formal mathematical devices in the development of grammatical theory:
But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one. Although it was well understood that linguistic processes are in some sense "creative", the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.
(Aspects of the Theory of Syntax, p. 8 [2])
[edit] Development of basic concepts
Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF — Logical Form, and PF — Phonetic Form), and then in the 1990s Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure no longer featured and PF and LF remained as the only levels of representation.
To complicate the understanding of the development of Noam Chomsky's theories, the precise meanings of Deep Structure and Surface Structure have changed over time — by the 1970s, the two were normally referred to simply as D-Structure and S-Structure by Chomskian linguists. In particular, the idea that the meaning of a sentence was determined by its Deep Structure (taken to its logical conclusions by the generative semanticists during the same period) was dropped for good by Chomskian linguists when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that meaning was determined by both Deep and Surface Structure).[
Innate linguistic knowledge
Terms such as "transformation" can give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences. Chomsky is clear that this is not in fact the case: a generative grammar models only the knowledge that underlies the human ability to speak and understand. One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to. Chomsky was not the first person to suggest that all languages had certain fundamental things in common (he quotes philosophers writing several centuries ago who had the same basic idea), but he helped to make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language. Perhaps more significantly, he made concrete and technically sophisticated proposals about the structure of language, and made important proposals regarding how the success of grammatical theories should be evaluated.
Chomsky goes so far as to suggest that a baby need not learn any actual rules specific to a particular language at all. Rather, all languages are presumed to follow the same set of rules, but the effects of these rules and the interactions between them can vary greatly depending on the values of certain universal linguistic parameters. This is a very strong assumption, and is one of the most subtle ways in which Chomsky's current theory of language differs from most others.
[edit] Grammatical theories
In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. The first was the distinction between competence and performance. Chomsky noted the obvious fact that people, when speaking in the real world, often make linguistic errors (e.g. starting a sentence and then abandoning it midway through). He argued that these errors in linguistic performance were irrelevant to the study of linguistic competence (the knowledge that allows people to construct and understand grammatical sentences). Consequently, the linguist can study an idealised version of language, greatly simplifying linguistic analysis (see the "Grammaticalness" section below). The second idea related directly to the evaluation of theories of grammar. Chomsky made a distinction between grammars which achieved descriptive adequacy and those which went further and achieved explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar which achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind; that is, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, so if a grammatical theory has explanatory adequacy it must be able to explain the various grammatical nuances of the languages of the world as relatively minor variations in the universal pattern of human language. Chomsky argued that, even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would only come if linguists held explanatory adequacy as their goal. In other words, real insight into the structure of individual languages could only be gained through the comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.
[edit] "I-Language" and "E-Language"
In 1986, Chomsky proposed a distinction between I-Language and E-Language, similar but not identical to the competence/performance distinction.[6] I-Language is taken to be the object of study in syntactic theory; it is the mentally represented linguistic knowledge that a native speaker of a language has, and is therefore a mental object — from this perspective, most of Linguistics is a branch of psychology. E-Language encompasses all other notions of what a language is, for example that it is a body of knowledge or behavioural habits shared by a community. Thus, E-Language is not itself a coherent concept[7], and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge, i.e. competence, even though they may seem sensible and intuitive, and useful in other areas of study. Competence, he argues, can only be studied if languages are treated as mental objects.
[edit] Grammaticality
Further information: Grammaticality
Chomsky argued that the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way. In contrast an extreme behaviorist linguist would argue that language can only be studied through recordings or transcriptions of actual speech, the role of the linguist being to look for patterns in such observed speech, but not to hypothesize about why such patterns might occur, nor to label particular utterances as either "grammatical" or "ungrammatical". Although few linguists in the 1950s actually took such an extreme position, Chomsky was at an opposite extreme, defining grammaticality in an unusually (for the time) mentalistic way.[8] He argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a particular string of English words elicits a double take, or feeling of wrongness in a native English speaker, it can be said that the string of words is ungrammatical (when various extraneous factors affecting intuitions are controlled for). This (according to Chomsky) is entirely distinct from the question of whether a sentence is meaningful, or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colorless green ideas sleep furiously". But such sentences manifest a linguistic problem distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as being well formed.
The use of such intuitive judgments freed syntacticians from studying language through a corpus of observed speech, since they were now able to study the grammatical properties of contrived sentences. Without this change in philosophy, the construction of generative grammars would have been almost impossible, since it is often the relatively obscure and rarely-used features of a language which give linguists clues about its structure, and it is very difficult to find good examples of such features in everyday speech.
[edit] Minimalism
Main article: Linguistic minimalism
In the mid-1990s to mid-2000s, much research in transformational grammar was inspired by Chomsky's Minimalist Program.[9] The "Minimalist Program" aims at the further development of ideas involving economy of derivation and economy of representation, which had started to become significant in the early 1990s, but were still rather peripheral aspects of TGG theory.
Economy of derivation is a principle stating that movements (i.e. transformations) only occur in order to match interpretable features with uninterpretable features. An example of an interpretable feature is the plural inflection on regular English nouns, e.g. dogs. The word dogs can only be used to refer to several dogs, not a single dog, and so this inflection contributes to meaning, making it interpretable. English verbs are inflected according to the grammatical number of their subject (e.g. "Dogs bite" vs "A dog bites"), but in most sentences this inflection just duplicates the information about number that the subject noun already has, and it is therefore uninterpretable.
Economy of representation is the principle that grammatical structures must exist for a purpose, i.e. the structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.
Both notions, as described here, are somewhat vague, and indeed the precise formulation of these principles is controversial.[10][11] An additional aspect of minimalist thought is the idea that the derivation of syntactic structures should be uniform; that is, rules should not be stipulated as applying at arbitrary points in a derivation, but instead apply throughout derivations. Minimalist approaches to phrase structure have resulted in "Bare Phrase Structure", an attempt to eliminate X-bar theory. In 1998, Chomsky suggested that derivations proceed in "phases". The distinction of Deep Structure vs. Surface Structure is not present in Minimalist theories of syntax, and the most recent phase-based theories also eliminate LF and PF as unitary levels of representation.
[edit] Mathematical representation
Returning to the more general mathematical notion of a grammar, an important feature of all transformational grammars is that they are more powerful than context free grammars.[12] This idea was formalized by Chomsky in the Chomsky hierarchy. Chomsky argued that it is impossible to describe the structure of natural languages using context free grammars.[13] His general position regarding the non-context-freeness of natural language has held up since then, although his specific examples regarding the inadequacy of CFGs in terms of their weak generative capacity were later disproven. [14] [15]
[edit] Transformations
The usual usage of the term 'transformation' in linguistics refers to a rule that takes an input typically called the Deep Structure (in the Standard Theory) or D-structure (in the extended standard theory or government and binding theory) and changes it in some restricted way to result in a Surface Structure (or S-structure). In TGG, Deep structures were generated by a set of phrase structure rules.
For example a typical transformation in TG is the operation of subject-auxiliary inversion (SAI). This rule takes as its input a declarative sentence with an auxiliary: "John has eaten all the heirloom tomatoes." and transforms it into "Has John eaten all the heirloom tomatoes?". In their original formulation (Chomsky 1957), these rules were stated as rules that held over strings of either terminals or constituent symbols or both.
X NP AUX Y X AUX NP Y
In the 1970s, by the time of the Extended Standard Theory, following the work of Joseph Emonds on structure preservation, transformations came to be viewed as holding over trees. By the end of government and binding theory in the late 1980s, transformations are no longer structure changing operations at all, instead they add information to already existing trees by copying constituents.
The earliest conceptions of transformations were that they were construction-specific devices. For example, there was a transformation that turned active sentences into passive ones. A different transformation raised embedded subjects into main clause subject position in sentences such as "John seems to have gone"; and yet a third reordered arguments in the dative alternation. With the shift from rules to principles and constraints that was found in the 1970s, these construction specific transformations morphed into general rules (all the examples just mentioned being instances of NP movement), which eventually changed into the single general rule of move alpha or Move.
Transformations actually come of two types: (i) the post-Deep structure kind mentioned above, which are string or structure changing, and (ii) Generalized Transformations (GTs). Generalized transformations were originally proposed in the earliest forms of generative grammar (e.g. Chomsky 1957). They take small structures which are either atomic or generated by other rules, and combine them. For example, the generalized transformation of embedding would take the kernel "Dave said X" and the kernel "Dan likes smoking" and combine them into "Dave said Dan likes smoking". GTs are thus structure building rather than structure changing. In the Extended Standard Theory and government and binding theory, GTs were abandoned in favor of recursive phrase structure rules. However, they are still present in tree-adjoining grammar as the Substitution and Adjunction operations and they have recently re-emerged in mainstream generative grammar in Minimalism as the operations Merge and Move.
Phrase structure rules
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Phrase-structure rules are a way to describe a given language's syntax. They are used to break a natural language sentence down into its constituent parts (also known as syntactic categories) namely phrasal categories and lexical categories (aka parts of speech). Phrasal categories include the noun phrase, verb phrase, and prepositional phrase; lexical categories include noun, verb, adjective, adverb, and many others. Phrase structure rules were commonly used in transformational grammar (TGG), although they were not an invention of TGG; rather, early TGG's added to phrase structure rules (the most obvious example being transformations; see the page transformational grammar for an overview of the development of TGG.) A grammar which uses phrase structure rules is called a phrase structure grammar - except in computer science, where it is known as just a grammar, usually context-free.
[edit] Definition
Phrase structure rules are usually of the form , meaning that the constituent A is separated into the two subconstituents B and C. Some examples are:
The first rule reads: An S consists of an NP followed by a VP. This means A sentence consists of a noun phrase followed by a verb phrase. The next one: A noun phrase consists of a determiner followed by a noun.
Further explanations of the constituents: S, Det, NP, VP, AP, PP
Associated with phrase structure rules is a famous example of a grammatically correct sentence. The sentence was constructed by Noam Chomsky as an illustration that syntactically but not semantically correct sentences are possible.
Colorless green ideas sleep furiously can be diagrammed as a phrase tree, as below:
where S represents a grammatical sentence. The theory of antisymmetry proposed in the early '90s by Richard Kayne is an attempt to derive phrase structure from a single axiom.
[edit] Alternative approaches
A number of theories of grammar dispense with the notion of phrase structure rules and operate with the notion of schema instead. Here phrase structures are not derived from rules that combine words, but from the specification or instantiation of syntactic schemata or configurations, often expressing some kind of semantic content independently of the specific words that appear in them. This approach is essentially equivalent to a system of phrase structure rules combined with a noncompositional semantic theory, since grammatical formalisms based on rewriting rules are generally equivalent in power to those based on substitution into schemata.
So, in this type of approach, instead of being derived from the application of a number of phrase structure rules, the sentence "colorless green ideas sleep furiously" would be generated by filling the words into the slots of a schema having the following structure:
(NP(ADJ N) VP(V) AP(ADV))
And which would express the following conceptual content
X DOES Y IN THE MANNER OF Z
Though they are noncompositional, such models are monotonic. This approach is highly developed within Construction grammar, and has had some influence in Head-Driven Phrase Structure Grammar and Lexical functional grammar.
Generative grammar
Main article: Generative grammar
Generative grammar hypothesizes that language is a mental structure of the human mind. The goal of generative grammar is to make a complete model of this inner-language (or i-language) which could be used to describe all human speech, and predict the grammaticality of any given speech utterance (that is, whether speech would sound correct to native speakers of the language). This approach to language was pioneered by Noam Chomsky. Most generative theories (although not all of them) assume that syntax is based in constituent structure. Generative grammars are among the theories that focus primarily on the form of the sentence rather than the function.
Among the many Chomskyan generative theories of linguistics are:
Transformational Grammar (TG) (now largely out of date)
Government and binding theory (GB) (common in the late 1970s and 1980s)
Minimalism (MP) (the most recent Chomskyan version of generative grammar)
Other theories that find their origin in the generative paradigm are:
Generative semantics (now largely out of date)
Relational grammar (RG) (now largely out of date)
Arc Pair grammar
Generalised phrase structure grammar (now largely out of date)
Head-driven phrase structure grammar
Lexical-functional grammar
HPSG and LFG also fall in the category of unification grammars.
[edit] Categorial grammar
Categorial grammar is an approach that focuses on the combinatoric properties of categories. For example, an intransitive verb has the property that it requires a noun phrase (NP) to complete it and the result is a sentence (S) thus the category of such a verb is NP\S (in one notation).
Tree-adjoining grammar is a categorial grammar but adds in partial tree structures to the categories
[edit] Dependency grammar
Dependency grammar is a different type of approach in which structure is determined by the relation between a word (a head) and its dependents rather than being based in constituent structure.
Some dependency-based theories of Syntax
Algebraic syntax
Word grammar
Operator Grammar
[edit] Stochastic/Probabilistic grammars/Network Theories
Theoretical approaches to syntax that are based in probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a Neural network or Connectionism. Some theories based in this are:
Optimality Theory
stochastic context-free grammar
[edit] Functionalist grammars
Functionalist theories, although concerned about form, are driven by explanation based in the function of a sentence (i.e. its communicative function). Some typical functionalist theories include:
Functional grammar (Dik)
Prague Linguistic Circle
Systemic functional grammar
Cognitive grammar
Construction grammar (CxG)
Role and reference grammar (RRG)
Functional grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Functional Grammar is a model of grammar motivated by functions. The model was originally developed by Simon C. Dik at the University of Amsterdam in the 1980s, and has undergone several revisions ever since, the latest one being the integration of discourse as major component by Kees Hengeveld. This has led to a renaming of the theory to "Functional Discourse Grammar". This type of grammar is quite distinct from systemic functional grammar as developed by Michael Halliday and many other linguists since the 1970s.
[edit] "Functions"
The notion of "function" in FG generalizes the standard distinction of grammatical functions such as subject and object. Constituents (parts of speech) of a linguistic utterance are assigned three types or levels of functions:
Semantic function (Agent, Patient, Recipient, etc.), describing the role of participants in states of affairs or actions expressed
Syntactic functions (Subject and Object), defining different perspectives in the presentation of a linguistic expression
Pragmatic functions (Theme and Tail, Topic and Focus), defining the informational status of constituents, determined by the pragmatic context of the verbal interaction
case grammar
Case Grammar is a system of linguistic analysis, focusing on the link between the valence of a verb and the grammatical context it requires, created by the American linguist Charles J. Fillmore in (1968), in the context of Transformational Grammar. This theory analyzes the surface syntactic structure of sentences by studying the combination of deep cases (i.e. semantic roles) -- Agent, Object, Benefactor, Location or Instrument -- which are required by a specific verb. For instance, the verb "give" in English requires an Agent (A) and Object (O), and a Beneficiary (B); e.g. "Jones (A) gave money (O) to the school (B).
According to Fillmore, each verb selects a certain number of deep cases which form its case frame. Thus, a case frame describes important aspects of semantic valency, of verbs, adjectives and nouns. Case frames are subject to certain constraints, such as that a deep case can occur only once per sentence. Some of the cases are obligatory and others are optional. Obligatory cases may not be deleted, at the risk of producing ungrammatical sentences. For example, Mary gave the apples is ungrammatical in this sense.
A fundamental hypothesis of case grammar is that grammatical functions, such as subject or object, are determined by the deep, semantic valence of the verb, which finds its syntactic correlate in such grammatical categories as Subject and Object, and in grammatical cases such as Nominative, Accusative, etc. Fillmore (1968) puts forwards the following hierarchy for a universal subject selection rule:
Agent < Instrumental < Objective
That means that if the case frame of a verb contains an agent, this one is realized as the subject of an active sentence; otherwise, the deep case following the agent in the hierarchy (i.e. Instrumental) is promoted to subject.
The influence of case grammar on contemporary linguistics has been significant, to the extent that numerous linguistic theories incorporate deep roles in one or other form, such as the so-called Thematic Structure in Government and Binding theory. It has also inspired the development of frame-based representations in AI research[citation needed].
During the 1970s and the 1980s, Charles Fillmore developed his original theory onto what was called Frame Semantics. Walter A. Cook, SJ, a linguistics professor at Georgetown University, was one of the foremost case grammar theoreticians following Fillmore's originial work. Cook devoted most of his scholarly research from the early 1970s until 1990s to further developing case grammar as a tool for linguistic analysis, language teaching methodology, and other applications, and was the author of several major texts and many articles in case grammar. Cook directed several doctoral dissertations applying case grammar to various areas of theoretical and applied linguistics research.
universal grammar
Universal grammar is a theory of linguistics postulating principles of grammar shared by all languages, thought to be innate to humans (linguistic nativism). It attempts to explain language acquisition in general, not describe specific languages. Universal grammar proposes a set of rules intended to explain language acquisition in child development. The application of the idea to the area of second language acquisition (SLA) is represented mainly by the McGill University linguist Lydia White.
Some students of universal grammar study a variety of grammars to abstract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a range of traits, from the phonemes found in languages, to what word orders languages choose, to why children exhibit certain linguistic behaviors. as they considered issues of the Argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The contrasting school of thought is known as functionalism.
History
The idea can be traced to Roger Bacon's observation that all languages are built upon a common grammar, substantially the same in all languages, even though it may undergo accidental variations, and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. Charles Darwin described language as an instinct in humans, like the upright posture[1]
The idea rose to notability in modern linguistics with theorists such as Noam Chomsky and Richard Montague, developed in the 1950s to 1970s, as part of the "Linguistics Wars".
[edit] Chomsky's theory
Further information: Language acquisition device, Generative grammar, X-bar theory, Government and Binding, Principles and parameters, and Minimalist Program
Linguist Noam Chomsky made the argument that the human brain contains a limited set of rules for organizing language. In turn, there is an assumption that all languages have a common structural basis. This set of rules is known as universal grammar.
Speakers proficient in a language know what expressions are acceptable in their language and what expressions are unacceptable. The key puzzle is how speakers should come to know the restrictions of their language, since expressions which violate those restrictions are not present in the input, indicated as such. This absence of negative evidence -- that is, absence of evidence that an expression is part of a class of the ungrammatical sentences in one's language -- is the core of poverty of stimulus argument. For example, in English one cannot relate a question word like 'what' to a predicate within a relative clause (1):
(1) *What did John meet a man who sold?
Such expressions are not available to the language learners, because they are, by hypothesis, ungrammatical for speakers of the local language. Speakers of the local language do not utter such expressions and note that they are unacceptable to language learners. Universal grammar offers a solution to the poverty of the stimulus problem by making certain restrictions universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
The presence of creole languages is cited as further support for this theory. These languages were developed and formed when different societies came together and devised their own system of language. Originally these languages were pidgins and later became more mature languages that developed some sense of rules and native speakers.
The idea of universal grammar is supported by the creole languages by virtue of the fact that all or most of these languages share certain features. Syntactically, they use participles to form future and past tenses and multiple negation to deny or negate. Another similarity among creoles is that a question can be implemented by changing inflection rather than changing words.
[edit] Criticism
Some linguists oppose the universal grammar theory. It is outspokenly opposed by Geoffrey Sampson, who maintains that universal grammar theories are not falsifiable, arguing that the grammatical generalizations made are simply observations about existing languages and not predictions about what is possible in a language.
Some feel that the basic assumptions of Universal Grammar are unfounded. Another way of defusing the poverty of the stimulus argument is if language learners notice the absence of classes of expressions in the input and, on this basis, hypothesize a restriction. This solution is closely related to Bayesian reasoning. Elman et al. argue that the unlearnability of languages assumed by UG is based on a too-strict, "worst-case" model of grammar.
Critics argue that the postulate of a "language acquisition device" essentially amounts to the trivial claim that languages are, in fact, learnt by humans, and that the LAD isn't a theory so much as the explanandum looking for theories.[2]
The Pirahã language has been claimed by the linguist Daniel Everett to be a counterexample to Universal Grammar, showing properties allegedly unexpected under current views of Universal Grammar. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and color terms.[3] Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of Universal Grammar.[4] While most languages studied in that respect do indeed seem to share common underlying rules, research is hampered by considerable sampling bias. Linguistically most diverse areas such as tropical Africa and America, as well as the diversity of Indigenous Australian and Papuan languages have been insufficiently studied. Furthermore, language extinction apparently has affected those areas most where most examples of unconventional languages have been found to date[citation needed].
Guidance
The comprehensive school counseling program refers to a sequential, developmental program designed to benefit all students in preparation for their futures. Such a program includes a curriculum organized around three areas essential for students growth and development: Academic Development, Career Development, and Personal/Social Development.
Demonstrate a positive attitude toward self as a unique and worthy person. Gain life-planning skills that are consistent with their needs, interests, and abilities. Develop responsible social skills and an understanding and appreciation of being a contributing member of society. Demonstrate an understanding and appreciation of the life-long process of learning, growing, and changing. Activities and strategies for achieving identified student outcomes in these three areas can be integrated across the curriculum by teachers and counselors. A goal for this guide is to illustrate the connectivity between the National Standards, the ABCs Goals, the SCANS, and the National Career Development Guidelines. This Guidance Curriculum for a Comprehensive School Counseling Program is student centered and teacher friendly. Counselors should use it as a blueprint for collaboratively building a sequential and developmentally appropriate school counseling program.
Behaviourism
Behavioural (or "behavioral") theory in psychology is a very substantial field: follow the links to the left or right for introductions to some of its more detailed contributions impinging on how people learn in the real world. How I have the effrontery to produce a single page on it amazes even me, whatever my reservations about it!
Behaviourism is primarily associated with Pavlov (classical conditioning) in Russia and with Thorndike, Watson and particularly Skinner in the United States (operant conditioning).
Behaviourism is dominated by the constraints of its (naïve) attempts to emulate the physical sciences, which entails a refusal to speculate about what happens inside the organism. Anything which relaxes this requirement slips into the cognitive realm.
Much behaviourist experimentation is undertaken with animals and generalised.
In educational settings, behaviourism implies the dominance of the teacher, as in behaviour modification programmes. It can, however, be applied to an understanding of unintended learning.
For our purposes, behaviourism is relevant mainly to:
Skill development, and
The "substrate" (or "conditions", as Gagné puts it) of learning
If you want to follow your own links, use "behaviorism" (sic.) Most of the material is US-based and "behaviorism" and "behaviorist" is how they spell it, and I freely admit that this side-bar is purely to get the stupid search engine "bots" to register "behavior"
Classical conditioning:
is the process of reflex learning—investigated by Pavlov—through which an unconditioned stimulus (e.g. food) which produces an unconditioned response (salivation) is presented together with a conditioned stimulus (a bell), such that the salivation is eventually produced on the presentation of the conditioned stimulus alone, thus becoming a conditioned response.
This is a disciplined account of our common-sense experience of learning by association (or "contiguity", in the jargon), although that is often much more complex than a reflex process, and is much exploited in advertising. Note that it does not depend on us doing anything.
Such associations can be chained and generalised (for better of for worse): thus "smell of baking" associates with "kitchen at home in childhood" associates with "love and care". (Smell creates potent conditioning because of the way it is perceived by the brain.) But "sitting at a desk" associates with "classroom at school" and hence perhaps with "humiliation and failure"...
More on Pavlov
Operant Conditioning
If, when an organism emits a behaviour (does something), the consequences of that behaviour are reinforcing, it is more likely to emit (do) it again. What counts as reinforcement, of course, is based on the evidence of the repeated behaviour, which makes the whole argument rather circular.
Learning is really about the increased probability of a behaviour based on reinforcement which has taken place in the past, so that the antecedents of the new behaviour include the consequences of previous behaviour.
Summary of Skinner'sideas
On operantconditioning
Skinner's own account
Wikipedia on operant conditioning
The schedule of reinforcement of behaviour is central to the management of effective learning on this basis, and working it out is a very skilled procedure: simply reinforcing every instance of desired behaviour is just bribery, not the promotion of learning.
Withdrawal of reinforcement eventually leads to the extinction of the behaviour, except in some special cases such as anticipatory-avoidance learning.
BehaviorismDefinitionBehaviorism is a theory of animal and human learning that only focuses on objectively observable behaviors and discounts mental activities. Behavior theorists define learning as nothing more than the acquisition of new behavior.
DiscussionExperiments by behaviorists identify conditioning as a universal learning process. There are two different types of conditioning, each yielding a different behavioral pattern:
Classic conditioning occurs when a natural reflex responds to a stimulus. The most popular example is Pavlov's observation that dogs salivate when they eat or even see food. Essentially, animals and people are biologically "wired" so that a certain stimulus will produce a specific response.
Behavioral or operant conditioning occurs when a response to a stimulus is reinforced. Basically, operant conditioning is a simple feedback system: If a reward or reinforcement follows the response to a stimulus, then the response becomes more probable in the future. For example, leading behaviorist B.F. Skinner used reinforcement techniques to teach pigeons to dance and bowl a ball in a mini-alley.
There have been many criticisms of behaviorism, including the following:
Behaviorism does not account for all kinds of learning, since it disregards the activities of the mind.
Behaviorism does not explain some learning--such as the recognition of new language patterns by young children--for which there is no reinforcement mechanism.
Reserach has shown that animals adapt their reinforced patterns to new information. For instance, a rat can shift its behavior to respond to changes in the layout of a maze it had previously mastered through reinforcements.
How Behaviorism Impacts LearningThis theory is relatively simple to understand because it relies only on observable behavior and describes several universal laws of behavior. Its positive and negative reinforcement techniques can be very effective--both in animals, and in treatments for human disorders such as autism and antisocial behavior. Behaviorism often is used by teachers, who reward or punish student behaviors.
PiagetDefinitionSwiss biologist and psychologist Jean Piaget (1896-1980) is renowned for constructing a highly influential model of child development and learning. Piaget's theory is based on the idea that the developing child builds cognitive structures--in other words, mental "maps," schemes, or networked concepts for understanding and responding to physical experiences within his or her environment. Piaget further attested that a child's cognitive structure increases in sophistication with development, moving from a few innate reflexes such as crying and sucking to highly complex mental activities.
DiscussionPiaget's theory identifies four developmental stages and the processes by which children progress through them. The four stages are:
Sensorimotor stage (birth - 2 years old)--The child, through physical interaction with his or her environment, builds a set of concepts about reality and how it works. This is the stage where a child does not know that physical objects remain in existence even when out of sight (object permanance).
Preoperational stage (ages 2-7)--The child is not yet able to conceptualize abstractly and needs concrete physical situations.
Concrete operations (ages 7-11)--As physical experience accumulates, the child starts to conceptualize, creating logical structures that explain his or her physical experiences. Abstract problem solving is also possible at this stage. For example, arithmetic equations can be solved with numbers, not just with objects.
Formal operations (beginning at ages 11-15)--By this point, the child's cognitive structures are like those of an adult and include conceptual reasoning.
Piaget outlined several principles for building cognitive structures. During all development stages, the child experiences his or her environment using whatever mental maps he or she has constructed so far. If the experience is a repeated one, it fits easily--or is assimilated--into the child's cognitive structure so that he or she maintains mental "equilibrium." If the experience is different or new, the child loses equilibrium, and alters his or her cognitive structure to accommodate the new conditions. This way, the child erects more and more adequate cognitive structures.
How Piaget's Theory Impacts LearningCurriculum--Educators must plan a developmentally appropriate curriculum that enhances their students' logical and conceptual growth.
Instruction--Teachers must emphasize the critical role that experiences--or interactions with the surrounding environment--play in student learning. For example, instructors have to take into account the role that fundamental concepts, such as the permanence of objects, play in establishing cognitive structures.
Government and binding theory
From Wikipedia, the free encyclopedia
(Redirected from Government and Binding)
Jump to: navigation, search
Government and binding is a theory of syntax in the tradition of transformational grammar developed principally by Noam Chomsky in the 1980s.[1][2][3] This theory is a radical revision of his earlier theories [4][5][6] and was later revised in The Minimalist Program (1995)[7] and several subsequent papers, the latest being Three Factors in Language Design (2005).[8] Although there is a large literature on government and binding theory which is not written by Chomsky, Chomsky's papers have been foundational in setting the research agenda.
The name refers to two central subtheories of the theory: government, which is an abstract syntactic relation, and binding, which deals with the referents of pronouns, anaphors, and R-expressions. GB was the first theory to be based on the principles and parameters model of language, which also underlies the later developments of the Minimalist Program.
Contents[hide]
1 Government
2 Binding
3 Further reading
4 References
5 External links
//
[edit] Government
The main application of the government relation concerns the assignment of case. Government is defined as follows:
A governs B if and only if
A is a governor and
A m-commands B and
no barrier intervenes between A and B.
Governors are heads of the lexical categories (V, N, A, P) and tensed I (T). A m-commands B if A does not dominate B and B does not dominate A and the first maximal projection of A dominates B. The maximal projection of a head X is XP. This means that for example in a structure like the following, A m-commands B, but B does not m-command A:
In addition, barrier is defined as follows:[9] A barrier is any node Z such that
Z is a potential governor for B and
Z c-commands B and
Z does not c-command A
The government relation makes case assignment unambiguous. The tree diagram below illustrates how DPs are governed and assigned case by their governing heads:
Another important application of the government relation constrains the occurrence and identity of traces as the Empty Category Principle requires them to be properly governed.
[edit] Binding
Binding can be defined as follows:
An element α binds an element β if and only if α c-commands β, and α and β are co-referent.
Consider the sentence "John saw his mother." which is diagrammed below using simple phrase structure rules.
"John" c-commands "his" because the first non-trivial parent of "John", S, contains "his". "John" and "his" are also co-referent (they refer to the same person), therefore "John" binds "his".
On the other hand, in the sentence "A friend of John saw his mother", "John" does not c-command "his", so they have no binding relationship, regardless of whether they are co-referent (which they may be; the example is ambiguous).
The importance of binding is shown in the grammaticality of the following sentences:
*Johni saw himi. (ungrammatical with co-reference)
John saw himself. (unambiguously co-referent)
*Himself saw John. (ungrammatical)
*Johni saw Johni. (ungrammatical, unless it refers to two distinct Johns)
Binding is used, along with particular binding principles, to explain the ungrammaticality of those statements. The applicable rules are called Binding Principle A, Binding Principle B, and Binding Principle C.
Principle A states that anaphors (reflexives and reciprocals, such as "each other") must always be bound in their domains. Since there is nothing to bind "himself" in sentence [3], that principle is violated, and the sentence is ungrammatical.
Principle B states that a pronoun must never be bound within its domain. If, in sentence [1], "John" and "him" are co-referent, then there is a binding relationship between them, violating the principle and resulting in ungrammaticality.
Principle C states that R-expressions must never be bound. R-expressions are referential expressions: non-pronoun, uniquely identifiable entities, such as "the dog", or proper names such as "John". In sentence [4], the first instance of "John" binds the second, resulting in the ungrammaticality.
Note that Principles A and B refer to domains. It is difficult to define a domain in a way that explains all the data, though the definition may be related to movement islands and the Phase Impenetrability Constraint.
Transformational grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
In linguistics, a transformational grammar, or transformational-generative grammar (TGG), is a generative grammar, especially of a natural language, that has been developed in a Chomskyan tradition. Additionally, transformational grammar is the Chomskyan tradition that gives rise to specific transformational grammars. Much current research in transformational grammar is inspired by Chomsky's Minimalist Program.[
Deep and surface:
In 1957, Noam Chomsky published Syntactic Structures, in which he developed the idea that each sentence in a language has two levels of representation — a deep structure and a surface structure.[2] [3] The deep structure represented the core semantic relations of a sentence, and was mapped on to the surface structure (which followed the phonological form of the sentence very closely) via transformations. Chomsky believed that there would be considerable similarities between languages' deep structures, and that these structures would reveal properties, common to all languages, which were concealed by their surface structures. However, this was perhaps not the central motivation for introducing deep structure. Transformations had been proposed prior to the development of deep structure as a means of increasing the mathematical and descriptive power of Context-free grammars. Similarly, deep structure was devised largely for technical reasons relating to early semantic theory. Chomsky emphasizes the importance of modern formal mathematical devices in the development of grammatical theory:
But the fundamental reason for [the] inadequacy of traditional grammars is a more technical one. Although it was well understood that linguistic processes are in some sense "creative", the technical devices for expressing a system of recursive processes were simply not available until much more recently. In fact, a real understanding of how a language can (in Humboldt's words) "make infinite use of finite means" has developed only within the last thirty years, in the course of studies in the foundations of mathematics.
(Aspects of the Theory of Syntax, p. 8 [2])
[edit] Development of basic concepts
Though transformations continue to be important in Chomsky's current theories, he has now abandoned the original notion of Deep Structure and Surface Structure. Initially, two additional levels of representation were introduced (LF — Logical Form, and PF — Phonetic Form), and then in the 1990s Chomsky sketched out a new program of research known as Minimalism, in which Deep Structure and Surface Structure no longer featured and PF and LF remained as the only levels of representation.
To complicate the understanding of the development of Noam Chomsky's theories, the precise meanings of Deep Structure and Surface Structure have changed over time — by the 1970s, the two were normally referred to simply as D-Structure and S-Structure by Chomskian linguists. In particular, the idea that the meaning of a sentence was determined by its Deep Structure (taken to its logical conclusions by the generative semanticists during the same period) was dropped for good by Chomskian linguists when LF took over this role (previously, Chomsky and Ray Jackendoff had begun to argue that meaning was determined by both Deep and Surface Structure).[
Innate linguistic knowledge
Terms such as "transformation" can give the impression that theories of transformational generative grammar are intended as a model for the processes through which the human mind constructs and understands sentences. Chomsky is clear that this is not in fact the case: a generative grammar models only the knowledge that underlies the human ability to speak and understand. One of the most important of Chomsky's ideas is that most of this knowledge is innate, with the result that a baby can have a large body of prior knowledge about the structure of language in general, and need only actually learn the idiosyncratic features of the language(s) it is exposed to. Chomsky was not the first person to suggest that all languages had certain fundamental things in common (he quotes philosophers writing several centuries ago who had the same basic idea), but he helped to make the innateness theory respectable after a period dominated by more behaviorist attitudes towards language. Perhaps more significantly, he made concrete and technically sophisticated proposals about the structure of language, and made important proposals regarding how the success of grammatical theories should be evaluated.
Chomsky goes so far as to suggest that a baby need not learn any actual rules specific to a particular language at all. Rather, all languages are presumed to follow the same set of rules, but the effects of these rules and the interactions between them can vary greatly depending on the values of certain universal linguistic parameters. This is a very strong assumption, and is one of the most subtle ways in which Chomsky's current theory of language differs from most others.
[edit] Grammatical theories
In the 1960s, Chomsky introduced two central ideas relevant to the construction and evaluation of grammatical theories. The first was the distinction between competence and performance. Chomsky noted the obvious fact that people, when speaking in the real world, often make linguistic errors (e.g. starting a sentence and then abandoning it midway through). He argued that these errors in linguistic performance were irrelevant to the study of linguistic competence (the knowledge that allows people to construct and understand grammatical sentences). Consequently, the linguist can study an idealised version of language, greatly simplifying linguistic analysis (see the "Grammaticalness" section below). The second idea related directly to the evaluation of theories of grammar. Chomsky made a distinction between grammars which achieved descriptive adequacy and those which went further and achieved explanatory adequacy. A descriptively adequate grammar for a particular language defines the (infinite) set of grammatical sentences in that language; that is, it describes the language in its entirety. A grammar which achieves explanatory adequacy has the additional property that it gives an insight into the underlying linguistic structures in the human mind; that is, it does not merely describe the grammar of a language, but makes predictions about how linguistic knowledge is mentally represented. For Chomsky, the nature of such mental representations is largely innate, so if a grammatical theory has explanatory adequacy it must be able to explain the various grammatical nuances of the languages of the world as relatively minor variations in the universal pattern of human language. Chomsky argued that, even though linguists were still a long way from constructing descriptively adequate grammars, progress in terms of descriptive adequacy would only come if linguists held explanatory adequacy as their goal. In other words, real insight into the structure of individual languages could only be gained through the comparative study of a wide range of languages, on the assumption that they are all cut from the same cloth.
[edit] "I-Language" and "E-Language"
In 1986, Chomsky proposed a distinction between I-Language and E-Language, similar but not identical to the competence/performance distinction.[6] I-Language is taken to be the object of study in syntactic theory; it is the mentally represented linguistic knowledge that a native speaker of a language has, and is therefore a mental object — from this perspective, most of Linguistics is a branch of psychology. E-Language encompasses all other notions of what a language is, for example that it is a body of knowledge or behavioural habits shared by a community. Thus, E-Language is not itself a coherent concept[7], and Chomsky argues that such notions of language are not useful in the study of innate linguistic knowledge, i.e. competence, even though they may seem sensible and intuitive, and useful in other areas of study. Competence, he argues, can only be studied if languages are treated as mental objects.
[edit] Grammaticality
Further information: Grammaticality
Chomsky argued that the notions "grammatical" and "ungrammatical" could be defined in a meaningful and useful way. In contrast an extreme behaviorist linguist would argue that language can only be studied through recordings or transcriptions of actual speech, the role of the linguist being to look for patterns in such observed speech, but not to hypothesize about why such patterns might occur, nor to label particular utterances as either "grammatical" or "ungrammatical". Although few linguists in the 1950s actually took such an extreme position, Chomsky was at an opposite extreme, defining grammaticality in an unusually (for the time) mentalistic way.[8] He argued that the intuition of a native speaker is enough to define the grammaticalness of a sentence; that is, if a particular string of English words elicits a double take, or feeling of wrongness in a native English speaker, it can be said that the string of words is ungrammatical (when various extraneous factors affecting intuitions are controlled for). This (according to Chomsky) is entirely distinct from the question of whether a sentence is meaningful, or can be understood. It is possible for a sentence to be both grammatical and meaningless, as in Chomsky's famous example "colorless green ideas sleep furiously". But such sentences manifest a linguistic problem distinct from that posed by meaningful but ungrammatical (non)-sentences such as "man the bit sandwich the", the meaning of which is fairly clear, but which no native speaker would accept as being well formed.
The use of such intuitive judgments freed syntacticians from studying language through a corpus of observed speech, since they were now able to study the grammatical properties of contrived sentences. Without this change in philosophy, the construction of generative grammars would have been almost impossible, since it is often the relatively obscure and rarely-used features of a language which give linguists clues about its structure, and it is very difficult to find good examples of such features in everyday speech.
[edit] Minimalism
Main article: Linguistic minimalism
In the mid-1990s to mid-2000s, much research in transformational grammar was inspired by Chomsky's Minimalist Program.[9] The "Minimalist Program" aims at the further development of ideas involving economy of derivation and economy of representation, which had started to become significant in the early 1990s, but were still rather peripheral aspects of TGG theory.
Economy of derivation is a principle stating that movements (i.e. transformations) only occur in order to match interpretable features with uninterpretable features. An example of an interpretable feature is the plural inflection on regular English nouns, e.g. dogs. The word dogs can only be used to refer to several dogs, not a single dog, and so this inflection contributes to meaning, making it interpretable. English verbs are inflected according to the grammatical number of their subject (e.g. "Dogs bite" vs "A dog bites"), but in most sentences this inflection just duplicates the information about number that the subject noun already has, and it is therefore uninterpretable.
Economy of representation is the principle that grammatical structures must exist for a purpose, i.e. the structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality.
Both notions, as described here, are somewhat vague, and indeed the precise formulation of these principles is controversial.[10][11] An additional aspect of minimalist thought is the idea that the derivation of syntactic structures should be uniform; that is, rules should not be stipulated as applying at arbitrary points in a derivation, but instead apply throughout derivations. Minimalist approaches to phrase structure have resulted in "Bare Phrase Structure", an attempt to eliminate X-bar theory. In 1998, Chomsky suggested that derivations proceed in "phases". The distinction of Deep Structure vs. Surface Structure is not present in Minimalist theories of syntax, and the most recent phase-based theories also eliminate LF and PF as unitary levels of representation.
[edit] Mathematical representation
Returning to the more general mathematical notion of a grammar, an important feature of all transformational grammars is that they are more powerful than context free grammars.[12] This idea was formalized by Chomsky in the Chomsky hierarchy. Chomsky argued that it is impossible to describe the structure of natural languages using context free grammars.[13] His general position regarding the non-context-freeness of natural language has held up since then, although his specific examples regarding the inadequacy of CFGs in terms of their weak generative capacity were later disproven. [14] [15]
[edit] Transformations
The usual usage of the term 'transformation' in linguistics refers to a rule that takes an input typically called the Deep Structure (in the Standard Theory) or D-structure (in the extended standard theory or government and binding theory) and changes it in some restricted way to result in a Surface Structure (or S-structure). In TGG, Deep structures were generated by a set of phrase structure rules.
For example a typical transformation in TG is the operation of subject-auxiliary inversion (SAI). This rule takes as its input a declarative sentence with an auxiliary: "John has eaten all the heirloom tomatoes." and transforms it into "Has John eaten all the heirloom tomatoes?". In their original formulation (Chomsky 1957), these rules were stated as rules that held over strings of either terminals or constituent symbols or both.
X NP AUX Y X AUX NP Y
In the 1970s, by the time of the Extended Standard Theory, following the work of Joseph Emonds on structure preservation, transformations came to be viewed as holding over trees. By the end of government and binding theory in the late 1980s, transformations are no longer structure changing operations at all, instead they add information to already existing trees by copying constituents.
The earliest conceptions of transformations were that they were construction-specific devices. For example, there was a transformation that turned active sentences into passive ones. A different transformation raised embedded subjects into main clause subject position in sentences such as "John seems to have gone"; and yet a third reordered arguments in the dative alternation. With the shift from rules to principles and constraints that was found in the 1970s, these construction specific transformations morphed into general rules (all the examples just mentioned being instances of NP movement), which eventually changed into the single general rule of move alpha or Move.
Transformations actually come of two types: (i) the post-Deep structure kind mentioned above, which are string or structure changing, and (ii) Generalized Transformations (GTs). Generalized transformations were originally proposed in the earliest forms of generative grammar (e.g. Chomsky 1957). They take small structures which are either atomic or generated by other rules, and combine them. For example, the generalized transformation of embedding would take the kernel "Dave said X" and the kernel "Dan likes smoking" and combine them into "Dave said Dan likes smoking". GTs are thus structure building rather than structure changing. In the Extended Standard Theory and government and binding theory, GTs were abandoned in favor of recursive phrase structure rules. However, they are still present in tree-adjoining grammar as the Substitution and Adjunction operations and they have recently re-emerged in mainstream generative grammar in Minimalism as the operations Merge and Move.
Phrase structure rules
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Phrase-structure rules are a way to describe a given language's syntax. They are used to break a natural language sentence down into its constituent parts (also known as syntactic categories) namely phrasal categories and lexical categories (aka parts of speech). Phrasal categories include the noun phrase, verb phrase, and prepositional phrase; lexical categories include noun, verb, adjective, adverb, and many others. Phrase structure rules were commonly used in transformational grammar (TGG), although they were not an invention of TGG; rather, early TGG's added to phrase structure rules (the most obvious example being transformations; see the page transformational grammar for an overview of the development of TGG.) A grammar which uses phrase structure rules is called a phrase structure grammar - except in computer science, where it is known as just a grammar, usually context-free.
[edit] Definition
Phrase structure rules are usually of the form , meaning that the constituent A is separated into the two subconstituents B and C. Some examples are:
The first rule reads: An S consists of an NP followed by a VP. This means A sentence consists of a noun phrase followed by a verb phrase. The next one: A noun phrase consists of a determiner followed by a noun.
Further explanations of the constituents: S, Det, NP, VP, AP, PP
Associated with phrase structure rules is a famous example of a grammatically correct sentence. The sentence was constructed by Noam Chomsky as an illustration that syntactically but not semantically correct sentences are possible.
Colorless green ideas sleep furiously can be diagrammed as a phrase tree, as below:
where S represents a grammatical sentence. The theory of antisymmetry proposed in the early '90s by Richard Kayne is an attempt to derive phrase structure from a single axiom.
[edit] Alternative approaches
A number of theories of grammar dispense with the notion of phrase structure rules and operate with the notion of schema instead. Here phrase structures are not derived from rules that combine words, but from the specification or instantiation of syntactic schemata or configurations, often expressing some kind of semantic content independently of the specific words that appear in them. This approach is essentially equivalent to a system of phrase structure rules combined with a noncompositional semantic theory, since grammatical formalisms based on rewriting rules are generally equivalent in power to those based on substitution into schemata.
So, in this type of approach, instead of being derived from the application of a number of phrase structure rules, the sentence "colorless green ideas sleep furiously" would be generated by filling the words into the slots of a schema having the following structure:
(NP(ADJ N) VP(V) AP(ADV))
And which would express the following conceptual content
X DOES Y IN THE MANNER OF Z
Though they are noncompositional, such models are monotonic. This approach is highly developed within Construction grammar, and has had some influence in Head-Driven Phrase Structure Grammar and Lexical functional grammar.
Generative grammar
Main article: Generative grammar
Generative grammar hypothesizes that language is a mental structure of the human mind. The goal of generative grammar is to make a complete model of this inner-language (or i-language) which could be used to describe all human speech, and predict the grammaticality of any given speech utterance (that is, whether speech would sound correct to native speakers of the language). This approach to language was pioneered by Noam Chomsky. Most generative theories (although not all of them) assume that syntax is based in constituent structure. Generative grammars are among the theories that focus primarily on the form of the sentence rather than the function.
Among the many Chomskyan generative theories of linguistics are:
Transformational Grammar (TG) (now largely out of date)
Government and binding theory (GB) (common in the late 1970s and 1980s)
Minimalism (MP) (the most recent Chomskyan version of generative grammar)
Other theories that find their origin in the generative paradigm are:
Generative semantics (now largely out of date)
Relational grammar (RG) (now largely out of date)
Arc Pair grammar
Generalised phrase structure grammar (now largely out of date)
Head-driven phrase structure grammar
Lexical-functional grammar
HPSG and LFG also fall in the category of unification grammars.
[edit] Categorial grammar
Categorial grammar is an approach that focuses on the combinatoric properties of categories. For example, an intransitive verb has the property that it requires a noun phrase (NP) to complete it and the result is a sentence (S) thus the category of such a verb is NP\S (in one notation).
Tree-adjoining grammar is a categorial grammar but adds in partial tree structures to the categories
[edit] Dependency grammar
Dependency grammar is a different type of approach in which structure is determined by the relation between a word (a head) and its dependents rather than being based in constituent structure.
Some dependency-based theories of Syntax
Algebraic syntax
Word grammar
Operator Grammar
[edit] Stochastic/Probabilistic grammars/Network Theories
Theoretical approaches to syntax that are based in probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a Neural network or Connectionism. Some theories based in this are:
Optimality Theory
stochastic context-free grammar
[edit] Functionalist grammars
Functionalist theories, although concerned about form, are driven by explanation based in the function of a sentence (i.e. its communicative function). Some typical functionalist theories include:
Functional grammar (Dik)
Prague Linguistic Circle
Systemic functional grammar
Cognitive grammar
Construction grammar (CxG)
Role and reference grammar (RRG)
Functional grammar
From Wikipedia, the free encyclopedia
Jump to: navigation, search
Functional Grammar is a model of grammar motivated by functions. The model was originally developed by Simon C. Dik at the University of Amsterdam in the 1980s, and has undergone several revisions ever since, the latest one being the integration of discourse as major component by Kees Hengeveld. This has led to a renaming of the theory to "Functional Discourse Grammar". This type of grammar is quite distinct from systemic functional grammar as developed by Michael Halliday and many other linguists since the 1970s.
[edit] "Functions"
The notion of "function" in FG generalizes the standard distinction of grammatical functions such as subject and object. Constituents (parts of speech) of a linguistic utterance are assigned three types or levels of functions:
Semantic function (Agent, Patient, Recipient, etc.), describing the role of participants in states of affairs or actions expressed
Syntactic functions (Subject and Object), defining different perspectives in the presentation of a linguistic expression
Pragmatic functions (Theme and Tail, Topic and Focus), defining the informational status of constituents, determined by the pragmatic context of the verbal interaction
case grammar
Case Grammar is a system of linguistic analysis, focusing on the link between the valence of a verb and the grammatical context it requires, created by the American linguist Charles J. Fillmore in (1968), in the context of Transformational Grammar. This theory analyzes the surface syntactic structure of sentences by studying the combination of deep cases (i.e. semantic roles) -- Agent, Object, Benefactor, Location or Instrument -- which are required by a specific verb. For instance, the verb "give" in English requires an Agent (A) and Object (O), and a Beneficiary (B); e.g. "Jones (A) gave money (O) to the school (B).
According to Fillmore, each verb selects a certain number of deep cases which form its case frame. Thus, a case frame describes important aspects of semantic valency, of verbs, adjectives and nouns. Case frames are subject to certain constraints, such as that a deep case can occur only once per sentence. Some of the cases are obligatory and others are optional. Obligatory cases may not be deleted, at the risk of producing ungrammatical sentences. For example, Mary gave the apples is ungrammatical in this sense.
A fundamental hypothesis of case grammar is that grammatical functions, such as subject or object, are determined by the deep, semantic valence of the verb, which finds its syntactic correlate in such grammatical categories as Subject and Object, and in grammatical cases such as Nominative, Accusative, etc. Fillmore (1968) puts forwards the following hierarchy for a universal subject selection rule:
Agent < Instrumental < Objective
That means that if the case frame of a verb contains an agent, this one is realized as the subject of an active sentence; otherwise, the deep case following the agent in the hierarchy (i.e. Instrumental) is promoted to subject.
The influence of case grammar on contemporary linguistics has been significant, to the extent that numerous linguistic theories incorporate deep roles in one or other form, such as the so-called Thematic Structure in Government and Binding theory. It has also inspired the development of frame-based representations in AI research[citation needed].
During the 1970s and the 1980s, Charles Fillmore developed his original theory onto what was called Frame Semantics. Walter A. Cook, SJ, a linguistics professor at Georgetown University, was one of the foremost case grammar theoreticians following Fillmore's originial work. Cook devoted most of his scholarly research from the early 1970s until 1990s to further developing case grammar as a tool for linguistic analysis, language teaching methodology, and other applications, and was the author of several major texts and many articles in case grammar. Cook directed several doctoral dissertations applying case grammar to various areas of theoretical and applied linguistics research.
universal grammar
Universal grammar is a theory of linguistics postulating principles of grammar shared by all languages, thought to be innate to humans (linguistic nativism). It attempts to explain language acquisition in general, not describe specific languages. Universal grammar proposes a set of rules intended to explain language acquisition in child development. The application of the idea to the area of second language acquisition (SLA) is represented mainly by the McGill University linguist Lydia White.
Some students of universal grammar study a variety of grammars to abstract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a range of traits, from the phonemes found in languages, to what word orders languages choose, to why children exhibit certain linguistic behaviors. as they considered issues of the Argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The contrasting school of thought is known as functionalism.
History
The idea can be traced to Roger Bacon's observation that all languages are built upon a common grammar, substantially the same in all languages, even though it may undergo accidental variations, and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. Charles Darwin described language as an instinct in humans, like the upright posture[1]
The idea rose to notability in modern linguistics with theorists such as Noam Chomsky and Richard Montague, developed in the 1950s to 1970s, as part of the "Linguistics Wars".
[edit] Chomsky's theory
Further information: Language acquisition device, Generative grammar, X-bar theory, Government and Binding, Principles and parameters, and Minimalist Program
Linguist Noam Chomsky made the argument that the human brain contains a limited set of rules for organizing language. In turn, there is an assumption that all languages have a common structural basis. This set of rules is known as universal grammar.
Speakers proficient in a language know what expressions are acceptable in their language and what expressions are unacceptable. The key puzzle is how speakers should come to know the restrictions of their language, since expressions which violate those restrictions are not present in the input, indicated as such. This absence of negative evidence -- that is, absence of evidence that an expression is part of a class of the ungrammatical sentences in one's language -- is the core of poverty of stimulus argument. For example, in English one cannot relate a question word like 'what' to a predicate within a relative clause (1):
(1) *What did John meet a man who sold?
Such expressions are not available to the language learners, because they are, by hypothesis, ungrammatical for speakers of the local language. Speakers of the local language do not utter such expressions and note that they are unacceptable to language learners. Universal grammar offers a solution to the poverty of the stimulus problem by making certain restrictions universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
The presence of creole languages is cited as further support for this theory. These languages were developed and formed when different societies came together and devised their own system of language. Originally these languages were pidgins and later became more mature languages that developed some sense of rules and native speakers.
The idea of universal grammar is supported by the creole languages by virtue of the fact that all or most of these languages share certain features. Syntactically, they use participles to form future and past tenses and multiple negation to deny or negate. Another similarity among creoles is that a question can be implemented by changing inflection rather than changing words.
[edit] Criticism
Some linguists oppose the universal grammar theory. It is outspokenly opposed by Geoffrey Sampson, who maintains that universal grammar theories are not falsifiable, arguing that the grammatical generalizations made are simply observations about existing languages and not predictions about what is possible in a language.
Some feel that the basic assumptions of Universal Grammar are unfounded. Another way of defusing the poverty of the stimulus argument is if language learners notice the absence of classes of expressions in the input and, on this basis, hypothesize a restriction. This solution is closely related to Bayesian reasoning. Elman et al. argue that the unlearnability of languages assumed by UG is based on a too-strict, "worst-case" model of grammar.
Critics argue that the postulate of a "language acquisition device" essentially amounts to the trivial claim that languages are, in fact, learnt by humans, and that the LAD isn't a theory so much as the explanandum looking for theories.[2]
The Pirahã language has been claimed by the linguist Daniel Everett to be a counterexample to Universal Grammar, showing properties allegedly unexpected under current views of Universal Grammar. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and color terms.[3] Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of Universal Grammar.[4] While most languages studied in that respect do indeed seem to share common underlying rules, research is hampered by considerable sampling bias. Linguistically most diverse areas such as tropical Africa and America, as well as the diversity of Indigenous Australian and Papuan languages have been insufficiently studied. Furthermore, language extinction apparently has affected those areas most where most examples of unconventional languages have been found to date[citation needed].
Subscribe to:
Posts (Atom)