《镜子大全》《朝华午拾》分享 http://blog.sciencenet.cn/u/liwei999 曾任红小兵,插队修地球,1991年去国离乡,不知行止。

博文

Parent-child Principle in Dependency Grammar 精选

已有 5369 次阅读 2015-6-18 01:54 |个人分类:立委科普|系统分类:科普集锦| dependency, grammar, Tesnière

Linguistics sounds boring.  Worse still, many linguists have a hard time finding a job after many years of PhD training.  But there is no short of fun once we really get into it.  Today I am presenting a fun piece called Parent-child Principle in Grammar.  

Grammar is one branch in linguistics (syntax).  Remember the sentence diagrams you had to draw in grammar school?  It may look boring or silly to some but intriguing to others.  Anyway, a grammarian is a drawer by profession.  He draws all types of diagrams, called trees.  These trees look different in shape but they are the magic wand in grammarians' hands in uncovering the mystery of language.  Ever since Lord decided to confuse human languages, the mystery of languages had been deeply hidden behind all the speeches, until grammarians like Lucien Tesnière and Noam Chomsky started drawing the trees.  I have not done research on who was the first linguist drawing a grammar tree, but that must have been a historical magic moment to break the linguistic curse set by Lord.

 

 

 

 

 

 

 

Let us look at this monster of human languages.  We humans speak in a linear fashion, words are uttered one after another in sequence.  It is one dimensional.  But this linear thing changes to a picture to the eyes of grammarians: one dimensional strings thus represent in two-dimensional trees, what a wonder!  Tree is a magic device to read structures out of an unstructured list of words, leading to making subconscious understanding explicit.  We cannot overrate this operation of tree drawing, called parsing in linguistics.  That is the very core of Natural Language Understanding and Artificial Intelligence. Open Sesame, when you have an automatic parser built in hands.  The door to languages then opens.  Illustrated below are two trees automatically drawn by our English parser and Chinese parser, proudly implemented by ourselves.  See how beautiful they are despite imperfections in automation.

Our parsers are robots, insatiable and never tired.  As long as we feed them with a sea of sentences, they will draw trees one by one, non-stop, and with lightning speed and consistent quality, to eventually form a gigantic forest.  We have the largest linguistic forest in the world in our storage, ready to serve our text mining products as well as language search or research.  PennTree Bank, which is a collection of human-drawn trees used in the NLP community, can in no way to catch up in quantity.  That is the beauty of a robot.  If it were a human, one might well go crazy in endlessly drawing trees until one is completely drowned in the forest seeing everything in the world as a tree and incapable of seeing the big picture of a forest.  I myself have once gone into that state and fortunately I jumped out of it when I embarked on building a robot to replace/save myself.

There are two types of trees, both drawn upside down, with root up and branches down.  They are based on two types of linguistic theories, represented by the aforementioned grammar masters Prof. Tesnière, French linguist in the last century and Prof. Chomsky, MIT institute professor, living god of linguistics.

Tesnière 's grammar is represented by dependency tree, which directly links a child word to its parent word, treating each word as the basic unit without resorting to an abstract node.  Chomsky's grammar, on the other hand, is via phrase structure tree, which is characterized by multiple levels of abstraction through intermediaries (non-terminal nodes) for the purpose of pattern generalizations.  See wiki for details of phrase structure trees which we decide not to use as our core representation (there is good reason for this choice).  What is illustrated in this post involves a hybrid tree based on dependency tree backbone, but it retains some flavor of phrase structures for some basic nodes such as NP (Noun Phrase), PP (Prepositional Phrase), etc..

Tesnière's Dependency grammar has a famous principle, what we can call Parent-Child Principle.  It states something like common sense, i.e. a parent can have n (n>=0) children but each child can at most have only one parent (one father or one mother).  

In real life, a child can of course have more fathers or mothers.  Steve Jobs, for example, has a biological father and a step-father and he also has a father-in-law.  But from the legal guardian perspective, there can be only one.  The legal system in grammar has thus been properly reflected.  Compare the following two English sentences to see how this principle is manifested in sentence structures:

We asked him to leave immediately.
We asked that he leave immediately.

The meaning of these two sentences is basically the same (with some nuances of course), but the syntax is different following the Parent-Child Principle.  From the deep layer logic, he/him is both the object of "asked" and the subject of "leave".  But when it is expressed in utterances, English grammar does not allow a noun to serve two parents as both subject and object.  It must follow the principle of one child one parent.  In order to do this legitimately, the first sentence uses two grammar devices: the object pronoun "him" is used to emphasize its role as object, not subject; the second is to use infinitive (with "to" before bare verb: "to leave") to cut off the possible subject-predicate connection between "him" and "leave" (English infinitive can not be predicate).  Similarly, the second sentence also uses two means: the subject pronoun "he" to show its status of subject (of "leave") instead of object (of "asked"); the use of "that" as an object clause indicator to eliminate the possibility of having "he" directly linked to "asked".  The corresponding dependency trees are illustrated below (S for subject, O is the object, Com is a complement, Adv adverbial, Mod is attributive, Cl is clause):

Think about it.  The Parent-Child Principle is about legal uniqueness of a role.  If you are already John's son, you should not be Peter's son.  There is one family where you legally belong.  Likewise, if you are subject of a predicate verb, you should not be an object at the same time, otherwise it leads to confusion or ambiguity.  However, ambiguity does exist when a pattern has little formal clues to determine the role of a unit, the most famous ambiguity is called PP-attachment in the pattern  V + NP + PP where PP may take the preceding NP as parent as its Modifier, or it may link to the parent node V as its Adverbial.  The existence of ambiguous sentences is argued to be evidence supporting this principle rather than violating it: as an ambiguous pattern corresponds to two different tree structures, there are two different meanings accordingly.  So thanks to this principle, hidden ambiguity in language is made explicit in the tree representations.  A famous example of such "structural ambiguity" in PP-attachment is:

They saw the girl with the telescope.

The above sentence is intended as a pun-like expression, a pure ambiguity where even human agents cannot distinguish "saw the girl using (with) the telescope" vs "girl wearing (with) a telescope".  But the majority of ambiguous structures are not this type, they can be disambiguated by human.  In other words, grammar structures indicate two possibilities but semantic constraints help select one.  For example,

They saw [the girl with a hat].
They [hit [the nail] with a hammer].

Much like the "forever triangle", the love relationships involving three entities (V, NP, PP) imply two candidate couples ("V, PP", "NP, PP"), but in the end, you can legally have only one marriage in the modern legal system.  Who and who get married depends on the mutual attraction (semantic coherence or chemistry).  Clearly, "the nail with a hammer" does not sound as coherent as "hit with a hammer".  Similarly, in the case of "the girl with a hat" vs. "saw with a hat", the former wins the common sense while the latter is nonsense.  Admittedly, it is a true challenge to a parser because a parser based on syntactic evidence is good at detecting such ambiguity but cannot resolve the ambiguity unless it is supported by semantic evidence or even common sense.  How to computationally model this decoding process of human mind is another topic.

Human language is different from computer languages in the ambiguity it involves.  Ambiguity is of two basic types, lexical ambiguity is on one word with multiple senses (node level ambiguity) while structural ambiguity involves one pattern leading to multiple structures in semantic interpretation (branch level ambiguity).  Parent-child Principle is the key to detecting many types of structural ambiguity.  

Human communication is a process of information encoding and decoding using natural language as a vehicle. To facilitate this process in detecting and minimizing ambiguity, the Parent-Child Principle has naturally emerged.  This principle is thus supported by numerous linguistic facts in many languages (with only a few exceptions).  It is largely subcontiously followed.  It is a powerful theoretical formulation of universal grammar that guides our implementation of the robot for parsing human language and comprehending its meanings.

 

[Related posts in my original Chinese blog]

 【科普小品:文法里的父子原则】

乔氏 X 杠杠理论 以及各式树形图表达法

【置顶:立委科学网博客NLP博文一览(定期更新版)】





https://blog.sciencenet.cn/blog-362400-898748.html

上一篇:“毛边书”是个什么感觉?
下一篇:Why Hybrid?
收藏 IP: 192.168.0.*| 热度|

2 赵凤光 icgwang

该博文允许注册用户评论 请点击登录 评论 (2 个评论)

数据加载中...
扫一扫,分享此博文

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-11-21 16:33

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部