<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<TEI xmlns="http://www.tei-c.org/ns/1.0" xmlns:ns2="http://www.tei-c.org/ns/Examples">
    <teiHeader>
        <fileDesc>
            <titleStmt>
                <title>Using Active Constraints to I)arse &quot;&quot;)~'(' ',.:, \[ ,~ k,,&lt;;</title>
            </titleStmt>
        </fileDesc>
    </teiHeader>
    <text>
        <front/>
        <body>
            <div>
                <p>Philipl)e I/la(:he</p>
                <p>lnstitut d'hffor,natiqu(~ Universit6 de NeuchateI (Suisse) e-mail: Blach e((~i,ffo, unine,ch Abstract with th( goals of generality and corm'el. Active constraints of tile CO,lslratnl higic programin,rig paradigm allow (l) the reduction of the search space of progr~tms and (2) a very concise representation of the problcnls. These two l)rop erties are particuhtrly interesting for I)arsing prob\[elns : they can hel I ) us to reduce non-determinisln and to use large coverage gramlnars. In this paper, we describe how to use Slleh constraints tot parsing ID/LP grammars and propose an inlplenlelmtl, iou in Prolog Ill. Keywords : constraints, syntax, ID/I,P formalism, bottom-up filtering, Prolog IlI 1 Introduction Logic programming is one of the nlost useful tools in computational inguistics. These two domains are progressing very rapidly. The former with the emergence of the constraint paradignl and the latter with the systematic use of well-formalized linguistic theories. In the l~st few years, natural lal&gt; guage processing (hereafter NLP) and more pre cisely syntax have created tools allowing expression of general knowledge.</p>
                <p>Constraints simplify parsing problems to a con siderable extent, both in a formal and computational way. From a formal point of view, we will see that they allow a very good adequacy between linguistic aim computational theories. We know that this prol)erty is essential to solve generality, reusability and coverage prolflems. On the otimr hal,d, from a computatiolml point of view, con straints set up a control of the I)rocesses which reduces non-determmisnl m parsing.</p>
                <p>The question is to know whether it, is possible to illlph!tlleIlt a I)arsiag Illethod I)ased oil actllal eoilstraints. The answer depends on the choice of the grmmnatical formalism. We think thai the II)/LP formalism used lit (IPS(-I theory can bring a sohl tion to this I)roblem.</p>
                <p>In this paper, we will describe a parsing method based on \[D/LP formalism using boolean constraints. We will show that this nlethod agrees A(m{s 131! COLING 92, NAN'II~S, 23 28 ao(rr 1992 8 1 2 Parsing and deduction IIoth for historical itllll \[orlilal reasolls, parsing turn ck)se relalions with logic The birth of Prolog, for example, w~s conditioimd by Ihat aud NLP was one of the early applications of this language. One of the re~molls, a.s shown in \[PereiraS,g\], is tiiat we ('.all compare |)arsdlg and dednet, ion, More precisely, it phrase- structure rule (hereafter PS-rule) can be interpreted its a Rmnula (an ilnplication), like a classical inDrenee rule.</p>
                <p>'\['hus, a PS-rule of the form : NX ~ ('~,...,C,, can be interpreted as the fl~llowmg implication : (,'1 A ... A (.',, D ,S'X the clausal form of which is : ~Ci V , . • V ~(5',, V ,b'X</p>
                <p>Because of the ui,iquei~ess of the positive literal, we can interpret a PS-.rule as a Ilorn clause, with a direct translation into I'rolog. Thus, a context-free gr~unlnar, represenled by a set. of PS rule, corresponds to a set of clauses. To verify the grammaticality of a sentence is tluls equivMent to proving the COllSiSteacy of a set of clauses.</p>
                <p>There is, howew,r, a restriction in the analogy hetwee\[l P~-rtlleS &amp;lid claltses : a \[1111', detines all order on ils right-haI.l--side chunelltS, whereas a clause does not. This restriction has important coliseqllenct,s 011 tho generality of the lileChalliSlllS. hldeed, lhe noti(m of order iiivoIvi)s it multilllica tion of the rifles describing a giw~n phrase : we get as zn;nly rules as there are (:onfigural.ious. This is one of the limits of phrase structure gramlnars.</p>
                <p>ll)/l,l' formMism and boolean constraints will alk)w us to sMve this problem. &quot;Ore will obtain a nearly perfect adequacy bet.weeIl I.h~ theoretical iiiode\] aiid its implementation. Within the classi fieation proposed m iF, van.s87\], it will be a strong direct interl)retation of the model. Pit&lt;It. OF COLING 92, NANrES, AU&lt;;. 23-28, 1992 3 Constraints and linguistic theory The basic mechanism of constraint logic programming is the restriction of the search space, or the reduction of the domain-variables. Tiffs goal can be reached differently depending on the active or passive constraint ype (ef \[Vanllentenryck89\]). In the classical ogic programming framework, the basic technique is that of generate-and-test. Iu this ease, the program generates values for the variables before verifying some of their properties : the search space is reduced a posteriori. On the other hand, in the CLP paradigm, the use of constraints allows the reduction of this space a priori. Moreover, the set of constraints forms a system which incorporates new constraints (luring the process, while the use of simple predicatcs verifying a property only has a local scope.</p>
                <p>This active/passive distinction can be useful for parsing, especially according to the type of knowledge that is constrained. Active constraints can easily be defined for syntactic structures and their formation. On the other hand, expressing relations between these structures with this kind of constraint is not always possible.</p>
                <p>We will describe the principles governing the formarion of the structures. A syntactic structure can be of two types : * simplestructures : lexical categories (e.g. Del,</p>
                <p>N,V...) • complex structures :phrases or propositions (e.g. NP, VP ... )</p>
                <p>The formation of complex structures is governed by two types of knowledge : • internal : specific information within a structure • external : relations between structures</p>
                <p>Internal knowledge concerns the structure composition, independently of its context. For a phrase, it is the set of its constituents. External knowledge describes interactions between structures. They concern on the one hand the order and on the other hand tile government (in the sense of phrase-structure grammars :selection, agreement ...).</p>
                <p>ID/LP formalism uses such a distinction : it separates information about immediate dominance (i.e. the set of possible constituents of a phrase) from that on linear precedence (i.e. the partial order relation between these constituents).</p>
                <p>It is possible to consider these two types of knowledge as constraints (cf \[Saint-Dizier91\]). But it is important o distinguish their respective funetionings. We will illustrate this point by presenting principles for each type. ACT~.S DE COLING-92, NANTES, 23-28 AOt~rr 1992 8 2</p>
                <p>o Internal knowledge</p>
                <p>Each complex structure must contain at least one particular element called the head. This category gives the phrase its type and its presence is compulsory. The other constituents are usually optional. We must specify that local constraints could require the presence of a particular category, but it is a sub-categorization aspect : it concerns relations between the sub-structures of the complex structure and is not specific to the structure itself. We will see that this distinction between optional and compulsory constituents can be represented directly as an active constraint.</p>
                <p>o External knowledge</p>
                <p>In the case of ID/LP formalism, the order constraints (i.e. linear precedence) cannot be easily used with an a priori reduction of the search space. Indeed, LP-rules define a partial order upon the set of categories. The LP-aeceptability relation uses this order and can be regarded as a constraint upon the domain-variables. It is a symbolic user- defined constraint. The use of this kind of constraint is possible in Chip (ef \[Dincbas88\]), but not in Prolog III (cf \[ColmerauergO\]).</p>
                <p>tlowever, using this order relation as an actual constraint allowing the reduction of domainvariables is difficult. In so far as it is a partial order, the LP notion cannot be used to predict the categories that can follow a constituent. It is used during the parse to verify the possibility for each new category to appear at a given place in the syntactic structure.</p>
                <p>Generally speaking, internal properties allow an easier use of active constraints than external ones. 4 Constraints and ID/LP formalism As we have seen, ID-rules of ID/LP formalism only contain tile set of possible constituents (without any notion of order). Therefore, an ID-rule is strictly equivalent o a clause. Example : N P &quot;-'*id Del, N, AP ~ N P V ~De~ V ~N V ~AP</p>
                <p>This equivalence is the basis of the conciseness and generality properties of GPSG. But it is difficult to represent. As we have seen, logic programming cannot directly represent he non-ordered aspect of a clause. Ilowever, it is possible to represent this kind of information as active constraints. These must allow the expression of tile simple fact that a phrase is well-formed if it is at least composed of the constituents Ct,..., C,. Other relations between the structures (like order or selection) will only be verified if this constraint is satisfied. PROC. OF COLING-92, NANTES, AUG. 23-28, 1992</p>
                <p>Practically, each rule descrihing a phrase cor: responds to a clause whose literals represent categories. An ID-rule is thus translated into a boolean formula where each category corresponds to a boolean. The semantics of this representatiou is the following :</p>
                <p>A literal is true if it corresponds to a well-formed structure. A structure is well- formed if it corresponds to a le~cical category (simple structure) or to a well- formed phrase (compler structure).</p>
                <p>Thus, the boolean value of a complex structure is the interpretation of this formula, and so depends on the value of its constituents. Ezample :</p>
                <p>Given the following set of ID-rules de-</p>
                <p>scribing a NP :</p>
                <p>N P --q~ DeC N</p>
                <p>NP ~i,* N</p>
                <p>NP -old DeC AP, PP, N</p>
                <p>NP ~ia Det, AP, N</p>
                <p>NP --qa Det, PP, N This set of rules corresponds to the following fornmla : (1)el A N)V (N)V (Det A AP A PP A N)V (Dot A AP A N)V (Det A PPA N) D NP</p>
                <p>It is interesting to note that the ID/LP formalism strongly reduces the problem of PS-rules multiplication inherent in phrase-structure grammars, tlowever, as we have seen in tile previous example, there is still a redundancy in the information. Indeed, a set of rules describing a phrase allows us to distinguish between two types of constituents according to their ot)tional or eomtmlsory aspect. Hence, for each phrase we can define a minimal set of compulsory constituents (generally limited to the head of the phrase), which we call the minimal set of a phrase. Ezample :</p>
                <p>In the previous example, the minimal set</p>
                <p>of the NP is {N}.</p>
                <p>We introduce an additional restriction preventing the repetition of an identical category within a phr,~se. This restriction is very strong and has to be relaxed for some categories (such as PP). But it remains a general principle : most of the categories should not be repeated. ACIES DE COLING-92, NAIqrES, 23-28 ̂o~r 1992 8 3</p>
                <p>We then construct a principle defining tile wellformedness of complex structures. 't'his principle only concerns internal knowledge :</p>
                <p>A phrase is well-formed iff it respects the following properties : m it contains at least one head • no constituent is repeated ~, all its embedded phrases are well-formed</p>
                <p>In the logical paradigm (equivalence between a role and a clause), we say that a literal is true ~ it corresponds to a lexieal category of the parsed sentence or if it correslmnds to a well-formed phrase.</p>
                <p>This formation rule allows its to simplify the veritication of the grammatieality ofa sentence. We simply need to verify the presence of the minimal set of compulsory constituents to indicate the wellformedness of a phrase. The boolean value of the complete structure is then evaluated recursively. If all the intermediate structures are true, the complete structure is also true and corresponds to a gralomatical sentence.</p>
                <p>We will call realization the actual presence of a category in tile syntactic structure corresponding to a sentence. The verification process of the wellfornmdness of a phrase follows these steps 1. verifieatmn of the realizatiou of the minimal</p>
                <p>set 2. verification of the membershil) of the realized</p>
                <p>constituents within the minimal set 3. verification of the uniqueness of the con-</p>
                <p>stituents in a pllr,'~se 4~ verification of the well4ormedness ofembed-</p>
                <p>ded phrases</p>
                <p>In an active constraint, we replace the set of clauses describing all the possible constructions with a system &lt;)f constraints S defining the set of l)ossihle constituents and the condition of realization for the minitelal set. We can represent it as follow :</p>
                <p>Let G' he the set of possible constituents of a phrase XP, let X t&gt;e the head of XI', let M be the minimal set such xs M = {X}UC' (where C' C C), and let zX be the disjtmction of the literals of M. The well-formedness constraint is : s = {A 7) xl,} Example :</p>
                <p>The well-formedness constraint for a Nt'</p>
                <p>is: {NDNI'}</p>
                <p>The well:formedness constraint for a PP</p>
                <p>is : { f'rel&gt; A N P D I' P } PROC. OF COL1NG-92, NANTES, AUG. 23-28, 1992</p>
                <p>It is interesting to note that the implication corresponding to the set of rules describing the NP in the previous example forms a system of constraints that can be simplified to {N D NP}. This property is verified for all phrases :</p>
                <p>Given a grammar G, VXP such that Xf' E G, lel A be the disjunction of the liter'Ms of the minimal set of XP, then the formula corresponding ~o the rules describing XP is simplified to {A D XP}.</p>
                <p>We thns have both a linguistic and a h)rmaljnstit|cation of tile active constraint used to verify tile well formedness of a phrase. 5 Implementation in Prolog III We will now describe the parsing strategy and its implementation. 5.1 Bottom-up filtering Our parsing strategy relies on tile concept of left boundary of a phrase. It is an improvement of the left-corner strategy (cf \[Rosenkrantz70\]) called bottom-up filtering (ef \[maehe90\]). It consists in using tile information extracted from 1,P constraints to determine all the left-bounds of the phrases from the list of lexieal categories corresponding to a sentence. This process, unlike the left-corner one, relies on a distributional analysis of the categories and the verification of some properties.</p>
                <p>We define the following flmctions which allow the initialization of the left boundaries.</p>
                <p>o First-legal daughters (noted I&quot;LD(P)) : this function defines for each phrase P the set of categories that can appear as left boudaries. It is de tined ,as follows ( LP relation between sets is noted with ~:) :</p>
                <p>Let P be a phrase, ga such that f' -~ c~ then FLD, the set of first legal daughters, is defined ,'~s R)llows: m,D(P) = {e E ~ such that e -&lt; ,, - {e} }</p>
                <p>&lt;, Immediate precedence (noted ll',,(c)) : this fimetlon defines for each FLI) c of a phrase P the set of categories that can precede e in P. It is defined as follows :</p>
                <p>Let P be a phrase, V(* such that P --÷ o, let x be a non-terminal, let c E FLD(P), then IPv(e), the set of immediate precedence of c for P, is defined as follows: IPp(c) = { ..... h that (x -4 c) or (,c E ....... l</p>
                <p>neither x -&lt; c nor e -&lt; z ea:ist)} AC.I'ES DE COLING-92, NANTES, 23-28 AO(;F 1992 8 4</p>
                <p>o Iu'tialize : this flmction verifies whether a category c is the actual left boundary of a phrase P. It is defined ms follow :</p>
                <p>Let I be a string, let C be tile list of lexical categories of I, Ve E C, c' G N (set of non4erminal symbols) such that c' precedes c in C ; c initializes S life E FLI)(S) anti e' ¢ IPs(e)</p>
                <p>The syntactic structure of the sentence is built from a list of partially evaluated structures. The process consists in determining all the h.'ft bounds and, from this structure, in completing tire partial structures by an analysis of the other constituents of the phrase. This is done by verifying whether the current category can or cannot belong to the cnrrent phrase.We have at our disposal the set of possible constituents for each phrase, the LP constraints and the other instant|at|on principles of the GPS(\] theory. After these verifications, if tile current category cannot belong to the current phrase, then we have reached the right boundary of the current ptm~se. Example :</p>
                <p>Input sentence : 7'he old man sings. Categorization : Det.Adl.N. V Partial structure : S.(NP, Det).(AP, Adj).N.(VP, V) Complete structure : (S,(NP, Det,(AP, Adj),N),(VP, !7))</p>
                <p>This strategy allows a reduction of the search space. Parsing becomes a simple membership test of a category within a set. 5.2 Implementation The following implementation considers only the ID/LP formalism (instead of the entire GPSG theory). We will not speak here about the other GPSG principles, bnt their insertion in the ID/LP module is very shnple.</p>
                <p>The parsing mechanism consists in assigning the value true l.o the boo\]eans corresponding to the categories a~s and when they appear. If the structure is simple (i.e. a lexical category), the LP-aeeeplability of this category in the phrase is checked and tire corresponding boolean is a.ssigned PROC. OV COLING-92, NANTES, At;c;. 23-28, 1992 tile vMue true. In the case where the l)otton&gt; up tiltering detects a left-bound, tile corresponding boolean of tile current category is mssigned tile value true and tile embedded phrase is parsed before coming back to tile construction of tim current phrase. When we reach the right boundary, the well-forme(lness of tim embedded structures is checked (i.e. all the corresponding booleans must be true). If this is tile case, the corresponding boolean value is that of tile disjunction A of tile literals corresponding to the minimal set.</p>
                <p>The representation of tile categories and their associated Iiooleans will be done through two parallel lists which will be examined simultaneously during an affectation (or any other operation).</p>
                <p>A l)hrase is described l)y the set. of its possil)le constituents, t,he set of its optional categories ~uld ~ forlnuls, using its tniniLnal set. '\['lie two sets are represented by lists and the R)rmula is an imldiCa don of the form {A D XP}. This inlbrm~ttion is collected into a systenl of constraints ehar;teterizing each phra.se.</p>
                <p>Here is a simplilied version of our parsing prc~ cess. The following predicates allow the parsing of a Ithrase and its simple or complex constituents.</p>
                <p>It c;m be noted that tile gramnm.tieal knowledge is lmshed at it low level. It is repn:sented by the set of constraints ~ssoeiated to each phrase. Moreover, at this level we do not use the notion of sub-eategorizatioil, but only rules concerning the general structure. We grill idSO notice the concisehess of this representation with reg;~rd to eh~ssical phra.se--strueture formalisms. Deseril)tion of the. implementation Let G be the following ll)/l,P grammar : NP-qa 1)el, N NP-'ia N NP ~ia Del, A P, I'P, N NP-q,t Det, Al', N NP-+ia Det, PP, N NP .'ia Det, Al', PP, N, l'Rcl Nt'-+ia Det, A 1', N, PRel NP ~id Det, PP, N, PRel NP-qd Det, N, PRel NP-~La N, t'Rel VI' ~id V VP--,i,t V, NP, PP Vl&gt;-+id V, N} ) VP ~ia V, 1'1' AP--'L,L Adj I'P-'i,t l'ret', NP PRel-'id l'ro, NP, VP</p>
                <p>q}lm lbllowing predicates correspond to the heart of the parser for the grammar G : APhrase (&lt;S (c) &gt;. i ,12, Cat, Bool ,'r) ,</p>
                <p>Coilst fluent (S, Cat, Bool ) Acri!s I)E COLING-92, NANIES, 23-28 AO\[a 1992 8 5</p>
                <p>Lphccept able (S ,Cat, Bool)</p>
                <p>hngmbeddedPhrae (&lt;S, c&gt;. 1, ll.</p>
                <p>Cat,Bool,hl)</p>
                <p>APhrase (i i ,12, Cat, gool, A2)</p>
                <p>Tree(&lt;S \[&lt;c&gt;. All &gt;. A2,T) ; APhras e (&lt;c&gt;. i, 11, Cat, Bool, &lt;c&gt;. A) -+</p>
                <p>LpAcceptable (c, Cat, Boo\].)</p>
                <p>lltstallciat e (e ,Cat, Bool)</p>
                <p>APhrase(l ,it ,Cat,Boo\].,A) ;</p>
                <p>Th( APhrase rllh! takes as illpllt ihe list Of partial structures returned by bottum-up filtering. It distinguishes between (we (:~ua.s aceor&lt;ling to the type of the current structure : complex (rule ~1) or simple (rule #2). In the first c~use, the following processes arc eMlcd : tim l,P-aeceptalfility, the correslmndiug boolean is assigned tile value true (Instanciate rule) and tile parse of the current phrase is pursued.</p>
                <p>If the APhrase r,de fails, the right-bound of the phrase is reached and die parse is pursued at a superior level. AnEmbeddadPhr as e (&lt;S, c&gt;. l, 11, gag, Bool, A ) -,</p>
                <p>Constraints (S, C,B ,R,S' )</p>
                <p>Instanciat e(c,C,B)</p>
                <p>APhrase(l, ii ,C,B,A)</p>
                <p>CorrectConstituent s (R, r)</p>
                <p>Valid(r,S,S' ,Cat,Boo\]) ;</p>
                <p>rFhe AilFanbeddedPhrase rule allows the parse of &amp; ll(!W COIUptex Btriicttli'e. It begins with tile system of ins{ailing constraints describing this structur~ (Coilstraints rule). TI,e wllidity of the con stituents is clmcked (CorrectConstituents and Valid rtdes) Before rettlrlling the boolean wthic of the parse for this phrg~se (variable S'). Constraints (NP,C,B, R,N P) ,</p>
                <p>{ C - &lt;Dot,Nm,AP,PP,PlteI&gt;,</p>
                <p>B e &lt;I)_ot,N,A.P,P P,P Rel&gt;,</p>
                <p>R : &lt;A~,P P,P_ReI&gt;, N =&gt; N1 ~ }; I)ROC. OF COIANG 92, NANTES, AtX;. 23-28, 1992 ® veritication of the mend)ership of the current</p>
                <p>structure within the set of the pnssibb con-</p>
                <p>sl.it)lel/ts el the curreltt phrmse (Constituent</p>
                <p>rule} o verifi&lt;'ation of the l,l ) acceptability</p>
                <p>(LpAcceptabl e r,lle) ~, parse of the elnbedded COlllplex structure</p>
                <p>(AnEmbeddedPhrase rule) tmrse of the rest cd&quot; the phr;Lse (APhraee rule) construction and w'rilicatiou of the syntactic tree (Tree rub) In the case of simple structures, afl;er checking Conatraints(VP,C,B,R,VA ~) -~</p>
                <p>{ C = &lt;Vb,IP,PP&gt;,</p>
                <p>B = &lt;V,IIA~,P-P&gt;,</p>
                <p>R = &lt;I_P,P_P&gt;</p>
                <p>V ::~. V-P }; Constraints (AP, C,B, R,A~) -~</p>
                <p>{ C = &lt;Adj&gt;. B = &lt;aAj&gt;, R = &lt;&gt;. AAj ~ A_P }; Constraints (PP, C,B, R, P-P) -~</p>
                <p>{ C = &lt;Prep,NP&gt;,</p>
                <p>B = &lt;P-top,IrA°&gt;, R = &lt;I_P&gt;, (P_rep a IIJ ~) =;~ PA: }; Constraint a (PRel, C, B, R, P~Rel) ---*</p>
                <p>{ C = &lt;Pro,NP,VP&gt;,</p>
                <p>B = &lt;P_ro,Ii_P,V_P&gt;,</p>
                <p>R = &lt;NA~.V-P&gt;º</p>
                <p>(P2co &amp; V~) :~ P_Kel };</p>
                <p>We can notice that in this representation, subcategorization consists in verifying the boolean values corresponding to the categories concerned. 6 Conclusion The ID/LP formalism distinguishes between internal and external knowledge about syntactic structures. This characteristic allows the expression of parsing mechanisms at a very high level of generality. We can represent he description of a phrase in an extremely concise way with a rule clustering operation. These properties allow the use of active constraints. The result is an implementation in agreement with the theoretical nmdel respecting in particularl the generality and conciseness properties of GPSG. Moreover, active constraints efficiently control the progress of the processes and limit non-determinism ofparsing. This last characteristic is very important for the ID/LP formalism which uses non-ordered rules implying an increase of the search space.</p>
                <p>We have shown in this paper how to use active constraints for ID/LP formalism. We can apply the same approach to the entire GPSG theory interpreting features structures and instaneiation principles as formulas (cf \[Blache92\]).</p>
                <p>The implementation presented here has been done in Prolog III on a Macintosh. From a coverage point of view, we can indicate that the rules in the grammatical formalism presented in our example roughly amounts to twenty standard ID-rules. References \[Blache9O\] Blache P. &amp; J.-Y. Morin (1990)</p>
                <p>Bottom-Up Filtering : a Parsing Strategy for ACTES DE COLING-92, NANTES, 23-28 nOt~r 1992 8 6 GPSG, COLING'90. \[Blaehe92\] Blache e. (1992) Interpretation of</p>
                <p>GPSG with Constraint Logic Grammars,</p>
                <p>ICEBOL '92. \[Colmerauer90\] Colmerauer A. (1990) An Intro-</p>
                <p>duction to PrologIII, CACM, 33:7 \[Damas91\] Damas L., Moreira N. &amp; Varile G.</p>
                <p>(1991) The Formal and Processing Models of</p>
                <p>CLG, proceedings of the 5th European Chapter</p>
                <p>of the ACL. \[Dinchas88\] Dinebas M., Vanilentenryck P., Si-</p>
                <p>monis H., Aggoun A. Graf T. &amp; Berthier</p>
                <p>F. (1988) The Constraint Logic Programming</p>
                <p>Language CHIP, International conference on</p>
                <p>5th Generation Computer Systems, ICOT. \[Evans87\] Evans R. (1987) Theoretical and Com-</p>
                <p>putational InteITrelations of GPSG, Thesis,</p>
                <p>University of Sussex. \[Guenthner88\] Guenthner F. (1988) Features and</p>
                <p>Values 1988, CIS.Bericht-90-2, Mfinchen. \[Johnson90\] Johnson M. (1990) Features, Frames</p>
                <p>and Quantifier-free Formulae, in Logic and</p>
                <p>Logic Grammars for Language Processing, P.</p>
                <p>Saint-Dizier &amp; S. Szpakowicz eds, Ellis Hor-</p>
                <p>wood. \[Kasper90\] Kasper 1%. &amp; W. Rounds (1990) The</p>
                <p>Logic of Unification in Grammar, in Linguis-</p>
                <p>tics and Philosophy, 13:1. \[Pereira831 Pereira F. &amp; O. Warren (1983) Parsing</p>
                <p>as Deduction, ACL83, 21st Annual meeting. \[Rosenkrantz70\] 1%osenkrantz D. &amp; P. Lewis</p>
                <p>(1970) Deterministic Left- corner Parser,</p>
                <p>IEEE Conference Record of the llth Annual</p>
                <p>Symposium on Switching and Automata The-</p>
                <p>ory. \[Saint-Dizier91\] Saint-Dizier P. (I991) Processing</p>
                <p>Language with Logical Types and Active</p>
                <p>Constraints, proceedings of the 5th European</p>
                <p>Chapter of the ACL. \[Stabler90\] Stabler E. (1990) Parsing as Logical</p>
                <p>Constraint Satisfaction, in Logic and Logic</p>
                <p>Grammars for Language Processing, P. Saint-</p>
                <p>Dizier &amp; S. Szpakowiez eds, Ellis Horwood. \[VanHentenryek89\] Vanllentenryck P.</p>
                <p>(1989) Constraint Satisfaction in Logic Pro-</p>
                <p>gramming, MIT Press. PROC. OF COLING-92, NANTES, AUG. 23-28, 1992</p>
            </div>
        </body>
        <back/>
    </text>
</TEI>
